diff --git a/.asf.yaml b/.asf.yaml index 6b1bb1599f463..fe6fbce3d35f2 100644 --- a/.asf.yaml +++ b/.asf.yaml @@ -41,7 +41,7 @@ github: rebase: false protected_branches: - master: + main: required_pull_request_reviews: required_approving_review_count: 1 v1-10-stable: diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md index 1e3c23d5ad347..3846ae698c717 100644 --- a/.github/PULL_REQUEST_TEMPLATE.md +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -17,7 +17,7 @@ http://chris.beams.io/posts/git-commit/ --- **^ Add meaningful description above** -Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information. +Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information. In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). -In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md). +In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md). diff --git a/.github/boring-cyborg.yml b/.github/boring-cyborg.yml index 19b4c3fb148b0..7d97c2f6eafce 100644 --- a/.github/boring-cyborg.yml +++ b/.github/boring-cyborg.yml @@ -199,20 +199,20 @@ labelerFlags: firstPRWelcomeComment: > Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our - Contribution Guide (https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst) + Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst) Here are some useful points: - Pay attention to the quality of your code (flake8, pylint and type annotations). Our [pre-commits]( - https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) + https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks) will help you with that. - In case of a new feature add useful documentation (in docstrings or in `docs/` directory). Adding a new operator? Check this short - [guide](https://github.com/apache/airflow/blob/master/docs/apache-airflow/howto/custom-operator.rst) + [guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst) Consider adding an example DAG that shows how users should use it. - - Consider using [Breeze environment](https://github.com/apache/airflow/blob/master/BREEZE.rst) for testing + - Consider using [Breeze environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for testing locally, itโ€™s a heavy docker but it ships with a working Airflow and a lot of integrations. - Be patient and persistent. It might take some time to get a review or get the final approval from @@ -222,7 +222,7 @@ firstPRWelcomeComment: > communication including (but not limited to) comments on Pull Requests, Mailing list and Slack. - Be sure to read the [Airflow Coding style]( - https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#coding-style-and-best-practices). + https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices). Apache Airflow is a community-driven project and together we are making it better ๐Ÿš€. diff --git a/.github/workflows/build-images.yml b/.github/workflows/build-images.yml index cf71c8357fd42..34ddcc39d8ecb 100644 --- a/.github/workflows/build-images.yml +++ b/.github/workflows/build-images.yml @@ -22,7 +22,7 @@ on: # yamllint disable-line rule:truthy - cron: '28 0 * * *' pull_request_target: push: - branches: ['main', 'master', 'v1-10-test', 'v1-10-stable', 'v2-0-test'] + branches: ['main', 'v1-10-test', 'v1-10-stable', 'v2-0-test'] env: MOUNT_SELECTED_LOCAL_SOURCES: "false" FORCE_ANSWER_TO_QUESTIONS: "yes" @@ -130,7 +130,7 @@ jobs: strategy: matrix: # We need to attempt to build all possible versions here because pull_request_target - # event is run from master for both master and v1-10-tests + # event is run for both main and v1-10-tests python-version: ${{ fromJson(needs.build-info.outputs.allPythonVersions) }} fail-fast: true if: needs.build-info.outputs.image-build == 'true' @@ -158,7 +158,7 @@ jobs: # We cannot "source" the script here because that would be a security problem (we cannot run # any code that comes from the sources coming from the PR. Therefore we extract the # DEFAULT_BRANCH and DEFAULT_CONSTRAINTS_BRANCH via custom grep/awk/sed commands - # Also 2.7 and 3.5 versions are not allowed to proceed on master + # Also 2.7 and 3.5 versions are not allowed to proceed on main id: defaults run: | DEFAULT_BRANCH=$(grep "export DEFAULT_BRANCH" scripts/ci/libraries/_initialization.sh | \ @@ -218,7 +218,7 @@ jobs: strategy: matrix: # We need to attempt to build all possible versions here because pull_request_target - # event is run from master for both master and v1-10-tests + # event is run for both main and v1-10-tests python-version: ${{ fromJson(needs.build-info.outputs.allPythonVersions) }} fail-fast: true if: needs.build-info.outputs.image-build == 'true' @@ -245,7 +245,7 @@ jobs: # We cannot "source" the script here because that would be a security problem (we cannot run # any code that comes from the sources coming from the PR. Therefore we extract the # DEFAULT_BRANCH and DEFAULT_CONSTRAINTS_BRANCH via custom grep/awk/sed commands - # Also 2.7 and 3.5 versions are not allowed to proceed on master + # Also 2.7 and 3.5 versions are not allowed to proceed on main id: defaults run: | DEFAULT_BRANCH=$(grep "export DEFAULT_BRANCH" scripts/ci/libraries/_initialization.sh | \ diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 350e6db434f0c..af4103c8c7f3b 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -21,9 +21,9 @@ on: # yamllint disable-line rule:truthy schedule: - cron: '28 0 * * *' push: - branches: ['master', 'v[0-9]+-[0-9]+-test'] + branches: ['main', 'v[0-9]+-[0-9]+-test'] pull_request: - branches: ['master', 'v[0-9]+-[0-9]+-test', 'v[0-9]+-[0-9]+-stable'] + branches: ['main', 'v[0-9]+-[0-9]+-test', 'v[0-9]+-[0-9]+-stable'] env: MOUNT_SELECTED_LOCAL_SOURCES: "false" @@ -496,7 +496,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" - name: Configure AWS credentials uses: ./.github/actions/configure-aws-credentials if: > - github.ref == 'refs/heads/master' && github.repository == 'apache/airflow' && + github.ref == 'refs/heads/main' && github.repository == 'apache/airflow' && github.event_name == 'push' with: aws-access-key-id: ${{ secrets.DOCS_AWS_ACCESS_KEY_ID }} @@ -504,7 +504,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" aws-region: eu-central-1 - name: "Upload documentation to AWS S3" if: > - github.ref == 'refs/heads/master' && github.repository == 'apache/airflow' && + github.ref == 'refs/heads/main' && github.repository == 'apache/airflow' && github.event_name == 'push' run: aws s3 sync --delete ./files/documentation s3://apache-airflow-docs @@ -519,13 +519,13 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}} VERSION_SUFFIX_FOR_PYPI: ".dev0" GITHUB_REGISTRY: ${{ needs.ci-images.outputs.githubRegistry }} - if: needs.build-info.outputs.image-build == 'true' && needs.build-info.outputs.default-branch == 'master' + if: needs.build-info.outputs.image-build == 'true' && needs.build-info.outputs.default-branch == 'main' steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" uses: actions/checkout@v2 with: persist-credentials: false - if: needs.build-info.outputs.default-branch == 'master' + if: needs.build-info.outputs.default-branch == 'main' - name: "Setup python" uses: actions/setup-python@v2 with: @@ -566,13 +566,13 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" PYTHON_MAJOR_MINOR_VERSION: ${{needs.build-info.outputs.defaultPythonVersion}} VERSION_SUFFIX_FOR_PYPI: ".dev0" GITHUB_REGISTRY: ${{ needs.ci-images.outputs.githubRegistry }} - if: needs.build-info.outputs.image-build == 'true' && needs.build-info.outputs.default-branch == 'master' + if: needs.build-info.outputs.image-build == 'true' && needs.build-info.outputs.default-branch == 'main' steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" uses: actions/checkout@v2 with: persist-credentials: false - if: needs.build-info.outputs.default-branch == 'master' + if: needs.build-info.outputs.default-branch == 'main' - name: "Setup python" uses: actions/setup-python@v2 with: @@ -897,8 +897,8 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" uses: actions/setup-python@v2 with: python-version: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} - - name: "Set issue id for master" - if: github.ref == 'refs/heads/master' + - name: "Set issue id for main" + if: github.ref == 'refs/heads/main' run: | echo "ISSUE_ID=10118" >> $GITHUB_ENV - name: "Set issue id for v1-10-stable" @@ -1109,7 +1109,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" - docs # TODO: Generalize me (find a better way to select matching branches) if: > - (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test' || + (github.ref == 'refs/heads/main' || github.ref == 'refs/heads/v1-10-test' || github.ref == 'refs/heads/v2-0-test' || github.ref == 'refs/heads/v2-1-test') && github.event_name != 'schedule' strategy: @@ -1134,7 +1134,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" - name: Set push-python-image id: push-python-image run: | - if [[ "${REF}" == 'refs/head/master' || "${REF}" == 'refs/head/main' ]]; then + if [[ "${REF}" == 'refs/head/main' || "${REF}" == 'refs/head/main' ]]; then echo "::set-output name=wanted::true" else echo "::set-output name=wanted::false" @@ -1171,7 +1171,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" - docs # TODO: Generalize me (find a better way to select matching branches) if: > - (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test' || + (github.ref == 'refs/heads/main' || github.ref == 'refs/heads/v1-10-test' || github.ref == 'refs/heads/v2-0-test' || github.ref == 'refs/heads/v2-1-test') && github.event_name != 'schedule' strategy: @@ -1221,7 +1221,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" # Only run it for direct pushes # TODO: Generalize me (find a better way to select matching branches) if: > - github.ref == 'refs/heads/master' || github.ref == 'refs/heads/v1-10-test' || + github.ref == 'refs/heads/main' || github.ref == 'refs/heads/v1-10-test' || github.ref == 'refs/heads/v2-0-test' || github.ref == 'refs/heads/v2-1-test' steps: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" @@ -1308,7 +1308,7 @@ ${{ hashFiles('.pre-commit-config.yaml') }}" github_token: ${{ secrets.GITHUB_TOKEN }} tags: true force: true - branch: master + branch: main tests-ui: timeout-minutes: 10 diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml index 5c4b0afd6aa0d..87d6c995a02ff 100644 --- a/.github/workflows/codeql-analysis.yml +++ b/.github/workflows/codeql-analysis.yml @@ -20,7 +20,7 @@ name: "CodeQL" on: # yamllint disable-line rule:truthy push: - branches: [master, main] + branches: [main] schedule: - cron: '0 2 * * *' diff --git a/.github/workflows/label_when_reviewed_workflow_run.yml b/.github/workflows/label_when_reviewed_workflow_run.yml index 59bde48284a00..e64b85bab44f8 100644 --- a/.github/workflows/label_when_reviewed_workflow_run.yml +++ b/.github/workflows/label_when_reviewed_workflow_run.yml @@ -72,7 +72,7 @@ jobs: ref: ${{ steps.source-run-info.outputs.targetCommitSha }} fetch-depth: 2 persist-credentials: false - # checkout the master version again, to use the right script in master workflow + # checkout the main branch again, to use the right script in main workflow - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" uses: actions/checkout@v2 with: @@ -107,7 +107,7 @@ jobs: comment: > The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. - If they don't merge it quickly - please rebase it to the latest master at your convenience, + If they don't merge it quickly - please rebase it to the latest main at your convenience, or amend the last commit of the PR, and push it with --force-with-lease. - name: "Initiate GitHub Check forcing rerun of SH ${{ github.event.pull_request.head.sha }}" uses: ./.github/actions/checks-action @@ -139,7 +139,7 @@ jobs: The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label - 'full tests needed'. Then you should rebase to the latest master or amend the last commit + 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease. - name: "Label when approved by committers for PRs that do not require tests at all" uses: ./.github/actions/label-when-approved-action @@ -153,7 +153,7 @@ jobs: comment: > The PR is likely ready to be merged. No tests are needed as no important environment files, nor python files were modified by it. However, committers might decide that full test matrix is - needed and add the 'full tests needed' label. Then you should rebase it to the latest master + needed and add the 'full tests needed' label. Then you should rebase it to the latest main or amend the last commit of the PR, and push it with --force-with-lease. - name: Update Selective Build check uses: ./.github/actions/checks-action diff --git a/.github/workflows/repo_sync.yml b/.github/workflows/repo_sync.yml deleted file mode 100644 index 76afc19213948..0000000000000 --- a/.github/workflows/repo_sync.yml +++ /dev/null @@ -1,36 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -# ---- -name: Force sync master from apache/airflow -on: # yamllint disable-line rule:truthy - workflow_dispatch: -jobs: - repo-sync: - if: github.repository != 'apache/airflow' - runs-on: ubuntu-20.04 - steps: - - uses: actions/checkout@v2 - with: - persist-credentials: false - - name: repo-sync - uses: repo-sync/github-sync@v2 - with: - source_repo: "apache/airflow" - source_branch: "master" - destination_branch: "master" - github_token: ${{ secrets.GITHUB_TOKEN }} diff --git a/BREEZE.rst b/BREEZE.rst index b6e82d17d4686..fac5104e17ca9 100644 --- a/BREEZE.rst +++ b/BREEZE.rst @@ -541,7 +541,7 @@ For all development tasks, unit tests, integration tests, and static code checks **CI image** maintained on the DockerHub in the ``apache/airflow-ci`` repository. This Docker image contains a lot of test-related packages (size of ~1GB). Its tag follows the pattern of ``-python-ci`` -(for example, ``apache/airflow:master-python3.6-ci`` or ``apache/airflow-ci:v2-1-test-python3.6-ci``). +(for example, ``apache/airflow:main-python3.6-ci`` or ``apache/airflow-ci:v2-1-test-python3.6-ci``). The image is built using the ``_ Dockerfile. The CI image is built automatically as needed, however it can be rebuilt manually with @@ -637,7 +637,7 @@ Building Production images The **Production image** is also maintained on the DockerHub in both ``apache/airflow`` (for tagged and latest releases) or ``apache/airflow-ci`` repository (for branches). This Docker image (built using official Dockerfile) contains size-optimised Airflow installation with selected extras and dependencies. Its tag follows -the pattern of ``-python`` (for example, ``apache/airflow-ci:master-python3.6`` +the pattern of ``-python`` (for example, ``apache/airflow-ci:main-python3.6`` or ``apache/airflow-ci:v2-1-test-python3.6``) or in case of production images tagged with releases ``apache/airflow:2.1.2-python3.8`` or ``apache/airflow:latest`` or ``apache/airflow:latest-python3.8``. @@ -730,7 +730,7 @@ If you want ever need to get a list of the files that will be checked (for troub .. code-block:: bash breeze static-check identity --verbose # currently staged files - breeze static-check identity --verbose -- --from-ref $(git merge-base master HEAD) --to-ref HEAD # branch updates + breeze static-check identity --verbose -- --from-ref $(git merge-base main HEAD) --to-ref HEAD # branch updates Building the Documentation -------------------------- @@ -773,8 +773,8 @@ easily identify the location the problems with documentation originated from. Generating constraints ---------------------- -Whenever setup.py gets modified, the CI master job will re-generate constraint files. Those constraint -files are stored in separated orphan branches: ``constraints-master``, ``constraints-2-0``. +Whenever setup.py gets modified, the CI main job will re-generate constraint files. Those constraint +files are stored in separated orphan branches: ``constraints-main``, ``constraints-2-0``. Those are constraint files as described in detail in the ``_ contributing documentation. @@ -795,7 +795,7 @@ Constraints are generated separately for each python version and there are separ * "constraints-source-providers" - those are constraints generated by using providers installed from current sources. While adding new providers their dependencies might change, so this set of providers - is the current set of the constraints for airflow and providers from the current master sources. + is the current set of the constraints for airflow and providers from the current main sources. Those providers are used by CI system to keep "stable" set of constraints. Use ``source-providers`` mode for that. @@ -1272,7 +1272,7 @@ This is the current syntax for `./breeze <./breeze>`_: -t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE Installs Airflow directly from reference in GitHub when building PROD image. - This can be a GitHub branch like master or v2-1-test, or a tag like 2.1.0a1. + This can be a GitHub branch like main or v2-1-test, or a tag like 2.1.0a1. --installation-method INSTALLATION_METHOD Method of installing Airflow in PROD image - either from the sources ('.') @@ -1420,7 +1420,7 @@ This is the current syntax for `./breeze <./breeze>`_: --use-github-registry flag) to build images. The pulled images will be used as cache. Those builds are usually faster than when ''--build-cache-local'' with the exception if the registry images are not yet updated. The DockerHub images are updated nightly and the - GitHub images are updated after merges to master so it might be that the images are still + GitHub images are updated after merges to main so it might be that the images are still outdated vs. the latest version of the Dockerfiles you are using. In this case, the ''--build-cache-local'' might be faster, especially if you iterate and change the Dockerfiles yourself. @@ -1535,7 +1535,7 @@ This is the current syntax for `./breeze <./breeze>`_: Generates pinned constraint files with all extras from setup.py. Those files are generated in files folder - separate files for different python version. Those constraint files when - pushed to orphan constraints-master, constraints-2-0 branches are used + pushed to orphan constraints-main, constraints-2-0 branches are used to generate repeatable CI builds as well as run repeatable production image builds and upgrades when you want to include installing or updating some of the released providers released at the time particular airflow version was released. You can use those @@ -2074,7 +2074,7 @@ This is the current syntax for `./breeze <./breeze>`_: --use-github-registry flag) to build images. The pulled images will be used as cache. Those builds are usually faster than when ''--build-cache-local'' with the exception if the registry images are not yet updated. The DockerHub images are updated nightly and the - GitHub images are updated after merges to master so it might be that the images are still + GitHub images are updated after merges to main so it might be that the images are still outdated vs. the latest version of the Dockerfiles you are using. In this case, the ''--build-cache-local'' might be faster, especially if you iterate and change the Dockerfiles yourself. @@ -2255,9 +2255,9 @@ This is the current syntax for `./breeze <./breeze>`_: 'breeze static-check mypy -- --files tests/core.py' 'breeze static-check mypy -- --all-files' - To check all files that differ between you current branch and master run: + To check all files that differ between you current branch and main run: - 'breeze static-check all -- --from-ref $(git merge-base master HEAD) --to-ref HEAD' + 'breeze static-check all -- --from-ref $(git merge-base main HEAD) --to-ref HEAD' To check all files that are in the HEAD commit run: @@ -2485,7 +2485,7 @@ This is the current syntax for `./breeze <./breeze>`_: -t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE Installs Airflow directly from reference in GitHub when building PROD image. - This can be a GitHub branch like master or v2-1-test, or a tag like 2.1.0a1. + This can be a GitHub branch like main or v2-1-test, or a tag like 2.1.0a1. --installation-method INSTALLATION_METHOD Method of installing Airflow in PROD image - either from the sources ('.') @@ -2656,7 +2656,7 @@ This is the current syntax for `./breeze <./breeze>`_: --use-github-registry flag) to build images. The pulled images will be used as cache. Those builds are usually faster than when ''--build-cache-local'' with the exception if the registry images are not yet updated. The DockerHub images are updated nightly and the - GitHub images are updated after merges to master so it might be that the images are still + GitHub images are updated after merges to main so it might be that the images are still outdated vs. the latest version of the Dockerfiles you are using. In this case, the ''--build-cache-local'' might be faster, especially if you iterate and change the Dockerfiles yourself. diff --git a/CI.rst b/CI.rst index 6551f188e6e70..769d8d851f811 100644 --- a/CI.rst +++ b/CI.rst @@ -21,7 +21,7 @@ CI Environment ============== Continuous Integration is important component of making Apache Airflow robust and stable. We are running -a lot of tests for every pull request, for master and v2-*-test branches and regularly as CRON jobs. +a lot of tests for every pull request, for main and v2-*-test branches and regularly as CRON jobs. Our execution environment for CI is `GitHub Actions `_. GitHub Actions (GA) are very well integrated with GitHub code and Workflow and it has evolved fast in 2019/202 to become @@ -57,20 +57,20 @@ Container Registry used as cache For the CI builds of our we are using Container Registry to store results of the "Build Image" workflow and pass it to the "CI Build" workflow. -Currently in master version of Airflow we run tests in 3 different versions of Python (3.6, 3.7, 3.8) +Currently in main version of Airflow we run tests in 3 different versions of Python (3.6, 3.7, 3.8) which means that we have to build 6 images (3 CI ones and 3 PROD ones). Yet we run around 12 jobs with each of the CI images. That is a lot of time to just build the environment to run. Therefore we are utilising ``pull_request_target`` feature of GitHub Actions. This feature allows to run a separate, independent workflow, when the main workflow is run - -this separate workflow is different than the main one, because by default it runs using ``master`` version +this separate workflow is different than the main one, because by default it runs using ``main`` version of the sources but also - and most of all - that it has WRITE access to the repository. This is especially important in our case where Pull Requests to Airflow might come from any repository, and it would be a huge security issue if anyone from outside could utilise the WRITE access to Apache Airflow repository via an external Pull Request. -Thanks to the WRITE access and fact that the 'pull_request_target' by default uses the 'master' version of the +Thanks to the WRITE access and fact that the 'pull_request_target' by default uses the 'main' version of the sources, we can safely run some logic there will checkout the incoming Pull Request, build the container image from the sources from the incoming PR and push such image to an GitHub Docker Registry - so that this image can be built only once and used by all the jobs running tests. The image is tagged with unique @@ -304,13 +304,13 @@ You can use those variables when you try to reproduce the build locally. | | | | | tested set of dependency constraints | | | | | | stored in separated "orphan" branches | | | | | | of the airflow repository | -| | | | | ("constraints-master, "constraints-2-0") | +| | | | | ("constraints-main, "constraints-2-0") | | | | | | but when this flag is set to anything but false | | | | | | (for example commit SHA), they are not used | | | | | | used and "eager" upgrade strategy is used | | | | | | when installing dependencies. We set it | | | | | | to true in case of direct pushes (merges) | -| | | | | to master and scheduled builds so that | +| | | | | to main and scheduled builds so that | | | | | | the constraints are tested. In those builds, | | | | | | in case we determine that the tests pass | | | | | | we automatically push latest set of | @@ -391,7 +391,7 @@ Note that you need to set "CI" variable to true in order to get the same results | CI_TARGET_REPO | ``apache/airflow`` | Target repository for the CI build. Used to | | | | compare incoming changes from PR with the target. | +------------------------------+----------------------+-----------------------------------------------------+ -| CI_TARGET_BRANCH | ``master`` | Target branch where the PR should land. Used to | +| CI_TARGET_BRANCH | ``main`` | Target branch where the PR should land. Used to | | | | compare incoming changes from PR with the target. | +------------------------------+----------------------+-----------------------------------------------------+ | CI_BUILD_ID | ``0`` | Unique id of the build that is kept across re runs | @@ -404,7 +404,7 @@ Note that you need to set "CI" variable to true in order to get the same results | | | [``pull_request``, ``pull_request_target``, | | | | ``schedule``, ``push``] | +------------------------------+----------------------+-----------------------------------------------------+ -| CI_REF | ``refs/head/master`` | Branch in the source repository that is used to | +| CI_REF | ``refs/head/main`` | Branch in the source repository that is used to | | | | make the pull request. | +------------------------------+----------------------+-----------------------------------------------------+ @@ -480,9 +480,9 @@ We are currently in the process of testing using GitHub Container Registry as ca the CI process. The default registry is set to "GitHub Packages", but we are testing the GitHub Container Registry. In case of GitHub Packages, authentication uses GITHUB_TOKEN mechanism. Authentication is needed for both pushing the images (WRITE) and pulling them (READ) - which means that GitHub token -is used in "master" build (WRITE) and in fork builds (READ). For container registry, our images are +is used in "main" build (WRITE) and in fork builds (READ). For container registry, our images are Publicly Visible and we do not need any authentication to pull them so the CONTAINER_REGISTRY_TOKEN is -only set in the "master" builds only ("Build Images" workflow). +only set in the "main" builds only ("Build Images" workflow). Dockerhub Variables =================== @@ -574,7 +574,7 @@ The housekeeping is important - Python base images are refreshed with varying fr usually but sometimes several times per week) with the latest security and bug fixes. Those patch level images releases can occasionally break Airflow builds (specifically Docker image builds based on those images) therefore in PRs we only use latest "good" Python image that we store in the -private GitHub cache. The direct push/master builds are not using registry cache to pull the Python images +private GitHub cache. The direct push/main builds are not using registry cache to pull the Python images - they are directly pulling the images from DockerHub, therefore they will try the latest images after they are released and in case they are fine, CI Docker image is build and tests are passing - those jobs will push the base images to the private GitHub Registry so that they be used by subsequent @@ -583,13 +583,13 @@ PR runs. Scheduled runs -------------- -Those runs are results of (nightly) triggered job - only for ``master`` branch. The +Those runs are results of (nightly) triggered job - only for ``main`` branch. The main purpose of the job is to check if there was no impact of external dependency changes on the Apache Airflow code (for example transitive dependencies released that fail the build). It also checks if the Docker images can be build from the scratch (again - to see if some dependencies have not changed - for example downloaded package releases etc. Another reason for the nightly build is that the builds tags most -recent master with ``nightly-master`` tag so that DockerHub build can pick up the moved tag and prepare a -nightly public master build in the DockerHub registry. The ``v1-10-test`` branch images are build in +recent main with ``nightly-main`` tag so that DockerHub build can pick up the moved tag and prepare a +nightly public main build in the DockerHub registry. The ``v1-10-test`` branch images are build in DockerHub when pushing ``v1-10-stable`` manually. All runs consist of the same jobs, but the jobs behave slightly differently or they are skipped in different @@ -603,13 +603,13 @@ repository, they are not executed in forks - we want to be nice to the contribut free build minutes on GitHub Actions. Sometimes (bugs in DockerHub or prolonged periods when the scheduled builds are failing) -the automated build for nightly master is not executed for a long time. Such builds can be manually +the automated build for nightly main is not executed for a long time. Such builds can be manually prepared and pushed by a maintainer who has the rights to push images to DockerHub (committers need to file JIRA ticket to Apache Infra in order to get an access). .. code-block:: bash - export BRANCH=master + export BRANCH=main export DOCKER_REPO=docker.io/apache/airflow for python_version in "3.6" "3.7" "3.8" ( @@ -747,12 +747,12 @@ Comments: (6) Nightly tag is pushed to the repository only in CRON job and only if all tests pass. This causes the DockerHub images are built automatically and made available to developers. -Force sync master from apache/airflow +Force sync main from apache/airflow ------------------------------------- This is manually triggered workflow (via GitHub UI manual run) that should only be run in GitHub forks. -When triggered, it will force-push the "apache/airflow" master to the fork's master. It's the easiest -way to sync your fork master to the Apache Airflow's one. +When triggered, it will force-push the "apache/airflow" main to the fork's main. It's the easiest +way to sync your fork main to the Apache Airflow's one. Delete old artifacts -------------------- @@ -772,7 +772,7 @@ It is run for JavaScript and Python code. Publishing documentation ------------------------ -Documentation from the ``master`` branch is automatically published on Amazon S3. +Documentation from the ``main`` branch is automatically published on Amazon S3. To make this possible, GitHub Action has secrets set up with credentials for an Amazon Web Service account - ``DOCS_AWS_ACCESS_KEY_ID`` and ``DOCS_AWS_SECRET_ACCESS_KEY``. @@ -787,7 +787,7 @@ Naming conventions for stored images The images produced during the CI builds are stored in the `GitHub Registry `_ -The images are stored with both "latest" tag (for last master push image that passes all the tests as well +The images are stored with both "latest" tag (for last main push image that passes all the tests as well with the tags indicating the origin of the image. The image names follow the patterns: @@ -807,10 +807,10 @@ The image names follow the patterns: | | | | It contains only compiled libraries and minimal set of dependencies to run Airflow. | +--------------+----------------------------+--------------------------------+--------------------------------------------------------------------------------------------+ -* might be either "master" or "v1-10-test" or "v2-*-test" -* - Python version (Major + Minor). For "master" and "v2-*-test" should be in ["3.6", "3.7", "3.8"]. For +* might be either "main" or "v1-10-test" or "v2-*-test" +* - Python version (Major + Minor). For "main" and "v2-*-test" should be in ["3.6", "3.7", "3.8"]. For v1-10-test it should be in ["2.7", "3.5", "3.6". "3.7", "3.8"]. -* - for images that get merged to "master", "v2-*-test" of "v1-10-test", or built as part of a +* - for images that get merged to "main", "v2-*-test" of "v1-10-test", or built as part of a pull request the images are tagged with the (full lenght) commit SHA of that particular branch. For pull requests the SHA used is the tip of the pull request branch. @@ -823,9 +823,9 @@ For example knowing that the CI build was for commit ``cd27124534b46c9688a1d89e7 .. code-block:: bash - docker pull docker.pkg.github.com/apache/airflow/master-python3.6-ci:cd27124534b46c9688a1d89e75fcd137ab5137e3 + docker pull docker.pkg.github.com/apache/airflow/main-python3.6-ci:cd27124534b46c9688a1d89e75fcd137ab5137e3 - docker run -it docker.pkg.github.com/apache/airflow/master-python3.6-ci:cd27124534b46c9688a1d89e75fcd137ab5137e3 + docker run -it docker.pkg.github.com/apache/airflow/main-python3.6-ci:cd27124534b46c9688a1d89e75fcd137ab5137e3 But you usually need to pass more variables and complex setup if you want to connect to a database or @@ -878,7 +878,7 @@ In 2.0 line we currently support Python 3.6, 3.7, 3.8. In order to add a new version the following operations should be done (example uses Python 3.9) -* copy the latest constraints in ``constraints-master`` branch from previous versions and name it +* copy the latest constraints in ``constraints-main`` branch from previous versions and name it using the new Python version (``constraints-3.9.txt``). Commit and push * add the new Python version to `breeze-complete `_ and @@ -911,7 +911,7 @@ In order to add a new version the following operations should be done (example u +-------------+----------------+-----------------------+---------------------+---------------+-----------+---------------+------------------------------------------------------------------------+ | Source type | Source | Docker Tag | Dockerfile location | Build Context | Autobuild | Build caching | Comment | +=============+================+=======================+=====================+===============+===========+===============+========================================================================+ -| Tag | nightly-master | master-python3.9 | Dockerfile | / | x | - | Nightly CI/PROD images from successful scheduled master nightly builds | +| Tag | nightly-main | main-python3.9 | Dockerfile | / | x | - | Nightly CI/PROD images from successful scheduled main nightly builds | +-------------+----------------+-----------------------+---------------------+---------------+-----------+---------------+------------------------------------------------------------------------+ | Branch | v2-*-stable | v2-*-stable-python3.9 | Dockerfile | / | x | | CI/PROD images automatically built pushed stable branch | +-------------+----------------+-----------------------+---------------------+---------------+-----------+---------------+------------------------------------------------------------------------+ diff --git a/COMMITTERS.rst b/COMMITTERS.rst index facb8913efdac..d5883e1192679 100644 --- a/COMMITTERS.rst +++ b/COMMITTERS.rst @@ -22,7 +22,7 @@ Committers and PMC's This document assumes that you know how Airflow's community work, but you would like to learn more about the rules by which we add new members. -Before reading this document, you should be familiar with `Contributor's guide `__. +Before reading this document, you should be familiar with `Contributor's guide `__. Guidelines to become an Airflow Committer ------------------------------------------ @@ -49,7 +49,7 @@ General prerequisites that we look for in all candidates: 2. Visibility on discussions on the dev mailing list, Slack channels or GitHub issues/discussions 3. Contributions to community health and project's sustainability for the long-term 4. Understands contributor/committer guidelines: - `Contributors' Guide `__ + `Contributors' Guide `__ Code contribution diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index 2aa18f952ef87..f2e02b86460c8 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -29,7 +29,7 @@ rules of that community. New Contributor --------------- -If you are a new contributor, please follow the `Contributors Quick Start `__ guide to get a gentle step-by-step introduction to setting up the development environment and making your first contribution. @@ -198,9 +198,9 @@ Step 2: Configure Your Environment ---------------------------------- You can use either a local virtual env or a Docker-based env. The differences -between the two are explained `here `_. +between the two are explained `here `_. -The local env's instructions can be found in full in the `LOCAL_VIRTUALENV.rst `_ file. +The local env's instructions can be found in full in the `LOCAL_VIRTUALENV.rst `_ file. The Docker env is here to maintain a consistent and common development environment so that you can replicate CI failures locally and work on solving them locally rather by pushing to CI. You can configure the Docker-based Breeze development environment as follows: @@ -261,24 +261,24 @@ Step 4: Prepare PR * Read about `email configuration in Airflow `__. * Find the class you should modify. For the example GitHub issue, - this is `email.py `__. + this is `email.py `__. * Find the test class where you should add tests. For the example ticket, - this is `test_email.py `__. + this is `test_email.py `__. - * Make sure your fork's master is synced with Apache Airflow's master before you create a branch. See + * Make sure your fork's main is synced with Apache Airflow's main before you create a branch. See `How to sync your fork <#how-to-sync-your-fork>`_ for details. * Create a local branch for your development. Make sure to use latest - ``apache/master`` as base for the branch. See `How to Rebase PR <#how-to-rebase-pr>`_ for some details + ``apache/main`` as base for the branch. See `How to Rebase PR <#how-to-rebase-pr>`_ for some details on setting up the ``apache`` remote. Note, some people develop their changes directly in their own - ``master`` branches - this is OK and you can make PR from your master to ``apache/master`` but we + ``main`` branches - this is OK and you can make PR from your main to ``apache/main`` but we recommend to always create a local branch for your development. This allows you to easily compare changes, have several changes that you work on at the same time and many more. - If you have ``apache`` set as remote then you can make sure that you have latest changes in your master - by ``git pull apache master`` when you are in the local ``master`` branch. If you have conflicts and - want to override your locally changed master you can override your local changes with - ``git fetch apache; git reset --hard apache/master``. + If you have ``apache`` set as remote then you can make sure that you have latest changes in your main + by ``git pull apache main`` when you are in the local ``main`` branch. If you have conflicts and + want to override your locally changed main you can override your local changes with + ``git fetch apache; git reset --hard apache/main``. * Modify the class and add necessary code and unit tests. @@ -395,11 +395,11 @@ these guidelines: Airflow Git Branches ==================== -All new development in Airflow happens in the ``master`` branch. All PRs should target that branch. +All new development in Airflow happens in the ``main`` branch. All PRs should target that branch. We also have a ``v2-*-test`` branches that are used to test ``2.*.x`` series of Airflow and where committers -cherry-pick selected commits from the master branch. +cherry-pick selected commits from the main branch. Cherry-picking is done with the ``-x`` flag. @@ -422,7 +422,7 @@ time when they converge. The production images are build in DockerHub from: -* master branch for development +* main branch for development * v2-*-test branches for testing 2.*.x release * ``2.*.*``, ``2.*.*rc*`` releases from the ``v2-*-stable`` branch when we prepare release candidates and final releases. There are no production images prepared from v2-*-stable branch. @@ -683,7 +683,7 @@ the providers are installed from PyPI, they provide the entry-point containing t in the previous chapter. However when they are locally developed, together with Airflow, the mechanism of discovery of the providers is based on ``provider.yaml`` file that is placed in the top-folder of the provider. Similarly as in case of the ``provider.yaml`` file is compliant with the -`json-schema specification `_. +`json-schema specification `_. Thanks to that mechanism, you can develop community managed providers in a seamless way directly from Airflow sources, without preparing and releasing them as packages. This is achieved by: @@ -804,7 +804,7 @@ There are several sets of constraints we keep: * "constraints-source-providers" - those are constraints generated by using providers installed from current sources. While adding new providers their dependencies might change, so this set of providers - is the current set of the constraints for airflow and providers from the current master sources. + is the current set of the constraints for airflow and providers from the current main sources. Those providers are used by CI system to keep "stable" set of constraints. Thet are named ``constraints-source-providers-.txt`` @@ -820,7 +820,7 @@ It can be done from the sources: .. code-block:: bash pip install -e . \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" or from the PyPI package: @@ -828,7 +828,7 @@ or from the PyPI package: .. code-block:: bash pip install apache-airflow \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" This works also with extras - for example: @@ -836,7 +836,7 @@ This works also with extras - for example: .. code-block:: bash pip install .[ssh] \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" As of apache-airflow 1.10.12 it is also possible to use constraints directly from GitHub using specific @@ -857,7 +857,7 @@ If you want to update just airflow dependencies, without paying attention to pro .. code-block:: bash pip install . --upgrade \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-no-providers-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-no-providers-3.6.txt" The ``constraints-.txt`` and ``constraints-no-providers-.txt`` @@ -868,7 +868,7 @@ Manually generating constraint files ------------------------------------ The constraint files are generated automatically by the CI job. Sometimes however it is needed to regenerate -them manually (committers only). For example when master build did not succeed for quite some time). +them manually (committers only). For example when main build did not succeed for quite some time). This can be done by running this (it utilizes parallel preparation of the constraints): .. code-block:: bash @@ -894,7 +894,7 @@ generated files: .. code-block:: bash - cd + cd git pull cp ${AIRFLOW_SOURCES}/files/constraints-*/constraints*.txt . git diff @@ -1222,14 +1222,14 @@ commands: How to sync your fork ===================== -When you have your fork, you should periodically synchronize the master of your fork with the -Apache Airflow master. In order to do that you can ``git pull --rebase`` to your local git repository from -apache remote and push the master (often with ``--force`` to your fork). There is also an easy -way using ``Force sync master from apache/airflow`` workflow. You can go to "Actions" in your repository and +When you have your fork, you should periodically synchronize the main of your fork with the +Apache Airflow main. In order to do that you can ``git pull --rebase`` to your local git repository from +apache remote and push the main (often with ``--force`` to your fork). There is also an easy +way using ``Force sync main from apache/airflow`` workflow. You can go to "Actions" in your repository and choose the workflow and manually trigger the workflow using "Run workflow" command. -This will force-push the master from apache/airflow to the master in your fork. Note that in case you -modified the master in your fork, you might loose those changes. +This will force-push the main from apache/airflow to the main in your fork. Note that in case you +modified the main in your fork, you might loose those changes. How to rebase PR @@ -1240,7 +1240,7 @@ providing a better alternative to the merge workflow. We've therefore written a As opposed to the merge workflow, the rebase workflow allows us to clearly separate your changes from the changes of others. It puts the responsibility of rebasing on the -author of the change. It also produces a "single-line" series of commits on the master branch. This +author of the change. It also produces a "single-line" series of commits on the main branch. This makes it easier to understand what was going on and to find reasons for problems (it is especially useful for "bisecting" when looking for a commit that introduced some bugs). @@ -1248,9 +1248,9 @@ First of all, we suggest you read about the rebase workflow here: `Merging vs. rebasing `_. This is an excellent article that describes all the ins/outs of the rebase workflow. I recommend keeping it for future reference. -The goal of rebasing your PR on top of ``apache/master`` is to "transplant" your change on top of +The goal of rebasing your PR on top of ``apache/main`` is to "transplant" your change on top of the latest changes that are merged by others. It also allows you to fix all the conflicts -that arise as a result of other people changing the same files as you and merging the changes to ``apache/master``. +that arise as a result of other people changing the same files as you and merging the changes to ``apache/main``. Here is how rebase looks in practice (you can find a summary below these detailed steps): @@ -1262,7 +1262,7 @@ as "apache" so you can refer to it easily: * If you use ssh: ``git remote add apache git@github.com:apache/airflow.git`` * If you use https: ``git remote add apache https://github.com/apache/airflow.git`` -2. You then need to make sure that you have the latest master fetched from the ``apache`` repository. You can do this +2. You then need to make sure that you have the latest main fetched from the ``apache`` repository. You can do this via: ``git fetch apache`` (to fetch apache remote) @@ -1272,7 +1272,7 @@ as "apache" so you can refer to it easily: 3. Assuming that your feature is in a branch in your repository called ``my-branch`` you can easily check what is the base commit you should rebase from by: - ``git merge-base my-branch apache/master`` + ``git merge-base my-branch apache/main`` This will print the HASH of the base commit which you should use to rebase your feature from. For example: ``5abce471e0690c6b8d06ca25685b0845c5fd270f``. Copy that HASH and go to the next step. @@ -1297,11 +1297,11 @@ as "apache" so you can refer to it easily: 5. Rebase: - ``git rebase HASH --onto apache/master`` + ``git rebase HASH --onto apache/main`` For example: - ``git rebase 5abce471e0690c6b8d06ca25685b0845c5fd270f --onto apache/master`` + ``git rebase 5abce471e0690c6b8d06ca25685b0845c5fd270f --onto apache/main`` 6. If you have no conflicts - that's cool. You rebased. You can now run ``git push --force-with-lease`` to push your changes to your repository. That should trigger the build in our CI if you have a @@ -1334,9 +1334,9 @@ Summary Useful when you understand the flow but don't remember the steps and want a quick reference. ``git fetch --all`` -``git merge-base my-branch apache/master`` +``git merge-base my-branch apache/main`` ``git checkout my-branch`` -``git rebase HASH --onto apache/master`` +``git rebase HASH --onto apache/main`` ``git push --force-with-lease`` How to communicate @@ -1373,7 +1373,7 @@ You can join the channels via links at the `Airflow Community page `_ as stated in `Contribution Workflow Example `_ +We don't create new issues on JIRA anymore. The reason we still look at JIRA issues is that there are valuable tickets inside of it. However, each new PR should be created on `GitHub issues `_ as stated in `Contribution Workflow Example `_ * The `Apache Airflow Slack `_ for: * ad-hoc questions related to development (#development channel) diff --git a/CONTRIBUTORS_QUICK_START.rst b/CONTRIBUTORS_QUICK_START.rst index dd8b0487d0b8b..92b3ff7eab0a9 100644 --- a/CONTRIBUTORS_QUICK_START.rst +++ b/CONTRIBUTORS_QUICK_START.rst @@ -300,8 +300,8 @@ Using Breeze Use CI image. - Branch name: master - Docker image: apache/airflow:master-python3.8-ci + Branch name: main + Docker image: apache/airflow:main-python3.8-ci Airflow source version: 2.0.0b2 Python version: 3.8 DockerHub user: apache @@ -408,7 +408,7 @@ For more information visit : |Breeze documentation| .. |Breeze documentation| raw:: html - Breeze documentation + Breeze documentation Following are some of important topics of Breeze documentation: @@ -417,7 +417,7 @@ Following are some of important topics of Breeze documentation: .. |Choosing different Breeze environment configuration| raw:: html - Choosing different Breeze environment configuration @@ -425,7 +425,7 @@ Following are some of important topics of Breeze documentation: .. |Troubleshooting Breeze environment| raw:: html - Troubleshooting + Troubleshooting Breeze environment @@ -433,7 +433,7 @@ Following are some of important topics of Breeze documentation: .. |Installing Additional tools to the Docker Image| raw:: html - Installing + Installing Additional tools to the Docker Image @@ -441,7 +441,7 @@ Following are some of important topics of Breeze documentation: .. |Internal details of Breeze| raw:: html - + Internal details of Breeze @@ -449,7 +449,7 @@ Following are some of important topics of Breeze documentation: .. |Breeze Command-Line Interface Reference| raw:: html - Breeze Command-Line Interface Reference @@ -457,7 +457,7 @@ Following are some of important topics of Breeze documentation: .. |Cleaning the environment| raw:: html - + Cleaning the environment @@ -465,7 +465,7 @@ Following are some of important topics of Breeze documentation: .. |Other uses of the Airflow Breeze environment| raw:: html - Other uses of the Airflow Breeze environment @@ -648,7 +648,7 @@ All Tests are inside ./tests directory. .. |TESTING.rst| raw:: html - TESTING.rst + TESTING.rst - Following are the some of important topics of TESTING.rst @@ -656,7 +656,7 @@ All Tests are inside ./tests directory. .. |Airflow Test Infrastructure| raw:: html - + Airflow Test Infrastructure @@ -664,7 +664,7 @@ All Tests are inside ./tests directory. .. |Airflow Unit Tests| raw:: html - Airflow Unit + Airflow Unit Tests @@ -672,7 +672,7 @@ All Tests are inside ./tests directory. .. |Helm Unit Tests| raw:: html - Helm Unit Tests + Helm Unit Tests @@ -680,7 +680,7 @@ All Tests are inside ./tests directory. .. |Airflow Integration Tests| raw:: html - + Airflow Integration Tests @@ -688,7 +688,7 @@ All Tests are inside ./tests directory. .. |Running Tests with Kubernetes| raw:: html - + Running Tests with Kubernetes @@ -696,7 +696,7 @@ All Tests are inside ./tests directory. .. |Airflow System Tests| raw:: html - Airflow + Airflow System Tests @@ -704,7 +704,7 @@ All Tests are inside ./tests directory. .. |Local and Remote Debugging in IDE| raw:: html - Local and Remote Debugging in IDE @@ -712,7 +712,7 @@ All Tests are inside ./tests directory. .. |BASH Unit Testing (BATS)| raw:: html - + BASH Unit Testing (BATS) @@ -846,7 +846,7 @@ To avoid burden on CI infrastructure and to save time, Pre-commit hooks can be r .. |STATIC_CODE_CHECKS.rst| raw:: html - + STATIC_CODE_CHECKS.rst - Following are some of the important links of STATIC_CODE_CHECKS.rst @@ -855,14 +855,14 @@ To avoid burden on CI infrastructure and to save time, Pre-commit hooks can be r .. |Pre-commit Hooks| raw:: html - + Pre-commit Hooks - |Pylint Static Code Checks| .. |Pylint Static Code Checks| raw:: html - Pylint Static Code Checks @@ -870,7 +870,7 @@ To avoid burden on CI infrastructure and to save time, Pre-commit hooks can be r .. |Running Static Code Checks via Breeze| raw:: html - Running Static Code Checks via Breeze @@ -884,7 +884,7 @@ Contribution guide .. |CONTRIBUTING.rst| raw:: html - CONTRIBUTING.rst + CONTRIBUTING.rst - Following are some of important links of CONTRIBUTING.rst @@ -892,7 +892,7 @@ Contribution guide .. |Types of contributions| raw:: html - + Types of contributions @@ -900,7 +900,7 @@ Contribution guide .. |Roles of contributor| raw:: html - Roles of + Roles of contributor @@ -908,7 +908,7 @@ Contribution guide .. |Workflow for a contribution| raw:: html - + Workflow for a contribution @@ -949,7 +949,7 @@ Syncing Fork and rebasing Pull request Often it takes several days or weeks to discuss and iterate with the PR until it is ready to merge. In the meantime new commits are merged, and you might run into conflicts, therefore you should periodically -synchronize master in your fork with the ``apache/airflow`` master and rebase your PR on top of it. Following +synchronize main in your fork with the ``apache/airflow`` main and rebase your PR on top of it. Following describes how to do it. @@ -957,7 +957,7 @@ describes how to do it. .. |Syncing fork| raw:: html - + Update new changes made to apache:airflow project to your fork @@ -965,5 +965,5 @@ describes how to do it. .. |Rebasing pull request| raw:: html - + Rebasing pull request diff --git a/Dockerfile b/Dockerfile index ca15184765940..35a78fa179b99 100644 --- a/Dockerfile +++ b/Dockerfile @@ -133,7 +133,7 @@ RUN mkdir -pv /usr/share/man/man1 \ ARG INSTALL_MYSQL_CLIENT="true" ARG AIRFLOW_REPO=apache/airflow -ARG AIRFLOW_BRANCH=master +ARG AIRFLOW_BRANCH=main ARG AIRFLOW_EXTRAS ARG ADDITIONAL_AIRFLOW_EXTRAS="" # Allows to override constraints source @@ -141,7 +141,7 @@ ARG CONSTRAINTS_GITHUB_REPOSITORY="apache/airflow" ARG AIRFLOW_CONSTRAINTS="constraints" ARG AIRFLOW_CONSTRAINTS_REFERENCE="" ARG AIRFLOW_CONSTRAINTS_LOCATION="" -ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-master" +ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-main" ARG AIRFLOW_PIP_VERSION # By default PIP has progress bar but you can disable it. ARG PIP_PROGRESS_BAR @@ -206,7 +206,7 @@ ENV AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES} \ AIRFLOW_SOURCES_FROM=${AIRFLOW_SOURCES_FROM} \ AIRFLOW_SOURCES_TO=${AIRFLOW_SOURCES_TO} -# In case of Production build image segment we want to pre-install master version of airflow +# In case of Production build image segment we want to pre-install main version of airflow # dependencies from GitHub so that we do not have to always reinstall it from the scratch. # The Airflow (and providers in case INSTALL_PROVIDERS_FROM_SOURCES is "false") # are uninstalled, only dependencies remain diff --git a/Dockerfile.ci b/Dockerfile.ci index 14e4294c6b029..41be4787f8e37 100644 --- a/Dockerfile.ci +++ b/Dockerfile.ci @@ -201,7 +201,7 @@ RUN curl -sSL https://github.com/bats-core/bats-core/archive/v${BATS_VERSION}.ta && tar -zxf /tmp/bats-file.tgz -C /opt/bats/lib/bats-file --strip 1 && rm -rf /tmp/* ARG AIRFLOW_REPO=apache/airflow -ARG AIRFLOW_BRANCH=master +ARG AIRFLOW_BRANCH=main # Airflow Extras installed ARG AIRFLOW_EXTRAS="all" ARG ADDITIONAL_AIRFLOW_EXTRAS="" @@ -210,7 +210,7 @@ ARG CONSTRAINTS_GITHUB_REPOSITORY="apache/airflow" ARG AIRFLOW_CONSTRAINTS="constraints" ARG AIRFLOW_CONSTRAINTS_REFERENCE="" ARG AIRFLOW_CONSTRAINTS_LOCATION="" -ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-master" +ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-main" # By changing the CI build epoch we can force reinstalling Airflow and pip all dependencies # It can also be overwritten manually by setting the AIRFLOW_CI_BUILD_EPOCH environment variable. ARG AIRFLOW_CI_BUILD_EPOCH="3" @@ -273,7 +273,7 @@ ARG UPGRADE_TO_NEWER_DEPENDENCIES="false" ENV EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS=${EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS} \ UPGRADE_TO_NEWER_DEPENDENCIES=${UPGRADE_TO_NEWER_DEPENDENCIES} -# In case of CI builds we want to pre-install master version of airflow dependencies so that +# In case of CI builds we want to pre-install main version of airflow dependencies so that # We do not have to always reinstall it from the scratch. # And is automatically reinstalled from the scratch every time patch release of python gets released # The Airflow (and providers in case INSTALL_PROVIDERS_FROM_SOURCES is "false") diff --git a/IMAGES.rst b/IMAGES.rst index 9e9e469853de7..6b4ca3117db92 100644 --- a/IMAGES.rst +++ b/IMAGES.rst @@ -76,8 +76,8 @@ And for production images with ``latest`` tag: where: -* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``master``, - ``v2-1-test``, ``2.1.0``. The ``master``, ``v2-*-test`` labels are +* ``BRANCH_OR_TAG`` - branch or tag used when creating the image. Examples: ``main``, + ``v2-1-test``, ``2.1.0``. The ``main``, ``v2-*-test`` labels are built from branches so they change over time. The ``2.*.*`` labels are built from git tags and they are "fixed" once built. * ``PYTHON_MAJOR_MINOR_VERSION`` - version of Python used to build the image. Examples: ``3.6``, ``3.7``, @@ -193,13 +193,13 @@ This will build the image using command similar to: You can also build production images from specific Git version via providing ``--install-airflow-reference`` -parameter to Breeze (this time constraints are taken from the ``constraints-master`` branch which is the +parameter to Breeze (this time constraints are taken from the ``constraints-main`` branch which is the HEAD of development for constraints): .. code-block:: bash pip install "https://github.com/apache/airflow/archive/.tar.gz#egg=apache-airflow" \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" You can also skip installing airflow and install it from locally provided files by using ``--install-from-docker-context-files`` parameter and ``--disable-pypi-when-building`` to Breeze: @@ -292,10 +292,10 @@ For example: .. code-block:: bash - apache/airflow-ci:master-python3.6 - production "master" image from current master - apache/airflow-ci:master-python3.6-ci - CI "master" image from current master - apache/airflow-ci:v2-1-test-python3.6-ci - CI "master" image from current v2-1-test branch - apache/airflow:python3.6-master - base Python image for the master branch + apache/airflow-ci:main-python3.6 - production "main" image from current main + apache/airflow-ci:main-python3.6-ci - CI "main" image from current main + apache/airflow-ci:v2-1-test-python3.6-ci - CI "main" image from current v2-1-test branch + apache/airflow:python3.6-main - base Python image for the main branch You can see those CI DockerHub images at ``_ @@ -327,7 +327,7 @@ By default DockerHub registry is used when you push or pull such images. However for CI builds we keep the images in GitHub registry as well - this way we can easily push the images automatically after merge requests and use such images for Pull Requests as cache - which makes it much it much faster for CI builds (images are available in cache -right after merged request in master finishes it's build), The difference is visible especially if +right after merged request in main finishes it's build), The difference is visible especially if significant changes are done in the Dockerfile.CI. The images are named differently (in Docker definition of image names - registry URL is part of the @@ -355,7 +355,7 @@ Images with a commit SHA (built for pull requests and pushes) docker.pkg.github.com/apache-airflow/-pythonX.Y-build-v2: - for production build stage docker.pkg.github.com/apache-airflow/python-v2:X.Y-slim-buster- - for base Python images -Latest images (pushed when master merge succeeds): +Latest images (pushed when main merge succeeds): .. code-block:: bash @@ -377,7 +377,7 @@ Images with a commit SHA (built for pull requests and pushes) ghcr.io/apache/airflow--pythonX.Y-build-v2: - for production build stage ghcr.io/apache/airflow-python-v2:X.Y-slim-buster- - for base Python images -Latest images (pushed when master merge succeeds): +Latest images (pushed when main merge succeeds): .. code-block:: bash @@ -565,7 +565,7 @@ The following build arguments (``--build-arg`` in docker build command) can be u | ``AIRFLOW_REPO`` | ``apache/airflow`` | the repository from which PIP | | | | dependencies are pre-installed | +------------------------------------------+------------------------------------------+------------------------------------------+ -| ``AIRFLOW_BRANCH`` | ``master`` | the branch from which PIP dependencies | +| ``AIRFLOW_BRANCH`` | ``main`` | the branch from which PIP dependencies | | | | are pre-installed | +------------------------------------------+------------------------------------------+------------------------------------------+ | ``AIRFLOW_CI_BUILD_EPOCH`` | ``1`` | increasing this value will reinstall PIP | @@ -590,7 +590,7 @@ The following build arguments (``--build-arg`` in docker build command) can be u | ``AIRFLOW_CONSTRAINTS_REFERENCE`` | | reference (branch or tag) from GitHub | | | | repository from which constraints are | | | | used. By default it is set to | -| | | ``constraints-master`` but can be | +| | | ``constraints-main`` but can be | | | | ``constraints-2-0`` for 2.0.* versions | | | | or it could point to specific version | | | | for example ``constraints-2.0.0`` | @@ -714,12 +714,12 @@ way of querying image details via API. You really need to download the image to We workaround it in the way that always when we build the image we build a very small image manifest containing randomly generated UUID and push it to registry together with the main CI image. The tag for the manifest image reflects the image it refers to with added ``-manifest`` suffix. -The manifest image for ``apache/airflow:master-python3.6-ci`` is named -``apache/airflow:master-python3.6-ci-manifest``. +The manifest image for ``apache/airflow:main-python3.6-ci`` is named +``apache/airflow:main-python3.6-ci-manifest``. The image is quickly pulled (it is really, really small) when important files change and the content of the randomly generated UUID is compared with the one in our image. If the contents are different -this means that the user should rebase to latest master and rebuild the image with pulling the image from +this means that the user should rebase to latest main and rebuild the image with pulling the image from the repo as this will likely be faster than rebuilding the image locally. The random UUID is generated right after pre-cached pip install is run - and usually it means that diff --git a/INSTALL b/INSTALL index 50ac1500fd638..ced87c9460560 100644 --- a/INSTALL +++ b/INSTALL @@ -40,21 +40,21 @@ python setup.py install # There are different constraint files for different python versions. For example" pip install . \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" By default `pip install` in Airflow 2.0 installs only the provider packages that are needed by the extras and install them as packages from PyPI rather than from local sources: pip install .[google,amazon] \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" You can upgrade just airflow, without paying attention to provider's dependencies by using 'no-providers' constraint files. This allows you to keep installed provider packages. pip install . --upgrade \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-no-providers-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-no-providers-3.6.txt" You can also install airflow in "editable mode" (with -e) flag and then provider packages are @@ -69,7 +69,7 @@ and in ``CONTRIBUTING.rst`` for developing community maintained providers. This is useful if you want to develop providers: pip install -e . \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" You can also skip installing provider packages from PyPI by setting INSTALL_PROVIDERS_FROM_SOURCE to "true". In this case Airflow will be installed in non-editable mode with all providers installed from the sources. @@ -77,13 +77,13 @@ Additionally `provider.yaml` files will also be copied to providers folders whic discoverable by Airflow even if they are not installed from packages in this case. INSTALL_PROVIDERS_FROM_SOURCES="true" pip install . \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" Airflow can be installed with extras to install some additional features (for example 'async' or 'doc' or to install automatically providers and all dependencies needed by that provider: pip install .[async,google,amazon] \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" The list of available extras: diff --git a/LOCAL_VIRTUALENV.rst b/LOCAL_VIRTUALENV.rst index 15f828d462791..f97f89a821cf9 100644 --- a/LOCAL_VIRTUALENV.rst +++ b/LOCAL_VIRTUALENV.rst @@ -151,7 +151,7 @@ for different python versions: .. code-block:: bash pip install -e ".[devel,]" \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" This will install Airflow in 'editable' mode - where sources of Airflow are taken directly from the source @@ -164,7 +164,7 @@ You can also install Airflow in non-editable mode: .. code-block:: bash pip install ".[devel,]" \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" This will copy the sources to directory where usually python packages are installed. You can see the list of directories via ``python -m site`` command. In this case the providers are installed from PyPI, not from @@ -173,7 +173,7 @@ sources, unless you set ``INSTALL_PROVIDERS_FROM_SOURCES`` environment variable .. code-block:: bash INSTALL_PROVIDERS_FROM_SOURCES="true" pip install ".[devel,]" \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" Note: when you first initialize database (the next step), you may encounter some problems. @@ -231,7 +231,7 @@ before running ``pip install`` command: .. code-block:: bash INSTALL_PROVIDERS_FROM_SOURCES="true" pip install -U -e ".[devel,]" \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.6.txt" This way no providers packages will be installed and they will always be imported from the "airflow/providers" folder. diff --git a/PULL_REQUEST_WORKFLOW.rst b/PULL_REQUEST_WORKFLOW.rst index 37444bafc77e2..96cc5b3f09514 100644 --- a/PULL_REQUEST_WORKFLOW.rst +++ b/PULL_REQUEST_WORKFLOW.rst @@ -50,7 +50,7 @@ We approached the problem by: the builds can complete in < 2 minutes) but also by limiting the number of tests executed in PRs that do not touch the "core" of Airflow, or only touching some - standalone - parts of Airflow such as "Providers", "WWW" or "CLI". This solution is not yet perfect as there are likely some edge cases but - it is easy to maintain and we have an escape-hatch - all the tests are always executed in master pushes, + it is easy to maintain and we have an escape-hatch - all the tests are always executed in main pushes, so contributors can easily spot if there is a "missed" case and fix it - both by fixing the problem and adding those exceptions to the code. More about it can be found in the `Selective CI checks <#selective-ci-checks>`_ chapter. @@ -126,7 +126,7 @@ The logic implemented for the changes works as follows: 1) In case of direct push (so when PR gets merged) or scheduled run, we always run all tests and checks. This is in order to make sure that the merge did not miss anything important. The remainder of the logic is executed only in case of Pull Requests. We do not add providers tests in case DEFAULT_BRANCH is - different than master, because providers are only important in master branch and PRs to master branch. + different than main, because providers are only important in main branch and PRs to main branch. 2) We retrieve which files have changed in the incoming Merge Commit (github.sha is a merge commit automatically prepared by GitHub in case of Pull Request, so we can retrieve the list of changed @@ -135,8 +135,8 @@ The logic implemented for the changes works as follows: 3) If any of the important, environment files changed (Dockerfile, ci scripts, setup.py, GitHub workflow files), then we again run all tests and checks. Those are cases where the logic of the checks changed or the environment for the checks changed so we want to make sure to check everything. We do not add - providers tests in case DEFAULT_BRANCH is different than master, because providers are only - important in master branch and PRs to master branch. + providers tests in case DEFAULT_BRANCH is different than main, because providers are only + important in main branch and PRs to main branch. 4) If any of py files changed: we need to have CI image and run full static checks so we enable image building @@ -160,7 +160,7 @@ The logic implemented for the changes works as follows: b) if any of the Airflow API files changed we enable ``API`` test type c) if any of the Airflow CLI files changed we enable ``CLI`` test type and Kubernetes tests (the K8S tests depend on CLI changes as helm chart uses CLI to run Airflow). - d) if this is a master branch and if any of the Provider files changed we enable ``Providers`` test type + d) if this is a main branch and if any of the Provider files changed we enable ``Providers`` test type e) if any of the WWW files changed we enable ``WWW`` test type f) if any of the Kubernetes files changed we enable ``Kubernetes`` test type g) Then we subtract count of all the ``specific`` above per-type changed files from the count of @@ -184,7 +184,7 @@ The logic implemented for the changes works as follows: Similarly to selective tests we also run selective security scans. In Pull requests, the Python scan will only run when there is a python code change and JavaScript scan will only run if -there is a JavaScript or yarn.lock file change. For master builds, all scans are always executed. +there is a JavaScript or yarn.lock file change. For main builds, all scans are always executed. The selective check algorithm is shown here: diff --git a/README.md b/README.md index d0d2e173a0fa8..15d97e3df0d1e 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,7 @@ [![PyPI version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow) [![GitHub Build](https://github.com/apache/airflow/workflows/CI%20Build/badge.svg)](https://github.com/apache/airflow/actions) -[![Coverage Status](https://img.shields.io/codecov/c/github/apache/airflow/master.svg)](https://codecov.io/github/apache/airflow?branch=master) +[![Coverage Status](https://img.shields.io/codecov/c/github/apache/airflow/main.svg)](https://codecov.io/github/apache/airflow?branch=main) [![License](https://img.shields.io/:license-Apache%202-blue.svg)](https://www.apache.org/licenses/LICENSE-2.0.txt) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow.svg)](https://pypi.org/project/apache-airflow/) [![Docker Pulls](https://img.shields.io/docker/pulls/apache/airflow.svg)](https://hub.docker.com/r/apache/airflow) @@ -97,7 +97,7 @@ We **highly** recommend upgrading to the latest Airflow major release at the ear Apache Airflow is tested with: -| | Master version (dev) | Stable version (2.0.2) | Previous version (1.10.15) | +| | Main version (dev) | Stable version (2.0.2) | Previous version (1.10.15) | | -------------------- | ------------------------- | ------------------------ | ------------------------- | | Python | 3.6, 3.7, 3.8 | 3.6, 3.7, 3.8 | 2.7, 3.5, 3.6, 3.7, 3.8 | | Kubernetes | 1.20, 1.19, 1.18 | 1.20, 1.19, 1.18 | 1.18, 1.17, 1.16 | @@ -121,9 +121,9 @@ They are based on the official release schedule of Python and Kubernetes, nicely [Kubernetes version skew policy](https://kubernetes.io/docs/setup/release/version-skew-policy/). 1. We drop support for Python and Kubernetes versions when they reach EOL. We drop support for those - EOL versions in master right after EOL date, and it is effectively removed when we release the + EOL versions in main right after EOL date, and it is effectively removed when we release the first new MINOR (Or MAJOR if there is no new MINOR version) of Airflow - For example for Python 3.6 it means that we drop support in master right after 23.12.2021, and the first + For example for Python 3.6 it means that we drop support in main right after 23.12.2021, and the first MAJOR or MINOR version of Airflow released after will not have it. 2. The "oldest" supported version of Python/Kubernetes is the default one. "Default" is only meaningful @@ -132,7 +132,7 @@ They are based on the official release schedule of Python and Kubernetes, nicely are both Python 3.6 images, however the first MINOR/MAJOR release of Airflow release after 23.12.2021 will become Python 3.7 images. -3. We support a new version of Python/Kubernetes in master after they are officially released, as soon as we +3. We support a new version of Python/Kubernetes in main after they are officially released, as soon as we make them work in our CI pipeline (which might not be immediate due to dependencies catching up with new versions of Python mostly) we release a new images/support in Airflow based on the working CI setup. @@ -148,7 +148,7 @@ Visit the official Airflow website documentation (latest **stable** release) for [getting started](https://airflow.apache.org/docs/apache-airflow/stable/start/index.html), or walking through a more complete [tutorial](https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html). -> Note: If you're looking for documentation for master branch (latest development branch): you can find it on [s.apache.org/airflow-docs](https://s.apache.org/airflow-docs/). +> Note: If you're looking for documentation for main branch (latest development branch): you can find it on [s.apache.org/airflow-docs](https://s.apache.org/airflow-docs/). For more information on Airflow Improvement Proposals (AIPs), visit the [Airflow Wiki](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals). @@ -165,7 +165,7 @@ if needed. This means that from time to time plain `pip install apache-airflow` produce unusable Airflow installation. In order to have repeatable installation, however, we also keep a set of "known-to-be-working" constraint -files in the orphan `constraints-master`, `constraints-2-0` branches. We keep those "known-to-be-working" +files in the orphan `constraints-main`, `constraints-2-0` branches. We keep those "known-to-be-working" constraints files separately per major/minor Python version. You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify correct Airflow tag/version/branch and Python versions in the URL. @@ -265,12 +265,12 @@ following the ASF Policy. ## Contributing -Want to help build Apache Airflow? Check out our [contributing documentation](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst). +Want to help build Apache Airflow? Check out our [contributing documentation](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst). ## Who uses Apache Airflow? More than 400 organizations are using Apache Airflow -[in the wild](https://github.com/apache/airflow/blob/master/INTHEWILD.md). +[in the wild](https://github.com/apache/airflow/blob/main/INTHEWILD.md). ## Who Maintains Apache Airflow? @@ -278,7 +278,7 @@ Airflow is the work of the [community](https://github.com/apache/airflow/graphs/ but the [core committers/maintainers](https://people.apache.org/committers-by-project.html#airflow) are responsible for reviewing and merging PRs as well as steering conversation around new feature requests. If you would like to become a maintainer, please review the Apache Airflow -[committer requirements](https://github.com/apache/airflow/blob/master/COMMITTERS.rst#guidelines-to-become-an-airflow-committer). +[committer requirements](https://github.com/apache/airflow/blob/main/COMMITTERS.rst#guidelines-to-become-an-airflow-committer). ## Can I use the Apache Airflow logo in my presentation? diff --git a/TESTING.rst b/TESTING.rst index 81b857908939f..54f433a1beb36 100644 --- a/TESTING.rst +++ b/TESTING.rst @@ -595,7 +595,7 @@ Deploying Airflow to the Kubernetes cluster created is also done via ``kind-clus The deploy command performs those steps: -1. It rebuilds the latest ``apache/airflow:master-pythonX.Y`` production images using the +1. It rebuilds the latest ``apache/airflow:main-pythonX.Y`` production images using the latest sources using local caching. It also adds example DAGs to the image, so that they do not have to be mounted inside. 2. Loads the image to the Kind Cluster using the ``kind load`` command. @@ -715,8 +715,8 @@ The typical session for tests with Kubernetes looks like follows: Use CI image. - Branch name: master - Docker image: apache/airflow:master-python3.7-ci + Branch name: main + Docker image: apache/airflow:main-python3.7-ci Airflow source version: 2.0.0.dev0 Python version: 3.7 @@ -757,8 +757,8 @@ The typical session for tests with Kubernetes looks like follows: Use CI image. - Branch name: master - Docker image: apache/airflow:master-python3.7-ci + Branch name: main + Docker image: apache/airflow:main-python3.7-ci Airflow source version: 2.0.0.dev0 Python version: 3.7 @@ -1300,7 +1300,7 @@ By default ``/files/dags`` folder is mounted from your local `` the directory used by airflow scheduler and webserver to scan dags for. You can place your dags there to test them. -The DAGs can be run in the master version of Airflow but they also work +The DAGs can be run in the main version of Airflow but they also work with older versions. To run the tests for Airflow 1.10.* series, you need to run Breeze with diff --git a/UPDATING.md b/UPDATING.md index 6fb1bf5a53322..9b658c7e9b89a 100644 --- a/UPDATING.md +++ b/UPDATING.md @@ -26,7 +26,7 @@ assists users migrating to a new version. **Table of contents** -- [Master](#master) +- [Main](#main) - [Airflow 2.1.0](#airflow-210) - [Airflow 2.0.2](#airflow-202) - [Airflow 2.0.1](#airflow-201) @@ -54,7 +54,7 @@ assists users migrating to a new version. -## Master +## Main **Table of contents** -- [Master](#master) +- [Main](#main) -## Master +## Main