Common-CI project contains a guideline for creation of continuous integration scripts and describes a general approach to continuous integration within 51Degrees. This readme should provide a comprehensive overview of rules and conventions to be expected from existing jobs and which should be followed when new jobs are created.
NOTE: A Release Engineer following this guide should first check the internal-ci
readme to understand the 51Degrees release process.
- Common-CI
- Table of content
- Reasoning
- Continuous integration
- Continuous deployment
- License
In order to keep high hygiene of development work and have clear indication of successful build, test and package creation, a common set of rules should be followed to measure the quality in a consistent manner across all of the projects. The main reason for having continuous integration in 51Degrees is to assure the best possible quality of software products by confirming successful execution of unit, functional, regression and example tests whenever change to the code base is made. Apart from the code related test, other measures prove the quality of software development through verification of successful execution of build and test processes on all supported platforms and architectures.
The reason for this document is to describe the technical solutions used for continuous integration in 51Degrees as well as provide a clear guidance on common rules across:
- Naming conventions;
- Compulsory elements of CI scripts;
- Platforms and environments;
- Requirements for additional documentation;
This section describes the general approach to continuous integration in 51Degrees.
As an internal repository management system 51Degrees is using the Azure DevOps services and continuous integration is achieved through Azure DevOps Pipelines. Each pipeline is defined by a single or multiple yml
scripts. High maintainability of continuous integration is achieved by keeping the tasks shared between the jobs in separate yml
scripts and reuse them when possible to avoid code duplications and “copy & paste” errors.
51Degrees is using continuous software development practices described in principle as Gitflow Workflow.
At least two main continuous integration jobs should be provided for each software project/repository:
- “Build and test”, and
- “Create packages”
Binaries built by continuous integration should be configured to perform a release built by default. If debug build configuration is required, additional, explicit jobs should be created to clearly indicate that pipeline output will be in debug mode.
Build and test job should be used for general purpose building and testing process, and should be the initial step of “Build, test and publish”. Continuous integration should be configured to automatically trigger this type of job whenever pull request is created regardless of the destination branch. Job should be automatically performed whenever any code change is made to the active pull request.
Build and test job provides tasks required for the project to build and run unit and regression tests. This job usually runs a sequence of tasks:
- Configuration
This task (or tasks) configures the environment to meet the build requirements. Task should install all dependencies and platform specific packages required for the build and test processes. - Code checkout
Task to checkout the source code from the version control system. 51Degrees is using Git repositories hosted on Azure DevOps platform:git clone
with, where required, submodules initialisation (git submodule update --init --recursive
) should be used. - Build
Language and project specific build tool execution. - Test (and publish the results)
Language and project specific unit, functional, example, or regression testing execution.
Set of tasks may differ between projects due to a requirement of individual approach for language or platform specific solutions. If an individual solution is in place, it should be documented in the ci/readme.md
file of the given project.
Job must indicate a fail state if any of the following occurs:
- Configuration step fails on installation of any of the dependencies
- Code checkout step fails regardless of the reason
- Build step fails with error or warning - all warnings should be treated as errors
- Any test fails
If multiple operating system platforms should be supported according to the tested versions table “Build and test” job should either:
- implement support for each operating system in a single
yml
file, or - implement support for each operating system in a separate
yml
file and create a combiningyml
script.
General guideline for selecting the approach is to keep the yml
file in a consumable size; if environment configuration, build, test, and any platform specific tasks sums up to more than 4 tasks - create a separate yml
file. Try to use multi-platform matrix configuration whenever possible, more details can be found in Microsoft documentation
Note: Build and test job should be configured in a separate yml
file to allow performing the set of tasks defined in this job as a part of "Build, test and publish" job.
Create packages job should be used for creation of packages or tagging the repository and continuous integration system should be configured to automatically execute this job whenever pull request from release
or hotfix
branch is merged to main
branch (as described in gitflow workflow).
Create packages job performs any tasks required for creation of packages and/or repository version tag. This job usually runs a sequence of tasks which differ for creating the packages and tagging the repository.
Typical tasks for packages creation:
- Package creation
Language and project specific task generating the packages for given language and/or platform. This task should be documented in project specificci/readme.md
file. - Digital signing
This task should digitally sign the generated binaries or packages to assure a high level of quality and trust for the end user. - Publish artifacts
Packages or binaries produced by Build, test and publish job should be published as artifacts of the Azure DevOps Pipeline execution. This task is important to support a smooth release process where the product of this step is used as the final release package.
Typical tasks for creating a repository tag:
- Determine repository version number
This step should determine the version number to be used for repository tagging. 51Degrees is using GitVersion Azure DevOps plugin to identify the repository version based on the gitflow workflow. - Tag the repository
Performgit tag
operation on the repository using the version number determined in the previous step andpush
the newly created tag to remote.
Job must indicate a fail state if any of the following occurs:
- Package creation fails
- Digital signature process fails
- Artifacts cannot be found or published
There are two main jobs per pipeline: build and test
, and create packages
the common naming convention is as follows:
- For “build and test” job:
<package-name>-test
where<package-name>
represents dash-separated repository name; for example for repositorypipeline-python
, “build and test” job name should be configured aspipeline-python-test
. - For “create packages” job when packages are created:
<package-name>-create-packages
, where<package-name>
represents dash-separated repository name; for example for repositorypipeline-python
, “build, test and publish” job name should be configured aspipeline-python-create-packages
. - For “create packages” job when repository is only tagged:
<package-name>-tag-repository
, where<package-name>
represents dash-separated repository name; for example for repositorylocation-php
, “build, test and publish” job name should be configured aslocation-php-tag-repository
. - For jobs in debug configuration:
<package-name>-<job>-debug
, where<package-name>
represents dash-separated repository name,<job>
represents job suffix selected above; for example for repositorydevice-detection-dotnet
, “build, test and publish” job indebug
the name should be configured asdevice-detection-dotnet-create-packages-debug
.
Detailed documentation and useful information about Azure DevOps pipelines can be found in Microsoft documentation.
YAML Ain't Markup Language configuration files are used to configure Azure DevOps continuous integration pipelines and more details about how to use them can be found in Microsoft documentation.
This guideline obligates the CI developer to add comments to any tasks defined in yml
files that are not self descriptive and requires more information to understand the implemented process. Follow the general rule that “if in doubt - comment” and always ask for peer review in order to address any concerns or possible misunderstandings.
Comments in yml
files are achieved by #
character prefix, for example:
Visual Studio build task from pipeline-dotnet
project:
- task: VSBuild@1
displayName: 'Build solutions'
inputs:
solution: '$(RestoreBuildProjects)'
vsVersion: '15.0'
platform: 'Any CPU'
configuration: '$(BuildConfiguration)'
clean: true
Although relatively self descriptive, could be extended by comments:
# Visual studio build task - VS2017 configuration
- task: VSBuild@1
displayName: 'Build solutions' # defines name of the task displayed in Azure DevOps
inputs: # Task specific inputs
solution: '$(RestoreBuildProjects)' # Location of solution file obtained from RestoreBuildProjects variable set by previous NuGet restore step
vsVersion: '15.0' # Version of Visual Studio to be used (version 15.0 is VS2017)
platform: 'Any CPU' # Target platform
configuration: '$(BuildConfiguration)' # Build configuration as set by strategy matrix at the top of this file
clean: true # Should we clean?
51Degrees provides information about supported platforms and language/API versions. The full table is available on the tested versions page. Azure DevOps Pipelines should be configured to at least mirror the requirements setup by the documentation. If platform architecture is not specified in the support version matrix, it is assumed that both 32 and 64 bit platforms are supported and relevant continuous integration jobs should be provided (please ignore 32bit architecture for the operating systems not supporting x86 platforms). If any changes are applied, support removed or added, either the documentation table or CI configuration must be updated to assure full synchronization between the two.
Whenever testing environment is set up for a project, continuous integration scripts should be configured to perform full set of tests for:
- All platforms supported by the software project
- All architectures supported by the operating system (Linux 32/64bit; Windows 32/64bit; MacOS 64bit)
- All variants of configuration (e.g. for APIs all performance profiles)
- Both debug and release build configurations
This guideline covers high-level overview and basic principles for continuous integration configuration in 51Degrees. Due to the nature of software products supported and provided by the company, different approaches may be required for various types of platforms, languages, APIs and their versions. Therefore, this document should be treated as the guideline and any project specific configuration that alters the information provided by this document should be explained in the readme.md
file stored under the ci
folder of the given project. Repository containing this document should be added as a submodule to any project that contains Continuous Integration pipeline configured within 51Degrees Azure DevOps environment. Example directory tree expected in the project:
<project_root>
\ci
\common-ci
\readme.md
\readme.md
\build-and-test.yml
\build-and-publish.yml
Language specific configuration should be documented in a readme.md
file stored under the corresponding folder of the repository common-ci
. Example directory tree of common-ci
for readme.md
files:
\common-ci
\readme.md
\java
\readme.md
\dotnet
\readme.md
...
...
When tasks are replicated across APIs, they should be made as templates and kept in the common-ci
repository. Templates that are shared across languages are kept at the root directory of common-ci
and templates which are only shared within a language APIs should be kept in its distinct folder, named with the language name. Below is an illustration of the of common-ci
directory structure:
\common-ci
\readme.md
\languages-common-template.yml
...
\java
\java-apis-common-template.yml
...
...
Continuous deployment in 51Degrees is configured to continuously publish packages to the internal package manager feed available in Azure DevOps Artifacts service. Deployment is configured to create and publish the packages internally on a daily basis (overnightly) so that the latest version is available for development purposes.
All of the packages for daily continuous deployment are created based on the latest version of the develop
repository branch.
51Degrees is using Azure DevOps services for continuous integration and deployment configuration. Azure DevOps provides internal repository managers for the main languages supported by 51Degrees APIs:
- NuGet
- Maven
- NPM
- PyPi
Deployment to internal package managers is performed daily (overnightly) based on changes applied to develop
branches of the source code repositories.
Refer to the internal-ci
readme for instruction on 51Degrees release process.
Packages release process in 51Degrees is handled through Azure DevOps and the deployment to the public repositories is performed manually using packages generated by Create packages continuous integration job. As explained in “Create packages” section, process of creating the packages is automatically triggered by completion of pull request to the main
branch of the repository. Created packages are stored as artifacts in Azure DevOps Artifacts and are used in internal release pipelines in order to upload them to the public package managers/repositories.
API release process steps:
- PR completed to the
main
branch. - Automatic execution of build, test and publish job.
- Automatic trigger for release pipeline:
- Automatic upload to internal package manager
- Manual deployment to public package manager/repository
51Degrees provides APIs for a selection of programming languages and packages are available on the following public package managers:
The release process uses the following flow:
- Changes are prepared in
release|hotfix
branches. - Switch
AutomatedRelease
variable inCIAutomation
variable group toOn
. Trigger release
job starts; or allleaf submodules
are manually or automatically merged tomain|master
.- Since
common-ci
is a submodule of all APIs, this is normally done by havingcommon-ci
release|hotfix
branch merged tomain
. - To merge
common-ci
release|hotfix
branch tomain
, creating a pull request tomain
is enough since the pull request will be automatically completed at the end of the pull requestbuild-and-test
check. This is done by a task that calls a set of APIs inRelease
modules.
- Since
- All
leaf submodules
release|hotfix
branches are merged tomain
.- NOTE:
leaf submodules
are modules which do not have any dependencies.
- NOTE:
- Completion of submodule merging triggers tag and packages deployment of the submodules. If module does not generate packages, then it will be the tag creation step.
- Tag and package deployment (or creation) completion triggers
submodule_trigger
pipelines of the parent APIs, which will then update submodule references, package dependencies versions and create a pull request tomain
if one does not exist. - The pull request of parent APIs will then again be completed once the
build_and_test
check passes. - Completion of merging
release|hotfix
branches tomain
will also triggerdeployment
of packages both internally and externally in the same way that the submodules were done. - At the end of the
release
process, packages are available to be collected internally; anddeployment
to external repositories are left to be approved byrelease engineer
. - Switch
AutomatedRelease
variable inCIAutomation
variable group toOff
to prevent any accidental references update or pull request completion after this stage.
The fully automated release process is controlled by a release-config.json
file, located in the common-ci
repository. There also exists a release-config-template.json
which contains all settings and APIs that can be included in a release-config.json
. The automated release process can also be enabled or disabled by a global variable AutomatedRelease
as part of the Azure Devops variable group CIAutomation
. To support automating the deployment process, powershell scripts and additional pipelines are required. These scripts are located under common-ci
repository, and are grouped into modules. The additional pipelines are required per API, but shared templated can be reused from common-ci
.
- The
submodule trigger
pipeline is required for each API to pick up the package deployment from each of its submodules. Since a module can have multiple submodules, multiple triggers might happen at the same time. Thus, this pipeline should cancel all previous builds and only run one at a single time.- This must be triggered only by a submodule deployment that happened on the 'main' branch.
- This pipeline, will then perform update of all submodule references and package dependencies, using the versions specified in the
release-config.json
.
- The
pull request completion
job is required to be done at the end of eachbuild and test
pipeline.- This will check if the current build is triggered by a pull request from
release|hotfix
branch to themain
branch. If it is and the following conditions have been met, it then proceed to complete the corresponding pull request.- The
AutomatedRelease
variable has been enabled. - All submodule references and package dependencies have been updated.
- The pull request has been approved and comments have been left unresolved; or approval is not required.
- The
- This will check if the current build is triggered by a pull request from
- Both of the above pipelines will not proceed if a tag already exists for a release version.
The release configuration files must be updated to configure which repositories will be released. Each repository can be configured with the following fields:
Files to update:
release-config.json
cloud-release-config.json
Property | Type | Description |
---|---|---|
version | string | The target version of this repository, e.g. the version to release or the current version if a release version does not exist. Should be a valid Semantic Version e.g. 4.3.1 |
packageName | string | The name of the package that this repository contains code for. E.g. FiftyOne.Pipeline |
versionVariableName | string | The variable name used in pipelines to update dependencies. |
isLeaf | boolean | Is this a leaf of the dependency tree, i.e. does this repository have no dependencies? |
dependencies | string[] | A list of dependencies, these are the names of the repositories that this one is dependent on or contains dependencies of the contained projects. |
While the cloud release is not triggered by the automated API release, the cloud-release-config.json
file will also need to be updated in preparation for the cloud release the following week. Only the version numbers must be updated in this file.
A release-config-template.json
file is provided which contains configurations for all the repos with a release pipeline, this should be used as a reference to modify the release config files. Only edit this file if adding a new repo configuration and updating dependencies.
Rules
There are some rules that a release engineer will need to pay attention to when updating the file:
packageName
: This name is used to search for matching package reference in a package file of a target language. This is used as part of a regular expression search so regex syntax can also be used. Most of the time, the value will just be a prefix of the package name since an API usually uses unique prefix such asFiftyOne.DeviceDetection
. However, there are APIs which use more than one prefixes such aslicense-dotnet
, whereFiftyOne.License
andFiftyOne.Resource
are used. In this specific case, the value of apackageName
can beFiftyOne.(License|Resource)
. CurrentlyDotNet
is only one which tests and supports regexpackageName
. Others language APIs have unique packageprefix
so regex is not required.dependencies
: When specify dependencies, make sure the definition of the dependencies are also specified, so that the release process can detect what tag to update to for submodule reference or what version to be updated for package dependencies. Failing to include the dependency definition will result in submodule references and package dependencies not being updated.- A
dependency
can be either a submodule or a package dependency.
- A
Validation
Once the files have been updated, they can be validated by running the following PowerShell commands:
Get-Content release-config.json | Out-String | ConvertFrom-Json
- Test JSON is validTest-Json -Json "$(Get-Content .\release-config.json)" -SchemaFile .\release-config-schema.json
- Test the file matches the release-config schema.
Steps to make changes and update the release config in common-ci repo:
- create new hotfix / release branch in common-ci and push to the remote -
- When determining the branch name for the common-ci hotfix or release branch, typically the version should be the same MAJOR and MINOR version of the API releases however the PATCH version does not need to match and should simply be incremented from the previous version in common-ci. See validating branch names.
- create new feature branch for release config in common-ci and push to the remote
- update release config with repos, deps and versions for both
- commit modified release config to feature branch and create a pr from the feature branch into the release / hotfix branch.
- At the start,
release engineer
will need to update therelease-config.json
to specify all release packages and their target release version. Any additional details should also be specified.- NOTE: There is a
release-config-template.json
available for references.
- NOTE: There is a
- Once the
release-config.json
is ready in one of therelease|hotfix
branch ofcommon-ci
, either the following should trigger the release process:- Complete the pull request that contains the updated
release-config.json
changes to themain
branch. This will only work based on the assumption that thecommon-ci
is specified as submodule of all release APIs.- This pull request can be completed manually or automatically by the
build-and-test.yml
auto completion pull request task. The automatic completion is only enabled if the global variableAutomatedRelease
specified in the variables groupCIAutomation
is set toOn
.
- This pull request can be completed manually or automatically by the
- Trigger the
trigger-release
pipeline of thecommon-ci
against the hotfix or release branch.
- Complete the pull request that contains the updated
As briefly mentioned, the automated release process requires powershell scripts to support tasks that have been done manually. These scripts are located in the directory scripts/modules
of the common-ci
repository.
Testing of the release process is done using both script tests and manual testing.
- Each module has a set of tests. Since it is the nature of the powershell script to contain many call to utility tools which make it difficult to mock and test effectively. Therefore, where it is not possible to test, the
test
file is left empty as a place holder for the future usage.- To run the tests, firstly
Pester
module is required.Pester
is included in powershell environment by default. However it is recommended to install the latest version. - Updated the environment variables and authorization string as mentioned in the
Manual testing
section below. - Set the environment
$Env:51D_ENVIRONMENT
toTest
to ensure that no tests is writing or changing the production environment. - Load the modules by setting
$Env:PSModulePath
( and usingImport-Module
command if it is needed) in a powershell terminal. - Navigate to
scripts/modules
path and run the following command:Invoke-Pester
- To run each module test, run the above command in each module folder.
- To run the tests, firstly
- Manual testing involves creation of test environment so that the release process can be completed and examined without effecting the production environment.
- To create a test environment, use the
New-TestEnvironment
function provided in the51DReleaseTestEnvModule
with an input configuration file. There are many example of how a configuration file looks like in the51DReleaseTestEnvModule
config
folder.- The module is mainly used locally but can also be integrated as part of the
Azure Devops
pipelines yaml. However, since the configuration file might be adjusted often to provide a test environment that is suitable for certain scenarios, it is the best to use the script locally. - To use the module locally, the
$env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI
and$env:SYSTEM_TEAMPROJECTID
as guided in theSharedVariables.psm1
of51DReleaseTestEnvModule
should be updated accordingly. TheAuthorizationString
of the51DAuthorizationModule
is also required to be updated. More details can be found in each module file. - The test environment also requires a separate
artifact
feed. Creation of testartifact
feed is done manually. Once it is created, theInternalFeedName
andInternalFeedId
of the variables groupCIAutomation
should be updated accordingly. - Once the release process has completed in a test environment, the
release-config.json
used for the release process can be fed into theTest-Release
function to verify if the output has successfully completed correctly. - To delete the test environment, use the
Clear-TestEnvironment
function of the51DReleaseTestEnvModule
. The test artifact feed should be deleted manually. As deletion is a dangerous task, be cautious when performing them. TheClear-TestEnvironment
will prompt to ask for your confirmation together with the target environment before proceed.
- The module is mainly used locally but can also be integrated as part of the
- To create a test environment, use the
The following Flow Chart can be edited and updated by changing the release-process.drawio
file.
There are two main jobs per pipeline: deploy internally
, and deploy externally
. The common naming convention is as follows:
- For “deploy internally” job:
<package-name>-deploy-internal
where<package-name>
represents dash-separated repository name; for example for repositorypipeline-python
, “deploy internally” job name should be configured aspipeline-python-deploy-internal
. - For “deploy externally” job:
<package-name>-deploy-external
, where<package-name>
represents dash-separated repository name; for example for repositorypipeline-python
, “deploy externally” job name should be configured aspipeline-python-create-packages
. - For “deploy” job there is no internal or external package repository:
<package-name>-deploy
, where<package-name>
represents dash-separated repository name; for example for repositorylocation-php
, “deploy” job name should be configured aslocation-php-deploy
. This normally happens when there is no package management for a target language; or the only package management supported for a language is via publishing the source code to a public Git repository.
There are other jobs required to support the automated deployment process: submodule trigger
, and pull request completion
. The pull request completion
is required as part of the build and test
pipeline. Thus, the only required common naming convention is for submodule trigger
pipeline.
- For “submodule trigger” job:
<package-name>-submodule-trigger
where<package-name>
represents dash-separated repository name; for example for repositorypipeline-python
, “submodule trigger” job name should be configured aspipeline-python-submodule-trigger
.
License information can be found in the LICENSE
file available in this repository.