Skip to content

Azure Pipelines task to replace tokens in files with variables.

License

Notifications You must be signed in to change notification settings

qetza/replacetokens-task

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ReplaceTokens

mit license donate

This Azure Pipelines task replaces tokens in text based files with variable values.

What's new

Please refer to the release page for the latest release notes.

Breaking changes in v6

The task was completely rewritten to use the npm package @qetza/replacetokens and be more similar with the new ReplaceTokens GitHub Actions:

  • support only node 16 (mininum agent version 2.206.1)
  • renamed input targetFiles to sources
  • migrated to fast-glob for glob patterns causing syntax changes (must use forward slash (/) for directory separator whatever the OS)
  • removed support for comma-separated paths in targetFiles
  • renamed encoding value win1252 to windows1252
  • renamed escapeType to escape
    • renamed escapeType value none to off
  • merged inputs variableFiles and inlineVariables in additionalVariables
  • renamed input variableSeparator to separator
  • renamed input enableRecursion to recursive
  • renamed input rootDirectory to root
  • renamed tokenPattern value rm to doubleunderscores
  • renamed input writeBOM to addBOM
  • changed writeBOM default value to false
  • renamed input verbosity to logLevel
    • renamed verbosity value detailed to debug
    • renamed verbosity value normal to info
    • removed verbosity value off (see new supported values for replacement)
  • renamed input actionOnMissing to missingVarLog
    • renamed actionOnMissing value continue to off
    • renamed actionOnMissing value fail to error
  • replaced keepToken with missingVarAction with value keep
  • renamed input actionOnNoFiles to ifNoFilesFound
    • renamed actionOnNoFiles value continue to ignore
    • renamed actionOnNoFiles value fail to error
  • renamed input enableTransforms to transforms
    • renamed transform noescape to raw
  • renamed input transformPrefix to transformsPrefix
  • renamed input transformSuffix to transformsSuffix
  • removed input useLegacyPattern
  • removed input useLegacyEmptyFeature
  • replaced input useDefaultValue with missingVarAction with value replace
  • removed input emptyValue
  • renamed input defaultValue to missingVarDefault
  • removed input enableTelemetry to telemetryOptout and inverse meaning
  • renamed output tokenReplacedCount to replaced
  • renamed output tokenFoundCount to tokens
  • renamed output fileProcessedCount to files
  • renamed output transformExecutedCount to transforms
  • renamed output defaultValueCount to defaults

You can find some documentation to help migrate here: Migrate from v5 to v6

If you are migrating from v3 to v6 make sure to read this documentation first: Migrate from v3 to v5

Usage

Inputs

- task: qetza.replacetokens.replacetokens-task.replacetokens@6
  inputs:
    # A multiline list of files to replace tokens in.
    # Each line supports:
    #   - multiple glob patterns separated by a semi-colon ';' using fast-glob syntax 
    #     (you must always use forward slash '/' as a directory separator, on win32 will 
    #     automatically replace backslash with forward slash)
    #   - outputing the result in another file adding the output path after an arrow '=>' 
    #     (if the output path is a relative path, it will be relative to the input file)
    #   - wildcard replacement in the output file name using an asterix '*' in the input 
    #     and output file names
    #
    # Example: '**/*.json; !**/local/* => out/*.json' will match all files ending with 
    # '.json' in all directories and sub directories except in `local` directory and the 
    # output will be in a sub directory `out` relative to the input file keeping the file 
    # name.
    #
    # Required.
    sources: ''

    # Add BOM when writing files.
    #
    # Optional. Default: false
    addBOM: ''

    # A YAML formatted string containing additional variable values (keys are case-insensitive).
    # Value can be:
    #   - an object: properties will be parsed as key/value pairs
    #   - a string starting with '@': value is parsed as multiple glob patterns separated 
    #     by a semi-colon ';' using fast-glob syntax to JSON or YAML files
    #   - a string starting with '$': value is parsed as an environment variable name 
    #     containing JSON encoded key/value pairs
    #   - an array: each item must be an object or a string and will be parsed as 
    #     specified previously
    #
    # Multiple entries are merge into a single list of key/value pairs.
    #
    # Example:
    # - '@**/*.json;**/*.yml;!**/local/*'
    # - '$COMPUTER_VARS'
    # - var1: '${{ parameters.var1 }}'
    #
    # will add all variables from:
    # - '.json' and '.yml' files except under 'local' directory, 
    # - the environment variable 'COMPUTER_VARS'
    # - the inline variable 'var1'
    #
    # Optional.
    additionalVariables: ''

    # Enable case-insensitive file path matching in glob patterns for sources and additionalVariables.
    #
    # Optional. Default: true
    caseInsensitivePaths: ''

    # The characters to escape when using 'custom' escape.
    #
    # Optional.
    charsToEscape: ''

    # The encoding to read and write all files.
    #
    # Accepted values:
    #   - auto: detect encoding using js-chardet
    #   - any value supported by iconv-lite
    #
    # Optional. Default: auto
    encoding: ''

    # The character escape type to apply on each value.
    #
    # Accepted values:
    #  - auto: automatically apply JSON or XML escape based on file extension
    #  - off: don't escape values
    #  - json: JSON escape
    #  - xml: XML escape
    #  - custom: apply custom escape using escape-char and chars-to-escape
    #
    # Optional. Default: auto
    escape: ''

    # The escape character to use when using 'custom' escape.
    #
    # Optional.
    escapeChar: ''

    # The behavior if no files are found.
    #
    # Accepted values:
    #   - ignore: do not output any message, the action do not fail
    #   - warn: output a warning but do not fail the action
    #   - error: fail the action with an error message
    #
    # Optional. Default: ignore
    ifNoFilesFound: ''

    # Include directories and files starting with a dot ('.') in glob matching results for sources and additionalVariables.
    #
    # Optional. Default: true
    includeDotPaths: ''

    # The log level.
    #
    # Accepted values:
    #   - debug
    #   - info
    #   - warn
    #   - error
    #
    # Debug messages will always be sent to the internal debug system.
    # Error messages will always fail the action.
    #
    # Optional. Default: info
    logLevel: ''

    # The behavior if variable is not found.
    #
    # Accepted values:
    #   - none: replace the token with an empty string and log a message
    #   - keep: leave the token and log a message
    #   - replace: replace with the value from missing-var-default and do not 
    #     log a message
    #
    # Optional. Default: none
    missingVarAction: ''

    # The default value to use when a key is not found.
    #
    # Optional. Default: empty string
    missingVarDefault: ''

    # The level to log key not found messages.
    #
    # Accepted values:
    #   - off
    #   - warn
    #   - error
    #
    # Optional. Default: warn
    missingVarLog: ''

    # Enable token replacements in values recusively.
    #
    # Example: '#{message}#' with variables '{"message":"hello #{name}#!","name":"world"}' 
    # will result in 'hello world!'
    #
    # Optional. Default: false
    recursive: ''

    # The root path to use when reading files with a relative path.
    #
    # Optional. Default: $(System.DefaultWorkingDirectory)
    root: ''

    # The separtor to use when flattening keys in variables.
    #
    # Example: '{ "key": { "array": ["a1", "a2"], "sub": "s1" } }' will be flatten as 
    # '{ "key.array.0": "a1", "key.array.1": "a2", "key.sub": "s1" }'
    #
    # Optional. Default: .
    separator: ''

    # Opt out of the anonymous telemetry feature.
    # You can also set the REPLACETOKENS_TELEMETRY_OPTOUT environment variable to '1' or 
    # 'true'.
    #
    # Optional. Default: false
    telemetryOptout: ''

    # The token pattern to use.
    # Use 'custom' to provide your own prefix and suffix.
    #
    # Accepted values:
    #   - default: #{ ... }#
    #   - azpipelines: $( ... )
    #   - custom: token-prefix ... token-suffix
    #   - doublebraces: {{ ... }}
    #   - doubleunderscores: __ ... __
    #   - githubactions: #{{ ... }}
    #   - octopus: #{ ... }
    #
    # Optional. Default: default
    tokenPattern: ''

    # The token prefix when using 'custom' token pattern.
    #
    # Optional.
    tokenPrefix: ''

    # The token suffix when using 'custom' token pattern.
    #
    # Optional.
    tokenSuffix: ''

    # Enable transforms on values.
    # The syntax to apply transform on a value is '#{<transform>(<name>[,<parameters>])}#'.
    #
    # Supported transforms:
    #   - base64(name): base64 encode the value
    #   - indent(name[, size, firstline]): indent lines in the value where size is the 
    #     indent size (default is '2') and firstline specifies if the first line must be 
    #     indented also (default is 'false')
    #   - lower(name): lowercase the value
    #   - raw(name): raw value (disable escaping)
    #   - upper(name): uppercase the value
    #
    # Example: 'key=#{upper(KEY1)}#' with '{ "KEY1": "value1" }' will result in 
    # 'key=VALUE1'
    #
    # Optional. Default: false
    transforms: ''

    # The tranforms prefix when using transforms.
    #
    # Optional. Default: (
    transformsPrefix: ''

    # The tranforms prefix when using transforms.
    #
    # Optional. Default: )
    transformsSuffix: ''

    # Use only variables declared in 'additionalVariables'.
    #
    # Optional: Default: false
    useAdditionalVariablesOnly: ''

Output

Name Description Example
defaults The number of tokens replaced with the default value if one was specified. 1
files The number of source files parsed. 2
replaced The number of values replaced by a value different than the default value. 7
tokens The number of tokens found in all files. 8
transforms The number of transforms applied. 2

Examples

Multiple target files

- task: qetza.replacetokens.replacetokens-task.replacetokens@6
  inputs:
    sources: |
      **/*.json;!**/*.dev.json;!**/vars.json => _tmp/*.json
      **/*.yml

Additional variables

- task: qetza.replacetokens.replacetokens-task.replacetokens@6
  inputs:
    sources: '**/*.json'
    additionalVariables: |
      - '@**/vars.(json|yml|yaml)'      # read from files
      - '$ENV_VARS',                    # read from env
      - var1: '${{ parameters.var1 }}'  # inline key/value pairs
        var2: '${{ parameters.var2 }}'

Access outputs

steps:
- task: qetza.replacetokens.replacetokens-task.replacetokens@6
  name: replaceTokens
  inputs:
    sources: '**/*.json'
- script: |
    echo "defaults  : $(replaceTokens.defaults)"
    echo "files     : $(replaceTokens.files)"
    echo "replaced  : $(replaceTokens.replaced)"
    echo "tokens    : $(replaceTokens.tokens)"
    echo "transforms: $(replaceTokens.transforms)"

Data/Telemetry

The Replace Tokens task for Azure Pipelines collects anonymous usage data and sends them by default to its author to help improve the product. If you don't wish to send usage data, you can change your telemetry settings through the telemetryOptout parameter or by setting the REPLACETOKENS_TELEMETRY_OPTOUT environment variable to 1 or true.

The following anonymous data is send:

  • the hash of your collection id
  • the hash of your project id and pipeline definition id
  • the hosting (server or cloud)
  • the agent OS (Windows, macOS or Linux)
  • the inputs values for
    • addBOM
    • caseInsensitivePaths
    • charsToEscape
    • encoding
    • escape
    • escapeChar
    • ifNoFilesFound
    • includeDotPaths
    • logLevel
    • missingVarAction
    • missingVarDefault
    • missingVarLog
    • recursive
    • separator
    • tokenPattern
    • tokenPrefix
    • tokenSuffix
    • transforms
    • transformsPrefix
    • transformsSuffix
    • useAdditionalVariablesOnly
  • the number of sources entries
  • the number of additionalVariables entries referencing file
  • the number of additionalVariables entries referencing environment variables
  • the number of additionalVariables inline entries
  • the task result (succeed or failed)
  • the task execution duration
  • the outputs (defaults, files, replaced, tokens and transforms)

You can see the JSON serialized telemetry data sent in debug logs.