Using multiple .tfvars files with a YAML template

Early days

When we first started our project we were using a fairly standard setup for Terraform:

main.tf
outputs.tf
variables.tf
sandbox.tfvars
dev.tfvars
prod.tfvars

This pairs up with a YAML template that, after initialising and validating the Terraform code, runs the appropriate plan/apply/destroy command. Here’s a cut-down version of the plan template:

parameters:
- name: tfvarsPath
  type: string
  default: 'sandbox.tfvars'
# rest of parameters snipped

jobs:
  - job:
    pool:
      vmImage: ${{ parameters.vmImage }}
    steps:
    - template: initialise-terraform.yml
      parameters:
      # snipped

    - task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1@0
      displayName: 'Terraform Plan'
      inputs:
        command: plan
        commandOptions: '-var-file=${{ parameters.tfvarsPath }}'
        environmentServiceNameAzureRM: ${{ parameters.serviceConnection }}
        backendAzureRmResourceGroupName: ${{ parameters.stateRgName }}
        backendAzureRmStorageAccountName: ${{ parameters.stateStorageAccount }}
        backendAzureRmContainerName: ${{ parameters.containerName }}
        backendAzureRmKey: ${{ parameters.key }}

The default of sandbox.tfvars is in place to ensure that if, for whatever reason, somebody forgets to specify the environment file, then a sensible default exists.

Through the combination of parameters and variables the use of this is fairly generic (i.e. we copy-paste it a lot):

jobs:
- template: terraform-plan.yml@templates
  parameters:
    containerName: ${{ variables.containerName }}
    key: ${{ variables.key }}
    tfVarsPath: ${{ variables.tfVarsPath }}
    serviceConnection: ${{ variables.serviceConnection }}
    stateStorageAccount: ${{ variables.stateStorageAccount }}
    workingDirectory: ${{ variables.workingDirectory }}
    vmimage: ${{ variables.vmImage }}

Evolution and growth

As the project evolved, the number of variables we were defining in our environment-specific .tfvars files also grew. The critical part – that forced me to look into how to solve the issue of multiple .tfvars files with our templates – was when we added a separate Key Vault to store a large amount of secrets. Over 300 of them. All of which vary between environments. Overnight our files more than tripled in size. It was clear that if we could separate the secrets for the new Key Vault into a separate file then it would be far easier to manage all of the other secrets. But how to do it?

The official Terraform documentation tells you how to use one file. Plenty of example sites tell you to use one file per environment (which we had been doing). We wanted to be using more than one, though, so I was relieved when I found Marcel Zehner’s blog post about it and saw that it was possible to do this.

Updating the template

The challenge with adding support for multiple .tfvars files into the template was twofold:

  1. We have a lot of pipelines calling the template, so we need to make sure that they continue to work.
  2. The filenames are passed in through to the task via the commandOptions parameter, which is just a string.

To solve this, I had to modify the template to accept multiple values for the tfvarsPath parameter, then work out how to combine them to form what was needed for the commandOptions task parameter.

Fortunately, YAML is quite flexible, so this was all that was need to accept multiple values for the parameter:

parameters:
- name: tfVarsPath
  displayName: '.tfvars files to supply to Terraform'
  type: object
  default:
  - 'sandbox.tfvars'

Joining the values together required using a combination of the functions available within the Azure DevOps YAML pipelines to create a variable:

# in the job variables section
  variables:
  - name: commandOptions
    value: ${{ format(' -var-file {0}', join(' -var-file ', parameters.tfVarsPath)) }}

# at the end
    - task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1@0
      displayName: 'Terraform Plan'
      inputs:
        command: plan
        commandOptions: ${{ variables.commandOptions }}
        # rest of parameters as before

The join command on its own wouldn’t be able to produce the correct results. Passing in an array such as [“dev-main.tfvars”, “dev-vault.tfvars”] would only create dev-main.tfvars -var-file dev-vault.tfvars. This is why the format command is used as well.

The call to the task was then updated such that the variable above is passed to commandOptions.

In use

With the template updated, it was necessary to test it with both single and multiple tfvars files, to check that everything was alright. Which it was

jobs:
- template: terraform-plan.yml@templates
  parameters:
    containerName: ${{ variables.containerName }}
    key: ${{ variables.key }}
    serviceConnection: ${{ variables.serviceConnection }}
    stateStorageAccount: ${{ variables.stateStorageAccount }}
    workingDirectory: ${{ variables.workingDirectory }}
    tfVarsPath:
    - ${{ variables.environmentTerraformVarsPath }}
    - ${{ variables.environmentVaultVarsPath }}
jobs:
- template: terraform-plan.yml@templates
  parameters:
    containerName: ${{ variables.containerName }}
    key: ${{ variables.key }}
    serviceConnection: ${{ variables.serviceConnection }}
    stateStorageAccount: ${{ variables.stateStorageAccount }}
    workingDirectory: ${{ variables.workingDirectory }}
    tfVarsPath: ${{ variables.tfVarsPath }}