4

I have a third-party maintained file with a lot of lines and it looks like:

# a comment lines
aaa.bbb.cc 423 
ddd.ee.fff 452
...
tons.like.them 111

Is there any way to load the file to a local map and access its keys in Terraform using the syntax like?:

"${lookup(var.locals.mylist, 'ddd.ee.fff', 0)}"
kivagant
  • 1,849
  • 2
  • 24
  • 33

2 Answers2

11

It's often better to pre-process files like this using a separate program written in a general-purpose programming language, since Terraform's DSL is not well-suited for this sort of ad-hoc parsing logic.

Terraform v0.11 and earlier lack the primitives required to implement parsing like this within the language itself. The v0.12.0 release (which is in beta at the time of writing) introduces some primitives that do allow this sort of parsing to be implemented, though I would still generally prefer to do it outside of Terraform as a preprocessing step if at all possible.

With that said, here's an example of parsing a file like your example into a map using Terraform v0.12 features:

locals {
  raw_lines = [
    for line in split("\n", file("${path.module}/input.txt")) :
    split(" ", trimspace(line))
  ]
  lines = [
    for line in local.raw_lines :
    line if length(line[0]) > 0 && substr(line[0], 0, 1) != "#"
  ]
  records = { for line in local.lines : line[0] => line[1] }
}

output "result" {
  value = local.records
}

Given the following input file:

# a comment line
aaa.bbb.cc 423 
ddd.ee.fff 452
tons.like.them 111

# another comment
another.record abc

This produces the following result using Terraform v0.12.0-beta1:

$ terraform apply

Apply complete! Resources: 0 added, 0 changed, 0 destroyed.

Outputs:

result = {
  "aaa.bbb.cc" = "423"
  "another.record" = "abc"
  "ddd.ee.fff" = "452"
  "tons.like.them" = "111"
}
Martin Atkins
  • 62,420
  • 8
  • 120
  • 138
  • Thank you. Very nice answer for 0.12. I'm marking this as a solution even though I can't test it right now. But v0.12 is a big step forward for Terraform and I hope that it will be released soon. – kivagant Mar 17 '19 at 16:04
0

The file name is myfile.txt and the content is:

# a comment line
aaa.bbb.cc 423 
ddd.ee.fff 452
...
tons.like.them 111

The data query should look like:

data "external" "myfile" {
  program = [
    "bash",
    "${path.module}/template/parser.sh",
    "${path.module}/template/myfile.txt",
    "ddd.ee.fff"
  ]
}

And the template/parser.sh contains:

#!/usr/bin/env bash
set -eo pipefail

CONFIG=$(cat $1|grep -v '#' | sed 's/ /=/g' | jq -R -n -c '.data = ([inputs|split("=")|{(.[0]):.[1]}] |add)')

OUTPUT=$(echo $CONFIG | jq -r '.data["'$2'"]')
jq -n --arg output "$OUTPUT" '{"output":$output}'

This tricky script will return:

{"output": 452}

And the interpolation can be used to get the resulting value:

${data.external.myfile.result.output}

Unfortunately I didn't find how to load the full list of rows into Terraform and access them using "variable of variable" or lookup(). If the bash scripts return the result of the first jq call, Terraform throws the exception:

command "bash" produced invalid JSON: json: cannot unmarshal object into Go value of type string

Update2: To access the resulting values, you need to use an additional key ".result" before the actual ".output" key from the script.

kivagant
  • 1,849
  • 2
  • 24
  • 33
  • hey, how can we get the list of values or the list of key-value pairs.. I've `{"privateIpAddress":"10.48.131.30"} {"privateIpAddress":"10.48.131.34"}` this and I would like to extract the ip addresses alone & process them further.. – harshavmb Jul 30 '20 at 14:16