0

I have a json file like this:

[
  {
    "classname": "Test endpoint",
    "name": "expect failure",
    "failure_system_out": "expected 404 Not Found\nError in test endpoint\n\tat Test._assertStatus"
  },
  {
    "classname": "Test inner functions",
    "name": "expect failure",
    "failure_system_out": "Example fo test\n\tExpect 4 and got 5"
  }
]

As you see the value in "failure_system_out" is a string containing newline chars (\n) and tab chars (\t).

I am trying to read the file, loop around the objects and print them with this code:

jq -c '.[]' myfile.json | while read i; do
    test_name=$(echo "$i" | jq -r .name)
    system_error=$(echo "$i" | jq -r .failure_system_out)
    printf "${system_error}"

done

The problem is that using this approach, printf doesn't print the script according the the new line & tab chars, but It prints something like this expected 404 Not FoundnError in test endpointntat Test._assertStatus Basically, I think that jq -c removes the \ char and therefore the printf doesn't work properly.

How can I iterate over an array of object stored in a file and keep the chars using to format the string?

Desired output for the first item:

expected 404 Not Found
Error in test endpoint
   at Test._assertStatus

Desired output for the second item:

Example fo test
    Expect 4 and got 5
dventi3
  • 901
  • 4
  • 11
  • 19

3 Answers3

2

Just use jq it's a scripting language on it's own.

$ jq -r '.[0].failure_system_out' /tmp/1
expected 404 Not Found
Error in test endpoint
    at Test._assertStatus
$ jq -r '.[1].failure_system_out' /tmp/1
Example fo test
    Expect 4 and got 5
$ jq -r '.[] | .name as $test_name | .failure_system_out as $system_error | $system_error' /tmp/1
expected 404 Not Found
Error in test endpoint
    at Test._assertStatus
Example fo test
    Expect 4 and got 5

As for using bash, first read https://mywiki.wooledge.org/BashFAQ/001 . I like using base64 to properly transfer context from jq to bash and handle all corner cases.

jq -r '.[] | @base64' /tmp/1 |
while IFS= read -r line; do
    line=$(<<<"$line" base64 -d);
    test_name=$(<<<"$line" jq -r .name);
    system_error=$(<<<"$line" jq -r .failure_system_out);
    printf "%s\n" "$system_error";
done

but it's not needed here, just a proper while read loop should be enough:

jq -c '.[]' /tmp/1 |
while IFS= read -r line; do
    test_name=$(<<<"$line" jq -r .name);
    system_error=$(<<<"$line" jq -r .failure_system_out);
    printf "%s\n" "$system_error";
done
KamilCuk
  • 120,984
  • 8
  • 59
  • 111
1

The question seems to weave amongst several goals, but in any case:

  • there is no need for jq to be called more than once, and

  • there should be no need to use base64 conversions, except possibly if the values corresponding to the keys of interest contain NULs.

If the goal is simply to emit the values of .failure_system_out then:

 jq -r '.[].failure_system_out' test.json

would do it.

If the values of both .name and .failure_system_out must be made available separately as bash variables, then consider:

while IFS= read -d $'\0' system_error ; do
    IFS= read -d $'\0' test_name
    printf "%s\n" name="$test_name"
    printf "%s\n" fso="$system_error"
    echo ""
done < <(jq -rj '.[] | [.name, .failure_system_out, ""] | join("\u0000")' test.json)

readarray could also be used -- see e.g. Storing JQ NULL-delimited output in bash array

peak
  • 105,803
  • 17
  • 152
  • 177
0

@KamilCuk's answer works great and gives quite some more control.

Thought I'd still share this only solution:

printf "%s\n" "$(jq -r -c '.[] | .failure_system_out' test.json)"

This will produce:

expected 404 Not Found
Error in test endpoint
    at Test._assertStatus
Example fo test
    Expect 4 and got 5
0stone0
  • 34,288
  • 4
  • 39
  • 64