171

I'm trying to get jq to parse a JSON structure like:

{
  "a" : 1,
  "b" : 2,
  "c" : "{\"id\":\"9ee ...\",\"parent\":\"abc...\"}\n"
}

That is, an element in the JSON is a string with escaped json.

So, I have something along the lines of $ jq [.c] myFile.json | jq [.id]

But that crashes with jq: error: Cannot index string with string

This is because the output of .c is a string, not more JSON. How do I get jq to parse this string?

My initial solution is to use sed to replace all the escape chars (\":\", \",\" and \") but that's messy, I assume there's a way built into jq to do this?

Thanks!

edit: Also, the jq version available here is:

$ jq --version
jq version 1.3

I guess I could update it if required.

avivamg
  • 12,197
  • 3
  • 67
  • 61
Colin Grogan
  • 2,462
  • 3
  • 12
  • 12

3 Answers3

301

jq has the fromjson builtin for this:

jq '.c | fromjson | .id' myFile.json

fromjson was added in version 1.4.

jwodder
  • 54,758
  • 12
  • 108
  • 124
  • 2
    Thank you. This works. I'll accept this answer, as it's more 'idiomaitic' I feel. Cheers. – Colin Grogan Feb 02 '16 at 14:04
  • @ColinGrogan please do. – vbence Oct 20 '17 at 12:51
  • 1
    @ColinGrogan: I don't see any reason to change the accepted answer since you clearly wrote in your question that you used the version 1.3 of jq in which the `fromjson` feature isn't available. In other words, even if this answer is interesting, it doesn't answer the question. – Casimir et Hippolyte Oct 20 '17 at 15:57
  • Is it possible to use this but on an entire json file (not specifying the .id property)? – Itération 122442 Mar 15 '19 at 10:01
  • 2
    @FlorianCastelain yes, either omit it or use dot: `jq 'fromjson | .' myfile`, where myfile contains `"{\"key\":1, \"word\":\"cat\"}"` –  Sep 30 '19 at 08:21
75

You can use the raw output (-r) that will unescape characters:

jq -r .c myfile.json | jq .id

ADDENDUM: This has the advantage that it works in jq 1.3 and up; indeed, it should work in every version of jq that has the -r option.

peak
  • 105,803
  • 17
  • 152
  • 177
Casimir et Hippolyte
  • 88,009
  • 5
  • 94
  • 125
  • 1
    -r is much easier than the accepted answer. You got my upvote. I use jq to view json from my clipboard. So when I run into this I do: `pbpaste | jq -r` – Noah Gary Oct 27 '21 at 15:30
  • bonus: if you want to escape quotes you can use sed: `pbpaste | sed -e 's/\\\"/\"/g'`. this helps when you have escaped json that contains escaped json. Take out the first level of escaped quotes with sed and then jq can remove the last level. e.g `pbpaste | sed -e 's/\\\"/\"/g' | jq -r` – Noah Gary Oct 27 '21 at 15:33
2

Motivation: you want to parse JSON string - you want to escape a JSON object that's wrapped with quotes and represented as a String buffer, and convert it to a valid JSON object. For example:

some JSON unescaped string :

"{\"name\":\"John Doe\",\"position\":\"developer\"}"

the expected result ( a JSON object ):

{"name":"John Doe","position":"developer"}

Solution: In order to escape a JSON string and convert it into a valid JSON object use the sed tool in command line and use regex expressions to remove/replace specific characters:

cat current_json.txt | sed -e 's/\\\"/\"/g' -e 's/^.//g' -e 's/.$//g'

s/\\\"/\"/g replacing all backslashes and quotes ( \" ) into quotes only (")

s/^.//g replacing the first character in the stream to none character

s/.$//g replacing the last character in the stream to none character

avivamg
  • 12,197
  • 3
  • 67
  • 61
  • I had a similar problem. Used the same sed command verbatum. Bravo! Pipe that result to `jq -r` and you can process escaped json inside your json. – Noah Gary Oct 27 '21 at 15:37
  • Boo, hiss re: trying to parse JSON with `sed`. Any tool that tries to match just a few of the most common escapings is going to miss others (for example, you're handling `"` but not `\t`) – Charles Duffy Jun 14 '22 at 23:44
  • I had to also replace double quote characters: `cat a.json | sed -e 's/\\\"/\"/g' -e 's/^.//g' -e 's/.$//g' -e 's/""/"/g' | jq ` – Geoff Langenderfer Jul 21 '22 at 20:48