0

In a PostgreSQL table I have a column with a JSON like:

{"elements":[{"val":"value1", "column":"column1"}, {"val":"val2", "column":"column2"}, ...]}.

Is any way to transform this to result set like:

column1 | column2 | ...
-----------------------
value1  | value2  | ...

I worked around PostgreSQL JSON functions but didn't find an answer.

A.Sizov
  • 171
  • 1
  • 1
  • 7

4 Answers4

2

The number of columns of a query needs to be known before the query is executed, so you will have to write one expression for each possible column in your array.

With Postgres 12, you can do this with a JSON/Path expression:

select jsonb_path_query_first(input -> 'elements', '$[*].val ? ($.column == "column1")' ) #>> '{}' as column_1,
       jsonb_path_query_first(input -> 'elements', '$[*].val ? ($.column == "column2")' ) #>> '{}' as column_2
from data;

You need to repeat the jsonb_path_query_first() part for every possible column in the array.

The #>> {} is there to convert the JSONB value returned by the function to a text value.

  • I tried this, but it returned the same value for each column. But change to "input -> 'elements', '$[*] ? (@.column == "column1").val' ) #>> '{}'" did the job – A.Sizov May 27 '20 at 13:23
  • @A.Sizov: no idea, the above works for me: https://dbfiddle.uk/?rdbms=postgres_12&fiddle=ebe43d407c8dea3eb29c53551977b5ba –  May 27 '20 at 13:28
0

You can use json_to_recordset function to convert JSON into rowset. Anyway, final rowset cannot have dynamic number of columns, i.e. whatever solution you choose, you will have to list them explicitly in some way.

For example in select clause when doing manual transposition of 1:1-converted JSON:

with t(d) as (values 
  ('{"elements":[{"val":"value1", "column":"column1"}, {"val":"val2", "column":"column2"}]}'::json)
), matrix(val,col) as (
  select x.val, x."column"
  from t
  inner join lateral json_to_recordset((t.d->>'elements')::json) as x(val text, "column" text) on true
)
select (select val from matrix where col = 'column1') as column1
     , (select val from matrix where col = 'column2') as column2

Or in as x(column1 text, column2 text) clause when using the crosstab extension (see this question).

Or in somehow transformed or converted-to-xml JSON.

Tomáš Záluský
  • 10,735
  • 2
  • 36
  • 64
0

This sample JSON value dynamically unpivoted to col and val columns through use of json_array_elements_text() and json_each() functions by this query

   SELECT json_array_elements_text(v)::json->>'column' AS col,
          json_array_elements_text(v)::json->>'val' AS val
     FROM tab t
    CROSS JOIN json_each(jsval) as js(k,v)

but pivoting the results coming from the above query needs to columns should be listed individually, depending on the number of columns for the resulting query, in a such a way that of using a conditional aggregation :

SELECT MAX(val) FILTER (WHERE col = 'column1') as column1,
       MAX(val) FILTER (WHERE col = 'column2') as column2,
       MAX(val) FILTER (WHERE col = 'column3') as column3
  FROM 
  (
   SELECT json_array_elements_text(v)::json->>'column' AS col,
          json_array_elements_text(v)::json->>'val' AS val
     FROM tab t
    CROSS JOIN json_each(jsval) as js(k,v)
  ) q

Demo

Barbaros Özhan
  • 59,113
  • 10
  • 31
  • 55
0

Based on @a_horse_with_no_name proposal:

select 
    jsonb_path_query_first(raw_data_1thbase.data -> 'elements', '$[*] ? (@.column == "column1").val' ) #>> '{}' as column1,
    jsonb_path_query_first(raw_data_1thbase.data -> 'elements', '$[*] ? (@.column == "column2").val' ) #>> '{}' as column2
from data;

worked for me.

A.Sizov
  • 171
  • 1
  • 1
  • 7