0

I'm working on a trigger.

declare
  v_time number(11, 0);
begin
for i in 0..1
loop
  select JSON_VALUE(body, '$.sections[0].capsules[0].timer.seconds') into v_time from bodycontent where contentid=1081438;
dbms_output.put_line(v_time);
end loop;
end;

However, index references do not become dynamic.

like JSON_VALUE(body, '$.sections[i].capsules[i].timer.seconds')

Is there any way I can do this?

GMB
  • 216,147
  • 25
  • 84
  • 135
Junseok Lee
  • 49
  • 3
  • 10

4 Answers4

0

You would need to concatenate the variable in the json path:

JSON_VALUE(body, '$.sections[' || to_char(i) || '].capsules[0].timer.seconds')

I don't really see how your question relates to a trigger.

GMB
  • 216,147
  • 25
  • 84
  • 135
  • First, Thank you for your answer. When table data was updated, we were creating a trigger to store JSON data in other tables. But if you do as you told me, the following error occurs. [ORA-00907: missing right parenthesis] How should I fix it? – Junseok Lee Oct 30 '20 at 00:41
  • Not quite like that. The path must be a text literal. The solution is dynamic SQL. I just wrote an answer to show how. –  Feb 21 '21 at 06:53
0

You can use JSON_TABLE:

declare
  v_time number(11, 0);
begin
  for i in 0..1 loop
    SELECT time
    INTO   v_time
    FROM   bodycontent b
           CROSS APPLY
           JSON_TABLE(
             b.body,
             '$.sections[*]'
             COLUMNS (
               section_index FOR ORDINALITY,
               NESTED PATH '$.capsules[*]'
               COLUMNS (
                 capsule_index FOR ORDINALITY,
                 time NUMBER(11,0) PATH '$.timer.seconds'
               )
             )
           ) j
    WHERE  j.section_index = i + 1
    AND    j.capsule_index = i + 1
    AND    b.contentid=1081438;

    dbms_output.put_line(v_time);
  end loop;
end;
/

Which, for the test data:

CREATE TABLE bodycontent ( body CLOB CHECK ( body IS JSON ), contentid NUMBER );

INSERT INTO bodycontent ( body, contentid ) VALUES (
  '{"sections":[
     {"capsules":[{"timer":{"seconds":0}},{"timer":{"seconds":1}},{"timer":{"seconds":2}}]},
     {"capsules":[{"timer":{"seconds":3}},{"timer":{"seconds":4}},{"timer":{"seconds":5}}]},
     {"capsules":[{"timer":{"seconds":6}},{"timer":{"seconds":7}},{"timer":{"seconds":8}}]}]}',
  1081438
);

Outputs:

0
4

Or, you can just use a query:

SELECT section_index, capsule_index, time
FROM   bodycontent b
       CROSS APPLY
       JSON_TABLE(
         b.body,
         '$.sections[*]'
         COLUMNS (
           section_index FOR ORDINALITY,
           NESTED PATH '$.capsules[*]'
             COLUMNS (
               capsule_index FOR ORDINALITY,
               time NUMBER(11,0) PATH '$.timer.seconds'
             )
           )
       ) j
WHERE  ( j.section_index, j.capsule_index) IN ( (1,1), (2,2) )
AND    b.contentid=1081438;

Which outputs:

SECTION_INDEX | CAPSULE_INDEX | TIME
------------: | ------------: | ---:
            1 |             1 |    0
            2 |             2 |    4

(Note: the indexes from FOR ORDINALITY are 1 higher than the array indexes in the JSON path.)

db<>fiddle here

MT0
  • 143,790
  • 11
  • 59
  • 117
0

So, the problem is that you want to loop over an index (to go diagonally through the array of arrays, picking up just two elements) - but the JSON_* functions don't work with variables as array indices - they require hard-coded indexes.

PL/SQL has an answer for that - native dynamic SQL, as demonstrated below.

However, note that this approach makes repeated calls to JSON_VALUE() over the same document. Depending on the actual requirement (I assume the one in your question is just for illustration) and the size of the document, it may be more efficient to make a single call to JSON_TABLE(), as illustrated in MT0's answer. If in fact you are indeed only extracting two scalar values from a very large document, two calls to JSON_VALUE() may be justified, especially if the document is much larger than shown here; but if you are extracting many scalar values from a document that is not too complicated, then a single call to JSON_TABLE() may be better, even if in the end you don't use all the data it produces.

Anyway - as an illustration of native dynamic SQL, here's the alternative solution (using MT0's table):

declare
  v_time number(11, 0);
begin
  for i in 0..1 loop
    execute immediate q'#
    select json_value(body, '$.sections[#' || i ||
                      q'#].capsules[#' || i || q'#].timer.seconds')
    from   bodycontent
    where  contentid = 1081438#'
    into v_time;
    dbms_output.put_line(v_time);
  end loop;
end;
/


0
4

PL/SQL procedure successfully completed.
0

You can see the example below:

Just hit this query in your console and try to understand the implementation.

SELECT JSON_VALUE('{
  "increment_id": "2500000043",
  "item_id": "845768",
  "options": [
    {
      "firstname": "Kevin"
    },
    {
      "firstname": "Okay"
    },
    {
      "lastname": "Test"
    }
  ]
}', '$.options[0].firstname') AS value
  FROM DUAL;
Golam Kibria
  • 114
  • 7