I have some problems with sp_parse_json procedure (the docs) in v16.0.0.2614 (haven't tested in v17 yet). Here is a script to reproduce (I've simplified json string as much as I could, the real string is much more complex):

begin
    declare @json long varchar;
    drop variable if exists @json_row;
    set @json = '{
    "TestA": [{
            "TestN": null,
            "TestA": [{
                    "TestN": null,
                    "TestA": []
                }
            ]
        }, {
            "TestN": null,
            "TestA": []
        }
    ]}';
    call sp_parse_json('@json_row', @json, 100);
    select (@json_row).TestA[[1]].TestA[[1]].TestN;
end

The first (and the most critical) problem

It gives me this error message (SQLCODE=-157): Cannot convert nvarchar to a varray.

Seems like in this case sp_parse_json behaves the same as if the third argument maxlen was less than 5. It parses only the first level of TestA. I came to this conclusion because this query returns string representation of the second level TestA:

select (@json_row).TestA[[1]].TestA;
The result string is like this:
[{"TestN": null,"TestA": []}]

It behaves this way when the first level TestA array has more than one item. When I remove the second item from the first level TestA, everything seems to work as expected in this example.

It behaves the same when I don't supply the third argument maxlen.

Is it a bug, or am I missing something? Are there any quick workarounds (an option to change)?

The second problem

The docs say that the first argument to the procedure sa_parse_json is the name of the local variable to create. However, it creates global connection level instead of local variable. This is why drop variable statement was included in my repro because without that it will give yet another error if I modify json structure and run the script again using the same connection.

Is it a bug in the docs or in the procedure, or am I missing something again?

The third... question

Procedure sa_parse_json doesn't accept maxlen bigger than 100, although the docs say it is INTEGER parameter.

Is it by design or not? It is not a problem for now but maybe this is somehow related.

asked 30 Jan '18, 05:25

Arthoor's gravatar image

Arthoor
1.3k355266
accept rate: 11%

edited 31 Jan '18, 00:21

2

#3. Doc says that maxlen is the recursion limit. And it seems that the deepest json you can process is 100 levels deep.

I don't know who named this property as "maxlen". It must be "maxdepth" instead.

(30 Jan '18, 07:54) Vlad

I'll try to answer each of your questions:

Issue #1: "Cannot convert nvarchar to a varray" error when attempting to access an element.

This is expected due to the fact that your input JSON is not the same form in each row element of the outer array. Let me try to explain: Due to the strict SQL requirement that all declared variables must have a well formed "declaration", sp_parse_json will convert elements of the input JSON into varchar (or nvarchar) when needed. In your input JSON, each element of your outer array has a different form: i.e. TestA[[1]].TestA is an ARRAY(1) of a ROW whereas TestA[[2]].TestA is an ARRAY(0) of something(?). Therefore to make both elements of the outer TestA look the same, sp_parse_json will parse and convert each of these elements into [n]varchar values. The resultant @json_row variable will be something like: ARRAY(2) of ROW( TestA varchar(?), TestN varchar(4) ) where (?) is the min length required to hold the corresponding JSON fragments. See the docs for more information about composite data types

Issue #2 The created variable is a global, not local.

The documentation is incorrect. When sp_parse_json creates the variable it will be a global variable connection-level variable, not a local variable. I will pass the error on to the doc team.

Issue #3 maxlen does not accept a value larger than 100.

I am not sure why this parameter is called "maxlen" since it specifies the maximum recursion depth? I would agree with Vlad that this parameter would have been better named "maxdepth". A max depth of 100 has been chosen to ensure that the parser doesn't overflow the CPU stack during parsing.

permanent link

answered 30 Jan '18, 08:46

Mark%20Culp's gravatar image

Mark Culp
24.9k10141297
accept rate: 41%

edited 30 Jan '18, 11:41

1

Thank you for the explanation.

As to the issue #1, wouldn't it be possible to implement a little bit different behavior (maybe dependant on some additional argument), that in case of an empty TestA[[2]].TestA array, sp_parse_json would determine the structure by non empty TestA[[1]].TestA array? An empty array would suit any structure, woudn't it?

Furthermore, maybe the structure could even be determined using some kind of "union" of different elements. I mean, if we had a json string like [{"N1": 11, "N2": 12}, {"N2": 22, "N3": 23}], couldn't the resulting variable be an ARRAY of ROW(N1, N2, N3)? And only in case of conflicting elements (say, different types with the same name in the same level) (N)VARCHAR would be generated instead of ROW.

(31 Jan '18, 01:34) Arthoor
Replies hidden

I'm not sure I understand your first point. An empty array (an array with zero elements) is not the same as an array with one (or more) elements, so automatically changing an empty array to have an element would be changing the data and thus cannot be done.

I need to think more about your second suggestion. I.e. does adding a (undefined) element to a row change the meaning of the row. I.e. does {"N1": 11, "N2": 12} mean the same as {"N1": 11, "N2": 12, "N3": undefined}?

(01 Feb '18, 13:53) Mark Culp
1

I don't mean to add an element to an array if that array is supposed to be empty. I am talking about the structure of generated data type, while actual data in that structure should of course reflect the input string. Maybe these two examples will explain what I mean.

The first example:

begin
    declare @aa array(5) of array(10) of int;
    set @aa[[1]] = array(11, null);
    set @aa[[2]] = array(21);
    set @aa[[3]] = array();
    select cardinality(@aa) as c_outer,
           cardinality(@aa[[1]]) as c_inner_1,
           cardinality(@aa[[2]]) as c_inner_2,
           cardinality(@aa[[3]]) as c_inner_3;
    // the result is 3, 2, 1, 0, as expected
end

The equivalent using sp_parse_json:

begin
    declare @js long varchar;
    set @js = '[[11, null], [21], []]';
    drop variable if exists @aa;
    call sp_parse_json('@aa', @js);
    select cardinality(@aa) as c_outer,
           cardinality(@aa[[1]]) as c_inner_1,
           cardinality(@aa[[2]]) as c_inner_2,
           cardinality(@aa[[3]]) as c_inner_3;
    // this succeeds only when c_inner_1 = c_inner_2 = c_inner_3
end
SQL Anywhere implementation of arrays allows to store different quantities of elements in sub-elements while sa_parse_json is IMHO too restrictive here. :)

(02 Feb '18, 01:51) Arthoor

So what should be the expected outcome here?

(02 Feb '18, 07:31) Volker Barth
1

I would expect the same result in both examples, i. e. I would expect sa_parse_json to generate ARRAY of ARRAY of INT (instead of ARRAY of VARCHAR).

(05 Feb '18, 01:04) Arthoor
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Question tags:

×260
×30

question asked: 30 Jan '18, 05:25

question was seen: 2,411 times

last updated: 05 Feb '18, 01:04