20

I'm concatenating strings together using "for JSON path('')".

I have set the Tools->Options->SQL Server->Results to Grid options to max.

I have set the Tools->Options->SQL Server->Results to Text options to max.

Executing the query in Grid mode and copying the one row/one column results, I see the return value is limited to 2033 characters.

How can I ensure the returned value isn't truncated?

צח חורי
  • 371
  • 1
  • 3
  • 9
  • `I'm concatenating strings together using "for JSON path` - why? Maybe we can solve the problem you're solving if you give us more background on what the problem is and why you thought `for JSON path` was a good fit. – Aaron Bertrand Mar 03 '19 at 23:36
  • PLease Look : https://stackoverflow.com/questions/52899115/sql-server-2016-ssms-json-formatting – BlueGreenWorld Aug 15 '22 at 22:35

10 Answers10

14

The behavior is documented here:

A large result set splits the long JSON string across multiple rows.

By default, SQL Server Management Studio (SSMS) concatenates the results into a single row when the output setting is Results to Grid. The SSMS status bar displays the actual row count.

Other client applications may require code to recombine lengthy results into a single, valid JSON string by concatenating the contents of multiple rows. For an example of this code in a C# application, see Use FOR JSON output in a C# client app.

Therefore, using FOR JSON to concatenate strings (when the result is longer than 2033 bytes) is not the best idea.

Try using FOR XML instead. For example:

SELECT STUFF((
    SELECT ', '+name FROM sys.columns FOR XML PATH(''), TYPE
).value('.','nvarchar(max)'),1,2,'')
Razvan Socol
  • 5,426
  • 2
  • 20
  • 32
11

Insert into a nvarchar(max) variable or a table will do it.

declare @json table (j nvarchar(max));
insert into @json select * from(select* from Table where Criteria1 for json auto)a(j)
insert into @json select * from(select* from Table where Criteria2 for json auto)a(j)
select * from @json

or

DECLARE @JSON nvarchar(max)
SET @JSON = (SELECT * FROM Table FOR JSON AUTO)
SELECT @JSON
Charlieface
  • 52,284
  • 6
  • 19
  • 43
Barak G.
  • 224
  • 3
  • 6
7

I've noticed that it suffices to just put the JSON (or XML) into a scalar subquery:

-- Useless
select row_number() over(order by (select 1)) as rn
from sys.all_objects
for json auto

-- Useful
select (
  select row_number() over(order by (select 1)) as rn
  from sys.all_objects
  for json auto
)

... at least in newer versions of SQL Server such as e.g. 2019. dbfiddle demo here

Lukas Eder
  • 211,314
  • 129
  • 689
  • 1,509
5

I had similar question and found answer by using cast result to varchar format. Please beware that the result will still have limitation since varchar has 8000 character for maximum. But it works for my situation. Hope it helps.

SELECT CAST((SELECT [Columns] FROM [Tables] FOR JSON PATH('')) AS VARCHAR(MAX)) AS JSONDATA
SheldonZ
  • 157
  • 1
  • 3
2

If you use Azure Data Studio for this query, the results can be opened as expected, without truncaiton.

In newer versions of SSMS, this is available from the toolsmenu.

Download it from here - https://learn.microsoft.com/en-us/sql/azure-data-studio/

Elken
  • 172
  • 1
  • 4
1

Put your SQL query into a database scalar function (a "scalar" function returns a single value - in this case a string of text). That seems to be the most straightforward way to handle the truncation.

Graham Laight
  • 4,700
  • 3
  • 29
  • 28
0

I got this to work by using a string builder and appending the additional rows while in the DataReader loop.

My original logic only allowed a single row returned - so that when the data reader looped more than once - it fell into 2033+ length issue.

0

With the latest SSMS (18.8) truncation appears to be fine, just click on the highlighted output on the result set in grid view, it opens up the entire string in a new document. See this screenshot

0

You can generate json for each row to avoid line limit, and manually replace the result line break with comma and add brackets to make it array.

SELECT (
    SELECT [column1], [column2]
    FOR JSON PATH, without_array_wrapper
)
FROM [dbo].[table_name]

Note that you cannot use * to select all columns. You can query column names with:

SELECT column_name FROM information_schema.columns WHERE table_name = [table_name]

If you are running in command line, you should use bcp Utility instead of sqlcmd.

This is the sample powershell script:

# Fetch columns
$columnsQuery = "SET NOCOUNT on;SELECT column_name FROM information_schema.columns WHERE table_name = N'$tableName'"
$columns = sqlcmd -S $DB_SERVER -d $DB_NAME -U $DB_USERNAME -P $DB_PASSWORD -h -1 -u -y 8000 -Q $columnsQuery

# Separate column names with `,`
$columnsCommaSeparated = ($columns -join ',').replace(' ','') -replace '[a-zA-Z0-9_]+', '[$0]'

# Run bcp command
$query="select (select $columnsCommaSeparated for json path, without_array_wrapper) from [$tableName] $conditionExpression"
bcp """$query""" queryout $outFile -S $DB_SERVER -d $DB_NAME -U $DB_USERNAME -P $DB_PASSWORD -w

# Make json Array
$result = Get-Content $outFile 
$jsonWithoutBracket = $result -join ','
$json = "[$jsonWithoutBracket]"

# Write
$Utf8WithBomEncoding = New-Object System.Text.UTF8Encoding $True
[System.IO.File]::WriteAllLines($outFile , $json, $Utf8WithBomEncoding)
Lagyu
  • 87
  • 5
0

I could recover a Giant JSON of Sql Server to Netcore doing this with dapper:

public class AccountMigrationQueries : SingleBaseRepository
{
    public AccountMigrationQueries(IReadOnlyDbConnection readOnlyConnection): base(readOnlyConnection){}

    private readonly string sqlReturnJsonLarge = @"
        declare @json table (result nvarchar(max));
        insert into @json
        select * from (select ... from ... where ... for Json auto, without_array_wrapper) a(result)
        select * from @json
    ";
        
    public async Task<YourClass> GetByIdAsyc()
    {
        var dReader = await ExecuteReaderAsync(sqlReturnJsonLarge);

        StringBuilder json = new StringBuilder();
        while (dReader.Read())
        {
            json.Append(dReader.GetString(0));
        }
        dReader.Close();
        return JsonConvert.DeserializeObject<YourClass>(json.ToString());
    }
}

// other file:
using System.Data;
using Dapper;
public abstract class SingleBaseRepository
{
    protected readonly IReadOnlyDbConnection _readOnlyConnection;
    protected SingleBaseRepository(IReadOnlyDbConnection readOnlyConnection)
    {
        _readOnlyConnection = readOnlyConnection;
    }

    protected async Task<IDataReader> ExecuteReaderAsync(string sql, object parameters = null)
    { 
        await _readOnlyConnection.ExecuteReaderAsync(sql, parameters); // <= dapper
    }
}