20

For the following code

$sql = "select ..... for xml path('row')" # returns a long xml file
$x = Invoke-SqlCmd -Server xxxx $sql
$r = $x[0].ItemArray[0]

The returned $r has a truncated xml string. How to make sure the full long string is returned?

ca9163d9
  • 27,283
  • 64
  • 210
  • 413

2 Answers2

32

That cmdlet has a default max character length, defaults to 4,000 characters (see Search for -MaxCharLength) for XML or char data types.

Try the following:

$x = Invoke-SqlCmd -Server xxxx $sql -MaxCharLength <some large enough number>
WileCau
  • 2,057
  • 1
  • 24
  • 34
New Guy
  • 8,606
  • 1
  • 15
  • 12
  • Actually I found the result is broken into multiple rows. – ca9163d9 Feb 06 '15 at 21:35
  • 1
    @dc7a9163d9 I found that to be true as well from some quick tests I've done. However, it should be easy enough to rebuild your xml document. Just put humpty dumpty together again. ;) That said, I have also found that if you find that your query returns more than 4000 array elements for a single column of char data (1024 for binary; use `-MaxBinaryLength`), then you will need the command above to make sure you get the whole document. – New Guy Feb 07 '15 at 00:54
  • Admittedly though my tests were just with binary data I had already in a DB somewhere. I didn't do any tests on xml or other char data. I don't know how powershell would chunk them up. For me it looked like it was chopping them up by bytes. Perhaps for a xml doc it does it per line. Would have to test it. – New Guy Feb 07 '15 at 01:02
  • This answer solved the problem for me, without the data being split into multiple rows. I successfully read a string of over 80,000 bytes. – bart Feb 20 '17 at 18:51
  • Thanks. The fact that this is necessary at all shows how broken powershell is, doubly-so that the solution needs to involve instead of them providing something like -NoTruncate – CrazyPyro Dec 03 '20 at 14:22
  • @CrazyPyro This is not a limitation or issue with PowerShell. This would all be up to SQL Server team and the module they wrote. This is no different than having a similar issue in Python or other languages that support 3rd party modules. – New Guy Jul 06 '21 at 14:29
4

Many times this solves the case:

$x = Invoke-SqlCmd -Server xxxx $sql -MaxCharLength <some large enough number>

However, there are some cases where it doesn't work:

  • Somewhere between 80,000 and 3,500,000 chars this solution appears to break down.
  • The result I got was scrambled: Inner XML broke outer XML, so clearly at least our version has some defects in it as well.

You could try couple of solutions:

  • Limit content to x chars, such as 80,000 and don't try to export anything longer than that. I didn't test if this would solve the defect case also, so if someone else has this problem, please comment if this helps or not.
  • I exported everything as CSV, broke the inner XML, created temporary XML result and finally fixed again the inner XML back. This solution worked. Option "-raw" with file reading was necessary when handling files almost one GB size to improve performance.
  • I had a 4000 char limit and I couldnt fix it by adding any random number. Changing to 80000 resolved my char limit issue. Thanks! – Arnine Aug 31 '22 at 12:57