0

I am using PowerShell 5.1 and adding the key, value content in hashtables. but I see, it is not adding the complete content in value when the value is too large around 6000 characters. So, my question is: Is there any limit while we use PowerShell hashtables, if yes, what could be the other way to save the content as key and value pair.

    $inputFileJson = @{}
$inputFileJson.Add($config.NAME,$config.VALUE)
$inputFileJson | ConvertTo-Json -Depth 20 | Out-File -Encoding utf8 $localDir\"\\input.json\" -Force
Ashish Goyanka
  • 207
  • 3
  • 11
  • AFAIK tThere is no practical limit, other than it has to fit in memory. – Theo Sep 13 '21 at 08:47
  • 2
    Please add ***to the question***, how you determine "*but I see, it is not adding the complete content in value when the value is too large around 6000 characters*" as that is probably where the real ussue lies. – iRon Sep 13 '21 at 09:21
  • 1
    _The variable '$config' cannot be retrieved because it has not been set_. Please [edit] the question and add some example to your [mcve]. – JosefZ Sep 13 '21 at 16:21

1 Answers1

0

There is not a limit to hashtables, but there is a -Depth 20 limit you defined for how many levels deep ConvertTo-Json will convert nested values to JSON-format. You are likely hitting the depth limit, not a content size limit. Once this limit gets hit, the next object will simply be a .ToString() representation of the object, and will not delve into further nested values.

You can specify up to 100 levels deep with ConvertTo-Json -Depth 100. However, I find that determining how deep my object actually is, determining how much additional depth to account for, etc. is more trouble than it's worth. Unless you really need to truncate objects more than a certain level deep, just use -Depth 100.

codewario
  • 19,553
  • 20
  • 90
  • 159