0

I am trying to read in multiple JSON files in SQL. I do not have Bulk Permissions and am using DbVisualizer. I currently have to copy and paste each file. Is there a way to read in over 400 files with different names.

Declare @json nvarchar(Max)
Set @json = '{
  "AudioFileResults": [
    {
      "AudioFileName": "test"
     }]}'
Pfwangs
  • 41
  • 5

1 Answers1

0

A powershell script like this:


$files = Get-ChildItem -Path "c:\temp\400jsons\*.json"

$beforeSql = Get-Content -Path "c:\temp\400jsons\before.sql" -Raw
$afterSql = Get-Content -Path "c:\temp\400jsons\after.sql" -Raw

$sql = "declare @json nvarchar(max);`r`n"

foreach ($f in $files){
    $sql += $beforeSql 
    $sql += Get-Content $f.FullName -Raw
    $sql += $afterSql + "`r`n"
}

Set-Content -Path "C:\temp\400jsons\out.sql" -Value $sql

Will read a before.sql script like this:

set @json = '

And an after.sql script like this:

';
insert into tableX values(@json);

And then repeatedly read hundreds or thousands of json files in a folder, building a string that is before + json + after over and over again, ultimately writing a file out.sql that looks like this:

declare @json nvarchar(max);
set @json = '{ "my":"json1" }';
insert into tableX values(@json);

set @json = '{ "my":"json2" }';
insert into tableX values(@json);

Which you can copy and paste into DBV and run. It's basically your 400 jsons concatenated into one file with some sql between

Of course you still have to create the before and after files to be correct SQL for what you actually want to run

Caius Jard
  • 72,509
  • 5
  • 49
  • 80