A powershell script like this:
$files = Get-ChildItem -Path "c:\temp\400jsons\*.json"
$beforeSql = Get-Content -Path "c:\temp\400jsons\before.sql" -Raw
$afterSql = Get-Content -Path "c:\temp\400jsons\after.sql" -Raw
$sql = "declare @json nvarchar(max);`r`n"
foreach ($f in $files){
$sql += $beforeSql
$sql += Get-Content $f.FullName -Raw
$sql += $afterSql + "`r`n"
}
Set-Content -Path "C:\temp\400jsons\out.sql" -Value $sql
Will read a before.sql
script like this:
set @json = '
And an after.sql
script like this:
';
insert into tableX values(@json);
And then repeatedly read hundreds or thousands of json files in a folder, building a string that is before + json + after
over and over again, ultimately writing a file out.sql
that looks like this:
declare @json nvarchar(max);
set @json = '{ "my":"json1" }';
insert into tableX values(@json);
set @json = '{ "my":"json2" }';
insert into tableX values(@json);
Which you can copy and paste into DBV and run. It's basically your 400 jsons concatenated into one file with some sql between
Of course you still have to create the before and after files to be correct SQL for what you actually want to run