0

I am migrating a database from On Premises SQL to Azure. This is a process we have performed several times already, successfully.

The process is normally quite simple. I export the database to a Data Tier Application, I change the extension from .bacpac to .zip, get the model.xml file, change the collation from Latin1_General_CI_AS to SQL_Latin1_General_CP1_CI_AS, recalculate the Checksum, update the Origin.xml file, save it back to the .zip file, change the extension back from .zip to .bacpac and import it. Easy peasy.

But not this time.

The .bacpac is much larger than usual, around 4GB

As an exercise, I tried simply extracting the model.xml and saving it back to the zip file without modifying it at all. When I try to do that a second time Windows complains that the .zip file is "corrupted"????

Even if I don't modify the .bacpac at all, when I try to import it the Import complains with: The ImportExport operation with Request Id '' failed due to 'Could not load package. Multidisk ZIP format is not supported. Multidisk ZIP format is not supported.'.

I'm stuck here. Is there anything I can do here? For example: Would it be possible to take a larger (4GB+ ) bacpac and reconfigure it so it's not seen as a "Multidisk" ZIP format? Using 7zip perhaps or whatnot?

Thanks

DinahMoeHumm
  • 185
  • 10
  • Can you try with this parameter while import ? `import parameter /p:DatabaseMaximumSize=` – Venkataraman R Jul 05 '21 at 08:59
  • I use the Import Data Tier Application wizard in SQL Server Management Studio and pick a Service Tier with a maximum size of 250GB which is about 4 times the actual amount of data (including indexes and fulltext indexes) of the original database. So I don't think that's it. – DinahMoeHumm Jul 05 '21 at 13:10
  • When I do the same thing in Azure, I upload the bacpac to blob storage and use the Import Database wizard in the Portal, picking a serverless configuration with a maximum size of 2.5TB and many cores, thereby ensuring that the available log space is massive, too and percentage wise I never go near 100% (scaling the DB back down after the import is complete). Same problem. – DinahMoeHumm Jul 05 '21 at 13:13

1 Answers1

2

In case someone else comes across this same issue.... I have come up with a workaround. It's not a full solution and I am still open to hearing one, but this is just about acceptable.

The databases we migrate contain, among other things, a table with a VARCHAR(MAX) column used to store text extracted from documents, so that we can then implement a Fulltext Index over that table and give users certain bits of Search functionality. That table is huge(ish) - definitely compared to the rest of the database.

When I create a .bacpac that contains the entire database except that table, the bacpac is much smaller, so much smaller in fact that our usual process works again and I can correct the Collation and import the .bacpac, thereby migrating the entire database except that one table.

Once I have done that, I go to the local database in SSMS, right-click and use the Tasks-Import Data wizard, point the source at a local copy containing the "big table", point the target at the migrated Azure database and import the table making sure that its columns are correctly defined. Doing the Import that way took about an hour but that's an acceptable time frame in our business. After the Import I recreate constraints, keys and indexes, and recreate the Fulltext indexes.

This is not a proper "answer" to my question. It worked for us in our specific circumstances, and it may work for another reader who comes across the same issue. If so, you're welcome. If not, I hope someone else comes up with a better answer that works generally. Here's hopin'

DinahMoeHumm
  • 185
  • 10