I have a Azure SQL Database that's been operational for around 5 years, and is now approaching 300GB in size.
The bulk of the data is in a table with around 4 millions rows. One of the columns of that table is NVARCHAR(MAX) and stores raw JSON. The size of the JSON stored is variable, with most records having JSON < 200KB, however some rows have JSON up to 16MB.
My initial thought was to migrate the JSON from being stored in the database to being stored in Azure Blob Storage, and just update the row in the database with the URL to the Blob. Then, after the migration is complete, run DBCC SHRINKDATABASE to free the previously allocated space that is no longer needed.
Is this a valid approach, or is there a better strategy? Are there pitfalls to watch out for when using DBCC SHRINKDATABASE?