-1

[This is a different approach to a similar question I did couple days ago, I recognize the approach of "cloning" was wrong as this would involve too many changes and offline the actual DB]

EXPLANATION:

I have a set of systems-installations (WebApp + SQL Server database), where the databases have two groups of objects, one small "core" and second and large "instance_specific".

Those systems need to be rebuilt base on another system from time to time (like an Excel financial scenario case), a rebuild of a system will be best if "core" can stay, and everything else delete and importing from a source database (the source database can be different from time to time)

SO FAR:

CREATE DATABASE Database1_copy AS COPY OF Database1; 

is not an option as will need to rebuild all core and neither Azure manage well DB created from SSMS, neither establish the needed connections, neither maintain the users and roles... (not good idea to change the db)

Same from recovery from backup / Import Data-Tier Application ...

Script seems really not a good option, doesn't include SEQUENCES (probably other objects as well) and it weight too much to be practical (see notes at the end). How should I load a 360mb script on SSMS to execute it on the destination system?

SQLAzureMW v5.15.6 Release Binary for SQL Server 2014, not able really to try as I run out of memory and trow and exception (with 16gb ram). Databases too big, or buggy ?

Solutions based on manual usage of backup sql functions, as: https://www.mssqltips.com/sqlservertip/1243/auto-generate-sql-server-database-restore-scripts/ are not supported in Azure ...

To get an instance "clear" and ready to receive the backup I really like the idea of the next post, filtering objects out that belong to core: Drop all the tables, stored procedures, triggers, constraints and all the dependencies in one sql statement

And due the mount of data, it would be great if "data-flows" goes from db to db directly, not back to local, and even better if it could be executed from the destination DB

SO:

In my best dreams I imagine a SP (similar to the one form bellow on stackoverflow) that goes object by object, reconstruct-it, and loads the data within, and even initialize sequences to the last used values.

I try to copy TABLES (with constrains and index, AND DATA), SEQUENCES (with acual value), VIEWS, SPs, FUNCTIONS, and I think that's all (do not need to pass USERS and ROLES)

I come also to Apexsql and RedBeltTools, haven't try yet, but maybe the solution, even I would prefer not to rely in 3rd party software that runs locally.

Am I out of options?

Should I start digging how to build my own SP migration tool? (I am not really sure how/where to start ...)

Just some numbers of an actual source database:

CHECK_CONSTRAINT    12
DEFAULT_CONSTRAINT  259
FOREIGN_KEY_CONSTRAINT  145
PRIMARY_KEY_CONSTRAINT  162
SEQUENCE_OBJECT 7
SERVICE_QUEUE   3
SQL_INLINE_TABLE_VALUED_FUNCTION    1
SQL_SCALAR_FUNCTION 27
SQL_STORED_PROCEDURE    765
SQL_TABLE_VALUED_FUNCTION   6
UNIQUE_CONSTRAINT   54
USER_TABLE  268
VIEW    42

Other than Users and Roles, I think nothing else is missed here, isn't it?

Script DB delivers a 360 mb (over 500k lines) 
and it seems to not include SEQUENCES at least 
(SMSS scripting let me just chose: Tables/Views/SPs/UserDefFunctions/Users/DatabaseRole)
Community
  • 1
  • 1
pGrnd2
  • 514
  • 1
  • 5
  • 14
  • Could you clarify what is your question? An unclear question will give you an unclear answer. Example : How to pass the schema from one Azure sql db to another? How to pass the data? How to pass both data and schema? – feranto Jan 29 '17 at 00:21
  • @feranto, sorry, yes, true, I tought was clear (english is not my first language). I try to pass 90% os schema plus data on respective tables – pGrnd2 Jan 30 '17 at 19:35

1 Answers1

0

Since you have mention you want to replicate 90% of anAzureSQL database to an existent one. I could see this scenario as comparing two databases in schema and data and trying to send selected changes from an origin db to a destiny db.

That said I would recommend using SQL Server Data Tools, it has a complete suite of tools that could aid you in that scenario, and multiple ways to update the destiny db: inline, via script, via dacpac, via bacpac.

Also since your question is so open, I don't know if it will solve your scenario, but try looking at it and its compare functionality.

feranto
  • 497
  • 4
  • 14
  • thanks for reply, not sure how I could do less open the question, for me (in my mind) seems so simple as to restore a bacpac excluding few objects into an existing db. I realise that a "selective restoration of a backup" may seem counterintuitive. I believe that my scenario is not that different from a PRODUCTION -> QUALITY cloning where there are objects that are specific to the environment. Thanks for SQL-Server-Data_tools,, I didn't know them, but basically they look like RedBelt, or ApexSQL tools, isn't it ? – pGrnd2 Jan 30 '17 at 19:42