0

I'm developing a simple utility which is going to create a *.bak file from a SQL Server database.

If database has a quite small size, less than 50 Mb, then there is no problem, it's working well. But potentially I'm going to work with big databases of 2-3Gb data size.

Since it's impossible (I guess) to keep such a big data in the memory to create *.bak file, would my utility still work in this case?

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Alan Coromano
  • 24,958
  • 53
  • 135
  • 205
  • It's impossible to say without seeing the code. – Tejs Oct 30 '12 at 17:39
  • You have not constructed your question well. – Furqan Safdar Oct 30 '12 at 17:49
  • 3
    How does the SQL Server Management Studio backup of such large databases work? The backup process is *streaming* the data to disk - chunk by chunk. It's not keeping the whole database in memory and then dumping it to disk.... – marc_s Oct 30 '12 at 17:50
  • 1
    insert junk into db, test yourself – Luke Hutton Oct 30 '12 at 17:53
  • Note: Default command timeout is set to 10 minutes, you can set property called `StatementTimeout` in the `ServerConnection` class. Setting this to zero disables the timeout. – Luke Hutton Oct 30 '12 at 18:03

2 Answers2

2

Yes, your utility will still work with databases that are 2 -3 Gb in size.

Tarzan
  • 4,270
  • 8
  • 50
  • 70
1

Yes! behind the scene Sql Server Management Studio uses smo models for it`s tasks such as backup and restoring. it can handle backups so also you can do your job using smo functionality.

Ehsan Mirsaeedi
  • 6,924
  • 1
  • 41
  • 46