-1

hi experts :) I'm using wpf with sql server

problem 1: lots of data gets created and must be saved to db every second, but at the same time multiple parts of the program write to the same tables. Saving to the db every second is not efficient as db methods are expensive, do you experts disagree or what should I do? not sure what is the best thing to do, when would xml or text files be more useful?

problem 2: I have to retrieve the data from the db from the tables that problem 1 is saving to so I can show on live graphs. Would this cause read/write problems?

Felice Pollano
  • 32,832
  • 9
  • 75
  • 115
Mercury
  • 711
  • 1
  • 11
  • 21
  • I had a db class, sent the item to write to a queue, and every x dumped the queue to the db as a transaction. Problem then is you cant really view completely live data, its always upto x behind. – BugFinder Oct 04 '12 at 09:42
  • 1
    I remove the tag WPF since the question has nothing to do with WPF – Felice Pollano Oct 04 '12 at 09:42

4 Answers4

1
  1. Loading a lot of data with one-by-one inserts is not a good idea. Try to look at SqlBulkCopy

  2. Database handles concurrency very well, you can insulate the writing in proper transaction in order to see just the data when a complete write is done.

Felice Pollano
  • 32,832
  • 9
  • 75
  • 115
  • thanks. I am using linq to sql, but I am used to having single inserts. Even if I have a list of 100 items and save that to the db it will still be single inserts. Is there a way to save a list of items in bulk using linq to sql? – Mercury Oct 04 '12 at 10:00
  • @user1622488 as far as I know, not directly. – Felice Pollano Oct 04 '12 at 12:16
0

1 You can use Caching in order to persist your datas, you can save with Cache class

Link : http://msdn.microsoft.com/en-us/library/system.web.caching.cache.add.aspx

2 You don't have problem with second scenario, you can use Transaction in order to ensure that you get commited datas.

Link : http://msdn.microsoft.com/en-us/library/system.transactions.transaction.aspx

Aghilas Yakoub
  • 28,516
  • 5
  • 46
  • 51
0

Considering that you have time limit, you have to process data in some way in 1 second.

I would suggest:

problem 1: Save the data you generate in chunks injectd into the Stack<..>. After from another thread process that Stack<..> to save the chunk to the DB, untill the Stack<..> is not empty. As there is no any gurantee that you wil be able save data in 1 second, after you have it in memory.

problem 2: Having it already in memory you can achieve maximum possible perforance, remaining in the acceptable allocated memory limimts.

It's hard to suggest somethign really practical here, as performance is always strictly domain specific, which can not be described completely in short question. But soluton, can be taken like a basic guideline.

Tigran
  • 61,654
  • 8
  • 86
  • 123
0

An other option, is to break up the table(s).

Are your other functions only writing to subset of the field of the records.

So you have a 1 to 1 mapping between two tables one for the initial data, the other for functionA. Depends on how well you can partition the backend needs, but it can significantly reduce collisions.

Basic idea is to look at your tables more like objects. So you have a base table, then if it's a type 1 thingy you add a 1 to 1 link to the relevant table. The business functions around type 1 only need to write to that table, never the entity table.

A possibility anyway.

Tony Hopkinson
  • 20,172
  • 3
  • 31
  • 39