0

I am coding in ASP.NET C# 4. The database is SQL Server 2012.

I have a table that has 2000 rows and 10 columns. I want to load this table in memory and if the table is updated/inserted in any way, I want to refresh the in-memory copy from the DB.

I looked into SQL Server Change Tracking, and while it does what I need, it appears I have to write quite a bit of code to select from the change functions -- more coding than I want to do for a simple scenario that I have.

What is the best (simplest) solution for this problem? Do I go with CacheDependency?

user1044169
  • 2,686
  • 6
  • 35
  • 64
  • You can create your own audit table to track inserts, updates, and deletes using triggers. Then when any updates are detected by searching the audit table from some timestamp, just reload the updated records. – mellamokb Jul 11 '12 at 16:32
  • My hope is that this is such a common scenario that I can just hook into some existing framework, preferably with a one-liner of code. CacheDependency comes to mind, but I am not sure it's the only thing out there. – user1044169 Jul 11 '12 at 16:50

1 Answers1

0

I currently have a similar problem: I'm implementing a rest service that returns a table with 50+ columns and I want to cache the data on the client to reduce trafic.

I'm thinking about this implementation:

All my tables have the fields

ID AutoIncrement (primary key)
Version RowVersion (a numeric value that will be incremented 
                    every time the record is updated)

To calculate a "fingerprint" of the table I use the select

select count(*), max(id), sum(version) from ...

Deleting records changes the first value, inserting the second value and updating the third value.

So if one of the three values changes, i have to reload the table.

Volkmar Rigo
  • 1,158
  • 18
  • 32