I'm having a two-tier (non-web) system, where I will load some objects into system memory, and flush them back to database occasionally. Let's assume there's just one object I need to update, but the object is a pretty big graph containing maybe hundreds of entities, and I use cascade to make the save propagate to the whole graph.
When Session.Update(the_big_object) is called in the second session, NHibernate "correctly" overrides every single entity in the whole object graph to the database, creating lots of updates, even if they are not changed. It's understandable because it has no idea what was changed outside the session. Using Session.Merge(the_big_object) does not help much since it would require a lot of queries as well.
The problem is in my case, usually there are only a few entities in the big graph that are actually changed. What I'm thinking is that, instead of using cascade to save the whole object, maybe it's a good idea to keep a dirty entity collection in the memory, and only flush those when needed. In order to do so I might need to add a dirty flag to those classes binding to their setters and so on.
Then I figured, isn't it just what NHibernate has been doing inside a session, to determine which objects are dirty? All these proxy / version things work perfectly for persistent objects, but not detached objects (or do they?). I just feel kinda dumb needing to do this again by hand.
Are there any suggestions / approaches I can take, or is there any magic trick I'm missing?
Thanks a lot!