0

I'm dealing with a large scale DB that grows every day. Pulling the required data from this DB involves some joins and due to the large amount of data it takes too long. A friend suggested the following:

Once a day pull all the required data from this DB and write it to a binary file which will reside in your source-control. Next, create a dal implementation which will work against this binary file, this way things should work smoothly.

I'm not familiar with this methodology and I'm wondering - is it a good practice? What are the advantages and disadvantages of such a practice and finally is there any reference for such an implementation (currently I'm using JPA)?

Thanks in advance

forhas
  • 11,551
  • 21
  • 77
  • 111

1 Answers1

1

A few things,

You could probably optimize how you are querying your data. See, http://java-persistence-performance.blogspot.com/2010/08/batch-fetching-optimizing-object-graph.html

If you enable a cache in JPA (or EclipseLink) then you could avoid having the query the database for the data. See, http://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Basic_JPA_Development/Caching

You could archive the data in the database to avoid it growing to big. Or partition the data so only the current data is in the table.

You could use a local or in-memory database for your current data, or use a product such as Oracle TimesTen, or a caching product such as Oracle Coherence.

James
  • 17,965
  • 11
  • 91
  • 146