1

We have a TDBGrid that connected to TClientDataSet via TDataSetProvider in Delphi 7 with Oracle database.

It goes fine to show content of small tables, but the program hangs when you try to open a table with many rows (for ex 2 million rows) because TClientDataSet tries to load the whole table in memory.

I tried to set "FetchOnDemand" to True for our TClientDataSet and "poFetchDetailsOnDemand" to True in Options for TDataSetProvider, but it does not help to solve the problem. Any ides?

Update:

My solution is:

TClientDataSet.FetchOnDemand = T
TDataSetProvider.Options.poFetchDetailsOnDemand = T
TClientDataSet.PacketRecords = 500

I succeeded to solve the problem by setting the "PacketRecords" property for TCustomClientDataSet. This property indicates the number or type of records in a single data packet. PacketRecords is automatically set to -1, meaning that a single packet should contain all records in the dataset, but I changed it to 500 rows.

Andrey Prokhorov
  • 890
  • 2
  • 13
  • 25
  • I think [this post](http://stackoverflow.com/questions/343553/how-to-prevent-delphi-ado-from-loading-the-entire-table-into-memory) can help ypu. – Iqbal Feb 06 '14 at 15:15

1 Answers1

1

When working with RDBMS, and especially with large datasets, trying to access a whole table is exactly what you shouldn't do. That's a typical newbie mistake, or a borrowing from old file based small database engines. When working with RDBMS, you should load the rows you're interested in only, display/modify/update/insert, and send back changes to the database. That means a SELECT with a proper WHERE clause and also an ORDER BY - remember row ordering is never assured when you issue a SELECT without an OREDER BY, a database engine is free to retrieve rows in the order it sees fit for a given query. If you have to perform bulk changes, you need to do them in SQL and have them processed on the server, not load a whole table client side, modify it, and send changes row by row to the database. Loading large datasets client side may fali for several reasons, lack of memory (especially 32 bit applications), memory fragmentation, etc. etc., you will flood the network probably with data you don't need, force the database to perform a full scan, maybe flloding the database cache as well, and so on. Thereby client datasets are not designed to handle millions of billions of rows. They are designed to cache the rows you need client side, and then apply changes to the remote data. You need to change your application logic.

Mad Hatter
  • 235
  • 2
  • 3
  • I don't want to load a whole table with 2 million rows in memory; Delphi tries to do that if you don't set right properties in TClientDataSet and TDataSetProvider components. – Andrey Prokhorov Feb 08 '14 at 17:02
  • @Andrey: yes, you don't want, but you're doing it :) If you issue SELECT * FROM TABLE on a 2M rows table, you're trying to do it. First, the database engine thinks you want all 2M rows and works to retrieve them and send to you. Then you can try to retrieve them by chunks (but remember you told the DB engine you want all of them already), but the more you retrieve, the more they fill your memory. If you don't retrieve them all, the server did some useless work using resources that went wasted. You should always tell the DB engine which rows you want exactly, and in which order. – Mad Hatter Feb 10 '14 at 16:36
  • @Andrey: as long as you're the single user of a database with 2M rows, everything may work regardless of what you do. But when you have more users, every row you retrieve (by the SELECT, not your client) needs to activate some kind of locks to implement transactions - for each user. They requires to be retrieved from disk, and put in cache, ecc. ecc. - all of these are shared resources on the DB server. The more resources each user uses, the less user the DB engine can support on a given hardware. A well written DB application use less resources it can, to scale better. – Mad Hatter Feb 10 '14 at 16:40