-1

Well, I am going to query a 4 GB data using a cfquery. It's gonna be pain to query the whole database as it's gonna take very long time to get the data back. I tried stored procedure when the data was 2 GB and it wasn't really fast at that time either. The data pulling will be done based on the date range user is gonna select from a HTML page. I have been suggested to follow data archiving in order to speed up querying the database.

Do you think that I'll have to create a separate table with only fields that are required and then query this newly created table? Well, the size of the current table is 4GB but it is increasing day by day, basically, it's a response database ( getting the information stored from somewhere else). After doing some research, I am wondering if writing a Trigger could be one option? So, if I do this, then as soon as a new entry (row) will be added into the current 4GB table , the trigger will initiate some SQL Query which will transfer the contents of the required fields into the newly created table. This will keep on happening as long as I keep on getting new values in my original 4GB database.

Does above approach sounds good enough to tackle my problem? I have one more concern, even though I am filtering out the only fields required to querying into a new table, at some point of time, the size of my new database will also increase and that could alsow slower the speed of querying the new table?

Please correct me if Iam wrong somewhere. Thanks

More Information:

I am using SQL Server. Indexing is currently done but it's not effective.

Miguel-F
  • 13,450
  • 6
  • 38
  • 63
Jack
  • 989
  • 3
  • 13
  • 24
  • 1
    Are you going to try and push 4GB of data onto a web page? Or is the original DB 4GB and you are trying to select a subset of that? – James A Mohler Nov 14 '13 at 18:42
  • Yeah, I will be displaying the information retrieved from the 4GB data using AmCharts on a web page. – Jack Nov 14 '13 at 18:47
  • 2
    You need to summarize the data down via a query or view to something smaller. Much smaller. Get you query down do something like 1,000 rows and a 50 columns. Just showing 50,000 data points will be hard enough – James A Mohler Nov 14 '13 at 18:59
  • 3
    Your last comment suggests something *completely different* than the actual question itself. I would suggest [revising your question](http://stackoverflow.com/posts/19985691/edit) to describe the *actual* goal, not how you think you should approach it, because charting really has nothing to do with archiving. As with any query question, always specify your DBMS and version. I suspect you may need some advice on indexing too, so be sure to include the queries and [DDL](http://en.wikipedia.org/wiki/Data_Definition_Language) of any tables involved. – Leigh Nov 14 '13 at 19:22
  • @Leigh I have revised my question. Please take a look. Thanks – Jack Nov 18 '13 at 02:21
  • 1
    @Jack - You are still telling us how you think you should tackle the task, rather than describing what the actual task involves. You mentioned generating a chart from a query. What queries and charting code are you currently using? What are the DDL's of the table(s)? – Leigh Nov 18 '13 at 13:51

1 Answers1

-1

Archiving the data will be farting against thunder. The data has to travel from your database to your application. Then your application has to process it to build the chart. The more data you have, the longer that will take.

If it is really necessary to chart that much data, you might want to simply acknowledge that your app will be slow and do things to deal with it. This includes code to prevent multiple page requests, displays to the user, and such.

Dan Bracuk
  • 20,699
  • 4
  • 26
  • 43