2

I am currently using the following HighCharts:HighStock:Charts: http://www.highcharts.com/stock/demo/data-grouping in order to display the data returned from the server.

We retrieve the data from a MySQL database and is really big. We are storing sensor metrics every 1 second. After a while we got the following error:

[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 4756882 bytes) in C:\\wamp\\www\\admin\\getTrends.php on line 156, referer: http://localhost/admin/trends.php
[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP Stack trace:, referer: http://localhost/admin/trends.php
[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP   1. {main}() C:\\wamp\\www\\admin\\getTrends.php:0, referer: http://localhost/admin/trends.php
[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP   2. getTrendsDataAI() C:\\wamp\\www\\admin\\getTrends.php:33, referer: http://localhost/admin/trends.php
[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP   3. printResults() C:\\wamp\\www\\admin\\getTrends.php:102, referer: http://localhost/admin/trends.php
[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP   4. createData() C:\\wamp\\www\\admin\\getTrends.php:230, referer: http://localhost/admin/trends.php
[Wed Sep 12 00:15:56 2012] [error] [client 127.0.0.1] PHP   5. implode() C:\\wamp\\www\\admin\\getTrends.php:156, referer: http://localhost/admin/trends.php

What is the best solution to return this data as JSON object to HighStocks for viewing? And how can we overcome the PHP limitation? Shall we return chunk of data each time? How do they usually present enormous amount of data to the users and creating charts and reports from this data? Another big problem that we need to overcome is that the returned JSON object is enormous. At this point is around 20-30 mbs and it will be much larger in the future. Is it ok to return this data to the user and perform everything client side?

Any suggestions or thoughts welcome.

glarkou
  • 7,023
  • 12
  • 68
  • 118
  • I've used RRD for storing sensor data in the past and it works well, and in my experience, better than MySQL for sensor data. There's a PECL extension for interacting with RRD. RRD also prunes data at points you can set so after a month, instead of having data for every second, it would replace those with averages for every minute, or however you want to configure it. RRDTool: http://oss.oetiker.ch/rrdtool/ Extension: http://php.net/manual/en/book.rrd.php – G-Nugget Sep 11 '12 at 21:40
  • I'd try to reduce the number of datapoints you're sending over. E.g. if you're sending 100,000 hourly datapoints, try sending daily datapoints instead (that'd be roughly 4,200). – Frank Farmer Sep 11 '12 at 22:53
  • 1
    But guys what if precision is a critical requirement? – glarkou Sep 11 '12 at 22:56

2 Answers2

2

When displaying large amounts of data I think the best thing you could do (And the most common one) would be to generate the views with a certain resolution.

As the user narrows down (zooms in) on a given spot, you would increase the resolution on that spot, thus reducing the overall size of the chunks

This way you could generate smaller file sizes via php, that would ideally represent the same graph. Something similar to the way google maps used to work.

aurbano
  • 3,324
  • 1
  • 25
  • 39
  • +1 I agree. To add...you need to do some type of server processing to limit the number of records you are working with based on the current zoom level. I have answered a [**question**](http://stackoverflow.com/a/12534878/1253219) recently with links explaining how to do server side processing on zoom levels. – Linger Sep 21 '12 at 17:37
0

JSON isn't really amendable to chunking. Each json string has to be complete in and of itself, since it directly represents a complete data structure: a string, an array, an object, etc...

What you could do is send the data over in independent chunks, and rebuild the data structure in JS. e.g. request #1 sends over 10,000 rows worth of data, request #2 gets 10,001-20,000, #3 gets 20,001-30,000 etc...

but then you're still going to end up with a huge memory footprint in your browser. neither PHP nor JS are particularly memory efficient for large structures.

Marc B
  • 356,200
  • 43
  • 426
  • 500
  • So any suggestions for an alternative technology/ies ? – glarkou Sep 11 '12 at 21:41
  • not really. either reduce the amount of data by summarizing/grouping into (say) day-sized chunks instead of minute-sized, or be resigned to basically slurping an entire database across the wire. – Marc B Sep 11 '12 at 21:44