I recently started thinking about my projects being developed for self hosts and one of the biggest things on my mind is data analysis of data in an SQL Table. I know that SQL, Javascript, and PHP can handle data analysis, especially things like sum, average, unit conversions, calculating ratios and other factors, but what which process exactly should be responsible for converting the SQL data into calculated data that makes sense for everyone to analyze?
SQL - Can do the calculations in the queries using copies of the same field, the SQL server seems to manage just fine with the spikes when needing to analyze this data.
PHP - Can do the calculations with the resulting raw data from a query(mysqli, PDO, etc...), but it delays loading of the page content.
Javascript - Can also do the calculations with the resulting data using AJAX or inserting php code into the header to retrieve the data and run the query. More complex but seems to have the least amount of delay with loading content since javascript is run client side, thus making the client machine do all the hard work.
In a practical industry with managing big data and doing analysis of statistical data, what would be best in terms of resource management for the server? for the client?
EDIT revision 1: Clarification on data size and scope =>
The data itself pushes the limits as to the maximum number of columns/fields and characters of data sql can handle. As far as number of rows, that number is expected to be in a few thousand rows but has the potential to grow massively(0.1-1% chance for 99% of installations). The project is a self-hosted user management system built specifically for virtual trucking companies. As far as table counts go, there are going to be approximately a dozen tables in this database. But with the amount of data that is being gathered, especially for one particular table, there are a lot of statistics that can be compiled from the data.