1

I have the following function that reads data from a remote csv file and does some processing on it before displaying it on a datatable.

function loadGrid()
{
        link='{{ url("thecsvfile") }}/'+$('#class-id').val();
        Papa.parse(link, {
        download: true,withCredentials: true,header:true,
        step: function(results, parser) {
         //.... process the data row by row
         cleanDataForGrid(results.data);
        },
        complete:function()
        {
           table =$('#the-grid').DataTable({
                  responsive: true,
                  destroy: true,
                 'bProcessing': true,
                  paging: false,
                  searching: false,
                  info:false,
                  scrollCollapse:true,
                  'columnDefs': [{
                                 'targets': 0,
                                 'searchable':false,
                                 'orderable':false,
                                 'width':'1%',
                                 'className': 'dt-body-center',
                                 'render': function (data, type, full, meta){
                                     return '<input type="checkbox" class="the_check">';
                                 },

                              }],

          });
           updateDownloaded();
        }
       });

}

It currently works well, the problem is that when the CSV file has many rows, it takes a long time to load and at times the browser hangs or crashes. Is there a way we can get the number of rows first, if the file has more than X rows avoid the download completely? if I could get such an implementation where only the row count is returned.

function getRows()
{
 var rowcount= Papa.parse(link, {download: false});
if(rowcount<50001) loadGrid();
}

Is it possible to have such an implementation?

BinaryButterfly
  • 18,137
  • 13
  • 50
  • 91
indago
  • 2,041
  • 3
  • 29
  • 48

1 Answers1

1

It is perfectly possible to query a server-side program which will handle your needs : returns the number of rows of a CSV file.

Example: http://www.yoursite.com/handlers/csv/count?file=thecsvfile

You can initialize the view which the returned value in javascript code. With php and the zend framework, you can inject the returned value with the viewmodel, and display itself inside your view.

<script type="text/javascript">
<!--
var maxcsvfileSize = 50001;
var thecsvfileSize = <?php echo $this->thecsvfileSize; ?>;

function loadGrid(thecsvfileSize){ 
   // test the length of thecsvfile and do the job you need.
   if (thecsvfileSize < maxcsvfileSize) {
      // do the job
   }
}

//-->
</script>

[EDIT]

Looking a bit on Papi documentation, you will have the right to use stream to handle big files :

Papa.parse("http://example.com/big.csv", {
    download: true,
    step: function(row) {
        console.log("Row:", row.data);
    },
    complete: function() {
        console.log("All done!");
    }
});

...which is done relatively easily and is responding at 100 percent to your question and needs.

BendaThierry.com
  • 2,080
  • 1
  • 15
  • 17
  • This assumes that i have control on the server that serves the csv file... – indago Sep 06 '16 at 09:03
  • Is your application deployed on a local webserver WAMP like or on a distributed servers architectures from an hosting company? If it is on local PC install PHP, APACHE and parse the csv file after download with wget or any linux tool you would prefer. – BendaThierry.com Sep 06 '16 at 09:20
  • My app is on a hosting company, but the csv file is from another system, its normally generated on the fly when you call that url e.g `http://theip/path/theclassid` – indago Sep 06 '16 at 13:17
  • Ok then, I have done an edit, which will give you what you need to handle big csv files. – BendaThierry.com Sep 07 '16 at 07:15