1

So far every resource I have seen on loading CSV files into D3.js require that the only way to use the data is to define callback functions (due to Javascript's asynchronous execution).

I originally approached this problem the usual way of defining functions outside then calling them inside the callback function. However this proved repetitive. The other alternative however, to define everything in the callback function, would lead to repeated executions of the same thing leading to redundancy.

I personally found that if I knew how many data items were going to be used, I could simply set an if (read_data_length==total_data_length) condition so that all my code is executed once every CSV row is read.

My question is whether there is a way to identify the length of a CSV file (in terms of number of Objects, or rows), BEFORE loading a CSV file in order to establish such a limit?

So far my own insight has proven this is difficult to impossible, but I am hoping somebody with more experience in Javascript might be able to clear this up. Thanks in advance.

Robert Melikyan
  • 52
  • 2
  • 14
  • You could parse the CSV before D3 becomes involved https://code.tutsplus.com/tutorials/parsing-a-csv-file-with-javascript--cms-25626 ... maybe parse it to JSON. – Ted Fitzpatrick Oct 16 '19 at 17:52
  • 1
    There is a slight misconception here, it’s the other way around: The entire file is read before any callback is executed. The callback is executed exactly **once**. Some functions for loading data take an optional row conversion function, though, which is executed for every row in the data file. However, this also does not happen before the entire file has been read. It happens while parsing the loaded file’s content. Given that, there is no need for what you are trying to accomplish. – altocumulus Oct 16 '19 at 19:49
  • @altocumulus is there a way I can identify the length of how many CSV rows are loaded before the callback function is called then? I have found placing counter functions outside unsuccessful and that within the callback function itself I have no way of knowing when the last element appears. – Robert Melikyan Oct 17 '19 at 20:15
  • 1
    @RobertMelikyan You don‘t need the count beforehand! It‘s sufficient to know the number of rows once the callback is executed. Again, this happens only after **all** rows have been loaded. You won‘t have to wait for *the last element* to appear, it‘s already there. `d3.csv(url, rows => console.log(rows.length))` up to v4 or `d3.csv(url).then(rows => console.log(rows.length))` for v5 will print the number of rows. – altocumulus Oct 18 '19 at 14:56
  • @altocumulus thank you for your feedback, and sorry for the delay, I have been trying out what you mentioned. I guess my problem is that how can I wait to execute a function AFTER all the data has been loaded? – Robert Melikyan Nov 07 '19 at 20:57
  • That's the magic therein: the callback you provide is guaranteed to not be executed before all the data has finished loading. You don't need to waste any thought on that. On the other hand, this does also mean that anything—and I really mean **anything** —which relies on that data, any logic acting on the data needs to go into that callback or, at least, needs to be called from within the callback. – altocumulus Nov 07 '19 at 21:11
  • @altocumulus your responses have really been helpful, what I've noticed is that my code within the callback function is executed iteratively through every data point. Say I want to apply a Scale which relies on all my data points being processed, is there any way I can determine when I have reached the end of my data so the function is only called once? – Robert Melikyan Nov 12 '19 at 17:46
  • 2
    I suppose you are using code designed for D3 **≤v4** whereas you are actually running D3 **v5**. The API for loading files has significantly changed between those versions. Have a look at my second comment above or read through my answer to [*d3 importing csv file to array*](https://stackoverflow.com/questions/52638816/d3-importing-csv-file-to-array) for an explanation. – altocumulus Nov 12 '19 at 21:20

0 Answers0