1

I am trying to write an embedded javascript code in an OBIEE report. Basically the idea of the report is to take tabular data, (rows and columns) into the report and have the report provide a way to extract user specified columns from it and download resulting data in CSV or Excel file. I'm trying to do this by storing data as an array of objects. So something like this:-


[
{'column1':'Entry1','column2':'Entry2',...}
,{'column1':'Entry1','column2':'Entry2',...}
,....
]

Problem is I'm get a C-runtime error (std::bad_alloc) which I'm assuming is because of running out of memory because it works when I take in less number of rows. The expected data is to be a maximum of about 200 columns (could be empty or non empty) and 1-2 million rows. What is the most memory efficient way to store such data, one copy of full data and then one copy of data with only the required columns? I can't post exact code here due to security reasons as it's on a work laptop on secure server.

  • Are you using Array.push method in order to push multiple array elements at once? – DalexHD Apr 13 '22 at 13:04
  • 1
    You should really consider using indexedDB – ControlAltDel Apr 13 '22 at 13:08
  • 1
    Storing two million rows in memory is silly. An excel sheet can not even contain as many. All you need to store in memory are the selected columns. Pass those back and generate the download file on the server. – JavaScript Apr 13 '22 at 13:12
  • @DalexHD Yes I'm using Array.push to push one object at a time as provided by back end – Pranav Bharadwaj Apr 13 '22 at 13:40
  • @JavaScript My boss wants me to do this stuff on the embedded javascript itself not using any other server or anything like that. – Pranav Bharadwaj Apr 13 '22 at 13:42
  • I can't believe I'm commenting on this, but here goes: So you're using the most complete and complex analytical platform tool out there as an Excel export mechanism AND are trying to do that in JS?! I mean - sure it gives job security and wastes time like there's no tomorrow (yay timesheet), but apart from everything else: That's kind of why the tool has self-service capabilities where the user can click together whatever they want. You're literally re-inventing the wheel in a worse way with less governance, worse upgradability and you're developing a new point of failure. Kudos all around. – Chris Apr 14 '22 at 07:22
  • @Chris I agree with you tbh feels like a silly way to do something to me as well and I have communicated the same to my superiors but they don't seem very flexible on listening to doing something on the tools provided by the actual analytical tool and not just on js just to have less changes needed. Insistent if there is "any other technology" available to handle the use case while still using javascript. So I don't have a choice but to try and do as much as the js can support before I thrown in the towel and say can't handle any more data than this from js side. – Pranav Bharadwaj Apr 14 '22 at 08:16

0 Answers0