Thanks for helping!
I have a global task - i need to sort big file, for example its 1gb and .csv (strings in file). Of course i must do it without using full memory, only via stream.
The method of sorting is not important, i want by string.length.
I divided the task to subtasks.
First is create the object mb Map collection which could contain infromantion:
key string.length = value is count of keys with the same lenght
Idea in that i can use this object in next read stream and write stream to write strings to file by their length.Search in big file this some range of length strigs ( up to 15 length for example) and write their to new sort file and etc.
I only learning how correctly use stream, so i need help:
1. global task - correct alorithm i selected or is there a better solution ?
2. i am stuck in read stream for adding information to object, thats my code:
const fs = require('fs');
const readline = require('readline');
const readableStream = fs.createReadStream('input.csv', {highWaterMark: 8000});
let fileStructure = {};
let count = 1;
readableStream.on('data', (chunk) => {
if (!fileStructure[chunk.length]) {
fileStructure[chunk.length] = count
}
fileStructure[chunk.length] = ++count
});
console.log(fileStructure);
if i create a console.log i will see that object fileStructure is empty
Thanks !
P.S please share the best practices for working with streams