Questions tagged [node.js-stream]

A stream is an abstract interface for working with streaming data in Node.js. The stream module provides a base API that makes it easy to build objects that implement the stream interface.

There are many stream objects provided by Node.js. For instance, a request to an HTTP server and process.stdout are both stream instances.

Streams can be readable, writable, or both. All streams are instances of EventEmitter.

The stream module can be accessed using:

const stream = require('stream');

There are four fundamental stream types within Node.js:

  • Readable - streams from which data can be read (for example fs.createReadStream()).
  • Writable - streams to which data can be written (for example fs.createWriteStream()).
  • Duplex - streams that are both Readable and Writable (for example net.Socket).
  • Transform - Duplex streams that can modify or transform the data as it is written and read (for example zlib.createDeflate()).
205 questions
2
votes
1 answer

Using nodejs server with request package and function pipe()?

I'm using a nodejs server to mockup a backend at the moment. The server is a webserver and returns json objects on different requests, works flawlessy. Now I have to get the json objects from another domain, so I have to proxy the server. I have…
petur
  • 1,366
  • 3
  • 21
  • 42
2
votes
1 answer

nodejs stdin readable event not triggered

The readable event is not triggered in the process.stdin test.js var self = process.stdin, data ; self.on('readable', function() { var chunk = this.read(); if (chunk === null) { handleArguments(); } else { data += chunk; …
Rajkamal Subramanian
  • 6,884
  • 4
  • 52
  • 69
1
vote
0 answers

Using `pipeline` from `node:stream/promise` for multiple writable sources

I have a Readable stream in object mode that I'm pushing data in, like this const getReadStream = () => { const stream = new Readable({ objectMode: true, read: () => {} }); const get = async (page = 1) => { const { data } = await…
hieumdd
  • 13
  • 3
1
vote
1 answer

unable to successfully close ffmpeg stream in node.js

i'm trying to write a node video app that generates frames using the canvas api (via node-canvas, the project's only npm dependancy right now), and writes it to ffmpeg via a stream to generate a video: const { createCanvas } =…
K. Russell Smith
  • 133
  • 1
  • 11
1
vote
0 answers

Throwing an error from a Node.js Transform stream

I need to throw an error in a Transform stream. Normally, I'd do this with the callback function on _transform(). I can't in my situation because I need to throw the error even if no data is currently flowing through my stream. That is, if no data…
Brad
  • 159,648
  • 54
  • 349
  • 530
1
vote
2 answers

How to make nodejs server act like a proxy and get img from cloudinary then send it to browser

for storage space issues i cannot save images in server so i had to store it in cloudinary and for seo purposes I had to serve it from my domain not cloudinary's so i thought to get img files from cloudinary and send it directly to browser (to be…
Andrew Naem
  • 162
  • 12
1
vote
1 answer

node.js stream finish or error event not called

I have a node.js function that should create a writeable stream wrapped with a Promise. The issue is that the promise returns pending, because stream's finish or error event is not called which should resolve or reject the promise. storeFile async…
Matt
  • 8,195
  • 31
  • 115
  • 225
1
vote
1 answer

Serve remote url with node.js stream

I have a video stored in amazon s3. Now I'm serving it to the client with node.js stream return request(content.url).pipe(res) But, the following format is not working with safari. Safari is unable to play streamed data. But, the same code works…
1
vote
1 answer

How to detect encoding errors in a Node.js Buffer

I'm reading a file in Node.js, into a Buffer object, and I'm decoding the UTF-8 content of the Buffer using Buffer.toString('utf8'). If there are encoding errors, I want to report a failure. The toString() method handles decoding errors by…
Michael Kay
  • 156,231
  • 11
  • 92
  • 164
1
vote
1 answer

Download a excel file from azure blob and process its data without needing to save file to local directory

I want to download an excel file from azure blob and process it's data using the 'xlsx' npm module. I have achieved this with saving the file to local directory on my node.js server. But I have to Implement this without needing to save the file…
Piyush Upadhyay
  • 177
  • 1
  • 12
1
vote
0 answers

How to make the 'end' event works in the node.js

I am not understanding why I cannot call the end event in this code below. I have already read too many things about how to do it, and I understand that I need to switch the stream (process.stdin) into a flowing mode. But if I have a data event in…
1
vote
0 answers

Using axios to get an external image and then saving it to the file system?

I have the following function that is called on every request. async function checkForNewData() { var now = moment(); var lastUpdateUnix = fs.readFileSync('.data/last-update.txt').toString(); var lastUpdate = moment.duration(lastUpdateUnix,…
Phoebe
  • 108
  • 1
  • 11
1
vote
2 answers

Twitter Bot Node.js and Twit Package

PROBLEM So im trying to create a twitter bot and everything was going fine until i tried to auto reply to users who follow me. I'am learning and was watching from this tutorial Coding Train Twitter Bot(LINK) but i seem to get this error(PHOTO) even…
user8360723
  • 23
  • 2
  • 8
1
vote
0 answers

res.send() too slow when returning large JSON object

I am building an Express GET route that responds with a large JSON object. Res.send(myobject) or res.json(myobject) works fine but is too slow. Is there an alternative method that i can use? I took a look at streaming but seemed to be file…
Jon
  • 123
  • 7
1
vote
0 answers

Node.js readStream from a large JSON file causes slowdown in the process overtime

I have a JSON file of size 2GB, i create a readStream from it using fs.createReadStream and pipe it through JSONStream.parse (https://github.com/dominictarr/JSONStream), and then push each record into the database, the setup works totally fine but…
Sai
  • 1,790
  • 5
  • 29
  • 51