Consider some kind of API scenario where I have one file that different HTTP requests cause reading/writing to. The requests occur synchronously. To avoid data or file access conflicts, I implemented some kind of asynchronous promise chain, basically like this
class PromiseChain {
private current: Promise<any> = Promise.resolve();
public push<T, TResult1 = T, TResult2 = never>(
onfulfilled?:
| ((value: T) => TResult1 | PromiseLike<TResult1>)
| undefined
| null,
onrejected?:
| ((reason: any) => TResult2 | PromiseLike<TResult2>)
| undefined
| null
): Promise<TResult1 | TResult2> {
return (this.current = this.current.then(onfulfilled).catch(onrejected));
}
}
(for details check out this post). So my different route handlers push their file operating functions to one global chain as they occur, and the chain is processing them one by one.
import { readFile, writeFile } from 'fs/promises';
const chain = new PromiseChain();
// one route
chain.push(() => oneFileOperation());
// another route
chain.push(() => anotherFileOperation());
function oneFileOperation<T>(data: T): Promise<T> {
return readFile('my-file.json', 'utf-8').then(JSON.parse);
}
Still, it sometimes seems that a file operation is not complete when another function already tries accessing the file. I would like to solve this (and maybe improve performance) by having some kind of global, continuous file stream, where my different request handlers would pipe there file operation to, in some asynchronous way like with my chain. I'm familiar with rxjs
, which I had in mind, but not so much with fs
, so maybe someone could point in the right direction, how to get started with streams.