1

i've a business logic to implement that entails fetching huge amount of data from mysql table approximately 1 billion.After fetching the record i've to hit an api of another system with each row.I've tried to implement it using stream.Now i am reading from mysql stream and writing into writeable stream and processing there.Now if my scripts receives a forceClose event i want to stop writing and ensure no more api hit is made.I've tried to implement it using mysql stream and scram jet below given is my code,but i am having few issues with my implementation i.e mysql read stream is done writing to writable stream and writable stream is processing the data and hitting api in the meanwhile ForceClose event is received the stream closes but the issues is few api calls are still made even after closing.Is there any fix.Below given is my code

let stopSending = false;
let newDataStream = new DataStream();
pool.getConnection((err, connection) => {
let numbersStream = connection.query(`SELECT id,number from ${msisdnTableName} Where status='Pending'`).stream({
                    highWaterMark: 5});
numbersStream.on('data', (data) => {

      newDataStream.write(data)
       
      })
numbersStream.on('end', () => {
                    
  connection.release();
  newDataStream.end();
                })
})

newDataStream.map(async (numberObj) => {
if(!stopSending){
let response=await axios.get("http://localhost:8081/data", {
      params: {
        number: numberObj.number
      }});
if(response.status==200){
  await mysqlUpdateStatus(numberObj.id,'Sent')
 }else{
  await mysqlUpdateStatus(numberObj.id,'Failed')
 }


}
})
.on('end',()=>{
  console.log('all data processed')
})

setTimeOut(()=>{
stopSending=true;
},1000)
Benson OO
  • 477
  • 5
  • 22

0 Answers0