0

There's a similar question / answer on this topic but I don't understand it.

Let's say I'm watching a log file being written to by a game and reading it when it updates, like so:

const fs = require('fs')

const logFilePath = 'C:/Users/*...snip...*.log'

fs.watchFile(logFilePath, (curr, prev) => {
  console.log(`${logFilePath} file Changed`)
  fs.readFile(logFilePath, 'utf8', function(err, data) {
    let lines = data.split("\r\n,")
    //console.log("lines:", lines)
    for (let i = lines.length - 1; i > 0; i--) {
      let line = lines[i]
      console.log("parsing line:", line)
      let trimmed = line.split("\r\n").join("")
      console.log("trimmed line:", trimmed)
      break // just test reading one line for now
    }
  })
})

This actually crashes the game completely and consistently the moment the file is written to. I'm pretty sure it's because I'm trying to read a file as its being written to by the game. How can I ensure my Node.js program doesnt interfere with the game's interaction with the file?

J.Todd
  • 707
  • 1
  • 12
  • 34
  • On what OS? You probably need to have the watcher open the file for read only which would then allow another process to open the file for read/write access.. – jfriend00 Jan 27 '21 at 23:54
  • Also, do you really need to read the entire file every time something is added to the end of the file? That's not very efficient. – jfriend00 Jan 27 '21 at 23:55
  • @jfriend00 Youre right this was lazy. No I dont. And Windows – J.Todd Jan 28 '21 at 00:02
  • @jfriend00 assuming the lines are variable length, how would I know how many lines to check? Is there a pre-built solution or would I write a function to read backwards line by line until one matches? – J.Todd Jan 28 '21 at 04:14
  • It depends upon what exactly you're trying to accomplish and what assumptions you can make. If you know the file is always appending, you can just keep track of the previous length of the file from the last read and just read from there to the end. That will get whatever has been added. If you're looking for the last N lines of the file, then you'd have to read a chunk at the end, parse into lines and perhaps read another chunk back from the end if you didn't have enough lines (reading backwards through the file). – jfriend00 Jan 28 '21 at 04:30
  • FYI, You could probably find a C++ implementation of the unix command `tail` online to see how they did it. – jfriend00 Jan 28 '21 at 04:31
  • @jfriend00 yeah if I was comfortable with C++. Not all of us are god tier coders like you. Ive seen your name on this site for over 5 years. You were teaching when I was learning JavaScript hello world lol – J.Todd Jan 28 '21 at 04:45
  • Written all in nodejs, yere is something that purports to be a tail implementation in nodejs: https://github.com/lucagrulla/node-tail/blob/master/src/tail.js, but I would not say the code is easy to follow. Here's a different approach: https://stackoverflow.com/questions/11225001/reading-a-file-in-real-time-using-node-js – jfriend00 Jan 28 '21 at 04:49

0 Answers0