1

I am connected to websocket using node.js. Every few sec i get message from websocket with some json data.

Something like this :

{"event":"add","data":[
{"id":["12280965172"],"m":"Marko","e":"FA","u":"pn1","p":98.72}},
{"id":["12280964541"],"m":"Darko","e":"FA","u":"kk6","p":1.02}},
{"id":["12280965661"],"m":"Ivan","e":"FB","u":"jb1","p":0.23}}
]}

When i get this message i want to save it into "example.json" on localhost.

"example.json" is already existing file. Before i add this new data I want to see if i already have data with same "m" in "example.json" and if not i want to add it.

What would be the best way to do it ?

MHB2011
  • 453
  • 1
  • 7
  • 22
  • 1
    This depends on how large the filesize is and whether or not you are willing/able to use a real database instead. – str Oct 13 '17 at 19:52
  • 1
    Also your priorities -- if you care about integrity then you get to do the create-a-new-file+rename-over-old-file dance. If for some reason it's acceptable to have the file have partial/bad/corrupt contents if someone reads your file in the middle of an update or if your script crashes/loses power, then one can rewrite the same inode in-place. – Charles Duffy Oct 13 '17 at 19:56
  • Closely related: [transactionally writing files in node.js](https://stackoverflow.com/questions/17047994/transactionally-writing-files-in-node-js) (though those answers don't appear to discuss all the work that needs to be done to be safe against powerloss with journaled filesystems). https://github.com/npm/write-file-atomic is a thing that exists, as well. – Charles Duffy Oct 13 '17 at 19:58
  • ...really, though, you *should* use a real database if this has any chance of growing large. One of the things a database will give you is the ability to do *real* in-place updates, which traditional filesystems don't natively support without a bunch of restrictions (ie. new data needs to be the exact same size as the old data or anything after the point of change needs to be fully rewritten). – Charles Duffy Oct 13 '17 at 20:02
  • The thing is that some of my other scripts use this json file and i really dont want to rewrite 20 scripts to use "real" database. In example.json will be 20k-30k inputs. – MHB2011 Oct 13 '17 at 20:03
  • 1
    Rewriting twenty or thirty thousand lines every few seconds doesn't sound like fun to me. There *are* real databases out there that can return query results in JSON form, or you could write an API call to return the JSON you want, and have your scripts use the output from that call. – Charles Duffy Oct 13 '17 at 20:04
  • 1
    The other thing a real database worthy of the name will get you is some means of resolving conflicts -- if you get two updates being processed at the same time, you don't want one of them to be ignored (the likely case with naive code that uses the write-file-atomic library I linked above) or a corrupt or inconsistent store as a result (with code that *doesn't* try to do something similar to write-file-atomic). – Charles Duffy Oct 13 '17 at 20:08
  • @CharlesDuffy I fully understand u , i am just lazy i guess. I will do it with real database all from scratch... – MHB2011 Oct 13 '17 at 20:25
  • 1
    I don't know if it suits your needs but you could use RethinkDB, which is a realtime database that uses json – Jesus Mendoza Oct 13 '17 at 20:32

0 Answers0