28

Currently I send JSON from an Ajax post to the server which is then converted to objects using the Jackson Mapper.

The format is like this

{"id":"780710","folderID":"42024","displayOrder":2},{"id":"780724","folderID":"42024","displayOrder":3}

What is the best JavaScript library to compress this data and will the Jackson mapper be able to handle the new format?

rayray
  • 931
  • 2
  • 11
  • 18
  • 1
    Any smaller than that and you're talking about using fewer characters in key names, which can begin to degrade the readability of your JSON object. Then again, some programmers have no problem using obscure key names that don't describe the data they point to. – kevin628 Jul 13 '12 at 15:32
  • 4
    Ok if the above json data example was 100 times the size I could use a library like https://github.com/WebReflection/json.hpack/wiki which would reduce the json format to [“id”,[“780710”,“780724”],“folderID”,[“42024”],“displayOrder”,[“2”,“3”]] . Can jackson handle this format? – rayray Jul 13 '12 at 15:35
  • 1
    What is Jackson? Michael Jackson? ;) If you want to compress it, you would need to have something uncompress it than run it through JSON parse. What you have is some weird array above comment. – epascarello Jul 13 '12 at 15:45
  • 2
    @epascarello What is Google? Try that first – Ian Jul 13 '12 at 15:45
  • rjson anyone? http://www.cliws.com/e/06pogA9VwXylo_GknPEeFA/ - I don't know enough, but looks interesting – David Hobs Dec 27 '12 at 12:19
  • 2
    Instead of using transformations, actual real compression (gzip, lzf/lz4/snappy) gets better results, and is more compatible if client supports it (browsers do) – StaxMan Nov 26 '13 at 19:51
  • TL;DR: Use [MessagePack](https://msgpack.org/) [(2)](https://github.com/msgpack/msgpack) or [JSONH](https://github.com/WebReflection/JSONH) – Andrew Jul 02 '20 at 04:31

5 Answers5

41

Why not just enable gzip compression that browser and web servers support? This will very nicely compress data sizes, with very little explicit work.

StaxMan
  • 113,358
  • 34
  • 211
  • 239
  • 10
    A very good idea, but: How would you enable that? – humanityANDpeace Nov 23 '13 at 09:08
  • This depends on container you are using. Or, if you have to do it manually, use classes from `java.util.zip` and set HTTP header indicating compression used (`Content-Encoding` or such). Note though that caller is supposed to send list of compression types it supports. – StaxMan Nov 26 '13 at 19:50
  • this way looks a way better than others, https://groups.google.com/forum/#!topic/golang-nuts/8AZ3wfseoJE. 76%, 91% compression, impressive – allenhwkim Jun 10 '14 at 16:44
  • 2
    Yes, gzip can produce impressive ration with JSON and XML, especially if content is indented (which should NOT be done for production, but some developers leave it on accidentally). – StaxMan Jun 10 '14 at 18:14
  • 2
    "Why not?" gzip is an excellent solution so +1. But there might be better solutions because gzip is a generic compression algorithm. In principle, you can do better if you have a specific algorithm that knows how the data is structured - and JSON has a simple, well defined structure. I have not found one. – James Jul 22 '14 at 12:20
  • @James theoretically yes, but in practice I've yet to see something practical for JSON; except for better trade-offs b/w CPU and compression. LZ4/LZF/Snappy, f.ex. are MUCH faster, with somewhat lower compression. But I am not aware of any compression algorithm specifically targetting JSON (or XML for that matter); there are binary formats that have compatible logical model, but those offer more modest compression. – StaxMan Dec 31 '14 at 19:42
  • 5
    This answer is a little inadequate since the OP asked for a compression method for AJAX request. Without any external libraries, gzip compression can now only be possible for data sent from server. – Lewis Jul 21 '15 at 10:40
  • 1
    Might be worth checking what browser's ajax helper methods offer. There may be a setting to enable/force gzip compression. I haven't written ajax apps for a while, but recall there being a few settings there. If not, you are right, can't really do it from Javascript itself without ext libs. – StaxMan Jul 21 '15 at 18:25
  • 1
    This answer doesn't address the question "send JSON from an Ajax post to the server" – Josh Noe Feb 07 '17 at 21:59
  • @JoshNoe No it does not. Why should it? That would have been odd as that was not being asked. – StaxMan Feb 08 '17 at 19:23
  • 1
    @StaxMan maybe I'm misunderstanding the question. It looks like rayray wants to know how to compress his data in the browser and post it to the server. Gzip compression only works server -> browser. – Josh Noe Feb 08 '17 at 19:32
15

As said by @JamWaffles, this is the best JSON is able to do concerning compression. And in your case (the line of code you delivered), compressing further may be overkill.

But if you have larger responses, and you want to save those bytes, have a look at

or

They are not JSON, but they serialize data to a smaller format (in most cases).

Beat Richartz
  • 9,474
  • 1
  • 33
  • 50
  • 6
    Alas, BSON is typically NOT more compact than JSON. Also, if client is written in Javascript, binary formats are really really bad matches, since client-side will be much slower (JSON parsers/generators are native, no native binary codecs provided by browser). – StaxMan Jul 13 '12 at 19:07
  • Thanks Beat Richartz and staxman – rayray Jul 16 '12 at 08:51
  • 11
    Neither of those worked great for me. Here's the breakdown I got on a 4mb JSON file. BSON: 5.2MB; msgpack: 5.1MB; lz-string (UTF16): 607kb lz-string: 576kb; lz4: 555kb; lz-string (base64): 519KB – Peter Ehrlich Mar 31 '14 at 02:31
  • 5
    @PeterEhrlich I'd love to see that 4MB JSON file. Especially in the case of msgpack I can't see how you could possibly end up with a roughly 25% larger file. – Beat Richartz Mar 31 '14 at 12:18
  • 4
    @BeatRichartz perhaps due to base64 encoding... generally if content is mostly String valued, there isn't much binary formats can do, so that plus base64 would explain it. – StaxMan Jun 10 '14 at 18:16
6

You can use e.g. jsonh, successor of hpack which has benchmarks on web-resource-optimization. It helps, but the same site will also show you that gzip alone will probably be enough.

So to be clear, gzip works better than hpack, but combining them adds a little more compression.

Mark
  • 18,730
  • 7
  • 107
  • 130
  • jsonh is not successor of hpack. They use similar storage formats, but hpack goes deep into objects, while jsonh converts only one level of array, ignoring nested objects. Both are not the compression tools, but rather efficient storage formats for homogenous data. – metalim Jan 20 '17 at 13:02
  • 3
    Okay thanks for the correction. I must've been confused by the "JSONH is the latest version of json.hpack project and based on JSONDB concept." on the jsonh page. – Mark Jan 20 '17 at 16:34
  • Well, ok, it is successor-ish. They have different functionality though. – metalim Jan 24 '17 at 10:00
  • Details of changes here: http://webreflection.blogspot.com.ee/2011/08/last-version-of-json-hpack.html – metalim Jan 24 '17 at 10:11
2

According to this tweet by modec, compressing JSON is indeed possible, and provides better result than tested alternatives.

It's possible to handle JSON format with nodejs, and a recent open-source project just implemented a very fast compression algorithm for nodejs.

Cyan
  • 13,248
  • 8
  • 43
  • 78
2

JsonZipper is also Awesome for multiple similar repeated objects. It allows you to use an object in its' zipped state by only extracting one object by index from the array, so memory wise it is great as it will always only have extracted what you want.

Oh and you can actually compress on the go, so basically as you are generating data objects, you can compress them, leaving you to always have a small memory footprint.

Most other compression algorithms have to compress&extract all the data at once.

Note however: if your data is a Homogeneous Collection (Exactly Same keys then hpack will be better.)

Gerardlamo
  • 1,505
  • 15
  • 21