3

Is there some way (using grunt, gulp, plain JS, node module, etc.) to do the following.

Taking a JS file as input, build a series of browser specific files based upon the browser's current support of ES6 features and transpile the features that are not yet supported.

I'd like to use the ES6 features that are available as they become available and transpile the ones that aren't to ES5.

Also, for those of us that have the pleasure of dealing with older browsers (e.g. IE9, IE10, and soon to be IE11) some sort of process of dealing with them other than transpiling all JS I write forever will need to come out eventually :).

JasonMArcher
  • 14,195
  • 22
  • 56
  • 52
JayRu
  • 319
  • 3
  • 11
  • I haven't heard of any, but have thought about this as well. – Felix Kling Jun 02 '15 at 02:03
  • An interesting idea (which I had considered once as well), and it would probably push ES6 implementation in browsers if it was widely used. However, see [Why not sending JavaScript files in browser-specific bytecode?](http://stackoverflow.com/q/28649467/1048572) for some contra arguments (not all of them apply of course) – Bergi Jun 02 '15 at 04:17
  • You might be able to do this for slow-moving browsers like IE, but evergreen browsers with tight release cycles and partial feature support would be much more difficult – CodingIntrigue Jun 02 '15 at 14:22
  • More difficult but not impossible :). Chrome and FF are really the fast movers and depending on what they're up to for a particular release, releases are normally at least a month apart. I could look it up, but I'm too lazy. – JayRu Jun 03 '15 at 01:25

1 Answers1

3

You apparently mean to have separate builds for different browsers, which you will presumably serve up depending on some kind of user agent sniffing, or dynamically load after browser-side feature detection. That sounds complicated and error-prone. You'll also need to constantly rebuild your browser-specific versions as new browser versions come out, such as a version of Chrome that supports fat arrow.

What is the problem you are trying to solve? Do you believe that the native implementations will be faster? That's possible, but not necessarily the case, and if there is a difference it is likely to be minimal. Are you worried about payload size, since ES6 syntax is often more succinct? That difference is also likely to be negligible once JS is minified and zipped. I'd much rather have the same ES5 transpiled code running in all browsers, and avoid having to track down weird bugs where a certain browser's support of a certain ES6 feature, which you thought would allow you to avoid transpiling, turns out to be shaky.

I'll give you a concrete example. Let's say you decide that the code you compile for node does not need to transpile fat arrows, since you heard that node supports them with the --harmony flag. So you write

var foo = {
  x: 42,
  bar: function() { setTimeout(() => console.log(this.x)); }
};  
foo.bar();

But then you find out that node does not support the lexicalization of this in fat arrow functions:

> node --harmony test.js
< undefined

Wouldn't you rather have a transpiler like babel reliably transpile this into correct ES5?

var foo = {
  x: 42,
  g: function g() {
    var _this = this;
    setTimeout(function () {
      return console.log(_this.x);
    });
  }
};

> babel-node test.js
< 42

Once you are comfortable that all the browsers you want to support have implemented a particular ES6 feature, then most transpilers provide feature-by-feature flags that allow you to tell it to skip it.

A modified version of your proposal would be to have two builds, one fully transpiled, and one non-transpiled for browsers with complete ES6 support. That would allow you to avoid having to include the transpiler's runtime (such as babel's browser-polyfill.js) in the latter case. However, it would also prevent you from taking advantage of babel's support of ES7 features, some of which are very useful, such as async functions.

  • I'd assume one would dynamically load the built script after feature-detection, instead of serving them based on UA-sniffing (which is bad for a couple of other reasons, too) – Bergi Jun 02 '15 at 04:07
  • @Bergi: That would impact time to render(/use) significantly though. – Felix Kling Jun 02 '15 at 18:29
  • @FelixKling: You might be able to come up with an adaptive solution, that tries (for unknown user agents) pure ES6 first and then gracefully degrades to transpiled scripts with the less demands. – Bergi Jun 02 '15 at 18:31
  • 1
    @Bergi: True. I was thinking about having a transpiled version for each latest browser based on the data from kangax' compatibility table. Ideally that browser test would be taken care of by the webserver itself. If only UA strings were more reliable :-/ But maybe if such a feature was added to web servers, UAs would be spoofed less. One can dream :) – Felix Kling Jun 02 '15 at 18:34
  • 1
    @torazburo: Mostly accurate, astute points, but one thing you said: "Once you are comfortable that all the browsers you want to support have implemented a particular ES6 feature" is a problem for me and most likely a lot of the world for a long time as we deal with those browsers that aren't updatable (e.g. IE's) and will NEVER support ES6 for as long they shall exist. So while it may suck, we'll have to come up with something to support the browsers that never support ES6 and mine as well support the browsers as they move towards full ES6 compatibility. – JayRu Jun 03 '15 at 01:44