8

I'm integrating a *.webm Video with alpha transparency. At the moment, the transparency is only supported in Chrome and Opera. (Demo: http://simpl.info/videoalpha/) Firefox for example plays the video as it supports the WebM format, but instead of the transparency, there's a black background.

My plan is to display the video poster image instead of the video, if the browser does not support alpha transparency. So the video should only play, if the browser supports WebM alpha transparency. I know how to detect the browser or the rendering engine and therefore play the video (see code below) - but is there a "feature detection" way?

var supportsAlphaVideo = /Chrome/.test(navigator.userAgent) && /Google Inc/.test(navigator.vendor) || (/OPR/.test (navigator.userAgent));
if (supportsAlphaVideo) {
document.querySelector(".js-video").play();
}

See also http://updates.html5rocks.com/2013/07/Alpha-transparency-in-Chrome-video

chaenu
  • 1,915
  • 16
  • 32
  • You could use [Modernizer](http://modernizr.com/docs/) a JavaScript library that detects HTML5 and CSS3 features in the user’s browser. – Chris Traveis Apr 02 '15 at 15:25
  • In my opinion there's no way to detect a specific feature in the webm codec with modernizr. – chaenu Apr 02 '15 at 15:30

2 Answers2

11

Here's a working solution to test for alpha support in WebM.

I basically combined Capture first frame of an embedded video and check_webp_feature

The video used to test with is base64-encoded into the source. It's actually a tiny VP9 WebM video encoded using:

ffmpeg -i alpha.png -c:v libvpx-vp9 alpha.webm

If you want to test for VP8 alpha support instead, just encode your own and remove the -vp9. alpha.png is a 64x64 pixel 100% transparent PNG image.

var supportsWebMAlpha = function(callback)
{
    var vid = document.createElement('video');
    vid.autoplay = false;
    vid.loop = false;
    vid.style.display = "none";
    vid.addEventListener("loadeddata", function()
    {
        document.body.removeChild(vid);
        // Create a canvas element, this is what user sees.
        var canvas = document.createElement("canvas");

        //If we don't support the canvas, we definitely don't support webm alpha video.
        if (!(canvas.getContext && canvas.getContext('2d')))
        {
            callback(false);
            return;
        }

        // Get the drawing context for canvas.
        var ctx = canvas.getContext("2d");

        // Draw the current frame of video onto canvas.
        ctx.drawImage(vid, 0, 0);
        if (ctx.getImageData(0, 0, 1, 1).data[3] === 0)
        {
            callback(true);
        }
        else
        {
            callback(false);
        }

    }, false);
    vid.addEventListener("error", function()
    {
        document.body.removeChild(vid);
        callback(false);
    });

    vid.addEventListener("stalled", function()
    {
        document.body.removeChild(vid);
        callback(false);
    });

    //Just in case
    vid.addEventListener("abort", function()
    {
        document.body.removeChild(vid);
        callback(false);
    });

    var source = document.createElement("source");
    source.src="data:video/webm;base64,GkXfowEAAAAAAAAfQoaBAUL3gQFC8oEEQvOBCEKChHdlYm1Ch4ECQoWBAhhTgGcBAAAAAAACBRFNm3RALE27i1OrhBVJqWZTrIHlTbuMU6uEFlSua1OsggEjTbuMU6uEHFO7a1OsggHo7AEAAAAAAACqAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAVSalmAQAAAAAAADIq17GDD0JATYCNTGF2ZjU3LjU3LjEwMFdBjUxhdmY1Ny41Ny4xMDBEiYhARAAAAAAAABZUrmsBAAAAAAAARq4BAAAAAAAAPdeBAXPFgQGcgQAitZyDdW5khoVWX1ZQOYOBASPjg4QCYloA4AEAAAAAAAARsIFAuoFAmoECU8CBAVSygQQfQ7Z1AQAAAAAAAGfngQCgAQAAAAAAAFuhooEAAACCSYNCAAPwA/YAOCQcGFQAADBgAABnP///NXgndmB1oQEAAAAAAAAtpgEAAAAAAAAk7oEBpZ+CSYNCAAPwA/YAOCQcGFQAADBgAABnP///Vttk7swAHFO7awEAAAAAAAARu4+zgQC3iveBAfGCAXXwgQM=";
    source.addEventListener("error", function()
    {
       document.body.removeChild(vid);
       callback(false);
    });
    vid.appendChild(source);

    //This is required for IE
    document.body.appendChild(vid);
};

supportsWebMAlpha(function(result)
{
   if (result)
   {
       alert('Supports WebM Alpha');
   }
   else 
   {
       alert('Doesn\'t support WebM Alpha');
   }
});
Community
  • 1
  • 1
Tom Mettam
  • 2,903
  • 1
  • 27
  • 38
  • I found out that some version of IE11 tend to fallback `ctx.getImageData(0, 0, 1, 1).data` to array `[0, 0, 0, 0]` therefore passing the test. I made another version of 1pixel video which has semi-transparency where `data[3] === 128`. Unforruntately it doesn't fit in subcomment and I don't want to create a separate commentary out of it, so: https://pastebin.com/swzkRpsw – Luixo Feb 08 '19 at 12:13
  • @Luixo: at 1x1 some older versions of webkit fail the test when they do support alpha. 64x64 is the minimum size of video which appeared to work on all. – Tom Mettam Feb 13 '19 at 14:20
  • Do you have an exact webkit engine version? I should check on that. – Luixo Feb 14 '19 at 17:02
  • Unfortunately not, I did this research a long time ago. Sorry :( – Tom Mettam Feb 14 '19 at 20:29
3

There are no properties exposed giving any information about the video and its channels.

The only way to do this is either:

  • Knowing in advance, incorporate that knowledge with the data and serve it to the browser when video is requested as meta-data
  • Use a canvas to analyze the image data
  • Load the file as binary data, then parse the webm format manually to extract this information. Do-able but very inconvenient as the complete file must be downloaded, and of course a parser must be made.

If you don't know in advance, or have no way to supply the metadata, then canvas is your best option.

Canvas

You can use a canvas to test for actual transparency, however, this do have CORS requirements (video must be on the same server, or the external server need to accept cross-origin usage).

Additionally you have to actually start loading the video which of course can have an impact on bandwidth as well as performance. You probably want to do this with a dynamically created video and canvas tag.

From there, it is fairly straight forward.

  • Create a small canvas
  • Draw a frame into it (one that is expected to have an alpha channel)
  • Extract the pixels (CORS requirements here)
  • Loop through the buffer using a Uint32Array view and check for alpha channel for values < 255 (pixel & 0xff000000 !== 0xff000000).

This is fairly fast to do, you can use a frame size of half or even smaller.

Community
  • 1
  • 1
  • Sorry for my late answer. Thank you for proposed solution, using canvas to detect the alpha transparency is a clever way to analyze the content of the video. – chaenu Apr 13 '15 at 16:31