We're using PIXI.js for games which internally uses WebGL for rendering. Every now and then I'm stumbling across mentions of power-of-two and possible performance benefits of avoiding NPOT textures (https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL#Non_power-of-two_textures, https://github.com/pixijs/pixi.js/blob/master/src/core/textures/BaseTexture.js#L116). Confusingly there are also mentions that it doesn't make a difference anymore (OpenGL - Power Of Two Textures). With webgl and browser development moving so fast it's hard to tell which of these bits of information is accurate.
Specifically I'm wondering whether the overhead of padding images to create POT textures (longer downloads, increased memory usage) are worth the performance benefits (if they indeed exists). I couldn't find any comparison or performance benchmarks comparing POT vs NPOT textures and I sadly don't really know how I would go about creating one myself.
Does anyone have experience in that regard or some up-to-date numbers? Is there a good way of measuring webgl performance?