It seems to me like one could theoretically use WebGL for computation--such as computing primes or π or something along those lines. However, from what little I've seen, the shader itself isn't written in Javascript, so I have a few questions:
What language are the shaders written in?- Would it even be worthwhile to attempt to do such a thing, taking into account how shaders work?
- How does one pass variables back and forth during runtime? Or if not possible, how does one pass information back after the shader finishes executing?
- Since it isn't Javascript, how would one handle very large integers (BigInteger in Java or a ported version in Javascript)?
- I would assume this automatically compiles the script so that it runs across all the cores in the graphics card, can I get a confirmation?
If relevant, in this specific case, I'm trying to factor fairly large numbers as part of a [very] extended compsci project.
EDIT:
- WebGL shaders are written in GLSL.