使用 HTML5 WebGL 着色器进行计算
在我看来,理论上可以将WebGL用于计算 - 例如计算素数或π或类似的东西。 编写的,所以我有几个问题:
哪种语言 写了?- 但是,从我看的几乎看来,着色器本身不是用JavaScript 考虑着着色器的工作原理,甚至值得尝试做这样的事情吗?
- 一个人如何在运行时来回传递变量?或者,如果不可能,在着色器完成执行后,如何将信息传递回?
- 由于它不是JavaScript,因此如何处理非常大的整数(Java中的BigInteger或JavaScript中的移植版本)?
- 我假设这会自动编译脚本,以便它在图形卡中的所有内核上运行,我可以得到确认吗?
如果相关,在这种特定情况下,我将试图将相当大的数字作为[非常]扩展的CompSci项目的一部分。
编辑:
- WebGL着色器用GLSL编写。
It seems to me like one could theoretically use WebGL for computation--such as computing primes or π or something along those lines. However, from what little I've seen, the shader itself isn't written in Javascript, so I have a few questions:
What language are the shaders written in?- Would it even be worthwhile to attempt to do such a thing, taking into account how shaders work?
- How does one pass variables back and forth during runtime? Or if not possible, how does one pass information back after the shader finishes executing?
- Since it isn't Javascript, how would one handle very large integers (BigInteger in Java or a ported version in Javascript)?
- I would assume this automatically compiles the script so that it runs across all the cores in the graphics card, can I get a confirmation?
If relevant, in this specific case, I'm trying to factor fairly large numbers as part of a [very] extended compsci project.
EDIT:
- WebGL shaders are written in GLSL.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我在 Chrome 中使用 WebGL 中的 JavaScript 计算着色器来解决旅行商问题,将其作为在片段着色器以及其他一些遗传优化问题中解决的一组分布式较小优化问题。
问题:
您可以将浮点数放入 (r: 1.00, g: 234.24234, b: -22.0),但只能取出整数 (r: 255, g: 255, b: 0)。这可以通过将单个浮点编码为 4 个整数作为每个片段的输出来克服。实际上,这是一项繁重的操作,几乎无法解决 99% 的问题。你最好用简单的整数或布尔子解来解决问题。
调试是一场史无前例的噩梦,社区在撰写本文时正在积极进行。
将数据作为像素数据注入着色器非常慢,读出它甚至更慢。举个例子,读取和写入数据来解决 TSP 问题分别需要 200 和 400 毫秒,该数据的实际“绘制”或“计算”时间为 14 毫秒。为了可用,您的数据集必须以正确的方式足够大。
JavaScript 是弱类型的(表面上......),而 OpenGL ES 是强类型的。为了实现互操作,我们必须在 JavaScript 中使用 Int32Array 或 Float32Array 之类的东西,这对于通常标榜自由的语言来说感觉很尴尬且受到限制。
大数字支持归结为使用 5 或 6 个输入数据纹理,将所有像素数据组合成一个数字结构(以某种方式......),然后以有意义的方式对该大数字进行操作。非常hacky,根本不推荐。
I've used compute shaders from JavaScript in Chrome using WebGL to solve the travelling salesman problem as a distributed set of smaller optimization problems solved in the fragment shader, and in a few other genetic optimization problems.
Problems:
You can put floats in (r: 1.00, g: 234.24234, b: -22.0) but you can only get integers out (r: 255, g: 255, b: 0). This can be overcome by encoding a single float into 4 integers as an output per fragment. This is actually so heavy an operation that it almost defeats the purpose for 99% of problems. Your better to solve problems with simple integer or boolean sub-solutions.
Debugging is a nightmare of epic proportions and the community is at the time of writing this actively.
Injecting data into the shader as pixel data is VERY slow, reading it out is even slower. To give you an example, reading and writing the data to solve a TSP problem takes 200 and 400 ms respectively, the actual 'draw' or 'compute' time of that data is 14 ms. In order to be usable your data set has to be large enough in the right way.
JavaScript is weakly typed (on the surface...), whereas OpenGL ES is strongly typed. In order to interoperate we have to use things like Int32Array or Float32Array in JavaScript, which feels awkward and constraining in a language normally touted for it's freedoms.
Big number support comes down to using 5 or 6 textures of input data, combining all that pixel data into a single number structure (somehow...), then operating on that big number in a meaningful way. Very hacky, not at all recommended.
目前正在开发一个项目,它几乎可以完成您正在做的事情 - WebCL。不过,我认为它还没有在任何浏览器中运行。
回答您的问题:
There's a project currently being worked on to do pretty much exactly what you're doing - WebCL. I don't believe it's live in any browsers yet, though.
To answer your questions: