使用 HTML5 WebGL 着色器进行计算

发布于 2024-12-04 14:08:39 字数 488 浏览 1 评论 0原文

在我看来,理论上可以将WebGL用于计算 - 例如计算素数或π或类似的东西。 编写的,所以我有几个问题:

  1. 哪种语言 写了?
  2. 但是,从我看的几乎看来,着色器本身不是用JavaScript 考虑着着色器的工作原理,甚至值得尝试做这样的事情吗?
  3. 一个人如何在运行时来回传递变量?或者,如果不可能,在着色器完成执行后,如何将信息传递回?
  4. 由于它不是JavaScript,因此如何处理非常大的整数(Java中的BigInteger或JavaScript中的移植版本)?
  5. 我假设这会自动编译脚本,以便它在图形卡中的所有内核上运行,我可以得到确认吗?

如果相关,在这种特定情况下,我将试图将相当大的数字作为[非常]扩展的CompSci项目的一部分。

编辑:

  1. WebGL着色器用GLSL编写。

It seems to me like one could theoretically use WebGL for computation--such as computing primes or π or something along those lines. However, from what little I've seen, the shader itself isn't written in Javascript, so I have a few questions:

  1. What language are the shaders written in?
  2. Would it even be worthwhile to attempt to do such a thing, taking into account how shaders work?
  3. How does one pass variables back and forth during runtime? Or if not possible, how does one pass information back after the shader finishes executing?
  4. Since it isn't Javascript, how would one handle very large integers (BigInteger in Java or a ported version in Javascript)?
  5. I would assume this automatically compiles the script so that it runs across all the cores in the graphics card, can I get a confirmation?

If relevant, in this specific case, I'm trying to factor fairly large numbers as part of a [very] extended compsci project.

EDIT:

  1. WebGL shaders are written in GLSL.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

呆橘 2024-12-11 14:08:39

我在 Chrome 中使用 WebGL 中的 JavaScript 计算着色器来解决旅行商问题,将其作为在片段着色器以及其他一些遗传优化问题中解决的一组分布式较小优化问题。

问题:

  1. 您可以将浮点数放入 (​​r: 1.00, g: 234.24234, b: -22.0),但只能取出整数 (r: 255, g: 255, b: 0)。这可以通过将单个浮点编码为 4 个整数作为每个片段的输出来克服。实际上,这是一项繁重的操作,几乎无法解决 99% 的问题。你最好用简单的整数或布尔子解来解决问题。

  2. 调试是一场史无前例的噩梦,社区在撰写本文时正在积极进行。

  3. 将数据作为像素数据注入着色器非常慢,读出它甚至更慢。举个例子,读取和写入数据来解决 TSP 问题分别需要 200 和 400 毫秒,该数据的实际“绘制”或“计算”时间为 14 毫秒。为了可用,您的数据集必须以正确的方式足够大。

  4. JavaScript 是弱类型的(表面上......),而 OpenGL ES 是强类型的。为了实现互操作,我们必须在 JavaScript 中使用 Int32Array 或 Float32Array 之类的东西,这对于通常标榜自由的语言来说感觉很尴尬且受到限制。

  5. 大数字支持归结为使用 5 或 6 个输入数据纹理,将所有像素数据组合成一个数字结构(以某种方式......),然后以有意义的方式对该大数字进行操作。非常hacky,根本不推荐。

I've used compute shaders from JavaScript in Chrome using WebGL to solve the travelling salesman problem as a distributed set of smaller optimization problems solved in the fragment shader, and in a few other genetic optimization problems.

Problems:

  1. You can put floats in (r: 1.00, g: 234.24234, b: -22.0) but you can only get integers out (r: 255, g: 255, b: 0). This can be overcome by encoding a single float into 4 integers as an output per fragment. This is actually so heavy an operation that it almost defeats the purpose for 99% of problems. Your better to solve problems with simple integer or boolean sub-solutions.

  2. Debugging is a nightmare of epic proportions and the community is at the time of writing this actively.

  3. Injecting data into the shader as pixel data is VERY slow, reading it out is even slower. To give you an example, reading and writing the data to solve a TSP problem takes 200 and 400 ms respectively, the actual 'draw' or 'compute' time of that data is 14 ms. In order to be usable your data set has to be large enough in the right way.

  4. JavaScript is weakly typed (on the surface...), whereas OpenGL ES is strongly typed. In order to interoperate we have to use things like Int32Array or Float32Array in JavaScript, which feels awkward and constraining in a language normally touted for it's freedoms.

  5. Big number support comes down to using 5 or 6 textures of input data, combining all that pixel data into a single number structure (somehow...), then operating on that big number in a meaningful way. Very hacky, not at all recommended.

橙味迷妹 2024-12-11 14:08:39

目前正在开发一个项目,它几乎可以完成您正在做的事情 - WebCL。不过,我认为它还没有在任何浏览器中运行。

回答您的问题:

  1. 我想已经回答了!
  2. 在 WebGL 中可能不值得做。如果您想尝试 GPU 计算,目前在浏览器之外进行操作可能会更顺利,因为浏览器中的工具链更加成熟。
  3. 如果您坚持使用 WebGL,一种方法可能是将结果写入纹理并读回。
  4. 有困难。与 CPU 非常相似,GPU 只能在本机使用某些大小值,其他所有内容都必须进行模拟。
  5. 是的。

There's a project currently being worked on to do pretty much exactly what you're doing - WebCL. I don't believe it's live in any browsers yet, though.

To answer your questions:

  1. Already answered I guess!
  2. Probably not worth doing in WebGL. If you want to play around with GPU computation, you'll probably have better luck doing it outside the browser for now, as the toolchains are much more mature there.
  3. If you're stuck with WebGL, one approach might be to write your results into a texture and read that back.
  4. With difficulty. Much like CPUs, GPUs can only work with certain size values natively, and everything else has to be emulated.
  5. Yep.
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文