GPGPU 是黑客吗?

发布于 2024-08-04 16:12:35 字数 214 浏览 9 评论 0原文

几天前我开始研究 GPGPU,并成功实现了具有良好性能的胆囊因式分解,并且我参加了一个高性能计算会议,其中有人说“GPGPU 是一个 Hack”。

我仍然很困惑这是什么意思以及为什么他们说它是黑客。有人说这是黑客行为,因为你将问题转换为矩阵并对其进行运算。但我仍然很困惑,人们是否认为这是黑客攻击,或者如果是,那么为什么?

谁能帮助我,为什么他们称其为黑客,而我发现它没有任何问题。

I had started working on GPGPU some days ago and successfully implemented cholesky factorization with good performacne and I attended a conference on High Performance Computing where some people said that "GPGPU is a Hack".

I am still confused what does it mean and why they were saying it hack. One said that this is hack because you are converting your problem into a matrix and doing operations on it. But still I am confused that does people think it is a hack or if yes then why?

Can anyone help me, why they called it a hack while I found nothing wrong with it.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

遥远的她 2024-08-11 16:12:35

出现这种观点的一个可能原因是 GPU 最初并不是用于通用计算。此外,GPU 编程不太传统,更硬核,因此更有可能被视为黑客行为。

“你把问题转化为矩阵”的观点根本不合理。无论您通过编写代码解决什么任务,您都会选择合理的数据结构。对于 GPU 矩阵可能是最合理的数据结构,这不是黑客,而是使用它们的自然选择。

不过我认为 GPGPU 的普及只是时间问题。人们只需要习惯这个想法。毕竟谁关心计算机的哪个单元运行该程序呢?

One possible reason for such opinion is that the GPU was not originally intended for general purpose computations. Also programming a GPU is less traditional and more hardcore and therefore more likely to be perceived as a hack.

The point that "you convert the problem into a matrix" is not reasonable at all. Whatever task you solve with writing code you choose reasonable data structures. In case of GPU matrices are likely the most reasonable datastructures and it's not a hack but just a natural choice to use them.

However I suppose that it's a matter of time for GPGPU becoming widespread. People just have to get used to the idea. After all who cares which unit of the computer runs the program?

2024-08-11 16:12:35

在 GPU 上,高效的内存访问对于实现最佳性能至关重要。这通常涉及重构甚至选择全新的算法和数据结构。这就是为什么 GPU 编程会被视为黑客行为的原因。

其次,调整现有算法以在 GPU 上运行本身并不科学。一些 GPU 算法相关论文的科学贡献相对较低,导致人们对 GPU 编程被视为严格的“工程”的负面看法。

On the GPU, having efficient memory access is paramount to achieving optimal performance. This often involves restructuring or even choosing entirely new algorithms and data structures. This is reason why GPU programming can be perceived as a hack.

Secondly, adapting an existing algorithm to run on the GPU is not in and of itself science. The relatively low scientific contribution of some GPU algorithm-related papers has led to a negative perception of GPU programming as strictly "engineering".

习惯成性 2024-08-11 16:12:35

显然,只有说这句话的人才能确切地说出他为什么这么说,但是,这是我的看法:

  • “黑客”并不是一件坏事。
  • 它迫使人们学习新的编程语言和概念。对于那些只是想模拟天气、蛋白质折叠或药物反应的人来说,这是一种不受欢迎的烦恼。他们一开始并不想学习 FORTRAN(或其他语言),现在必须学习另一个编程系统。
  • 编程工具还还没有非常成熟。
  • 硬件还不如 CPU 可靠,因此所有计算都必须进行两次才能确保获得正确的答案。原因之一是 GPU 尚未配备纠错内存,因此,如果您尝试构建一台拥有数千个处理器的超级计算机,那么宇宙射线在您的数字中发生一点翻转的概率就接近确定性。

至于评论“你正在将问题转换为矩阵并对其进行运算”,我认为这表明了很多无知。几乎所有高性能计算都符合这一描述!

Obviously, only the person who said that can say for certain why he said it, but, here's my take:

  • A "Hack" is not a bad thing.
  • It forces people to learn new programming languages and concepts. For people who are just trying to model the weather or protein folding or drug reactions, this is an unwelcome annoyance. They didn't really want to learn FORTRAN (or whatever) in the first place, and now the have to learn another programming system.
  • The programming tools are NOT very mature yet.
  • The hardware isn't as reliable as CPUs (yet) so all of the calculations have to be done twice to make sure you've got the right answer. One reason for this is that GPUs don't come with error-correcting memory yet, so if you're trying to build a supercomputer with thousands of processors, the probability of a cosmic ray flipping a bit in you numbers approaches certainty.

As for the comment "you are converting your problem into a matrix and doing operations on it", I think that shows a lot of ignorance. Virtually ALL of high-performance computing fits that description!

贵在坚持 2024-08-11 16:12:35

过去几年以及未来几年 GPGPU 的主要问题之一是针对任意任务对其进行编程并不容易。直到 DX10 之前,GPU 之间还没有整数支持,并且分支仍然很差。在这种情况下,为了获得最大收益,您必须以一种非常尴尬的方式编写代码,以从 GPU 中获取各种效率增益。这是因为您运行的硬件仍然专用于处理多边形和纹理,而不是抽象的并行任务。

显然,这就是我的看法,YMMV

One of the major problems in GPGPU for the past few years and probably for the next few is that programming them for arbitrary tasks was not very easy. Up until DX10 there was no integer support among GPUs and branching is still very poor. This is very much a situation where in order to get maximum benefit you have to write your code in a very awkward manner to extract all sorts of efficiency gains from the GPU. This is because you're running on hardware that is still dedicated to processing polygons and textures, rather than abstract parallel tasks.

Obviously, thats my take on it and YMMV

总攻大人 2024-08-11 16:12:35

GPGPU 让人回想起数学协处理器的时代。 hack 是解决冗长问题的捷径。 GPGPU 是一种黑客行为,就像 IPV4 之上的 NAT 是一种黑客行为一样。就像网络一样,随着我们尝试做得更多,计算问题变得越来越大,GPGPU 是一个有用的临时解决方案,无论它是留在核心 CPU 芯片之外并具有单独的胡思乱想的 API,还是通过 API 或制造吸入 CPU,都取决于路径发现者。

The GPGPU harks back to the days of the math co-processor. A hack is a shortcut to solving a long winded problem. GPGPU is a hack just like NAT on top of IPV4 is a hack. Computational problems just like networks are getting bigger as we try to do more, GPGPU is an useful interim solution, whether it stays outside the core CPU chip and has separate cranky API or gets sucked into the CPU via API or manufacture is up to the path finders.

情绪失控 2024-08-11 16:12:35

我想他的意思是使用 GPGPU 迫使您重新构建您的实现,以便它适合硬件,而不是问题域。优雅的实现应该适合后者。

请注意,“hack”一词可能有几种不同的含义:
http://www.urbandictionary.com/define.php?term=hack

I suppose he meant that using GPGPU forced you to restructure your implementation, so that it fitted the hardware, not the problem domain. Elegant implementation should fit the latter.

Note, that the word "hack" may have several different meanings:
http://www.urbandictionary.com/define.php?term=hack

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文