GPGPU上的金融应用

发布于 2024-09-01 01:12:49 字数 115 浏览 5 评论 0原文

我想知道使用 GPGPU 可以实现什么样的金融应用程序。我知道使用 CUDA 在 GPGPU 上使用蒙特卡罗模拟进行期权定价/股票价格估计。有人可以列举一下在金融领域的任何应用程序中使用 GPGPU 的各种可能性吗?

I want to know what sort of financial applications can be implemented using a GPGPU. I'm aware of Option pricing/ Stock price estimation using Monte Carlo simulation on GPGPU using CUDA. Can someone enumerate the various possibilities of utilizing GPGPU for any application in Finance domain,

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

久伴你 2024-09-08 01:12:49

有许多金融应用程序可以在各个领域的 GPU 上运行,包括定价和风险。 NVIDIA 的 计算金融 页面上有一些链接。

确实,蒙特卡洛对于许多人来说是最明显的起点。蒙特卡罗是一类非常广泛的应用程序,其中许多都适合 GPU。许多基于格的问题也可以在 GPU 上运行。显式有限差分方法运行良好且易于实现,NVIDIA 网站和 SDK 中都有许多示例,它也用于石油和天然气行业。气体编码很多,材料也很丰富。隐式有限差分方法也可以很好地工作,具体取决于问题的确切性质,Mike Giles 在他的 网站,其中还有其他有用的金融内容。

GPU 也适合解决线性代数类型的问题,特别是当您可以将数据留在 GPU 上以完成合理的工作时。 NVIDIA 为 cuBLAS 提供了 CUDA 工具包,您也可以获得 cuLAPACK

There are many financial applications that can be run on the GPU in various fields, including pricing and risk. There are some links from NVIDIA's Computational Finance page.

It's true that Monte Carlo is the most obvious starting point for many people. Monte Carlo is a very broad class of applications many of which are amenable to the GPU. Also many lattice based problems can be run on the GPU. Explicit finite difference methods run well and are simple to implement, many examples on NVIDIA's site as well as in the SDK, it's also used in Oil & Gas codes a lot so plenty of material. Implicit finite difference methods can also work well depending on the exact nature of the problem, Mike Giles has a 3D ADI solver on his site which also has other useful finance stuff.

GPUs are also good for linear algebra type problems, especially where you can leave the data on the GPU to do reasonable work. NVIDIA provide cuBLAS with the CUDA Toolkit and you can get cuLAPACK too.

樱桃奶球 2024-09-08 01:12:49

基本上,任何需要大量并行数学才能运行的东西。正如您最初所说,无法用封闭式解决方案定价的期权的蒙特卡罗模拟是很好的选择。任何涉及大型矩阵及其运算的事物都是理想的;毕竟,3D 图形使用大量矩阵数学。

鉴于许多交易者桌面有时具有“工作站”级 GPU,以便驱动多个显示器,可能带有视频源、有限的 3D 图形(波动性表面等),因此在 GPU 上运行一些定价分析是有意义的,而不是在 GPU 上运行一些定价分析。将责任推到计算网格上;根据我的经验,计算网格经常在银行中试图使用它们的每个人的重压下挣扎,并且一些网格计算产品还有很多不足之处。

除了这个特定问题之外,GPU 无法轻松实现更多目标,因为与常规 CISC CPU 相比,指令集和管道的功能范围更加有限。

采用的问题之一是标准化; NVidia 有 CUDA,ATI 有 Stream。大多数银行都有足够的供应商锁定来处理,而无需将其衍生品分析(许多人认为这是极其敏感的知识产权)与 gfx 卡供应商的加速技术挂钩。我想随着 OpenCL 作为开放标准的出现,这种情况可能会改变。

Basically, anything that requires a lot of parallel mathematics to run. As you originally stated, Monte Carlo simultation of options that cannot be priced with closed-form solutions are excellent candidates. Anything that involves large matrixes and operations upon them will be ideal; after all, 3D graphics use alot of matrix mathematics.

Given that many trader desktops sometimes have 'workstation' class GPUs in order to drive several monitors, possibly with video feeds, limited 3D graphics (volatility surfaces, etc) it would make sense to run some of the pricing analytics on the GPU, rather than pushing the responsibility onto a compute grid; in my experience the compute grids are frequently struggling under the weight of EVERYONE in the bank trying to use them, and some of the grid computing products leave alot to be desired.

Outside of this particular problem, there's not a great deal more that can be easily achieved with GPUs, because the instruction set and pipelines are more limited in their functional scope compared to a regular CISC CPU.

The problem with adoption has been one of standardisation; NVidia had CUDA, ATI had Stream. Most banks have enough vendor lock-in to deal with without hooking their derivative analytics (which many regard as extremely sensitive IP) into a gfx card vendor's acceleration technology. I suppose with the availability of OpenCL as an open standard this may change.

少女的英雄梦 2024-09-08 01:12:49

高端 GPU 开始提供 ECC 内存(金融和军事应用的认真考虑)和高精度类型。

但目前一切都与蒙特卡洛有关。

你可以去研讨会,从他们的描述中可以看出'我们将重点关注蒙特卡洛。

High-end GPUs are starting to offer ECC memory (a serious consideration for financial and, eh, military applications) and high-precision types.

But it really is all about Monte Carlo at the moment.

You can go to workshops on it, and from their descriptions see that it'll focus on Monte Carlo.

飞烟轻若梦 2024-09-08 01:12:49

一个好的开始可能是查看 NVIDIA 的网站:

A good start would be probably to check NVIDIA's website:

爱冒险 2024-09-08 01:12:49

使用 GPU 会给应用程序的架构、部署和维护带来限制。
在为此类解决方案投入精力之前请三思。
例如,如果您在虚拟环境中运行,则需要所有物理机安装 GPU 硬件以及特殊的 vGPU 硬件和软件支持 + 许可证。
如果您决定将服务托管在云中(例如 Azure、Amazon)怎么办?
在许多情况下,值得提前构建架构以支持横向扩展、灵活且可扩展(当然需要一些开销),而不是尽可能地扩展和挤压硬件。

Using a GPU introduces limitations to architecture, deployment and maintenance of your app.
Think twice before you invest efforts in such solution.
E.g. if you're running in virtual environment, it would require all physical machines to have GPU hardware installed and a special vGPU hardware and software support + licenses.
What if you decide to host your service in the cloud (e.g. Azure, Amazon)?
In many cases it is worth building your architecture in advance to support scale out and be flexible and scalable (with some overhead of course) rather than scale up and squeeze as much as you can from your hardware.

她如夕阳 2024-09-08 01:12:49

回答你的问题的补充:任何涉及会计的事情都不能在 GPGPU(或二进制浮点,就此而言)上完成

Answering the complement of your question: anything that involves accounting can't be done on GPGPU (or binary floating point, for that matter)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文