初始化过剩时出现问题
我将我的问题简化为这个例子:
#include <GL/glut.h>
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize (600, 600);
glutInitWindowPosition( 0, 0 );
int win = glutCreateWindow("Recon");
return 0;
}
当它执行 glutCreateWindow 时,大约需要 1 分钟,屏幕会闪烁几次。
这实在是太长了。这不可能是正常的。
环境:
- Fedora 10
- 双 NVIDIA GTX280 卡驱动 3 个显示器。
- NVIDIA 驱动程序版本 190.53 CUDA 2.3 安装了
- gcc 版本 4.3.2 20081105 (Red Hat 4.3.2-7) (GCC)
关于可能出现问题的任何想法?
编辑:我没有显示功能,因为我的最终目标是创建一个渲染上下文,以便我可以从一些 CUDA 代码创建一个像素缓冲区对象(目前不会显示其输出。我还尝试创建一个具有相同延迟的一系列 glx 调用的上下文,并且在调用 gkxMakeCurrent 时发生闪烁。
I have simplified my problem to this example:
#include <GL/glut.h>
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize (600, 600);
glutInitWindowPosition( 0, 0 );
int win = glutCreateWindow("Recon");
return 0;
}
When it executes the glutCreateWindow, it takes about 1 minute and the screens flicker several times.
This is ridiculously long. This can't be normal.
Environment:
- Fedora 10
- Dual NVIDIA GTX280 cards driving 3 monitors.
- NVIDIA driver version 190.53 CUDA 2.3 installed
- gcc version 4.3.2 20081105 (Red Hat 4.3.2-7) (GCC)
Any ideas as to what could be wrong?
Edit: I have no display function because my ultimate goal is to create a rendering context so that I can create a Pixel Buffer Object from some CUDA code (which for the moment is not going to be displaying its output. I have also tried creating a context with a series of glx calls with the same delay and flickering happening when gkxMakeCurrent is called.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
有显示功能吗?
我不确定这是否有帮助,但也许放入一个清除缓冲区的显示功能可能会有所帮助?
例如
glutDisplayFunc(myDisplay);
你使用什么编译器?并且,您是否研究过与 Fedora 10 和 openGL 相关的任何可能的性能问题(我现在正在研究第二点)。
编辑:Fedora 10 中肯定有一些性能上的轶事
此处和此处。第二个似乎至少描述了您的一种症状。您可以在其他操作系统上尝试您的代码吗?
Do you have a display function?
I'm not sure if this will help, but maybe putting in a display function in which you clear the buffers might help?
e.g.
glutDisplayFunc(myDisplay);
What compiler are you using? And, have you looked into any possible performance issues associated with Fedora 10 and openGL (I'm looking into the second bit right now).
Edit: There are definitely some anecedotal stories of a performance hit in Fedora 10
Here and Here. The second one seems to describe at least one of your symptoms. Are you able to try your code on another OS?