如何在GPU上运行Keras?
我正在尝试使用GPU在远程服务器上运行我的ML任务。我打字了
nvidia-smi
和我确定该设备具有一个GPU。
我正在使用Keras编写我的ML任务。我打算在一个GPU上执行任务。但是我只是无法让该程序在GPU上运行。我已经检查了运行过程,我的任务不是其中之一。
我尝试使用代码中的多种方法打印出可用的设备,但是它们没有表明我有GPU。
来自tensorflow.python.client导入设备
print(device_lib.list_local_devices())
这给了我:
[名称:“/设备:CPU:0” device_type:“ cpu” memory_limit:268435456 局部{} ....
显示了
但没有“/设备:gpu:0” ,然后我尝试了:
print(“ num gpus可用:”,len(tf.config.experiment.list_physical_devices('gpu')))
这给了我:
num gpus可用:0
另外,
print(“ gpus:”,tf.config.experiment.list_physical_devices('gpu'))
这给了我:
gpus:[]
并且,
with tf.device("gpu:0"):
print('-------------------------------------------')
print("tf.keras code in this scope will run on GPU")
print('-------------------------------------------')
这给了我:
-------------------------------------------
tf.keras code in this scope will run on GPU
-------------------------------------------
显然,我的代码没有在GPU上运行。不知道为什么它说我的Keras代码在GPU上运行。
我已经搜索了很多信息,但是我仍然失败了。 我想知道如何在GPU上运行我的Keras代码,以及上述情况发生的情况。提前致谢!
I am trying to run my ML task on a remote server with GPU. I typed
nvidia-smi
and I was sure that the device has one GPU.
I am using Keras to write my ML task. And I intend to run my task on one GPU. But I just can't get the program to run on GPU. I've checked running processes and my task was not one of them.
I've tried to print out available devices using multiple methods in my code, but they did not show that I have a GPU.
from tensorflow.python.client import device_lib
print(device_lib.list_local_devices())
This gave me:
[name: "/device:CPU:0" device_type: "CPU" memory_limit: 268435456
locality { }....
but no "/device:GPU:0" is shown
And then I tried:
print("Num GPUs Available: ", len(tf.config.experimental.list_physical_devices('GPU')))
This gave me:
Num GPUs Available: 0
Also,
print("GPUs: ", tf.config.experimental.list_physical_devices('GPU'))
This gave me:
GPUs: []
And,
with tf.device("gpu:0"):
print('-------------------------------------------')
print("tf.keras code in this scope will run on GPU")
print('-------------------------------------------')
This gave me:
-------------------------------------------
tf.keras code in this scope will run on GPU
-------------------------------------------
Apparently, my code was not run on the GPU. Do not know why it said my Keras code run on GPU.
I've searched for a lot of information, but I still failed.
I would like to know how to run my Keras code on GPU, and also what happened to the above situation. Thanks in advance!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
安装CUDA后,
如果您使用上面的命令,则说要安装CUDA-ToolKit,
因此由于未指定该路由而无法找到该路由引起的问题。
添加前两个句子。但是,请按照CUDA版本按不同的版本
,然后输入以下命令以识别安装路径中的CUDA并打印出当前版本。
如果您完成了此图片的EUQAL,
则
After installing CUDA
If you use the command above, it says to install cuda-toolkit
The problem is caused by not being able to find the route because it was not specified.
Add the top two sentences. However, please press different versions depending on the cuda version
Then enter the command below to recognize the CUDA in the installation path and print out the current version.
If you finished euqal to this picture