To check if Tensorflow is using a GPU, you can use the config.list_physical_devices('GPU') in Tensorflow. This method returns True if a GPU is available and False if not. Here is an example of how to use it:
import tensorflow as tf
#Check if Tensorflow is using a GPU
if tf.config.list_physical_devices('GPU'):
print("Tensorflow is using a GPU")
else:
print("Tensorflow is not using a GPU")
Following will be the output if you have GPU:
Tensorflow is using a GPUIf you want to check which GPUs are available, you can use the GPUOptions class and the list_physical_devices() method from the DeviceManager class. Here is an example:
import tensorflow as tfThe output of the above program will be as follow:
# Get the list of available GPUs
gpus = tf.config.experimental.list_physical_devices('GPU')
# Print the list of available GPUs
print(gpus)
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]This will print a list of available GPUs, if any. If no GPUs are available, it will print an empty list.
You could also use the following example:
import tensorflow as tfOutput:
tf.config.list_physical_devices('GPU')
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]
You could run the following command to find the information about the GPU that TensorFlow is using:
! nvidia-smi
Output:
You will get output similar to the following:
Sun Dec 11 10:56:52 2022 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 460.32.03 Driver Version: 460.32.03 CUDA Version: 11.2 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla T4 Off | 00000000:00:04.0 Off | 0 | | N/A 44C P8 11W / 70W | 3MiB / 15109MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | No running processes found | +-----------------------------------------------------------------------------+
The above table provides the information about the GPU currently in use.
To check all devices available to the TensorFlow, you could use the following:
from tensorflow.python.client import device_lib
print(device_lib.list_local_devices())
Output:
[name: "/device:CPU:0" device_type: "CPU" memory_limit: 268435456 locality { } incarnation: 6268366153253263738 xla_global_id: -1 , name: "/device:GPU:0" device_type: "GPU" memory_limit: 14415560704 locality { bus_id: 1 links { } } incarnation: 17524716795020899435 physical_device_desc: "device: 0, name: Tesla T4, pci bus id: 0000:00:04.0, compute capability: 7.5" xla_global_id: 416903419 ]
In the above output the all devices are listed. You may notice different output based on the available devices on your machine.
Advertisements
Comments
Post a Comment