This article is original by HinGwenWoong. If this article is helpful to you, welcome to reprint it. Please read the [authorization instructions] at the end of the article. Thank you for your recognition of HinGwenWoong's article!
When deploying the deep learning service, it is often not allowed to eat a whole card, and sometimes fatal OOM (Out of Memory) errors occur. Therefore, it is necessary to appropriately limit the video memory. Here is how to use the code to limit the video memory.
Detailed instructions on using GPU in TF2.0 can be found from Official documents Found in. In this article, I'll show you some of the code in the documentation that you can use immediately.
I'm HinGwenWoong, a program ape with clear goals and constant struggle. I love technology and like sharing. It's not easy to code words. If I can help you, please help me like it at the bottom of the screen 👍， Your praise can spread the technology further and wider. Thank you!
1, TensorFlow 2.x
Method 1: set_memory_growth
The method is to set the video memory to grow on demand to achieve the goal of not eating full at the beginning.
import tensorflow as tf gpu_list = tf.config.experimental.list_physical_devices('GPU') if len(gpu_list) > 0 : for gpu in gpu_list: try: # Set multiple GPUs. If you do not need for to set multiple GPUs, use the index setting of list tf.config.experimental.set_memory_growth(gpu, True) except RuntimeError as e: print(e) else: print("Got no GPUs")
- The [video memory growth on demand] configuration of all GPU s should be the same.
- [video memory on demand growth] should be set before initializing GPU.
Method 2: memory_limit
This step limits the memory usage of the first GPU to 2048MB. Just change the GPU as needed_ List index and memory_ Set limit.
import tensorflow as tf using_gpu_index = 0 # GPU number used gpu_list = tf.config.experimental.list_physical_devices('GPU') if len(gpu_list) > 0: try: tf.config.experimental.set_virtual_device_configuration( gpu_list[using_gpu_index], [tf.config.experimental.VirtualDeviceConfiguration(memory_limit=2048)] ) except RuntimeError as e: print(e) else: print("Got no GPUs")
2, TensorFlow 1.x
Method 1: allow_growth
The following code corresponds to method 1 of TF2.x.
config = tf.ConfigProto() config.gpu_options.allow_growth=True sess = tf.Session(config=config)
The following code corresponds to method 2 of TF2.x, but it sets the percentage of video memory, not a definite value. The following example sets the use of 60% video memory
import tensorflow as tf gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.6) # Just change this percentage sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))
The above is the limiting means of TF 2.x and 1.x video memory. I hope it can help you deploy more smoothly.
I'm HinGwenWoong, a program ape with clear goals and constant struggle. I love technology and like sharing. It's not easy to code words. If I can help you, please help me like it at the bottom of the screen 👍， Your praise can spread the technology further and wider. Thank you!More reading recommendations
- In depth learning and actual combat | smart site helmet and dangerous area detection system (the code has been open source!)
- Deep learning OCR | text recognition network CRNN paper translation
- Original articles can only be reprinted one day after being pushed
- Reprint the article and prohibit the original declaration
- Direct secondary reprint is not allowed. Please contact the author according to the original link
- If there is no need to revise the version, clearly mark the author and source / original link at the beginning of the text, and delete the [original statement], which can be reproduced directly.
However, I reserve the right to recall articles that do not indicate the source / original link of reprint.
Author: hingwen Woong
A program ape with clear goals and continuous struggle, loves technology, likes sharing and common progress!
Original link: Deep learning | TensorFlow 2.x and 1.x limited video memory (super detailed)
- If you need to modify the layout of the article, please contact the author according to the original link
- Thank you again for your approval. Please follow the above reprint instructions!