Home > OS >  Google Colab CUDA out of memory
Google Colab CUDA out of memory

Time:06-07

I am running a colab notebook "Disco Diffusion", it is a text to image ML algo. I am trying to render but I get a runtime error:

CUDA out of memory. Tried to allocate 960.00 MiB (GPU 0; 15.78 GiB total capacity; 14.11 GiB already allocated; 158.75 MiB free; 14.14 GiB reserved in total by PyTorch) If reserved memory is allocated memory, try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Now, if I render with small dimensions, it works perfectly fine. I tried to upgrade my GCE VM and I am still getting the same error.

Here is the code:

https://colab.research.google.com/github/alembics/disco-diffusion/blob/main/Disco_Diffusion.ipynb

CodePudding user response:

What GPU do you use for running the notebook? If it is a K80, that might be why as K80 has lesser RAM. When you try to run the notebook too, it shows the following warning

Notebook requires high RAM This notebook was authored in a high RAM runtime environment and may require more resources than are allocated in the version of Colab that is free of charge. If you encounter errors, you may want to check out Colab Pro. Blockquote

I would suggest using a GPU with more RAM if that is available to you, or you could try reducing the image resolution.

CodePudding user response:

You are getting this error for a combination of:

  • The batch size is too big
  • The images have too high resolution, witch makes the dataset too large
  • The model you're using is too heavy due to the input shape

The only solutions would be to get Colab Pro or try to deal with the 3 problems I mentioned earlier.

  • Related