Hi all,
I hate to ask such a dumb question,
but I’ve been searching for a concise answer
for a good hour now.
After running some code, my
GPU’s memory is full.
…
On this pytorch thread, they discuss how to free up memory.
Check GPU memory using nvidia-smi
del a
torch.cuda.empty_cache()
Check GPU memory again using nvidia-smi
Works
…
However, in Python, how can I list all
my objects in memory that I’ve created, and
ideally, their size?
Without poking through my code?
Similar to ls for files on linux.
Or, ls() in R.
Thanks a lot