After deleting some variables and using torch.cuda.empty_cache() I was able to free some memory but not all of it.
Here is a code example:
from fastai.imports import *
from fastai.transforms import *
from fastai.conv_learner import *
from fastai.model import *
from fastai.dataset import *
from fastai.sgdr import *
from fastai.plots import *
PATH = “data/dogscats/”sz=224
arch=resnet34data = ImageClassifierData.from_paths(PATH, tfms=tfms_from_model(arch, sz))
learn = ConvLearner.pretrained(arch, data, precompute=True)
learn.fit(0.01, 3)% running nvidia-smi → 689MB used
torch.cuda.empty_cache()
% running nvidia-smi → 687MB used
del data, learn
torch.cuda.empty_cache()% running nvidia-smi → 571MB used
Any ideas what could be using the rest of the memory?