hook WITHOUT removing/deleting it post training. Now when I am trying to re-run the notebooks I am getting OOM error as soon as I start training my first epoch.
Added .remove() in hooks now, but still not getting past the training. How should I free the CUDA memory? Tried resetting the runtime - which is not working!
layers_ = flatten_model(learn.model)
def __init__(self, m, f): self.hook = m.register_forward_hook(partial(f, self))
def remove(self): self.hook.remove()
def __del__(self): self.remove()
def append_stats(hook, mod, inp, outp):
if not hasattr(hook,'stats'): hook.stats = (,,)
means,stds, outs = hook.stats