Accumulating Gradients

Please note that v1.0.47 will have a breaking change that affects this callback (see announcement in the developer chat). To skip the step and the grad zeroing, just return:

return {'skip_step': True, 'skip_zero': True}

wherever is more convenient (probably in on_backward_end)

1 Like