I was wondering if the learning rate finder method presented in class can be applied to RNN as well? Thank you so much.
As an additional follow up question - Should we try to tweak the learning rate the ADAM algorithm? I believe most instances of it ship with recommended defaults.
We covered that in detail in part 1 and quite a few previous threads in #part1-v2, FYI. tl;dr: yes you should.