Hey,
just something tiny that i’ve found interesting and useful while tinkering with training parameters:
It’s a callback for keras that records the loss for each batch/epoch and a function that then uses seaborn to plot it nicely per epoch and fits a line so you have to squint less to see if the loss is still going down or not.
Example notebook here: https://github.com/ah-/courses/blob/master/deeplearning1/nbs/plot_loss.ipynb