Plotting metrics after learning

I’m trying to do the equivalent of learn.recorder.plot_metrics() with fastai2 (master). I was able to find plot_loss and look up my metric after fitting as learn.metrics and get its latest value, but wasn’t able to find the series data for each epoch etc to plot it. What am I missing?

Thanks in advance for your help.

@indigoviolet You should use a Callback with fit method:

learn.fit_one_cycle(10, slice(5e-3,5e-2),cbs=[ShowGraphCallback()])

http://dev.fast.ai/callback.progress#ShowGraphCallback

3 Likes

You may want to take a look at a function I’ve built that allows you to see the loss alongside all the metrics you have selected. It’s part of the timeseriesAI repo, but it can be used independently from it.

from fastai2.imports import *
from fastai2.torch_core import *
from fastai2.learner import *
    
@patch
@delegates(subplots)
def plot_metrics(self: Recorder, nrows=None, ncols=None, figsize=None, **kwargs):
    metrics = np.stack(self.values)
    names = self.metric_names[1:-1]
    n = len(names) - 1
    if nrows is None and ncols is None:
        nrows = int(math.sqrt(n))
        ncols = int(np.ceil(n / nrows))
    elif nrows is None: nrows = int(np.ceil(n / ncols))
    elif ncols is None: ncols = int(np.ceil(n / nrows))
    figsize = figsize or (ncols * 6, nrows * 4)
    fig, axs = subplots(nrows, ncols, figsize=figsize, **kwargs)
    axs = [ax if i < n else ax.set_axis_off() for i, ax in enumerate(axs.flatten())][:n]
    for i, (name, ax) in enumerate(zip(names, [axs[0]] + axs)):
        ax.plot(metrics[:, i], color='#1f77b4' if i == 0 else '#ff7f0e', label='valid' if i > 0 else 'train')
        ax.set_title(name if i > 1 else 'losses')
        ax.legend(loc='best')
    plt.show()

To use it you only need to run:

learn.recorder.plot_metrics()

once training has finished, and you may get something like this:

16 Likes

Thanks for the responses, @WaterKnight and @oguiza!

@oguiza 's function does what I want, and I’m glad to learn about the ShowGraphCallback that @WaterKnight suggested – it’s not exactly what I wanted here, but I think I’ll use it regularly!

thanks @WaterKnight this is exactly what I am looking for and it’s in the latest version.

Hi, I was just looking into a way to plot the results from learn.fine_tune(). Is there a way to access the table, once training is done, that’s being printed during training? That way I could easily plot the train/val loss.