How to save the best model during training in fastv1

In fastai 0.7 we can use the learn.fit(…, best_save_name = “my_best_model”)
but How to save the best model during training in fastai v1?
Thanks

7 Likes

jeremy will probably introduce this in later lectures, but if you’re up for reading some docs, you can pass in a custom callback to the learner, where you can save the model weights, and run any python code.

callbacks param for fit_one_cycle and fit

Click on the Callback in the docs and you arrive at pretty solid docs on using it.

You probably want to use on_epoch_end

9 Likes

Thanks!
Following the previous version, I write a simple callback to save the best weight during the training.

class SaveBestModel(Recorder):
    def __init__(self, learn,name='best_model'):
        super().__init__(learn)
        self.name = name
        self.best_loss = None
        self.best_acc = None
        self.save_method = self.save_when_acc
        
    def save_when_acc(self, metrics):        
        loss, acc = metrics[0], metrics[1]
        if self.best_acc == None or acc > self.best_acc:
            self.best_acc = acc
            self.best_loss = loss
            self.learn.save(f'{self.name}')
            print("Save the best accuracy {:.5f}".format(self.best_acc))
        elif acc == self.best_acc and  loss < self.best_loss:
            self.best_loss = loss
            self.learn.save(f'{self.name}')
            print("Accuracy is eq,Save the lower loss {:.5f}".format(self.best_loss))
            
    def on_epoch_end(self,last_metrics=MetricsList,**kwargs:Any):
        self.save_method(last_metrics)

the result:

6 Likes

There is also a built in SaveModelCallback that you can find in fastai/callbacks/tracker.py. You can find a couple of other useful callbacks there as well (ReduceLROnPlateauCallback, EarlyStoppingCallback).

I also rolled my own initially but quite appreciate the behavior of the built in one :slight_smile:

16 Likes

Thanks for you advice, I didn’t find it before.
The behavior of the built in one is very useful.:smile: i will learn it.

2 Likes

How do you use SaveModelCallback? I have tried

learn.fit_one_cycle(4,callbacks=[SaveModelCallback])

but I get the following error


TypeError Traceback (most recent call last)
in ()
----> 1 learn.fit_one_cycle(4,callbacks=[SaveModelCallback])

/usr/local/lib/python3.6/dist-packages/fastai/train.py in fit_one_cycle(learn, cyc_len, max_lr, moms, div_factor, pct_start, wd, callbacks, **kwargs)
20 callbacks.append(OneCycleScheduler(learn, max_lr, moms=moms, div_factor=div_factor,
21 pct_start=pct_start, **kwargs))
—> 22 learn.fit(cyc_len, max_lr, wd=wd, callbacks=callbacks)
23
24 def lr_find(learn:Learner, start_lr:Floats=1e-7, end_lr:Floats=10, num_it:int=100, stop_div:bool=True, **kwargs:Any):

/usr/local/lib/python3.6/dist-packages/fastai/basic_train.py in fit(self, epochs, lr, wd, callbacks)
160 callbacks = [cb(self) for cb in self.callback_fns] + listify(callbacks)
161 fit(epochs, self.model, self.loss_func, opt=self.opt, data=self.data, metrics=self.metrics,
–> 162 callbacks=self.callbacks+callbacks)
163
164 def create_opt(self, lr:Floats, wd:Floats=0.)->None:

/usr/local/lib/python3.6/dist-packages/fastai/basic_train.py in fit(epochs, model, loss_func, opt, data, callbacks, metrics)
72 cb_handler = CallbackHandler(callbacks, metrics)
73 pbar = master_bar(range(epochs))
—> 74 cb_handler.on_train_begin(epochs, pbar=pbar, metrics=metrics)
75
76 exception=False

/usr/local/lib/python3.6/dist-packages/fastai/callback.py in on_train_begin(self, epochs, pbar, metrics)
192 self.state_dict[‘n_epochs’],self.state_dict[‘pbar’],self.state_dict[‘metrics’] = epochs,pbar,metrics
193 names = [(met.name if hasattr(met, ‘name’) else camel2snake(met.class.name)) for met in self.metrics]
–> 194 self(‘train_begin’, metrics_names=names)
195
196 def on_epoch_begin(self)->None:

/usr/local/lib/python3.6/dist-packages/fastai/callback.py in call(self, cb_name, call_mets, **kwargs)
185 “Call through to all of the CallbakHandler functions.”
186 if call_mets: [getattr(met, f’on_{cb_name}’)(**self.state_dict, **kwargs) for met in self.metrics]
–> 187 return [getattr(cb, f’on_{cb_name}’)(**self.state_dict, **kwargs) for cb in self.callbacks]
188
189 def on_train_begin(self, epochs:int, pbar:PBar, metrics:MetricFuncList)->None:

/usr/local/lib/python3.6/dist-packages/fastai/callback.py in (.0)
185 “Call through to all of the CallbakHandler functions.”
186 if call_mets: [getattr(met, f’on_{cb_name}’)(**self.state_dict, **kwargs) for met in self.metrics]
–> 187 return [getattr(cb, f’on_{cb_name}’)(**self.state_dict, **kwargs) for cb in self.callbacks]
188
189 def on_train_begin(self, epochs:int, pbar:PBar, metrics:MetricFuncList)->None:

TypeError: on_train_begin() missing 1 required positional argument: ‘self’

This works for me:

    callbacks = [
        EarlyStoppingCallback(learn, min_delta=1e-5, patience=3),
        SaveModelCallback(learn)
    ]

    learn.callbacks = callbacks
11 Likes

In lecture 2 Jeremy describes over fitting saying that “the sign that you are over fitting is that your error starts getting worse”. I understood this to be the error_rate or 1-accuracy.
If this is correct would it be fair to say the best model is also when we get the lowest error_rate (or highest accuracy)?
If this is also correct shouldn’t the default value for monitor below (taken from tracker.py) be ‘error_rate’?

class TrackerCallback(LearnerCallback):
    "A `LearnerCallback` that keeps track of the best value in `monitor`."
    monitor:str='val_loss'
1 Like

Here is a dev notebook which shows another way to do it:

learn.callback_fns.append(partial(SaveModel, monitor='accuracy'))

1 Like

Thank you, so I can pass error_rate to the callback in the following way

learn.fit_one_cycle(30, max_lr=slice(3e-5,3e-4),callbacks=[SaveModelCallback(learn,monitor='error_rate',mode='min')])

However I am still unsure as to whether this is a good idea. As a rule of thumb is the best model the one with the lowest error_rate or the lowest valid_loss?

2 Likes

Lowest error rate.

6 Likes

I wanted both to display the losses during training and save the best model but it seems incompatible. How to solve that?

learn = create_cnn(data, arch, metrics=[error_rate], callback_fns = [ShowGraph])
learn.fit_one_cycle(5, max_lr=lr, callbacks=[SaveModelCallback(learn)])

Error message:

Try the following

learn = create_cnn(data, arch, metrics=[error_rate])
learn.fit_one_cycle(5, max_lr=lr, callbacks[ShowGraph(learn),SaveModelCallback(learn,monitor='error_rate',mode='min')])
6 Likes

Hi @cudawarped,

Thanks for your code. I’m using fastai 1.0.43 and the 2 following code work:

Code 1

learn = create_cnn(data, arch, metrics=[error_rate], callback_fns=ShowGraph)
from fastai.callbacks import * 
learn.fit_one_cycle(5, callbacks=[SaveModelCallback(learn)])

Code 2 (my default option)

learn = create_cnn(data, arch, metrics=[error_rate])
from fastai.callbacks import * 
learn.fit_one_cycle(5, callbacks=[ShowGraph(learn),SaveModelCallback(learn)])
7 Likes

You’ll have to pass
callbacks = [SaveModelCallback(learn=learn, {otherparameters})]

1 Like

Is this still the case, I am asking because I have noticed that in fastaiv2 all the callbacks inside 17_callback_tracker.ipynb default to monitor='valid_loss' not error_rate as I naively would have expected?