When we call learn.fit in two consecutive blocks, does that mean incremental learning?
The second block in the notebook has:-
Say the third block has:-
Will this mean, the learning for the learn in block 3 starts from where is left the weights of the final layers in block 2? Or does it mean start learning from the raw weights of the imported model again?
It is an interesting question. Let’s wait and see if someone could answer. If not, you’ll have to dig in the code.
It will start from where it left as per my observation
I am not sure whether we call that incremental learning , from my understanding :
The aim of incremental learning is for the learning model to adapt to new data without forgetting its existing knowledge, it does not retrain the model. ~ Source : Wikipedia Link
And to Answer your other question :
We are simply breaking the number of epochs into smaller chunks so that you can see intermediate results and stop when required . Say if there is a situation where you start to over-fit in 4 epochs, if you can realize it earlier , you can save time and stop further trainning . This is possible only when u split it into chunks as you will have the option to proceed further or stop the training (i.e Execute new block with same code). When you are doing large number of epochs in a single fit (Single Block) you wont be able to stop in the middle ( say 10 in previous example) . You might have to rerun the whole thing again till the point before over-fitting.
One Advantage when you split the training into chunk is that you can save intermediate models and ensemble them to get better performance. (Snapshot Ensembles: Train 1, get M for free: Link
As @lokeshdangi pointed it uses the weights where it left.
Hope it helps!
@ArchieIndian Well, according to François Chollet, the creator of Keras, successive calls to fit will incrementally train the model.
Please find the below closed thread for reference
@ArchieIndian, every time you call learn.fit(…), it will start from where the model previously left off training. At least, that’s my understanding.