It seems I had to go back to lesson 2 again...
The code for the third lesson loads the 'finetune3.h5' file obtained on the last piece of code of the lesson 2. I had to skip this part because the 4-epoch training was taking too much time for me to compute. This is the line of code I'm talking about:
fit_model(model, batches, val_batches, 4)
Every time I executed this piece of code before (with 2 and 3 epochs, during the lesson 2) the text indicators of the process ended up freezing. For instance, the number of processed batches growed normally until 29500, for instance, and then everything would freeze up until the end of the execution, where all the remaining verbosity would appear at once and the correct would end its execution correctly. Approximately, the 2-epoch training would take around 15 minutes, and the 3-epoch training, around 30 minutes.
However, I executed the 4-epoch training stated before for almost 2 hours and it didn't finish, the notebook cell still had the [*] execution symbol, and of course the verbosity was frozen, as usual. As I didn't have time, I interrupted the execution to start with lesson 3 next day. My surprise was to see that this 4-epoch training was loaded at start on lesson 3, so I decided to come back to lesson 2 and spend a full night with the execution of the 4-epoch training, thinking that it could be a matter of time/computational load. It has been running for almost 10 hours and it still has not finished, with the verbosity, of course, frozen. I'm not thrown any error, and I don't see anything wrong, it simply seems to be stuck.
I suppose I could load the 3-epoch training weights at lesson 3 start instead, right? However, I'm not sure it would work as expected, and I would like to know what is happening to the 4-epoch training anyway, I would like to know why it seems to be neverending.
Can anybody help me? Thanks in advance, again.
EDIT: I found the solution, after tryting everything I simply tried to do the same in Firefox instead of Chrome. Worked perfectly, with no frozen verbosity and ending the computation after training the 4th epoch, as expected.