When I was using Keras, when the number of neural in hidden layer is too large, the kernel would die. I deliberately make the model very very large and complex to test my computer. Up till now I do not know what happened when the model is large and what cause the Jupyter kernel die. Do you use your own computer to run the program? Maybe the model is too complex for your computer to handle. How about change a platform?
It worked. Few days back I was checking that issue, I noticed that somehow ImageClassifierCleaner just above started giving this issue too. At this function, CPU shoots up and suddenly jupyter freezes for few seconds and Kernel dies.
I commented this call and tried to train again. Now kernel is not crashing at learn.export().
I will experiment why ImageClassifierCleaner is messing with kernel. Its possible that issue may be something else. Need to check more.
I haven’t had any problem at export time, but when running the superres model in inference mode with large images, I experienced very similar issues. So far, I wasn’t able to find a better solution than reducing the model’s input image size .