Fastai Unet

At around 56:52 in https://youtu.be/9shn3Oz7Ptw Jeremy says he uses “ICNR” to tweak the unet and that can be seen above at the Pixel_Shuffel function. What is “ICNR”? Could not find online. Also, what are those attention and blur modules? I am working on unet for a medical segmentation problem and want to incorporate some of these things.

4 Likes

From what I have checked (please correct me if I am wrong), The idea of blur comes from https://arxiv.org/abs/1806.02658 (to avoid checkboard artifacts), self-attention from https://arxiv.org/pdf/1805.08318.pdf (it gives a way to incorporate more context on the convolution operation, somehow to increase the receptive field) and ICNR is the initialization recomended for the upsampling layers weights from https://arxiv.org/pdf/1707.02937.pdf.

6 Likes

While I check on this, there is alot of information on Lesson 14 of part 2 of the older course from 22 mins onwards for anyone looking to dig into more details about upsampling. Interestingly batch-norm is not good for segmentation problems and there is some wierd constant multiplier. Dont know if that mystery has been solved yet…

I am also trying to understand at what exact point the self attention layer is used and why it is used at that point. The library documentation states that it is used at the third block before the end. A visual guide on the location of this block (third block before the end) will be helpful.