Normalizing images with a lambda instead of stats (latest EfficientNet requiires it)

Hi all,
the latest weights for efficientnet are advprop trained and thus require a new normalization scheme for images.
Specifically (per luke melas impl):
ap_normalize = lambda img: img * 2.0 - 1.0

or:

if advprop:  # for models using advprop pretrained weights
    normalize = transforms.Lambda(lambda img: img * 2.0 - 1.0)
else:
    normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406], 
                                     std=[0.229, 0.224, 0.225])

Besides just hacking into data.py and adding a new normalize function…is there a better way to do this within fastai?

You can use no normalization and make it a transformation. Just make sure you set proper order for it to run either first or last, not sure what order should normalization be.

1 Like

I feel like this can be adapted in fastai2 fairly easily, as a Normalize.from_advprop (or func) and use delegates

1 Like

@muellerzr - if you can show how to do this with code, I’ll start moving my production work to v2 :slight_smile: (I’m already close but will need this before I can make the jump). Either way, much appreciated!

I’d take a look at this notebook:

And go down to where Normalize.from_stats is for ideas. I don’t have the time to do it exactly yet, so if you run into issues I’d ask on the v2 chat :slight_smile:

1 Like

for v1 I ended up just rewriting the main normalize functions in data.py (made an ap_normalize, etc) and it all appears to be working now.
I’ll take a look at the v2 code, thanks for posting the link!

1 Like

Hi @LessW2020!

Have you managed to use transforms.Lambda in the Datablock bacth_tfms pipeline? I am frying my brain and cannot make it work.

For those who are interested, I managed to normalize for advprop for EfficientNet, following @muellerzr advice. I changed the Normalize transform a bit and created a NormalizeEf transform:

def enc_advprop(x):
    normalize = transforms.Lambda(lambda img: img * 2.0 - 1.0)
    return normalize(x)

def dec_advprop(x):
    normalize = transforms.Lambda(lambda img: (img + 1.0) / 2.0)
    return normalize(x)

class NormalizeEf(Transform):
    "Normalize/denorm batch of `TensorImage`"
    order=99
    def __init__(self, axes=(0,2,3)): self.axes = axes

    @classmethod
    def from_advprop(cls, dim=1, ndim=4, cuda=True): return cls(*broadcast_vec(dim, ndim, cuda=cuda))

    def encodes(self, x:TensorImage): return enc_advprop(x)
    def decodes(self, x:TensorImage):
        f = to_cpu if x.device.type=='cpu' else noop
        return (dec_advprop(x))

    _docs=dict(encodes="Normalize batch", decodes="Denormalize batch") 


batch_tfms=[*aug_transforms(size=224, 
                            do_flip=True,
                            max_rotate=15,
                            max_zoom=1.1,
                            max_lighting=0.3,
                            max_warp=0.0,
                            p_affine=1.0,
                            p_lighting=1.0), NormalizeEf.from_advprop()]
2 Likes

I’m not sure if I understood the reason to create a transforms.Lambda(). Also, you’re not broadcasting a tensor in this case, so no need of static method. Could you tell me what’s wrong in writing it this way?

class NormalizeEf(Transform):
    "AdvProp Normalization for `TensorImage`"
    order=99
    def encodes(self, x:TensorImage): return (x * 2.) - 1.
    def decodes(self, x:TensorImage): return (x + 1.) / 2.

@LessW2020 how to invoke this
data = (
src.transform(tfms,size=(sz,sz),resize_method=ResizeMethod.NO,padding_mode=‘zeros’)
.databunch(bs=bs)
.normalize1#(imagenet_stats)
)