Saving model with lambda function

I referred to this, I guess, If we only save the weights, then model architecture need always need to be known or documented properly. So, I wanted to save the model with lambda function in Keras.

I tried two approaches:

Approach-1: I did not try to change the vgg_mean function

# Mean of each channel as provided by VGG researchers
vgg_mean = np.array([123.68, 116.779, 103.939]).reshape((3,1,1))

def vgg_preprocess(x):
   x = x - vgg_mean     # subtract mean
   return x[:, ::-1]    # reverse axis bgr->rgb

And, then I was able to save the model. But, when I reload the saved model

new_model = load_model('my_model_copy2.h5',custom_objects={'vgg_mean': vgg_mean})

I get error mentioning 'global name vgg_mean is not defined". Gist file for this working is located here.

Approach 2:

I referred to this: and changed vgg_preprocess function as below:

def vgg_preprocess(x, vgg_mean):
    x = x - vgg_mean     # subtract mean
    return x[:, ::-1] 


def VGG_16():
   model = Sequential()
   model.add(Lambda(vgg_preprocess, arguments={'vgg_mean':vgg_mean},input_shape=(3,224,224)))
   ConvBlock(2, model, 64)
   ConvBlock(2, model, 128)
   ConvBlock(3, model, 256)
   ConvBlock(3, model, 512)
   ConvBlock(3, model, 512)
   model.add(Flatten())
   FCBlock(model)
   FCBlock(model)
   model.add(Dense(1000, activation='softmax'))
   return model

I get below error, when I try to save the model:

ValueError: can only convert an array of size 1 to a Python scalar

Gist file for this working is here.

How to save the model both architecture and weights with lambda function? Please let me know, if you succeed.

I’m having a similar problem, did you get it working?

No. I was busy with other things and did not get time to pursue this.

I think you’ll need to change the function into this :

def vgg_preprocess(x, vgg_mean):
vgg_mean = np.array([123.68, 116.779, 103.939]).reshape((3,1,1))
x = x - vgg_mean # subtract mean
return x[:, ::-1]

2 Likes

Changing vgg_preprocess function, and defining vgg_mean in there worked for me!

2 Likes