UnpicklingError: invalid load key, ‘<’

i tried to use pretrained InceptionResnetV1

resnet = InceptionResnetV1(
    classify=True,
    pretrained='vggface2',
    num_classes=len(dataset.class_to_idx)
).to(device)

but i get the following error :

magic_number = pickle_module.load(f, **pickle_load_args)
if magic_number != MAGIC_NUMBER:
raise RuntimeError(“Invalid magic number; corrupt file?”)
UnpicklingError: invalid load key, ‘<’.

is there something i’ve missed to add or remove? thanks for advice

A little late here but I was running into the same problem.

I also ran into the error UnpicklingError: invalid load key, '<'..

I was running into the error because I was following the guide for downloading models from google drive to avoid the heroku slug limit (see here).

The guide uses the following snippet:

import urllib.request

MODEL_URL = "https://drive.google.com/uc?export=download&id=YOUR_FILE_ID"
urllib.request.urlretrieve(MODEL_URL, "model.pkl")

learner = load_learner(Path("."), "model.pkl")

This isn’t a great way to download your model though because if you file is too large google won’t return the file and will instead return some warning file. If you model is large (Mine is ~500MB) I recommend using the following method for downloading files from gdrive

import requests

def download_file_from_google_drive(id, destination):
    URL = "https://docs.google.com/uc?export=download"

    session = requests.Session()

    response = session.get(URL, params = { 'id' : id }, stream = True)
    token = get_confirm_token(response)

    if token:
        params = { 'id' : id, 'confirm' : token }
        response = session.get(URL, params = params, stream = True)

    save_response_content(response, destination)    

def get_confirm_token(response):
    for key, value in response.cookies.items():
        if key.startswith('download_warning'):
            return value

    return None

def save_response_content(response, destination):
    CHUNK_SIZE = 32768

    with open(destination, "wb") as f:
        for chunk in response.iter_content(CHUNK_SIZE):
            if chunk: # filter out keep-alive new chunks
                f.write(chunk)


file_id = 'YOUR_FILE_ID'
destination = 'model.pkl'
download_file_from_google_drive(file_id, destination) 
1 Like