Export Tabular model


Does anybody know why i cant use export to export this tabular model ?
Jeremy told us to do

save_pickle(path/‘to.pkl’,to)

instead

You’re trying to export TabularPandas, not the model. I have some code here that enables you to do so:

(See the bottom for a usage example, install the package with pip install wwf)

1 Like

hi thank you for your response , could youalso explain whats the benefit of using the tabular pandas over a normal Pandas dataframe both seem rather similar in my eyes as a novice but I dont seem to get why we couldnt just use the dataframe … also

Why cant i use x instead of xs ?

TabularPandas is how we preprocess our tabular data in fastai. It uses pandas internally hence the name. It’s similar to our DataBlock/Datasets, which we can then turn into DataLoaders. The procs (Normalize, Categorify, and FillMissing) are inplace transforms rather than lazy transforms that we do for vision (since we need to apply these right away). Does this help alleviate some confusion? :slight_smile:

It’s not exactly the same as a regular DataLoader/dataset, so some things are different. We don’t have “xs” here, we just have “x”. You can also do “.conts” and “.cats” to get the categorical variable data and continuous data. It’s a different API than the rest of the library since tabular data is special

I’m sorry what’s the difference between lazy transforms and the transforms we’re doing now ? apart from that the explanation is golden . Also I asked some other questions on the forum, could I link you to em ?

So lazy transforms are done as we get items. For instance we can’t exactly store 100,000 images into memory, we’d need GB upon GB of memory to do that. So instead we pull as we need by batch and apply our transforms on these 64 images (if our bs is 64). We read it in, apply our item_tfms, batch_tfms, pass to the model, then take that batch out of memory and load in a new one.

In this case our dataframe is always in memory, so as a result we just apply the full transforms at once (which makes sense, we’re not reading in things lazily, and we need the full DataFrame to calculate our preprocessors too like Normalize).

Re: questions, sure. Go ahead and drop a link and I’ll see if I can answer them.

1 Like

https://forums.fast.ai/t/unable-to-visualize-embedding-distances/83957/2

Hi Sir, Above are the 2 questions. Also yep I get what you mean, lazy transforms is done “on the fly” so to speak … correct me if I’m wrong

Exactly! :slight_smile:

1 Like

Hi sir another question i have

Please help me out, I need to find a way to beat this learning curve