I’m running the documentation notebooks in my laptop, but I keep getting this error. I’ll look all over the forum and other sites but I cannot find any similar issue.
I ran the notebook on Google Colab and works fine, try to update fastai but is the last version. I don’t know which could be the problem.
I have the same issue. I can see/access classes like ImageBBox and ImageImageList, but not ImageList.
I run on Crestle and fastai lib is updated.
I’m trying to run lesson 3 (planet)
I just found the issue. Apparently this is a modification introduce in the last Fast.ai library update, they replace ImageItemList for ImageList … I manage to update the library to 1.0.46 version and now it’s working.
In my case I changed ImageList to ImageItemList and it seems to work.
I am stuck now with this error:
RuntimeError: The size of tensor a (418) must match the size of tensor b (64) at non-singleton dimension 1
In my case, using 1.0.46 with ItemList no longer allows me to load a dataset smaller than the default batch size (i.e. 64 images). With ImageItemList from prior package versions, I would force the batch size to be smaller than 64 with “bs=new_batch_size”, where new_batch_size = size of my dataset. I recognize that my dataset is very small, but it would be great to have this capability again.
Recommended Solution is to update fastai lib using following command
conda install -c fastai fastai
this solved the issue for me
same, have you solved this?
how can you update the library to 1.0.46?
when I use " conda install -c fastai fastai" then “All requested packages already installed.”
but the version of fastai only ‘1.0.34’
Do you use Crestle.ai platform? In my case, it was restricted to upgrade the fastai version as you mentioned. Specify the version like
conda install -c fastai fastai=1.0.46. It allows me to upgrade higher version than 1.0.34. Hope it helps
I did upgrade the fastai to
1.0.50.post1 (could not upgrade to 1.0.50, the newest version on github at this moment), and seems everythings going well.
thank you very much, that’s worked for me:grinning: