What should n_inp of dataloaders return

After creating unet_learner and trying to get its summary() i got an error ‘TypeError: ‘int’ object is not iterable’. After going through the traceback, if I understood right the error is caused by DataBlock instance having integer saved in its variable n_inp when it has to have something that could be iterated through. Am I understanding it correctly about the n_inp variable? What should it return?

The traceback:
TypeError Traceback (most recent call last)
in
----> 1 learn.summary()

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in summary(self)
    189     "Print a summary of the model, optimizer and loss function."
    190     xb = self.dls.train.one_batch()[:self.dls.train.n_inp]
--> 191     res = module_summary(self, *xb)
    192     res += f"Optimizer used: {self.opt_func}\nLoss function: {self.loss_func}\n\n"
    193     if self.opt is not None:

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in module_summary(learn, *xb)
    166     infos = layer_info(learn, *xb)
    167     n,bs = 64,find_bs(xb)
--> 168     inp_sz = _print_shapes(apply(lambda x:x.shape, xb), bs)
    169     res = f"{learn.model.__class__.__name__} (Input shape: {inp_sz})\n"
    170     res += "=" * n + "\n"

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in _print_shapes(o, bs)
    156 def _print_shapes(o, bs):
    157     if isinstance(o, torch.Size): return ' x '.join([str(bs)] + [str(t) for t in o[1:]])
--> 158     else: return str([_print_shapes(x, bs) for x in o])
    159 
    160 # Cell

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in <listcomp>(.0)
    156 def _print_shapes(o, bs):
    157     if isinstance(o, torch.Size): return ' x '.join([str(bs)] + [str(t) for t in o[1:]])
--> 158     else: return str([_print_shapes(x, bs) for x in o])
    159 
    160 # Cell

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in _print_shapes(o, bs)
    156 def _print_shapes(o, bs):
    157     if isinstance(o, torch.Size): return ' x '.join([str(bs)] + [str(t) for t in o[1:]])
--> 158     else: return str([_print_shapes(x, bs) for x in o])
    159 
    160 # Cell

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in <listcomp>(.0)
    156 def _print_shapes(o, bs):
    157     if isinstance(o, torch.Size): return ' x '.join([str(bs)] + [str(t) for t in o[1:]])
--> 158     else: return str([_print_shapes(x, bs) for x in o])
    159 
    160 # Cell

~/anaconda3/envs/rmnt/lib/python3.8/site-packages/fastai/callback/hook.py in _print_shapes(o, bs)
    156 def _print_shapes(o, bs):
    157     if isinstance(o, torch.Size): return ' x '.join([str(bs)] + [str(t) for t in o[1:]])
--> 158     else: return str([_print_shapes(x, bs) for x in o])
    159 
    160 # Cell

TypeError: 'int' object is not iterable

I had a similar problem too. See
https://docs.fast.ai/data.core.html#Datasets

I fixed learn.summary() with
dls.train.n_inp=1

Still I do not know if this is the same issue as yours. It seems there’s some fragility around the implementation of n_inp, maybe when transforms are not used.

Good luck! :slightly_smiling_face:

1 Like