Trouble in 1.0.55 making unet_learner with TensorDataset

(Sean) #1

Hi all.
Am attempting to make the lesson7-superres notebook work on floating point data instead of png images. Had it working in a borderline reasonable way with a slightly older fastai version, but it broke when I updated.

By analogy with the lesson5-mnist notebook I read them in as numpy arrays, mapped them to tensors, made X_train, y_train, X_valid, y_valid into TensorDatasets, and did a Databunch.create.
Then passed that Databunch to make a unet_learner:

train_ds, valid_ds = TensorDataset(X_train, Y_train), TensorDataset(X_valid,Y_valid)
train_ds.c=3
valid_ds.c=3
data2 = DataBunch.create(train_ds, valid_ds, bs=10, num_workers=1)

But when I try to make the unet_learner:

wd = 1e-3
learn = unet_learner(data2, arch, wd=wd, loss_func=feat_loss, callback_fns=LossMetrics,
blur=True, norm_type=NormType.Weight)

I get the following error:

/group03/secrawle/Pystuf/python-virtual-environments/env/lib/python3.6/site-packages/fastai/callbacks/hooks.py in dummy_batch(m, size)
103 “Create a dummy batch to go through m with size.”
104 ch_in = in_channels(m)
–> 105 return one_param(m).new(1, ch_in, *size).requires_grad_(False).uniform_(-1.,1.)
106
107 def dummy_eval(m:nn.Module, size:tuple=(64,64)):

TypeError: new() argument after * must be an iterable, not builtin_function_or_method

I’m not very knowledgeable about Python/Pytorch/etc - is this a problem because Tensors have a built-in size method? Somehow this worked pretty recently.

With fastai 1.0.52 (I think it was 52?), I had this working. I could train the network, and then read new datasets from disk and produce results. (However, presort=True wasn’t implemented yet in the ItemList part, and the dataset would be all out of order compared to how it was on disk.)
It stopped working when I upgraded fastai to 1.0.55

Any suggestions? Anyone else build a unet_learner with tensor datasets in a more straightforward way? (I’m still padding my (1,128,128) arrays to (3,128,128) for instance, so I’m definitely doing a few clunky things - have mercy I’m learning).

0 Likes