Code:
batch_size = 64
do_flip = True
flip_vert = True
max_rotate = 90
max_zoom = 1.1
max_lighting = 0.2
max_warp = 0.2
p_affine = 0.75
p_lighting = 0.75
tfms = get_transforms(do_flip=do_flip,
flip_vert=flip_vert,
max_rotate=max_rotate,
max_zoom=max_zoom,
max_lighting=max_lighting,
max_warp=max_warp,
p_affine=p_affine,
p_lighting=p_lighting)
train, valid = ObjectItemListSlide(train_images) ,ObjectItemListSlide(valid_images)
item_list = ItemLists(".", train, valid)
lls = item_list.label_from_func(lambda x: x.y, label_cls=SlideObjectCategoryList)
lls = lls.transform(tfms, tfm_y=True, size=patch_size)
data = lls.databunch(bs=batch_size, collate_fn=bb_pad_collate,num_workers=0).normalize()
Error:
fastai/vision/ transform.py:247: UserWarning: torch.solve is deprecated in favor of torch.linal g.solveand will be removed in a future PyTorch release.
torch.linalg.solve has its arguments reversed and does not return the LU factori zation.
To get the LU factorization see torch.lu, which can be used with torch.lu_solve or torch.lu_unpack.
X = torch.solve(B, A).solution
should be replaced with
X = torch.linalg.solve(A, B) (Triggered internally at ../aten/src/ATen/native/B atchLinearAlgebra.cpp:859.)
return _solve_func(B,A)[0][:,0]
Can anyone tell me is this warning stopping my transform or not?
What can i do to overcome this warning?
INSTALLED LIBRARIES versions:
bottleneck-1.3.5 fastai-1.0.61 nvidia-ml-py3-7.352.0 object-detection-fastai-0.0.10
torch-1.11.0 torchvision-0.12.0