Unet_learner memory issue with large dataset

Hi Everyone,
I am facing issues with unet_learner. The error occurs on “learn.lr_find()”. i have a dataset of 100k Records. Same code works with 5K records.

Datablock is:
final_size = 512
dblock = DataBlock(blocks=(ImageBlock, MaskBlock(codes = codes)),
get_items=get_image_files,
get_y=lambda x: path/‘masks’/f’{x.stem}{x.suffix}’,
item_tfms = [Resize(size=final_size, method = ‘squish’)],
batch_tfms=[aug_transforms(size=(final_size,final_size)), Normalize.from_stats(imagenet_stats)]
)

Error:

Try running on the cpu and seeing what the real error is

Kernel gets restarted, when dls device is set to gpu