Launching distributed training scripts with multiple args

For example, I’m trying to run a script like so

!python3 -m torch.distributed.launch --nproc_per_node= 2 train.py --bs 32 --bs_big 16 --model 'resnet50' --fp16 --eps1e-4 --stage1_big --stage2_big 

But is it possible to run it on multiple lines, like

!python3  -m torch.distributed.launch --nproc_per_node= 2 train.py \
    --bs 32 \
    --bs_big 16 \
    --model 'resnet50' \
    --fp16
    --eps 1e-4
    --stage1_big \
    --stage2_big \
    --div_factor_end_small 3 \
    --div_factor_end_big 3 \

I tried running that, but get an unexpected character \