I have not known about any of the !
commands before, the whole bringing to background/foreground thing also seemed a little bit remote to me even though I technically knew about it, but now that I started using all this moving around in bash feels so much nicer 
Dunno what the effect is, as I genuinely did know about pushd
for example, but it took being part of the walk-thrus to actually start using all this
Awesome and a big thanks! 
Inspired by all the installing of stuff and changing the PATH environment variable, (and symlinking!) I wrote this small program for running docker, it is super useful to me
#!/usr/bin/env python3
import argparse
import subprocess
parser = argparse.ArgumentParser()
parser.add_argument('-i', '--image', help='name of the docker image to run', type=str, default='nvcr.io/nvidia/merlin/merlin-pytorch-training:22.05')
parser.add_argument('-c', '--cmd', help='cmd to run', type=str, default='jupyter notebook --ip=0.0.0.0 --allow-root --no-browser --NotebookApp.token=""')
parser.add_argument('-np', '--no-port-forwarding', help='forward port 8888', action='store_true')
args = parser.parse_args()
docker_cmd = ['docker run -it --gpus all --ipc=host']
if args.no_port_forwarding is False:
docker_cmd.append('-p 8888:8888')
docker_cmd.append('--mount src=`pwd`,target=/workspace,type=bind --ulimit memlock=-1 --ulimit stack=67108864 --entrypoint="" -w /workspace')
docker_cmd += [args.image, args.cmd]
subprocess.call(' '.join(docker_cmd), shell=True)
I initially tried to do all this in bash (by defining a function), but this very changed into a painful ordeal (as the logic was becoming more complex). Turns out python is quite handy for such a thing 
I am not sure if that is the “correct” way to use it, but I keep this code somewhere in my homedir where I can easily edit it, have made it executable and have linked to it from /usr/local/bin
. Not sure if that is where we should put our own stuff? Probably!
One mystery I continue to be faced with is how come in a docker container I cannot do pip install -e .
on a local repo? I mean, the command runs but docker still imports the library that was preinstalled there 
But one cool thing I stumbled upon if I do the following at the top of my notebook:
os.chdir('/workspace')
and workspace
is where the code for the library I would like to work on lives, somehow, that code gets imported instead of what is installed.
But… how? Why? So what is pip install -e .
doing? Now this will be a stupid question… but for all the python packages in the universe, can I always do this? Be at the root of the repo where they are defined, where I could do pip installed -e .
and simply import them “directly”? What magic is this?!
If that would indeed work with all Python code, that would be quite neat because… I could then always use pdb.set_trace
to step through any Python code in the entire known universe without doing pip install -e .
, would just need to change the workdir using os.chdir
.
Well, figuring out these pieces was fun, now I actually have to figure out what the code I wanted to learn about does
That will be aaaaa long process…