Having been through a lot of pain working through having the wrong version of packages and getting breaks…
Something I have been wondering about… the software package management ecosystem in Python seems very wild and wooly. For instance, Java has Maven and even Javascript has NPM to insure that the proper dependency versions are in use. (NM the constant pain of preferring Windows on my workstation and Python not having platform independent byte code format like those other languages… yeah I know it’s a rougher road, but somebody’s got to do it )
Is Python left out in the cold for project dependency versioning specification? It’s been a huge pain point for me, and many others I am sure. If there is such a thing, maybe we should be using it in the lessons? The whole bash script thing seem like quite a hack. (although maybe a necessary one in Python world?)
python has virtualenv and requirement files that help you manage 3rd party libs for different projects. I still find it a bit clunky to use but it can remove version pains if you need different versions of the same lib for different projects.
You should use conda for package management. For most purposes, it’s better and easier to use than virtualenv.
Conda allows for easy installs for many packages (e.g. Scipy) and easy creation of environments (conda create -n env_name python=3 scipy numpy other_cool_package).
You can freeze your environment settings in a YAML File and recreate easily.
Yep. Combination of conda YAML file and pip requirements should get you everything (and I think there might be a way to store pip requirements in the conda file too although I’ve never tried).
That said, these libraries are changing quickly. You’re much better off trying to recreate everything with the most recent versions than just running someone else’s code.