What are the best tools to manage DL models?

Interested in knowing what tools people are using to manage/analyze their data, deep learning models, and results!

2 Likes

Google sheets work pretty good for me to keep track of my results. Easy to create, easy to collaborate and has tons of features to analyze the results as well as exporting to something else.

Is there nothing better than spreadsheets? I just feel there ought to be something to make it easy to manage all the different variables. Particularly when more data from various sources are being fed into different iterations of the model(s)

I use Trello to manage projects/challenges/tasks.

I use a personal slack instance as a bit of a brain dump log. And as few .txts as well.

I usually start a notes.ipynb as well that gives a bit more structure.

This is the answer to the common problem:

2 Likes

I recently used this for visualizing performance: https://www.wandb.com/

1 Like

Have you find it useful? I took a look and seems like tensorboard + spreadsheet.

Hi @echan00,

Jakub from neptune.ml here.
It feels like Neptune would solve your problems. It helps you keep track of code, hyperparameters, data versions, summary charts (prediction distribution) and things like that. Some time ago I wrote a blog post about organizing your experimentation process with Neptune so you can go and check it out here if you want. What is more, it also allows discussions that link to code or charts so that collaboration on your machine learning project is smoother.

Also, I have just added a simple callback that lets you monitor fastai training in Neptune to our neptune-contrib library. I explain how it works in this blog post but basically, with no change to your workflow, you can track code, hyperparameters, metrics and more.

@fredguth
We have tensorboard integration so you can track your experiments in Neptune. Currently we are working on a functionality that lets you do porting of the tensorboard logs directory to Neptune. I think it is a good option to have because it keeps it backed-up and ready to share with anyone you want. This option should be out in the next few weeks so stay tuned! What do you think?

Before you ask, Neptune is now open and free for non-organizations.
Read more about it on the docs page to get a better view.

1 Like

jovian is also nice (https://www.jvn.io/)
@aakashns @init_27 and others (i am saying this based on seeing contributors on github) are working on it.

2 Likes

Thanks for the mention @spock!

If anyone is interested, you can find the docs here (WIP).

To give a 50 ft overview, here’s what jovian does:

  • Allows you to share the notebooks and the anaconda env with a one-line commit, inside of a notebook cell:
import jovian
jovian.commit()

This should auto-capture your env and upload the notebook, env along with any checkpoints (needs to be explicitly included) to the website. Ex: Notebook

If you clone a notebook, it auto grabs the notebook, the conda env is setup as well along with the checkpoints, artifacts being downloaded.

  • Tracking Metrics:

For multiple experiments we allow you to compare metrics by manually logging, we’re also adding callbacks to allow auto-caputre.

ex: Checkout fast.ai callbacks here

Example of compare versions

  • Versions/Collaborations:

Every commit to a notebook creates a new notebook. We also allow comparing differences in notebooks, in a cell by cell basis unlike the json by json changes basis compared to github.

Collaborator: You can add collaborators to your notebook when they commit to it a new version is created and the contributor name gets added to the version lists.

We’re still in Beta but hoping to launch soon, if anyone would like to try it out, do let us know if you like it or have any feedback. Many of the design philosophies also align with what’s mentioned in the lectures (2019, p1 and p2).

  • Sanyam
1 Like

you’re most welcome !