Study plan after this course?

ok.thanks…

Anyone from bangalore interested to from a study group to go through the DL and ML lessons?

1 Like

We can have a pan country group or the forum will suffice…

I could copy/paste all your todolists as I have the same to work on (review DL + ML part 1 videos, code, write…) before the part 2 I will apply to (many thanks to @jeremy, @rachel and @yinterian for the part 1 course !)

One (more) thing to do : get cuda/cudnn works on my windows 10 NVIDIA gpu computer (GeForce GTX 1070) using ubuntu in order to run locally all jupyter notebooks from part 1.
Possible ? Any online tutorial on that ?

you should be able to get everything work (at least vast majority) directly in windows 10 now (with newest pytorch). Everything to set up on ubuntu is covered in the thread on setting up your own DL box (Personal DL box) .
Basically:

1 Like

Thanks @beacrett for your list of steps. But in fact, I would like to avoid the dualboot windows/linux.

I would like to run my pytorch jupyter notebooks on ubuntu installed in windows 10, not from a true linux partition.
I tried the conda install of peterjc123 but did not work. I guess I’m missing steps in the installation of cuda/cudnn on ubuntu windows.
What is your opinion on this package (peterjc123) ?

i don’t believe the required nvidia and cuda set up is supported under the windows ubuntu sub system (pretty sure this is discussed in the dl box thread). As per one of Jeremy’s recent posts, you should now be able to get everything running directly in windows if you dont want to dual boot.

Thanks for your answer @beacrett.

I do not understand well the difference between “i don’t believe the required nvidia and cuda set up is supported under the windows ubuntu sub system” and what seems to me the opposite : “you should now be able to get everything running directly in windows”. If you have time to explain more (or give the link to a post answering that), I would appreciate very much.

+1 on working for Tamil MNIST. Are you in Bangalore area? Would love to discuss.


PS

I am here until next month. Please PM me.

To word my suggestions another way, if you want to run everything for this class in a ubuntu environment (given that you have a windows system), setting up a dual boot machine or a virtual box are options. Using the windows subsystem for linux may be problematic (in the ‘setting up a DL box’ thread I linked above, the consensus seemed to be that this doesn’t work for the class - it would seem to revolve around issues getting the gpu configured - this was the wall I hit).

As per Jeremy’s recent post however, you should now be able to get everything running in windows 10 (without using the ubuntu sub system).

I am interested in forming a study group…

1 Like

My plan is similar to many of the forum participants…

  • Revisit the lectures
  • Work on kaggle competitions (Willing to form teams to learn together)
  • Document / Blog the results

My plan in order:

  • Review last lectures
  • Finish Porto Seguro Winning Solution (Show representation learning increases performance + write a blog about it)
  • Submit a deep learning kernel using Fastai + PyTorch to each active Kaggle competiton.
  • Learn about dicom brain image contouring using CNNs.

While doing all these spend time with loved ones :slight_smile:

Hope to finish all and happy holidays to everyone !

2 Likes

it’s there already

1 Like

My plan is as follows :

  1. Carefully watch last 1 and half lectures which I missed due to exams
  2. Work on Taxi Trajectory kaggle winner’s solution using PyTorch
  3. Write something related to Collaborative filtering (I have some ideas, just need time)
  4. Read a lot of blogposts/papers to prepare myself for DL part 2 :wink:

:slight_smile:

One thought I had to make the fastai lib in kaggle kernels more useful was to upload pretrained model weights as kaggle datasets. And then add functionality in fastai to allow loading weights from csv files or SQLite DBs, so that we could then use pretrained models in kaggle. If anyone wants to give that a go and see if it works OK, I think it would be a cool project…

2 Likes

I am more into “paper to code” concept, which is understanding how the advanced concept from papers has been converted as a library. I remember in one lecture @jeremy said every efficient technique will be outdated by upcoming different approaches, so we should be in a position to understand how to convert or apply the technique defined in the papers. I started with this, and it is kind of very intense.

1 Like

I have a question regarding this project:

We usually store trained models, computed values etc. in some folder (e.g. tmp). For that we need to have write access in uploaded kaggle repos. I guess the problem will be that we can’t change a kaggle repository. We can just upload a repository and read from that. (as fas as I know). How would that work?

Hello all! Sorry I’m a bit late to this thread. I’m also interested in joining a study group and participating in some projects. Has one been started already from this conversation? Thank you! :slight_smile: