Share your work here ✅

Hi @champs.jaideep,
I took a look at your kernel and studied the paper. From what I understand, Arcface loss might be applicable to problems other than computer vision.
Does replacing Linear layers from other models (a text classifier for exmaple) with ArcMarginProduct make sense?
What do you think?

well i havent tried on the models other than the classification model. It produced fantastic results for kaggle whale identification form hump back which had 5000 classes plus. You are welcome to try it out for other models.Arc face is also useful for face detection by learning deep features.

As a homework for lesson 1, I have created a dataset of images of text of four languages: English, Punjabi, Hindi and Urdu.
The images contain text of either of the four languages (some of the images have text of two languages). The effort is to detect the langauge of the text in the images.
With whatever I learned from lesson 1, I was able to achieve an error rate of 0.303125 using resnet34.
Believe me, data is very noisy. I will try to improve it with whatever I will learn from next lessons.

Update: 12/08/19
I applied data cleaning on the data as taught by Jeremy in lesson 2.
With it, I was able to reduce the error rate to 0.240385.

I have experimented tonight with changing the batchsize and nothing else on my a network trained on anime character faces and the results are not what I expected. I have placed a notebook at https://github.com/Dakini/BatchSize that show my results.

Training for 5 epochs with fit_one_cycle with a batchsize of 728,got an accuracy of ~58%. While a batchsize of 32 got 79%. There seemed to be no difference between the training times for them either.

This blog describes how I created a simple web app using the 102 Flower Dataset.

1 Like

ShopNet: A Neural Network for Product Images (http://www.shopnet.ai/)


Trained a CNN with the amazing techniques mentioned throughout the course and deployed it as a web app. Currently at ~95% accuracy but with room for improvement.

The stats:

  • Training data: 26 classes, approx. 400 samples each, batch size of 16 (see current list of classes)
  • CNN learner with resnet50: cnn_learner(data, arch, metrics=accuracy, wd=0.1, ps=0.01)
  • weight decay: wd=0.1
  • dropout: ps=0.01
  • 3 cycles, 5 epochs each
  • Accuracy: 94.6%
  • Other fine-tuning techniques:
    • data augmentation (rotate, zoom, lighting, warp)
    • progressive resizing from 224 to 352
    • adjusting learning rates
2 Likes

Is there a way to explore Pytorch classes and functions through VSCode?

If anyone’s interested in StyleGANs I have a fast.ai implementation here.

9 Likes

Hi, I’m trying to follow your example to build a real aplication. Thanks for sharing your work! Congratulations for your medal! I don’t know if fast.ai has changed since then, but it doesn’t find ConvLearner, so I’m using cnn_learner as I’ve seen at Lesson 1 jupyter notebook.

Thanks again!

Try something like that:

command = "./ffmpeg -ss {} -i $(youtube-dl -f mp4 --get-url {}) -t {} -vn -c:v copy -c:a copy {}.aac".format(int(start), url, dur, i)

By doing so you, ffmpeg will be on charge of everthing and with -vn video won’t be saved.

Check this thread for more info.: https://github.com/ytdl-org/youtube-dl/issues/622

Hi guys, I just did the Lesson 1 Homework with the Cricket and Baseball classification. Can anyone give me feedback for my code? Thank you!

Hi naraB hope you are well!

Your notebook looks good, its well laid out and easy to follow.
If I were you I would try and deploy it somewhere so a friend can see it.

I would then do as Jeremy says, go full speed ahead and try and complete the other lessons.

There is plenty to do, as the lessons get a little more challenging.

Cheers mrfabulous1 :smiley::smiley:

1 Like

Hello everyone, I want to ask what is the better way to share Jupyter notebook on GitHub, using Gist or a repo? Thanks.

Hey everyone, I just wanted to share some work I’ve been doing. At my university, I have been preaching fastai to both undergraduate and graduate level students. As a result I “teach” the course there through my club. Essentially I use the lessons as a base, and expand from there. Through this, I’ve been able to get four research projects for other students using the fastai library and the professors love it. I wanted to share with you guys my lecture/meetup material, in case anyone else would find it useful. This year I made it two days a week, where the first day we go over a particular type of problem (tabular, images, etc) and the second day is focused on state of the art practices with code examples, along with helpful tips/resources/functions for applying fastai and deep learning for research. If anyone wants to take a look, my notebooks are here :slight_smile:

It may look slightly disorganized, I’m still preparing for the next class for this semester. Should be completely done with the new material in the next week or so.

The notebooks are all finished :slight_smile:

9 Likes

Hi muellerzr hope all is well!

Thanks once again for sharing your work.

Have you got any time management tips or a specific work ethic I could learn or emulate?
You seem to create and help so much.

.

Thank you.

mrfabulous1 :smiley::smiley:

1 Like

Hey @mrfabulous1! Sure :slight_smile: I usually find some project where I can just get lost in it, explore it until it frustrates me, and continue until it doesn’t. Also, trying to teach and guide others at my school has really helped me make sure I know the material, as the people I am helping come in never even touching python in some cases. That takes a lot of prep work and thinking into how to help gear them into the right direction.

For the past few months also, I work roughly 1-2hrs a day on smaller projects (this was before the meetup work), just exploring what some functions do, how they work, and applying it to any dataset I could find. Since most of my research is tabular, I was going through datasets found on the UCI.

Then, I’d explore pure pytorch code papers and try to migrate it to fastai. Sometimes this is easy, eg the new optimizer LessW2020 got to work, where it’s a simple port of a function, other times it’s trying to pull full architectures from papers such as NTS-Net or Deep High Res. Again only working at most 2 hours a day so I don’t get too frustrated.

I also explore the source code and lecture notebooks. Often. How does x work. Why does x work. And why does doing y break x’s code? (What did I do wrong). Most of the time, simply tracing back what a function does answers most of my questions. And for the course notebooks, I still can’t remember how to write an image databunch from memory so I cheat (oh no!). I try to not, and if it doesn’t quite work, the course notebooks show an example for most any problem so I debug there.

I write (or try to) when I can. I haven’t lately for my blog as things have been crazy, but I found writing blogs have helped me figure out what’s the most important bits from lectures, the library, etc and also helps me to be able to explain it to others.

And lastly, for lectures (the actual fastai course). Honestly I didn’t complete course v3 for four months. Why? I focused on what I needed then, and slowly worked my way through. Doing this allowed me to not get overwhelmed with the super advanced topics at the end of the course right away, and instead focused on what I needed to learn and do at the time for my various tasks.

I know I said lastly but just came to me, also don’t be afraid to be curious. Einstein once said “ The important thing is not to stop questioning. Curiosity has its own reason for existing.” This can come in many ways such as feature engineering, playing around with the number of variables, classes, hyperparameters tuning, etc. even if someone’s done it, assume their way may not be the best, and try to see if you can outthink it. Even if that somebody is yourself! :slight_smile: I had a research project where I was trying to beat a baseline in random forests. I spent two months on it and couldn’t quite do it. I always fell 1-2% short. Then I had discovered a paper on feature engineering for sensors a few months later, revisited it with my new knowledge and practices and wound up blowing them out of the water! Patience, persistence, and curiosity is everything. While I know a decent amount about the library, there is much I don’t know, and I always remember that to stay level-headed. Everyday I’m learning something new just by playing around.

So basic sum-up:

  • Spend 1-2hrs a day on mini projects that I can get deep into for a month or two at most.
  • Look over the source code and notebooks often
  • Write blogs and lecturing geared towards those who either barely know what fastai is or are getting the basics to make sure I know it well enough and can explain it.
  • Go through the lectures and courses slowly, relistening and running the notebooks often.
  • You are your own rival. Try to outperform yourself on your projects and you will see growth.
  • Read the forum daily. Even a casual browse of a topic. I may not understand something people are talking about, but I know it exists and I can revisit it later if I need to

Hope some of that will help you or others keep going :slight_smile: I’ve only been in this for 9 months now, and doing the above has helped me solidify my comprehension of the material to a point where it’s allowed me to teach and help others at a young age (I’m still 21) and opened many research and job opportunities. It doesn’t take much to get there :slight_smile:

13 Likes

Hi muellerzr thank you for providing a comprehensive reply.:smiley:

I’m happy to say I have always had a lot of perseverance, curiosity and patience with others but according to my partner not with my self. I do a few of the things you mentioned. But from your reply I can see I can do a lot more. I will endeavor to add some of your tips to my repertoire.

Many Thanks mrfabulous1 :smiley::smiley:

1 Like

I’ve been going through lesson 1 and 2, I think I got the ideas okay, my problem has been trying to deploy to a webapp for free. Tried heroku a few times but been having problems. I’d like to do it but it’s been hard, I’ll check android next.

But anyway, the thing I’m doing is a basic art classifier that tells you the artistic movement. Here’s a sample of the dataset:

My current error_rate is down to between 2-5%.

2 Likes

Hi @LauraB. Great job!.. in fact, wow, you did a LOT of work adding and trimming fastai to the Lambda. But why? :slight_smile:

I just deployed a small proyect (I’ll share it soon), but I didn’t have to add fastai, so I saved a lot of time there. I just exported the model to PyTorch and then used the dockerfile from Pytorch which had all the modules I needed ( https://github.com/brunosan/iris-ai/blob/master/iris-aws-lambda/pytorch/app.py#L4 ). What made you need fastai? Just curious, you must have spent A LOT of time on that bit, but I don’t know why. I think the reason is that you didn’t port your model from the fastai format to the PyTorch format (explained here: https://course.fast.ai/deployment_aws_lambda.html#export-your-trained-model-and-upload-to-s3 )

2 Likes

Hi @brunosan

You are right in that you don’t need the fast.ai library for inference if you export your model to the pytorch format.

I wanted to see if it was possible to have the fast.ai library running on AWS lambda, and it was a good learning experience for me :slight_smile:

Laura

2 Likes