Introduce yourself here

Hey, I’m Jack

My guiding aim for joining part 2 is to eventually develop tools to help me with creating art, I’m a keen oil painter and I’m utterly obsessed with using Mid Journey.

Looking forward to starting this journey, meeting some cool people and picking up some skills along the way. I’ve not done part 1 so I may have to go back and do that to cement that stuff in my head first :slight_smile:

2 Likes

Hey everyone, I’m Aarushi!

I’m super pumped about Artificial Intelligence and all the amazing things it can do. Learning about AI is my passion, and I’m thrilled to connect with all of you here. Let’s dive into this exciting journey together and have a great time exploring the world of AI. Good luck to everyone!

1 Like

Hi! My name is Daniel.

I work as a software developer and I am curious about how we learn.

I am very grateful to Jeremy, Raquel and the community for such incredible project. Thank you very very much!

I’ve done the first lesson of part I and I’ve got what I think is a nice little insight and I would like to share it with you.

Reading about the “vanishing gradient problem”, I learned that it is a problem that we may have when we train very large models. The thing is that in the backpropagation process (one of the metalearning processes), we used to use a function to react to the results of the current prediction, it was called called Sigmoid and it returned positive and negative feedback.

The problem was that the progress on the training was too slow. So, to solve this, they started to use a function that instead of returning positive and negative values (reinforcemet), it only returned positive values and zeros. This function is called ReLu.

And as we are Neural Networks…

If we want to learn small things, negative reactions to feedback may not bother us too much… because it is something easy and fast! So we just achieve our goals!

But when we want to learn large, deep subjects, negative reactions to feedback will slow us too much!

So! If you want to go far, don’t react negatively to failure, if something doesn’t work maybe it is not good but also it is not bad! It is either good or not good (1 or 0) but it is never bad (-1). Keep your self positive!

See you around!

Daniel

2 Likes

Hi, I am Marius. I am an IT professional as per the last 20 years of my life and a Materials Scientist from my previous incarnation. I have just discovered this course, trying to navigate through it. A fascinating topic, very well structured and addressed in the course material. I am looking forward to make a start.
Well done guys for building this course.

3 Likes

Hey all,
I am Shyam and I am fascinated by fastai course and NN’s possibilities. I am already a user of Pytorch since 2 years and fastai took away a lot of trouble.
I am learning fastai for majorly Computer Vision tasks, Hit me up on linkedin if you want to learn together. https://www.linkedin.com/in/shyam-gupta-5356511aa/
looking forward to contribute and learn from everyone here. :slight_smile:

2 Likes

Hello all, Tobi here, glad to finally get an hold on the nitty-gritty of stable diffusion, can’t wait to explore all that’s possible with it :).
Follow my code here: BetikuOluwatobi (Betiku Oluwatobi) · GitHub
Hit me up on Linkedin: Oluwatobi Betiku - Volunteer - AIA | LinkedIn

2 Likes

Hey! I’m Jay!

I’m a Data Science Software Developer. I was first introduced to AI back in 2017 and from there It became my career. I have worked on Computer Vision, Machine Learning, NLP, LLMs and now I’m ready to start my journey in Diffusers.

I’m super excited to learn about this new concept and also very grateful to Jeremy, Raquel and the whole community for making it easy and accessible for both Part 1 and 2 of the course.

Reach me out on Linkedin to learn together https://www.linkedin.com/in/jaydeepsinh-rajput/
Looking forward to contribute and learn together!!

2 Likes

Hi Everyone,

My name is Mike. I’m from Poland :poland: . I’m here, because TF finally make me seek. More seriously, I decided to use the Pytorch. I have forgotten from where I first time heard about fast.ai. When I saw “what torch.nn really is?” in pytorch’s docs, name Jeremy Howard was known to me. Maybe Jeremy wrote some tutorial about cluster training on TPU’s in GCS docs? My area of ML interests is Pytorch Geometrics. Great book Jeremy, thanks 4 sharing your great thoughts :slight_smile:

3 Likes