Introduce yourself here

Hello everyone,

I’m Wayde, a Full Stack ML Engineer from San Diego, California. I’m excited to be a part of the new Part 1 course as it was great going through it last year with everyone, and it looks like the new and improved Fastai library is going to kick ass :slight_smile: . I’m hoping to use the things we learn over the coming months towards deploying ML applications for personal projects and (at some point soon, hopefully) an employer .

If anyone is hiring I’m looking for work! Check me out at my personal website that has links to all my socials and some write ups.

Anyways, I’m excited to work with everyone again this year and to really learn how to leverage the power that is practical deep learning with Fastai!


Yes you can. You will miss out on the opportunity to ask questions during the class, of course. I’d strongly suggest trying to watch it no later than the next day, because the conversation online moves fast!


Hi, my name is Matthew. I live in the US and I aspire to be useful in the DL field…someday. In the meantime, I’ll keep my day job as an institutional investor.


Hi everyone.

My name is John Garcia, from Colombia.

I am an information / cybersecurity consultant that started to learn ML/DL four months ago.

I recently took the 2019 part I course, and learned a lot in a short time, mainly because library is so great that people like me, who coded 15 years ago at the University, may implement things since the first week!

My main topic of interest is trustworthy AI, so I am learning about fairness, accountability, explainable AI, and of course using my domain expertise of privacy and cyber security to deal with these kind of risks. I would love helping companies and people to build trustworthy AI systems.

I am thinking how to use v2 to integrate 3rd party tools dealing with the topics I talked above, and also as a tool to evaluate other metrics to detect bias and create explainable systems.


Hi, everyone –

My name is Andrew Nguyen, currently working as an ML Engineer, managing infrastructure and orchestration for production models on healthcare data.

I started the courses almost two years ago, and in that time, I went from being a novice programmer and researcher to landing a spot in Fellowship.AI, and now I work with Launchpad.AI and their clients on AI products, thanks in large part to the course.

Always excited about the unconventional, creative thinking that’s characteristic of that leads to big wins in the AI space, and the advances in the field now are as exciting as ever!


My name is Matanya (@HananMatanya), I work as an NLP developer in a startup called DigitalOwl (and completing first degree in BSc). In the company, we do medical documents processing in the field of insurance.
I owe to the community a lot!

  1. The project with which I was accepted for work was Hebrew ULMFIT.
  2. I created a learning group at my place of residence on v3 part 1. The group completed the course successfully!
  3. I think v3 part 2 is one of the best practical courses on the web today in Deep Learning. A lot of my work environment is built on this course. It allows you to think and perform almost any experiment I want.
    Anyone who deals with advances NLP with is welcome to contact :slight_smile:
  4. To decentralize our experiments we use FastEc2.

I was very happy to receive the email and very excited about the course.


Hello everyone,

Bhoomit (@bhoomitt)
Work @ GoIbibo/MakeMyTrip, India
Building an internal Chatbot Platform
Using ULMFiT & fastai on production for intent classification for about a year :slight_smile:

Learned most of what I know about NLP with V1. Looking forward to V4.


I’m Butch Landingin, a software developer from the Philippines.

I first started my ML journey years ago with Andrew Ng’s ML course on Coursera, which in contrast to the fastai approach, uses a bottom up approach – starting with matrix algebra then building it up to understand more advanced concepts like forward propagation, back propagation and gradient descent .

While Andrew is great at explaining complicated mathematical concepts and making them easy to understand, it’s a hard course to master especially when you want to start applying deep learning to real world problems.

In contrast, when I first took the fastai course a year ago, I was very excited to be able to apply what I learned right away (and get good results as well!) just after the first few lessons.

I am excited to join this next iteration of the course and its improved fastai v2 library and would like to thank Jeremy, Rachel and Sylvain for building a fantastic course, a great library and an inclusive community!

– Butch
twitter : @butchland


Hi! My name is Andrea Panizza, I’m a Senior Staff Data Scientist at Baker Hughes and on Twitter I’m @unsorsodicorda. I got interested in Statistics years ago while working as an Aero Design Engineer for turbomachinery, then I started learning about Bayesian Inference, Statistical Learning…and finally in 2016 I got on the Deep Learning train :grinning: As I’m mostly self-taught in these topics, I followed quite a few MOOCs in the past.
I wanted to start again by following the MOOC and this time I tried to start a study group in my hometown, without success. So this time I’m going to do it the classic way, i.e., with you folks as my wonderful remote fellow students!

BTW, let me thank @jeremy and all the people not only for this great library, but also for:

  1. an excellent getting-started guide to GCP
  2. finally convincing me to open my blog :grinning: It’s still in beta, so to speak, but finally after a long procrastination, Jeremy’s excellent template for GitHub pages convinced me to start!

:wave: My name is Vova, I’m from St Petersburg, Russia.

I work as software engineer since long time (like since ~2004) and was never involved in ML/DL in my day job role. Though, I’m enthusiastic about learning new stuff in general, and in ML in particular.

Last year I’ve enjoyed Swift for TensorFlow part of course and even took some part in its development by creating SwiftCV library, contributing to S4TF itself, and doing a fun project with S4TF.

This year I suddenly for myself became PyTorch/OpenMined grant recipient for working on Federated Learning open-source project with PySyft team! PySyft is privacy-preserving deep learning framework based on PyTorch.

Privacy is a relatively new (yet important) thing in DL, and I’d love to hear more on it in the new 2020 part, @jeremy :slight_smile:

My main takeaway from is that you don’t have to have PhD to do things in ML. Thank you for that!
Feel free to connect with me in Github or Linkedin.


Hi everyone,

I am Ariel and live in Austin, Texas. I have been a huge fan of and the people behind it, Rachel, Jeremy, Sylvain and all contributors. Not only because they are bringing all of us forward in this field, but also because they bring up important topics such as bias and how to avoid misusing AI.

I am a Software Engineer working on AI and have deployed text classifiers in production using I have been lucky enough to attend the past few live classes remotely, and I am grateful for the opportunity to do it again in 2020.

Looking forward to learning from all of you and to continue my journey in this amazing and exciting field. I am also looking forward to the release of the book.



Absolutely, this sounds great!

1 Like

Hi! My name is Ravi Vijayakumar, I’m a IC Design Engineer at Broadcom and on Twitter I’m @hiphopswami. I got interested in Deep Learning 4 years back and have been a student of fastai from the beginning. I am still awe-struck at the forethought of Jeremy and Rachel in knowing the impact Deep Learning would have and the generosity of opening up all their experience to anyone willing to put in the hours and starting such a fantastic community. I have been putting in the hours but haven’t been consistent. I plan to throw the kitchen sink this time and look forward to participate actively in the forums.

I am passionate about applying deep learning to solve analytical problems which mix Deep Learning and Heuristics together. Works like Reinforcement Learning Driven Heuristic Optimization and GAP: Generalizable Approximate Graph Partitioning Framework seem to be making strides in the area and I hope to jump on the bandwagon. Would love to collaborate with anyone with similar interests.


I’m Francesco, father of two and co-founder of two Data-related companies in Italy.
I have been following a couple of iterations of courses.
Together with my colleague @pietro.latorre we created the ULMFiT model for Italian ( and some other prototypes!


I’m a software engineer turned ML practitioner turned startup founder based in Bengaluru, India. I’m currently building, a platform for sharing and collaborating on Jupyter notebooks.

I’m lucky to have attended the last 4 iterations of FastAI (Part 1 & 2) live online, and I recommend it to pretty much everyone I meet, as the best place to start learning ML :sweat_smile:. It’s a privilege to be part of the FastAI community. I look forward to organizing a FastAI study group in Bengaluru, as we do for every iteration of the course. I blog infrequently about FastAI, PyTorch & data science on Medium.

EDIT: I’m @aakashns on Twitter.


Yeah! Rock’n’roll!


I’m James Briggs. I’m a data scientist and software engineer from the UK. I studied computational physics at university. I got a little interested in ML back in 2008 when i tried to read the
late David McKay’s Information Theory and learning algorithms. This book is beautiful, but it’s very math heavy. As a result it was hard to get inspired and see applications. I learned about basic neural nets like Hopfield nets at this time too. (Back then wikipedia said something like “neural networks have some limited applications, but they are largely a curiosity and likely won’t be part of the solution of AI.” Hehe)

I got into ML properly through Andrew Ng’s course in 2013. I tried to do things like cats vs dogs and galaxy zoo with classical methods, but gave up because the results sucked.

I struggled to learn deep learning before starting fastai last year. I’d been on a nvidia weekend course on dl and tensorflow, and at the end i felt that i would have to study for years to get good at this stuff. The course basically taught us some tensorflow and by the end we could code a cnn to solve mnist from scratch… yay!

Fastai was a revelation. I’d been learning things backwards all along! My old problem Cats vs dogs was solved the first day, it was even too easy. I immediately started using what i was learning at work, especially tabular learning. I even wrote up my own notes, on my blog, and taught my colleagues about what we could now do.

I’ve been fairly active on the forums, asking questions and helping people. I also made a small pull request to fastaiv1 a few weeks after starting, fixing a bug i found in how the Category datatype worked with python dictionaries. I also write a blog on github pages, where i have my personal lesson notes and project write ups.
Twitter: jimypbr


I’d be very happy to see the early drafts - Jeremy. I’m not directly on the Azure ML/AI side, but we have some internal discussions started up.

Hello Everyone,

I’m Nikky. I studied image processing and computer vision, focusing on image generation. I’ve gone through Part 1, v3, and part of Part 2, v2. I’m looking forward to doing more work with this version and improving my python skills. Especially writing more blogs in reimplementing interesting papers. One of my current favorites is Sin-gan, ICCV 2019 best paper.

1 Like

Very interesting! Can you recommend some literature on these topics?