Hi @Daniel, I am glad you like Excel too. If Jeremy wants to share his Excel tips, I would love to attend it. BTW, Excel can perform Lambda function, it becomes more powerful.
I am aware you have attention to detail. So, I tried to answer as details as possible. Please feel free to ask further questions if anything is unclear to you.
Coming from an accounting background, Excel is my universal tool with a strong application in forecasting and scenario modelling. When I came across fast.ai, Jeremy used Keras at first; then, moved to Tensorflow. In the middle of the course, he changed again to PyTorch. For someone who learnt Python not too long, it was hard to cope with different frameworks. So, I focused on learning the underlying concepts. Jeremy used Excel to explain softmax (maths), cross-entropy (maths), gradient descent with different variations (maths and solver add-on), convolution (visualisation), and recommendation system (matrix multiply and solver add-on). So, I could follow along the Part 1 in 2017.
I am a visual person. I need to “see” before I can absorb new information/concepts. I found “dropout” was very unintuitive. WHY do we spend all the time to train a model (much slower and expensive to train at that time) but delete some of the features/activations the model just learnt??? But, by doing dropout, the model will generalise better!? I couldn’t process this concept in my head.
So, I did the visualisation (note: Jeremy explained the details operation in Lesson 8 1:08:42 few days ago). All of the sudden, I GOT IT!!! (For those who don’t have Excel, all the files were converted into Google Sheet previously)
Attending Part 2 in Spring 2018 was a big stretch for me. Reading ML research papers, with lots of maths notations, was intimidating. Again, I tried to learn the concept and immediately fell back into my Excel comfort zone. I managed to re-produce focal loss graph in Excel first and then re-produce it again in Python. So, I learned it twice. (I just realised it help to improve my forget curve). While I was running (and waiting impatiently) Part 2 notebooks, I kept using Excel to understand/experiment with the following concepts:
- Gradient explosion for CycleGAN
- Wasserstein GAN (comparing L1 and L2 losses)
- Artifact when up-sampling images using different methods and kernel sizes
If you are interested, here is the repo. Feedback is welcome.
Over the years, deep learning frameworks and libraries can do most of the heavy lifting for us. We don’t even need to fine-turn cats and dogs classification anymore. Knowing the impacts and reasons for picking certain parameters/loss functions are far more important.
How useful or unexpectedly useful Excel has been for you?
Additionally, I use Excel extensively for project management (general projects or even deep learning projects) in my corporate career. I use it to:
- develop checklists based on the concept of Work Breakdown Structure
- keep brief minutes that contain decisions and actions only (a tab for each meeting, so I can follow up on actions items every meeting and make my team accountable for their tasks)
- keep track of project deadlines, milestones and leaves
- data collection registration (since we needed to collect our own ground truth dataset)
- explore best visualisation options (much easier to change chart types in Excel than in Python)
- mock up model specifications (breaking down into input, process, and output) to avoid misunderstanding and using the predefined output for User Acceptance Test later. (Very important for system customisation projects to ensure projects are delivered on time and on budgets)
I successfully applied the above with a multi-disciplines team, located in four different time zones, to deliver a deep learning project - using computer vision for digital pathology. Last year, my team published the finding in Nature’s Scientific Report. Most of the techniques we used were covered in Part 1. But, how to apply existing problems and execute them within limited resources is still challenging.
In summary, if Jeremy did not use Excel in his teaching, I would not contemplate learning deep learning at all. Without fastai, I might possibly still use Excel and work in Finance/Accounting. But now, fastai opens up a whole new world for me to explore.
PS. Thank you for all your detailed notes. They are very helpful. 深度碎片,加油!