Jeremy mentions(in the video for Lesson 1) that the design of fastai v2 was inspired by APL and Lisp. I would love to know what those design/implementation decisions were(specifically APL).
A Programming Language APL was designed to be mathematical so involved operations on vectors. It used mathematical symbols so required a special Greek alphabet keyboard. Jeremy said it was the broadcasting element which was the important language. For example you might use it to calculate eigenvectors. It was used the 1980 to provided user tools for example A Departmental Reporting System (IBM) was written in APL.
Lots of Irrelevant Silly Parentheses (LISP) as it was jokingly known was a popular language for teaching students about AI in the 1970’s.
Both were designed to focus on the solution rather than the mechanics approach of the 3rd generation languages such as COBOL and ALGOL.
Although it is important to differentiate between language concepts and programming because such concepts could still be coded in assembler.
Thanks! I should have given more detail. I’ve worked in Common Lisp and am familiar with APL(side note: there is this wonderful paper which excites me. https://dl.acm.org/doi/pdf/10.1145/3315454.3329960).
I was curious about it’s impact on implementation of the fastai v2 api design (e.g. function signatures, classes etc.)
Yes APL is a natural fit for the Wx+b of deep learning. I conducted a though exercise whilst I was washing the dinnerware. I suppose the big question is how does the APL generate the code to run on the GPU. You could either generate CUDA or C like programs that the GPU support. It has been nearly 40 years since I used APL but even then ADI existed to read external files. In 2019 Part 2 last year we looked at using Swift to replace Python. It might be an interesting amusement to use APL. I think the class is just a consequent of everybody being forced to learn C++ until thank goodness C# came along (C++ – the bad bits) and I must admit I must of missed when Jeremy said LISP although Chris Lattner did mention FORTRAN last year. It would be interesting if Jeremy or Silvan replied.
That’s wonderful that you have coded in APL. I’m going through J right now to learn the syntax. I’m hoping to take that as an inspiration and make a Python library(though numpy is said to be inspired by APL).
Dyalog APL has something to say on GPUs(https://www.dyalog.com/uploads/conference/dyalog17/presentations/U04_APL_on_GPUs_Progress_Report.pdf)
Yes, let’s wait for Jeremy or Silvan to reply.