My interest in neural networks began in 1975 when I entered kindergarten already reading the newspaper and learned about cyborgs from the government. In 1988 when the University of Washington adviser asked me to choose a major, I said cyborgs, and then had to explain it to him as remote-controlled people. They didn’t offer any classes like that back then, but 25 years later the school published their EEG brain-to-brain interface allowing one person’s thoughts to control another person’s actions. In the meantime, despite being ordered by the Department of Defense in February 1997 to stand down when I tried to discuss cyborgs, they heeded my concerns to make cloning newsworthy, lifted the news blackout covering Dolly the sheep with legislation to follow, and eventually in 2005 my paper was credited as the basis for the Wikipedia article about brain implants.
Just before I was so rudely forced out of school by skeletons from the closet, I was going to port my professor’s neural network music conducting gaming glove interface from Mac to Windows. Over the years I kept an eye on the technology, and when Stanford published their lectures on stochastic gradient descent, the process just screamed for improvement; still does: the fast.ai engine requires eyeballing the best slope to minimize the loss function when that process could surely be automated by a programmer versed in Calculus.
Upon discovering fast.ai at 11pm the other night and burning through most of my Crestle’s free hour watching the first lecture with the clock ticking, I was hooked after the Cat v Dog lesson, and went way overboard trying to buy a GPU system without any advice. Suffice it to say, when my baby Alien arrives on the doorstep, I’m sending it back to planet Dell. Otherwise, I’ve made it my mission to find the best Deep Learning computer system build for the starving student, and along the way, hopefully finding folks to build them. Word on the street is that computer salespeople are being asked more and more about AI systems over gaming.
(I see three (3) local minimums in the pictured data.)