Hi Jeromy,
I’ve been watching the Deep Learning course 2019 and was inspired by the example you gave of the gentleman that contributed with code. Specially because I’ve been working on something that one of the students asked which was to automatically find the learning rate. I developed an AutoML hyperparameter algo that has been beating random search, and skopt’s Baysian Optimization. I only tested it for Random forests and ExtratreeRegressor. My Dataset is a simplified version of the Rossman Kaggle competition. I feel it would be interesting to try to use it to optimize a Neural network. Would you have a small toy problem (I have a first generation i7 with 24gb of memory and no graphics card) and dataset that we could use to test it? I’m using Pattern Search which is MATLAB’s method for finding global optima (my code is in Python 3.6 and I have adapted it to work in the SKLearn framework, so you can call the code just like you call RandomizedSearchCV). Do you have anyone that I could show the notebook to see if it is something interesting to incorporate into Fast.ai?.
Best regards,
Rodrigo.