I am currently in my final year of graduation (pursuing B.E in Computer Engineering). I have to submit a project regarding which I need some ideas.I have about 4-5 months time to submit it. I want it to be in Deep Learning or Machine Learning (Preferably Deep Learning). Need some help.
My knowledge about this field:
I have watched part 1 of the fast.ai course v2 and took part in some older Kaggle competitions and I know python enough to run ML and DL Libraries.
Thanks in advance!
A few ideas that I want to explore but are ill-formed (for now):
Alternatives to backpropagation: In particular, one can look at second order methods (Hessian-free methods in particular) or while it seems something like simulated annealing would be horrible compared to backprop, it would still be interesting to actually do experiments or maybe even take a trained network and do some iterations of simulated annealing to “settle” in the minimum.
Exploration of the loss landscape: Train a neural network normally but keep track of weight iterations. At the end, study the path that the weights took to get to the final solution. For example, find the cosine product between the weight vector at each update and the final weight vector. A second idea is to then vary the initial random weight vectors by small amounts of noise to study sensitivity to initialization.