I was trying to train a char level RNN on some python scripts to see how it learns the structure of the code. With time it is learning the syntactic structures, but I think it will be better if we train it on an AST instead, cause then instead of wasting time figuring out the next character, the RNN could learn the token sequence.
Another wild thought, is there a way to add logic to NN as well, like adding some kind of intelligence?
What a coincidence… I was thinking if the same thing… I think it can explore a totally new way to do Deep Learning…
Yes, you can change the generator code to incorporate a prior - e.g. only pick out tokens that are legal at that point.
This is a cool idea. Maybe it would incorporate well in genetic programming or other evolutionary techniques as a more informed search or mutation operation?
That might be pretty cool. Not sure I’ve seen that tried.