Does anyone have experience with this?
I have one dataset with about 1 million rows and another with 210,000.
I have successfully built a 3 dense layer model (excluding embeddings of which i have many) that does a really good job of predicting my 16 continuous outputs.
I’m considering three approaches.
- Add 2-3 dense layers to the end of the existing model
- retrain the existing model with very low learning rates
- don’t retrain at all and push the 210,000 row dataset through the existing model with no changes.
I’m open to suggestions or ideas, thanks guys.