Hi friends,
I’m running different experiments with a net I built from scratch in Python using Kaggle notebooks, and all goes well, but when the size of the input data and/or number of hidden units increases beyond a certain point I begin to receive these warnings:
/opt/conda/lib/python3.6/site-packages/ipykernel_launcher.py:13: RuntimeWarning: overflow encountered in exp
del sys.path[0]
The only part of my code where I use exp is in the sigmoid:
def sigmoid(x):
return 1.0/(1+ np.exp(-x))
and it all works well, trains well etc, until I reach a certain size in data/units, then I begin to get that error,
I am standardizing the data using sklearn:
names = df.columns
scaler = preprocessing.StandardScaler()
scaled_df = scaler.fit_transform(df)
scaled_df = pd.DataFrame(scaled_df, columns=names)
and initializing the weights with:
np.random.randn(layer_size[l],layer_size[l-1])*np.sqrt(2/layer_size[l-1]) ,
and the biases to 0.
any tips, advice about how to deal with those? thanks very much