Is there a way to programmatically tune neural network's (NN) parameters (e.g. Gridsearch in scikitlearn) or I have to tune them experimentally (changing one at a time) in Lasagne? I know there are different rules of thumb to select NN's parameters e.g.the size of hidden layer to be somewhere between the input layer size and the output layer size etc.
Also, how to perform cross-validation in Lasagne to measure NN's performance? Any link to any resource would be helpful.
Below is my Lasagne implementation of NN (Number of inputs=6, Number of outputs = 1);
X=pd.read_csv('....\Full_Data.csv')
Y = X.pop("Eeg")
X, Y = shuffle(X, Y, random_state=13)
X = X.round(2)
Y = Y.round(2)
X_min= np.min(X)
X_max = np.max(X)
Y_min = np.min(Y)
Y_max = np.max(Y)
X = (X - X_min) / (X_max - X_min)
Y = (Y - Y_min) / (Y_max - Y_min)
X_train, X_test, y_train, y_test = train_test_split(X,Y,test_size=0.3,random_state=10)
X_train = np.array(X_train)
y_train = np.array(y_train)
X_test = np.array(X_test)
y_test = np.array(y_test)
import lasagne
net1= NeuralNet(
layers=[
('input',layers.InputLayer),
('hidden',layers.DenseLayer),
#('hidden2',layers.DenseLayer),
('output',layers.DenseLayer),],
input_shape=(None,6),
hidden_num_units=17,
#hidden2_num_units=100,
output_nonlinearity=lasagne.nonlinearities.tanh,
output_num_units = 1,
update=nesterov_momentum,
update_learning_rate=0.01,
update_momentum=0.9,
regression=True,
max_epochs=1000,
verbose=1,)
net1.fit(X_train, y_train)