0

I found in the documentation of AutroEcnoder that:

Indicator to rescale the input data, specified as the comma-separated pair consisting of 'ScaleData' and either true or false.

Autoencoders attempt to replicate their input at their output. For it to be possible, the range of the input data must match the range of the transfer function for the decoder. trainAutoencoder automatically scales the training data to this range when training an autoencoder. If the data was scaled while training an autoencoder, the predict, encode, and decode methods also scale the data.

How does MATLAB do this? When I ran a model like this:

hiddenSize1 = 1;
autoenc1 = trainAutoencoder(tdata, hiddenSize1, 'UseGPU',true);
factor_1 = encode(autoenc1, tdata);

I found out the fact that mean of factor_1 is not 0 and the standard divination of it is not 1. So what is the procedure for standardizing the data?

Eghbal
  • 3,892
  • 13
  • 51
  • 112

1 Answers1

0
[x,t] = wine_dataset;

autoenc = trainAutoencoder(x, 10, 'EncoderTransferFunction', 'logsig', 'ScaleData', true);

encoded_data1 = encode(autoenc, x);

x_scaled = (x-min(x,[],2))./(max(x,[],2)-min(x,[],2));

encoded_data2 = logsig(autoenc.EncoderWeights * x_scaled + autoenc.EncoderBiases);
Zoe
  • 27,060
  • 21
  • 118
  • 148