Introduction to deep learning small white help, MATLAB with function of a two layer autoencoder (encode - encode decode - decode), the training sample input x reconstruction error, than with a layer of (encode, decode), this kind of situation is normal? Or something wrong with the code?
The code is as follows:
% "Train" the first encoder
Autoenc1=trainAutoencoder (x, I, 'MaxEpochs', 400, 'DecoderTransferFunction' and 'purelin');
% Extract the encoded data for new images using the first autoencoder.
Features1=encode (autoenc1, x);
% "Train" the second encoder
Autoenc2=trainAutoencoder (features1, 10, 'MaxEpochs' 400, 'DecoderTransferFunction' and 'purelin');
% Extract the encoded data for new images using the second autoencoder.
Features2=encode (autoenc2 features1);
% Decode the encoded data from the autoencoder.
Regenerated2=decode (autoenc2 features2);
Regenerated=decode (autoenc1 regenerated2);
% Calculate the reconstruction error
Prefomance=SQRT (mse) (x - regenerated);
P.S. light spray why not use Python...