Given a matrix X
with an arbitrary number of columns and rows (Each row representing a feature of the dataset), I want to normalize each value to be ((value - column mean) / column standard deviation)
. I came up with the following code, which works. Can this be optimized for less computation or is this optimal?
mu = mean(X);
sigma = std(X);
x_ones = ones(size(X));
zero_mean_X = X - x_ones * diag(mu);
X_norm = zero_mean_X ./ (x_ones * diag(sigma));