I am trying to generate 100 packets(row vectors each of size 8192) of random bits(1,-1),filter them using butterworth filter and then plot their average power spectral density. I have to do this using MATLAB.My o/p should be a filtered sinc with a very sharp peak. When I use this code for smaller sized packets say 100 it works. But for 8192 it doesn't. I want someone to review my code for errors please.
%generates a random square matrix of 8192x8192
n=rand(8192);
%initiates a row vector of 64 zeros
B=zeros(1,64);
%makes a butterworth(lowpass) filter
[num,den]=butter(20,.6);
%two for loops to generate 100 row vectors(packets) each of size 8192 that
%give 1 for any value greater than 0.5 and vice versa
for c=1:100
for k=1:8192
if n(c,k)>=0.5
n(c,k)=1;
else
n(c,k)=-1;
end
%filter the generated vectors and calculate average power spectral density
x=filter(num,den,n(c,:));
A=fftshift(fft(x,64));
psd=A.*conj(A);
B=B+psd;
end
end
plot(B./100)
xlabel 'Frequency', ylabel 'Average Power Spectral Density'