0

In python, I have the one 1-D array coef defined as follows:

coef = np.ones(frame_num-1)

for i in range(1,frame_num):
    coef[i-1] = np.corrcoef(data[:,i],data[:,i-1])[0,1]

np.savetxt('serial_corr_results/coef_rest.txt', coef)

Now I want to do autocorrelation on it, and for that I use the code posted by deltap in another post:

timeseries = (coef)
#mean = np.mean(timeseries)
timeseries -= np.mean(timeseries)
autocorr_f = np.correlate(timeseries, timeseries, mode='full')
temp = autocorr_f[autocorr_f.size/2:]/autocorr_f[autocorr_f.size/2]

The autocorrelation works fine, however, when I want to now plot or work with the original coef, the values have changed to that of timeseries -= np.mean(timeseries).

Why does the original array coef get changed here and how can I prevent it from being altered? I need it further down in my script for some other operations.

Also, what exactly is the operation -= doing? I have tried to google that, but haven't found it. Thanks!

knut_h
  • 303
  • 2
  • 3
  • 11

1 Answers1

0

NumPy arrays are mutable, e.g.

timeseries = coef           # timeseries and coef point to same data
timeseries[:] = 0

will set both timeseries and coef to zero.

If you do

timeseries = coef.copy()    # timeseries is a copy of coef with its own data
timeseries[:] = 0

instead, coef will remain untouched.

Nils Werner
  • 34,832
  • 7
  • 76
  • 98
  • thanks, that perfectly helped! And I assume the command -= is just a way to subtract a constant (here np.mean(timeseries)) from each value in an array, respectively? – knut_h May 25 '16 at 10:26