I am actually trying do unwrap a signal and the numpy function don't do what I want. I already searched the web for an answer of course and nothing help me unfortunatly: here is a part of the code:
NT
can be in the range of 1 to 48.- The size of
h1
is in Gb (like 10-20Gb), and I extract like 20,000,000 rows in oneNT
. - I can't do it in one big process I need to divide it with
NT
.
for i in range(0,NT,1):#NT
A = 0
temp = pd.DataFrame()
h1 = pd.read_hdf(path,'foo',start=int(Compt1*i/NT),stop=int(Compt1*(i+1)/NT))
print('\033[0m'+'calcule d1 et d2 : NT '+str(i+1)+' / '+str(NT))
temp = list(h1.phi)
# s=temp[0]-tp
# if s<(-np.pi) :
# temp[0]+=2*np.pi
# elif s>np.pi:
# temp[0]-=2*np.pi
# ech=NT*record_size
# for y in range(1,len(temp),1):
# s=temp[y]-temp[y-1]
# if s<(-np.pi) :
# temp[y]+=2*np.pi
# elif s>np.pi:
# temp[y]-=2*np.pi
# temp0=[tempi*(londe/(4*np.pi)) for tempi in temp]
temp0 = np.unwrap(temp,axis=0)
A = pd.DataFrame({
'd1':[temp0*(londe/(4*np.pi)) for temp0 in temp],
'd2':Q+np.cumsum(h1.deltax)})
h1 = 0
tp = temp[-1]
Q = A.iloc[-1,1]
A = A.apply(pd.to_numeric, downcast='float')
print('\033[0m'+'enregistrement')
with pd.HDFStore(pathout, data_columns=True) as store:
try:
nrows = store.get_storer('foo').nrows
except:
nrows = 0
A.index = pd.Series(A.index) + nrows
store.append('foo', A)
print('\033[32m'+'Ok')
Here is what I got (d1 is not good, d2 is good):
https://i.stack.imgur.com/ULYFP.jpg
Last one in the link is what I want, first one is what I get and second one is what I get with zoom.