I am trying to convert a TradingView indicator into Python (also using pandas to store its result).
This is the indicator public code I want to convert into a python indicator:
https://www.tradingview.com/script/sU9molfV/
And I am stuck creating that pine script linereg default function.
This is the fragment of the pinescript indicator I have troubles with:
lrc = linreg(src, length, 0)
lrc1 = linreg(src,length,1)
lrs = (lrc-lrc1)
TSF = linreg(src, length, 0)+lrs
This is its documentation:
Linear regression curve. A line that best fits the prices specified over a user-defined time period. It is calculated using the least squares method. The result of this function is calculated using the formula: linreg = intercept + slope * (length - 1 - offset), where length is the y argument, offset is the z argument, intercept and slope are the values calculated with the least squares method on source series (x argument). linreg(source, length, offset) → series[float]
Source:
https://www.tradingview.com/pine-script-reference/#fun_linreg
I have found this mql4 code and tried to follow it step by step in order to convert it and finally to create a function linreg in Python in order to use it further for building that pine script indicator:
https://www.mql5.com/en/code/8016
And this is my code so far:
# calculate linear regression:
# https://www.mql5.com/en/code/8016
barsToCount = 14
# sumy+=Close[i];
df['sumy'] = df['Close'].rolling(window=barsToCount).mean()
# sumxy+=Close[i]*i;
tmp = []
sumxy_lst = []
for window in df['Close'].rolling(window=barsToCount):
for index in range(len(window)):
tmp.append(window[index] * index)
sumxy_lst.append(sum(tmp))
del tmp[:]
df.loc[:,'sumxy'] = sumxy_lst
# sumx+=i;
sumx = 0
for i in range(barsToCount):
sumx += i
# sumx2+=i*i;
sumx2 = 0
for i in range(barsToCount):
sumx2 += i * i
# c=sumx2*barsToCount-sumx*sumx;
c = sumx2*barsToCount - sumx*sumx
# Line equation:
# b=(sumxy*barsToCount-sumx*sumy)/c;
df['b'] = ((df['sumxy']*barsToCount)-(sumx*df['sumy']))/c
# a=(sumy-sumx*b)/barsToCount;
df['a'] = (df['sumy']-sumx*df['b'])/barsToCount
# Linear regression line in buffer:
df['LR_line'] = 0.0
for x in range(barsToCount):
# LR_line[x]=a+b*x;
df['LR_line'].iloc[x] = df['a'].iloc[x] + df['b'].iloc[x] * x
# print(x, df['a'].iloc[x], df['b'].iloc[x], df['b'].iloc[x]*x)
print(df.tail(50))
print(list(df))
It doesn't work.
Any idea how to create a similar pine script linereg function into python, please?
Thank you in advance!