I'm currently trying to build a Hybrid Convolutional Recurrent Neural Network to analyze time series.The network is described in this image The basic idea is to take a historical series and run the network on it. At each iteration laconv1D takes M adjacent values and processes them (from Xt-1 to Xt-M). The conv1d returns a value that is concatenated to the Xt data and is then passed through two LSTMs to return an output with a dense network.
I tried to make the network with tensorflow but I think it's wrong.
input_stock = Input(shape=(None, 1300, 1))
input_ip = Input(shape=(1,1))
x = Sequential()
x = Conv1D(filters=1, kernel_size=100, activation='relu',padding='causal')(input_stock)
x = Concatenate(axis=2)([x, input_ip])
x = LSTM(units=50,return_sequences=True,activation="tanh")(x)
x = LSTM(units=50,activation="tanh")(x)
output = TimeDistributed(Dense(1,activation="relu"))(x)
model = Model(inputs=[input_stock, input_ip], outputs=output)
Also I don't know what to use as input Shape: I have a series with 1300 values with one feature each.
X.shape()
(1300,)
Is this input shape "Input(shape=(None, 1300, 1))" correct ? Thanks in advance