I am currently in the process of trying to develop a feed-forward neural network n-gram language model using TensorFlow 2.0. Just to be clear, I do not want this to be implemented via a recurrent neural network, I simply want to use a few Dense layers and a Softmax layer to accomplish this. This is the reference that I have used; the architecture of the model has also been outlined, https://www.researchgate.net/publication/301875194_Authorship_Attribution_Using_a_Neural_Network_Language_Model
However, when I tried to do this, I kept getting an error. Given below is my model,
tf.keras.optimizers.Adam(learning_rate=0.01)
model = tf.keras.Sequential([
tf.keras.layers.Embedding(total_words, 300, weights = [embeddings_matrix], input_length=inputs.shape[1], trainable = False),
tf.keras.layers.Dense(100, activation = 'relu'),
tf.keras.layers.Dense(total_words, activation = 'softmax')
])
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
When this code is run, the error I get is as follows,
ValueError: Shapes (None, 7493) and (None, 116, 7493) are incompatible
Can someone please tell me how to resolve this? I am slightly confused.