recently I encounter a very strange thing in training a RNN type neural networks(LSTM and GRU). Last month, I finish all the code can get the results. I do believe that I set random seeds each time initialize a new model (like model = RNNModel() ), and back to then I get the same result every time when I run the code. But today, I can get the same result every time I run, but this is not same as the result a month before. Hope someone can save me! Thanks a lot!
here is the some example code:
def train(model_type, seq_list, lag_k, hidden_size, seed = 0):
np.random.seed(seed)
torch.manual_seed(seed)
model = models.RNNModel(model_type, seq_dim, hidden_size, seq_dim)
...
and here is the results record a month ago and the results I get today:
before : | test_loss : 0.0040 | total_time : 71.3345(s) | MSE : 0.4701 | MAE : 0.5380 | MAPE : 6.2668 |
now : | test_loss : 0.0040 | total_time : 82.9247(s) | MSE : 0.4700 | MAE : 0.5380 | MAPE : 6.2728 |