Dropout is a regularization technique, i.e. it prevents the network from overfitting on your data quickly. The validation loss just gives you an indication of when your network is overfitting. These are two completely different things and having a validation loss does not help you when your model is overfitting, it just shows you that it is.
I would say that having a validation loss is valuable information to have during training and you should never go without it. Whether you need regularization techniques such as noise, dropout or batch normalization depends on how your network learns. If you see that it overfits then you should attempt to employ regularization techniques.