A residual neural network is a class of deep, feed-forward artificial neural networks that utilizes skip connections or short-cuts to jump over some layers in order to make the optimization of very deep networks tractable.
A deep-residual-network is an artificial neural-network that utilizes skip connections or short-cuts to jump over some layers.
The motivation for skipping over layers is to avoid the problem of vanishing gradients that might occur in very deep neural networks.
By reusing activation from a previous layer until the layer next to the current one have learned its weights, the neural network collapses into fewer layers in the initial phase and gradually expands as it learns more of the feature space, thus making the optimization of very deep networks tractable.
Additional resources: