Residual neural network
A residual neural network (also referred to as a residual network or ResNet) is a deep learning model in which the weight layers learn residual functions with reference to the layer inputs. It behaves like a highway network whose gates are opened through strongly positive bias weights. This enables deep learning models with tens or hundreds of layers to train easily and approach better accuracy when going deeper. The identity skip connections, often referred to as "residual connections", are also used in the 1997 LSTM networks, Transformer models (e.g., BERT, GPT models such as ChatGPT), the AlphaGo Zero system, the AlphaStar system, and the AlphaFold system.
Residual networks were developed by Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun, who won the 2015 ImageNet competition.