Residual connections are a type of network architecture component in deep learning where the input to a layer is added to its output. This helps in preventing the problem of vanishing gradients by allowing gradients to flow through the network more effectively. They are commonly used in deep convolutional neural networks, notably in architectures like ResNet.