Layer normalization is a technique used in deep learning to standardize the inputs across the features for each data sample independently. It helps in stabilizing the learning process and improving the convergence speed of training deep neural networks.