Mutual information is a measure of the amount of information that one random variable contains about another random variable. It quantifies the reduction in uncertainty about one variable given knowledge of the other, and is used in various fields such as statistics, information theory, and machine learning.