A minibatch is a subset of the training dataset used to train a model in machine learning. It is smaller than the full dataset but larger than a single example. Minibatches are used in batch learning methods, particularly in stochastic gradient descent, to approximate the gradient of the loss function more efficiently and to speed up the training process.