A technique used to prevent overfitting in neural networks by randomly dropping out neurons during training.