Hard Attention

Home > Computer Science > Natural Language Processing > Attention Mechanisms > Hard Attention

Hard attention is a type of attention mechanism that is used in some neural network architectures for natural language processing tasks. It works by selecting only a subset of the input sequence to pay attention to, rather than providing a weighted representation of the entire sequence.