Soft Attention

Home > Computer Science > Natural Language Processing > Attention Mechanisms > Soft Attention

Soft attention is a type of attention mechanism that is commonly used in neural machine translation. It works by providing a weighted representation of the input sequence, which is then fed into the decoder to generate the target sequence.