A

Attention Mechanism

Definition

A technique in neural networks that allows the model to focus on relevant parts of the input when producing output. Self-attention enables each element in a sequence to attend to all other elements, capturing long-range dependencies more effectively than recurrent architectures.

Defined Term