Wednesday, January 29, 2025

Attention

Attention mechanism is like placing bookmarks on a lengthy text. The bookmark enables one to keep track of important or significant portion of the text which can be used to produce output later in time. By focusing on these parts of inputs, the output become more accurate and contextual. 

Self attention is a variant that capture relationships between different part of input sequence, regardless of their distances in the text. 

No comments: