Lukas' Notes

Home

❯

Knowledge

❯

Self-Attention

Self-Attention

Jun 09, 20251 min read

machine-learning transformers attention

Self-Attention

Self-attention, sometimes called intra-attention is an attention mechanism relating different positions of a single sequence in order to compute a representation of the sequence. 1

It is a more specific form of cross-attention where x1​=x2​.

Visualisation

Taken from 2.

Footnotes

  1. Attention is All You Need ↩

  2. https://sebastianraschka.com/blog/2023/self-attention-from-scratch.html ↩


Graph View

Backlinks

  • Large Memory Layers with Product Keys

Created with Quartz v4.4.0 © 2025

  • GitHub