Skip to main content

Prerequisite chain

Prerequisites for Forgetting Transformer (FoX)

Topics you need before working through Forgetting Transformer (FoX). Direct prerequisites are listed first; transitive prerequisites (the chain reachable through them) follow.

Direct prerequisites (5)

  1. Attention Mechanism Theorylayer 4, tier 2
  2. Recurrent Neural Networkslayer 3, tier 2
  3. Transformer Architecturelayer 4, tier 2
  4. Attention Variants and Efficiencylayer 4, tier 2
  5. Sparse Attention and Long Contextlayer 4, tier 2

Reachable through the chain (15)

These topics are not directly cited as prerequisites but are reached transitively by following the chain upward. Working through the direct prerequisites pulls these in.