Prerequisite chain
Prerequisites for Symbolic Regression and Equation Discovery
Topics you need before working through Symbolic Regression and Equation Discovery. Direct prerequisites are listed first; transitive prerequisites (the chain reachable through them) follow.
Direct prerequisites (2)
- Lasso Regressionlayer 2, tier 1
- Sparse Recovery and Compressed Sensinglayer 4, tier 3
Reachable through the chain (20)
These topics are not directly cited as prerequisites but are reached transitively by following the chain upward. Working through the direct prerequisites pulls these in.
- Linear Regressionlayer 1, tier 1
- Matrix Operations and Propertieslayer 0A, tier 1
- Sets, Functions, and Relationslayer 0A, tier 1
- Basic Logic and Proof Techniqueslayer 0A, tier 2
- Maximum Likelihood Estimation: Theory, Information Identity, and Asymptotic Efficiencylayer 0B, tier 1
- Common Probability Distributionslayer 0A, tier 1
- Differentiation in Rnlayer 0A, tier 1
- Vectors, Matrices, and Linear Mapslayer 0A, tier 1
- Continuity in Rⁿlayer 0A, tier 1
- Metric Spaces, Convergence, and Completenesslayer 0A, tier 1
- Central Limit Theoremlayer 0B, tier 1
- Law of Large Numberslayer 0B, tier 1
- Random Variableslayer 0A, tier 1
- Kolmogorov Probability Axiomslayer 0A, tier 1
- Expectation, Variance, Covariance, and Momentslayer 0A, tier 1
- KL Divergencelayer 1, tier 1
- Information Theory Foundationslayer 0B, tier 2
- Convex Optimization Basicslayer 1, tier 1
- Sub-Gaussian Random Variableslayer 2, tier 1
- Concentration Inequalitieslayer 1, tier 1