ALGORITHMS
The Geometry of Latent Space: Visualizing AI Reasoning Paths
New research suggests LLM attention heads are forming complex topological manifolds during multi-step inference tasks.
New research suggests LLM attention heads are forming complex topological manifolds during multi-step inference tasks.
New research suggests LLM attention heads are forming complex topological manifolds during multi-step inference tasks.
Techniques in mixture-of-experts (MoE) are enabling massive parameter counts with a fraction of active compute.