Overview
Context management is the art of providing LLM coding assistants with the right information at the right time. Models have limited context windows, so strategic inclusion and exclusion of code, docs, and instructions directly impacts output quality.
Context Window Sizes
| Model | Context Window | Approx. Lines of Code |
|---|---|---|
| Claude 3.5 Sonnet | 200K tokens | ~50,000 lines |
| GPT-4 Turbo | 128K tokens | ~32,000 lines |
| Gemini 1.5 Pro | 1M tokens | ~250,000 lines |
| Llama 3.1 | 128K tokens | ~32,000 lines |
Strategies
- •Provide only the relevant files — not the entire codebase
- •Summarize large modules with their public API signatures
- •Use file headers to describe purpose and dependencies
- •Front-load the most important context (instructions, types)
- •Remove conversation history that is no longer relevant
- •Split large tasks into smaller, focused conversations
File Selection Guide
Bug Fix
Include: the buggy code, its tests, related type definitions, and the error message.
New Feature
Include: the nearest similar feature, relevant types, and the routing/API layer where it integrates.
Refactoring
Include: the code to refactor, its tests, and any code that imports/depends on it.