Claude Code hooks that auto-switch model tier based on task complexity
-
Updated
Mar 8, 2026 - Shell
Claude Code hooks that auto-switch model tier based on task complexity
Smart LLM routing layer — classifies prompts and routes to the optimal model (Ollama, OpenAI, Anthropic, Groq) via Bifrost gateway. Cost-optimized, quality-first, and local-only strategies.
Add a description, image, and links to the prompt-classification topic page so that developers can more easily learn about it.
To associate your repository with the prompt-classification topic, visit your repo's landing page and select "manage topics."