feat: streaming token output with Print Mode rendering#30
Open
Travelguest wants to merge 3 commits intoMagicCube:mainfrom
Open
feat: streaming token output with Print Mode rendering#30Travelguest wants to merge 3 commits intoMagicCube:mainfrom
Travelguest wants to merge 3 commits intoMagicCube:mainfrom
Conversation
- Extend AgentProgressThinkingEvent with `text` (accumulated) and `delta` (incremental) fields so consumers can render tokens in real time. - Update Agent._deriveProgress() to extract text from streaming snapshots and compute delta between consecutive progress events. - Add StreamingMessage component for Ink-mode real-time Markdown rendering of streaming text tokens. - Update use-agent-loop to consume progress events: - Ink Mode: update streamingText React state for re-rendering - Print Mode: write delta directly to stdout for instant output - Update App.tsx to show StreamingMessage while text is streaming, replacing the generic shimmer indicator. - Add AgentLoopProvider `printMode` prop to toggle between modes. - Add 3 unit tests for streaming progress events (thinking text+delta, final message, tool progress without text fields).
DeepSeek's OpenAI-compatible API rejects top_p=0 with: 'Invalid top_p value, the valid range of top_p is (0, 1.0]' top_p=0.1 is the closest valid value that preserves near-deterministic behavior while satisfying the API constraint.
Print Mode (direct stdout.write) conflicts with Ink's terminal control and provides no real benefit over Ink Mode's real-time Markdown rendering. A proper non-interactive mode (like Claude Code's -p flag) would bypass Ink entirely rather than mixing stdout.write within the React tree. Removed: printMode prop, stdout.write branches, printMode state.
38d49ef to
57dceec
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description:
Adds real-time streaming token output to the TUI, replacing the generic "Thinking..." shimmer with actual model text as it arrives.
Changes:
Extend
AgentProgressThinkingEventwith text (accumulated) and delta (incremental) fieldsUpdate
Agent._deriveProgress()to extract text from streaming snapshots and compute deltasAdd
StreamingMessagecomponent for real-time Markdown rendering of streaming tokensUpdate
use-agent-loopto consume progress events and maintain streamingText stateUpdate
App.tsxto show StreamingMessage while text is streaming, replacing the shimmer indicatorFix
top_p: 0 → 0.1 for DeepSeek API compatibility (top_p must be in (0, 1.0])Add 3 unit tests covering thinking events (text + delta), final message, and tool progress