Skip to content

Add broadcast for stream open in process_stream_chunk#308

Open
TheRealNeil wants to merge 1 commit intoactiveagents:mainfrom
qlarity:fix-ollama-provider-stream-completion
Open

Add broadcast for stream open in process_stream_chunk#308
TheRealNeil wants to merge 1 commit intoactiveagents:mainfrom
qlarity:fix-ollama-provider-stream-completion

Conversation

@TheRealNeil
Copy link
Contributor

Fixes #289

Summary

Add broadcast_stream_open call in OpenAI::ChatProvider#process_stream_chunk to establish the streaming lifecycle

Root cause

The OpenAI ChatProvider#process_stream_chunk never called broadcast_stream_open, so the streaming flag was never set to true. When streaming finished, broadcast_stream_close checked return unless streaming and returned early — meaning on_stream_close callbacks never fired. This affected both OpenAI and Ollama (which inherits from the OpenAI provider).

The Anthropic and mock providers already called broadcast_stream_open correctly; the OpenAI provider was the odd one out.

Fix

Add broadcast_stream_open at the top of process_stream_chunk, mirroring the pattern in the Anthropic and mock providers. This ensures the full stream lifecycle (open → update → close) fires correctly.

Test plan

  • Verify Ollama streaming now triggers on_stream_close callbacks
  • Verify OpenAI streaming open/close lifecycle works end-to-end
  • Confirm Anthropic and mock provider streaming is unaffected
  • Confirm broadcast_stream_open is idempotent when called on every chunk

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds the missing stream “open” broadcast to the OpenAI chat streaming path so the stream lifecycle (open → update → close) is correctly established and on_stream_close callbacks can fire (fixing behavior for both OpenAI and Ollama, which inherits from it).

Changes:

  • Call broadcast_stream_open at the start of OpenAI::ChatProvider#process_stream_chunk.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 95 to +99
def process_stream_chunk(api_response_event)
instrument("stream_chunk.active_agent")

broadcast_stream_open

Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a regression test that asserts the OpenAI (and ideally Ollama) provider streaming lifecycle emits :open and :close events when stream: true. Right now the change fixes a lifecycle bug, but there isn't provider-level test coverage to prevent broadcast_stream_open from being accidentally removed again (similar to the existing mock provider streaming test).

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama Provider stream completion callback or done message not being sent

1 participant