Skip to content

Conversation

SteveSandersonMS
Copy link
Member

@SteveSandersonMS SteveSandersonMS commented Mar 11, 2025

For Ollama, it was broken because it set a new ResponseId for every chunk. It now correctly sets per-response IDs.

For OpenAI, it had a minor issue with handling empty-string ResponseId values. It now treats this case as equivalent to null ResponseId values, and thereby avoids introducing extra blank messages.

Microsoft Reviewers: Open in CodeFlow

@SteveSandersonMS SteveSandersonMS requested a review from a team as a code owner March 11, 2025 12:30
@github-actions github-actions bot added the area-ai Microsoft.Extensions.AI libraries label Mar 11, 2025
@SteveSandersonMS SteveSandersonMS merged commit 44fd1c1 into main Mar 11, 2025
7 checks passed
@SteveSandersonMS SteveSandersonMS deleted the stevesa/fix-streaming-grouping branch March 11, 2025 14:11
joperezr pushed a commit to joperezr/extensions that referenced this pull request Mar 11, 2025
* For Ollama client, ensure ToChatResponseAsync coalesces text chunks into a single message

* Fix OpenAI case by not treating empty-string response IDs as message boundaries
joperezr pushed a commit to joperezr/extensions that referenced this pull request Mar 11, 2025
…otnet#6074)

Fix grouping of ChatResponseUpdate into ChatMessage (dotnet#6074)

* For Ollama client, ensure ToChatResponseAsync coalesces text chunks into a single message

* Fix OpenAI case by not treating empty-string response IDs as message boundaries
@github-actions github-actions bot locked and limited conversation to collaborators Apr 11, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-ai Microsoft.Extensions.AI libraries
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants