Python: .NET: Fix .NET conversation memory in DevUI (#3484)#4294
Python: .NET: Fix .NET conversation memory in DevUI (#3484)#4294victordibia wants to merge 5 commits intomainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR fixes a critical bug in the .NET DevUI where agents couldn't remember prior messages across conversation turns. The root cause was that InMemoryResponsesService stored conversation items after each execution but never loaded them before the next one, resulting in agents receiving only the current message with no history context.
Changes:
- Added
ItemResourceConversions.ToChatMessages()to convert stored conversation items back toChatMessageobjects for history injection - Modified
InMemoryResponsesServiceto load prior conversation items from storage and pass them to executors before each run - Updated
IResponseExecutorinterface and both implementations (AIAgentResponseExecutor,HostedAgentResponseExecutor) to accept an optionalconversationHistoryparameter
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| dotnet/src/Microsoft.Agents.AI.Hosting.OpenAI/Responses/Converters/ItemResourceConversions.cs | New converter that transforms stored ItemResource objects back to ChatMessage objects, handling messages, function calls, and function results |
| dotnet/src/Microsoft.Agents.AI.Hosting.OpenAI/Responses/InMemoryResponsesService.cs | Added conversation history loading logic that retrieves up to 100 prior items in ascending order and passes them to the executor |
| dotnet/src/Microsoft.Agents.AI.Hosting.OpenAI/Responses/IResponseExecutor.cs | Extended interface to accept optional conversationHistory parameter |
| dotnet/src/Microsoft.Agents.AI.Hosting.OpenAI/Responses/AIAgentResponseExecutor.cs | Prepends conversation history to input messages before agent execution |
| dotnet/src/Microsoft.Agents.AI.Hosting.OpenAI/Responses/HostedAgentResponseExecutor.cs | Prepends conversation history to input messages before agent execution |
| dotnet/tests/Microsoft.Agents.AI.Hosting.OpenAI.UnitTests/TestHelpers.cs | Added ConversationMemoryMockChatClient that captures full message lists for testing conversation history |
| dotnet/tests/Microsoft.Agents.AI.Hosting.OpenAI.UnitTests/OpenAIResponsesIntegrationTests.cs | Added comprehensive test verifying conversation history is passed correctly on subsequent requests |
dotnet/tests/Microsoft.Agents.AI.Hosting.OpenAI.UnitTests/OpenAIResponsesIntegrationTests.cs
Show resolved
Hide resolved
dotnet/src/Microsoft.Agents.AI.Hosting.OpenAI/Responses/InMemoryResponsesService.cs
Show resolved
Hide resolved
…ent type on both objects (.type) and dicts (.get("type")). Replaced all 4 bare event.type accesses in _executor.py (lines 267, 477, 499, 523).
Root cause: PR #3690 changed event.__class__.__name__ == "RequestInfoEvent" (safe) to event.type == "request_info" (crashes on dicts), but _execute_workflow still yields raw dicts on error paths.
Test: test_workflow_error_yields_dict_event_without_crash — mocks a workflow that raises, verifies execute_entity consumes the dict error events without crashing.
| public async IAsyncEnumerable<StreamingResponseEvent> ExecuteAsync( | ||
| AgentInvocationContext context, | ||
| CreateResponse request, | ||
| IReadOnlyList<ChatMessage>? conversationHistory = null, |
There was a problem hiding this comment.
Does it make sense to add conversationHistory to AgentInvocationContext?
History sounds like an essential part of the conversation.
| var context = new AgentInvocationContext(new IdGenerator(responseId: responseId, conversationId: state.Response?.Conversation?.Id)); | ||
|
|
||
| // Load conversation history if a conversation ID is provided | ||
| IReadOnlyList<Extensions.AI.ChatMessage>? conversationHistory = null; |
There was a problem hiding this comment.
While this change does fix loading of chat history, the code is still broken for other scenarios.
We should really be storing and loading the AgentSession based on the request conversation id. We can then attach a custom per-run ChatHistoryProvider which loads chat history for the session from _conversationStorage and persists new messages into _conversationStorage. If we don't store the session, memory scenarios are broken for all agents exposed via this host.
Happy to work with you get this fixed for all memory scenarios.
Fix .NET conversation memory in DevUI (#3484)
Agents hosted via the .NET DevUI couldn't remember prior messages across turns. The root cause:
InMemoryResponsesServicestored conversation items after each execution but never loaded them before the next one, so agents always received only the current message with no history.What changed
ItemResourceConversions.ToChatMessages()— converts storedItemResourceobjects back toChatMessagefor history injectionInMemoryResponsesServicenow loads prior conversation items from storage, converts them toChatMessagelist, and passes them to the executor before each runIResponseExecutor.ExecuteAsyncand both implementations (AIAgentResponseExecutor,HostedAgentResponseExecutor) accept an optionalconversationHistoryparameter and prepend it to the current inputTest plan
CreateResponse_WithConversation_SecondRequestIncludesPriorMessagesAsync— sends two messages in the same conversation, asserts the agent receives all 3 messages (prior user + prior assistant + new user) on the second callOther Changes
Contribution Checklist
Closes #3484