-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Open
Labels
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
The instruction generated by PromptedOutput is not saved in message_history. Since other instructions are saved, I don't think this is the expected behavior.
I checked the code:
if instructions := self._get_instructions(messages, model_request_parameters):
openai_messages.insert(0, chat.ChatCompletionSystemMessageParam(content=instructions, role='system'))
it's inserted into openai_messages ad hoc but not saved in list[ModelMessage].
Example Code
class Step(BaseModel):
explanation: str
output: str
class MathReasoning(BaseModel):
steps: list[Step]
final_answer: str
result = await agent.run("how can I solve 8x + 7 = -23", output_type=MathReasoning)
print(result.all_messages_json())Python, Pydantic AI & LLM client version
Python 3.13.9
pydantic_ai 1.20.0