Skip to content

Expose the instruction generated by PromptedOutput on ModelRequest #3489

@xcpky

Description

@xcpky

Initial Checks

Description

The instruction generated by PromptedOutput is not saved in message_history. Since other instructions are saved, I don't think this is the expected behavior.

I checked the code:

        if instructions := self._get_instructions(messages, model_request_parameters):
            openai_messages.insert(0, chat.ChatCompletionSystemMessageParam(content=instructions, role='system'))

it's inserted into openai_messages ad hoc but not saved in list[ModelMessage].

Example Code

class Step(BaseModel):
    explanation: str
    output: str

class MathReasoning(BaseModel):
    steps: list[Step]
    final_answer: str

  result = await agent.run("how can I solve 8x + 7 = -23", output_type=MathReasoning)
  print(result.all_messages_json())

Python, Pydantic AI & LLM client version

Python 3.13.9
pydantic_ai 1.20.0

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions