Skip to content

Anthropic prompt caching fails when there are more than 4 cache points #3467

@DouweM

Description

@DouweM

Initial Checks

Description

As pointed out in #3453.

We should post-process the messages sent to the Anthropic API to strip all cache markers but the last 4.

Example Code

Python, Pydantic AI & LLM client version

latest

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions