Skip to content

Incompatible Token Limit Parameter (e.g max_token) for OpenAI models #41

@j-ber

Description

@j-ber

the following models throw this error, when utilizing langchain_oci:
"openai.gpt-5-2025-08-07",
"openai.gpt-5-mini",
"openai.gpt-5-mini-2025-08-07",
"openai.gpt-5-nano",
"openai.gpt-5-nano-2025-08-07",
"openai.o1",
"openai.o1-2024-12-17",
"openai.o3",
"openai.o3-2025-04-16",
"openai.o3-mini",
"openai.o3-mini-2025-01-31",
"openai.o4-mini",
"openai.o4-mini-2025-04-16",

Error processing gpt_5_mini_2025_08_07: {'target_service': 'generative_ai_inference', 'status': 400, 'code': '400', 'opc-request-id': xx'message': "Invalid 'maxTokens': Unsupported parameter: 'maxTokens' is not supported with this model. Use 'maxCompletionTokens' instead."

But even when max_completion_tokens or maxCompletionTokens is passed:
TypeError: Unrecognized keyword arguments: max_completion_tokens

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggood first issueGood for newcomershelp wantedExtra attention is needed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions