-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Add tracing guide for non-OpenAI LLMs in docs/tracing.md #1329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
docs/tracing.md
Outdated
@@ -39,7 +39,7 @@ By default, the SDK traces the following: | |||
- Audio outputs (text-to-speech) are wrapped in a `speech_span()` | |||
- Related audio spans may be parented under a `speech_group_span()` | |||
|
|||
By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. | |||
By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Agent workflow" is the correct default name.
By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. | |
By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. |
docs/tracing.md
Outdated
set_tracing_export_api_key(os.getenv("OPENAI_API_KEY")) | ||
gemini_api_key = os.getenv("GEMINI_API_KEY") | ||
|
||
external_client = AsyncOpenAI( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The compatible API layer provided by other platforms are not perfectly compatible. So, we'd like to avoid recommending those endpoints. Instead, can you use code examples using LiteLLM + Chat Completions? Also, rather than mentioning a specific model / endpoint, we would like to simply show how to set up with a non-OpenAI model in general.
See also: https://openai.github.io/openai-agents-python/models/litellm/
model = "your model name here"
api_key = "your api key here"
model=LitellmModel(model=model, api_key=api_key),
docs/tracing.md
Outdated
@@ -117,4 +152,3 @@ To customize this default setup, to send traces to alternative or additional bac | |||
- [Okahu-Monocle](https://github.com/monocle2ai/monocle) | |||
- [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration) | |||
- [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents) | |||
- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please revert this change
Add guide for free tracing with non-OpenAI LLMs. Fixes #1327