generated from oracle/template-repo
-
Notifications
You must be signed in to change notification settings - Fork 11
Open
Description
I'm running the simple LangGraph test in Getting Started
This simple test create a simple (hard-coded) tool to answer about the weather. When I call the sample with Cohere model cohere.command-latest
it runs find.
But when I try the Llama model meta.llama-3.3-70b-instruct
, it calls the tool, but does not integrate the response in the answer.
This is my code
chat = ChatOCIGenAI(
auth_type="API_KEY",
auth_profile=os.getenv("OCI_PROFILE"),
model_id=os.getenv("OCI_GENAI_MODEL_ID"),
service_endpoint=os.getenv("OCI_GENAI_SERVICE_ENDPOINT"),
compartment_id=os.getenv("OCI_COMPARTMENT_OCID"),
model_kwargs={"temperature": 1, "max_tokens": 500},
)
messages = [
SystemMessage(content="You are an AI assistant."),
# AIMessage(content="Hi there human!"),
HumanMessage(content="What is the weather in SF?"),
]
agent = create_react_agent(
model=chat,
tools=[get_weather],
prompt="You are a helpful assistant"
)
# Run the agent
response = agent.invoke({"messages": messages})
log_response(response, logging.DEBUG)
# Print last message as response
print(response["messages"][-1].content)
This is the answer using Cohere:
2025-08-31 18:49:48,415 - DEBUG - tool: It's always sunny in SF!
2025-08-31 18:49:48,415 - DEBUG - ai: The weather in SF is always sunny!
The weather in SF is always sunny!
And this the answer using Llama
2025-08-31 18:43:44,568 - DEBUG - tool: It's always sunny in SF!
2025-08-31 18:43:44,568 - DEBUG - ai: I hope this is what you were looking for. let me know if I can help with anything else.
I hope this is what you were looking for. let me know if I can help with anything else.
Notice the debug lines. The tools returns the same (and correct) answer in both case, is the LLM which is not returning the answer
Metadata
Metadata
Assignees
Labels
No labels