Skip to content

[BUG] LiteLLM+Tools not working #312

@dpolistwm

Description

@dpolistwm

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

0.1.9

Python Version

3.12.3

Operating System

Ubuntu 24.04.2

Installation Method

pip

Steps to Reproduce

  1. Declare a simple tool (e.g. multiply)
  2. Use LiteLLMModel with model_id='gemini/gemini-2.5-flash' or 'ollama/llama3.1' (probably happens with all models)
  3. Create a new agent using this simple tool
  4. Call the agent in such a way that the tool will be called ("Calculate 10x20")

Expected Behavior

Tool #1: multiply
10 x 20 is 200

Actual Behavior

Tool #1: multiply (process ends before actual tool call)

Additional Context

No response

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Labels

area-providerRelated to model providersbugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions