Passing <class 'agno.models.message.Message'> to Team.run passes full object to llm provider

Hi,

Passing List[Message] to team.run adds the full object print as ‘user’ message in to llm API.

Im migrating to v2 not sure if this was before also. I think in v1 passing a list to Team.run was not possible.

Is this expected behaviour? Seems not nice, also not sure now how we can pass multiple messages and distinguish between assistant and user message.

I see this now when inspecting the openai logs

[Message(role=‘user’, content=[{‘type’: ‘text’, ‘text’: ‘hi’}], name=None, tool_call_id=None, tool_calls=None, audio=None, images=None, videos=None, files=None, audio_output=None, image_output=None, video_output=None, redacted_reasoning_content=None, provider_data=None, citations=None, reasoning_content=None, tool_name=None, tool_args=None, tool_call_error=None, stop_after_tool_call=False, add_to_agent_memory=True, from_history=False, metrics=Metrics(input_tokens=0, output_tokens=0, total_tokens=0, audio_input_tokens=0, audio_output_tokens=0, audio_total_tokens=0, cache_read_tokens=0, cache_write_tokens=0, reasoning_tokens=0, timer=None, time_to_first_token=None, duration=None, provider_metrics=None, additional_metrics=None), references=None, created_at=1757682929)]

Is this a bug?

Hi @thomas, thanks for reaching out and supporting Agno. I’ve shared this with the team, we’re working through all requests one by one and will get back to you soon.If it’s urgent, please let us know. We appreciate your patience!

Hi @thomas !

Message is an Agno class. At runtime, the Message is converted to the model specific format. For OpenAI, that would look something like: {"role": "system", "content": "You are a helpful assistant."},

Can you please share how you are inspecting OpenAI logs?