Hi
How can i get the numbers of input and output the model llm used ?
Hi @shai,
Thanks for reaching out!
Yes, you can definitely access the metrics data. Please refer to the following documentation for detailed guidance:
Agent.run() - Agno
Let me know if you have any questions or need help with anything specific!
cool
Can we get serialized object as a response from agent.run?
Yes @shai, we can easily convert the response from agent.run to a serialized object like JSON. Here’s how we can do it in Python:
import json
from agno.agent import Agent
from agno.models.openai import OpenAIChat
agent = Agent(
name="Simple Agent",
role="Answer basic questions",
agent_id="simple-agent",
model=OpenAIChat(id="gpt-4o-mini"),
)
response = agent.run(
"What is the capital of France?",
)
serialized = response.to_json()
print(serialized)
Let me know if you need any more help.