Not able to call LLM using LiteLLM

Facing an issue with LiteLLM. With LiteLLM agent not giving any output.
Here is the code

python
from agno.agent import Agent
from agno.models.litellm import LiteLLM
from dotenv import load_dotenv
import os

load_dotenv()

agent = Agent(
    model=LiteLLM(
        id="ollama/qwen3:latest",  # Correct format for Groq through LiteLLM
    ),
    markdown=True,
)

agent.print_response("tell me a joke", stream=True)

Respone

Hey @anamul, thanks for reaching out and supporting Agno. I’ve shared this with the team, we’re working through all requests one by one and will get back to you soon.
If it’s urgent, please let us know. We appreciate your patience!

Hey @anamul before running this file do you have the ollama server running for qwen3?