I’m really struggling to get Qwen LLM to work with Agno when building an agent. I tried using OpenAIChat to use Qwen3, but it’s not working right at all. And I couldn’t find any clear steps in the official docs on how to make it happen. Would Agno be possible to add support for Qwen LLM?
Hey @Ran ! Which inference provider are you using for Qwen models? We recommend using Qwen via Together or Groq for easy access. Also you could set it up locally via Ollama / vLLM.
Let us know if you need more help here