OllamaClient to a remote server is ignored by agno.models.ollama

Hi guys,

I’m trying to use my Ollama remote server and it seems like passing OllamaClient to Ollama isn’t working. Ollama just uses localhost instead of the remote client.

example code

import asyncio
from agno.agent import Agent
from agno.models.ollama import Ollama
from ollama import Client as OllamaClient

task = “”“The more you take, the more you leave behind. What are they?”“”
client = OllamaClient(host=“10.50.0.1:11434”)
regular_agent = Agent(model=Ollama(client=client, id=“llama3.1:latest”), markdown=True)
asyncio.run(regular_agent.aprint_response(task, stream=True))

The OllamaClient can be used to Chat to a remote model but passing it to Ollama results in Ollama still using the local server.

best regards
B

Hi
Thanks for reaching out and for using Agno! I’ve looped in the right engineers to help with your question. We usually respond within 24 hours, but if this is urgent, just let us know, and we’ll do our best to prioritize it.
Appreciate your patience—we’ll get back to you soon! :smile:

hey @Barbatio i tried to run this script with deepseek r1
but i couldn’t replicate this error .
here is my script :

import asyncio
from agno.agent import Agent
from agno.models.ollama import Ollama
from ollama import Client as OllamaClient

task = “The more you take, the more you leave behind. What are they?”
client = OllamaClient(host=“10.50.0.1:11434”)
regular_agent = Agent(model=Ollama(client=client, id=“deepseek-r1:8b”), markdown=True)
asyncio.run(regular_agent.aprint_response(task, stream=True))

It’s odd that you are testing this on the same ip. Anyway I updated to the latest version of Agno but still get a httpcore.ConnectError: All connection attempts failed.

And there’s nothing wrong with the connection as this works

client = OllamaClient(host=“10.50.0.1:11434”)

response = client.chat(model=‘llama3.2’, messages=[
{
‘role’: ‘user’,
‘content’: ‘Why is the sky blue?’,
},
])
print(response)

hey @Barbatio i actually also tested it with ngrok remote server too , still couldn’t replicate the issue

import asyncio

from agno.agent import Agent

from agno.models.ollama import Ollama

from ollama import Client as OllamaClient

task = "The more you take, the more you leave behind. What are they?"

client = OllamaClient(host="https://5b5f-2401-4900-1f3e-716f-21dd-a017-dd03-7731.ngrok-free.app")

regular_agent = Agent(model=Ollama(client=client, id="deepseek-r1:8b"), markdown=True)

asyncio.run(regular_agent.aprint_response(task, stream=True))