Ollama: ConnectionError: Failed to connect to Ollama

from agno.agent import Agent
from agno.models.ollama import Ollama

agent = Agent(model=Ollama(id=“phi3.5:latest”), markdown=True)
agent.print_response(“Share a 2 sentence horror story”)

Here is output :
agent.print_response(“Share a 2 sentence horror story”)
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\agno\agent\agent.py”, line 4006, in print_response
run_response = self.run(
^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\agno\agent\agent.py”, line 994, in run
return next(resp)
^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\agno\agent\agent.py”, line 706, in _run
model_response = self.model.response(messages=run_messages.messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\agno\models\base.py”, line 177, in response
assistant_message, has_tool_calls = self._process_model_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\agno\models\base.py”, line 313, in _process_model_response
response = self.invoke(messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\agno\models\ollama\chat.py”, line 200, in invoke
return self.get_client().chat(
^^^^^^^^^^^^^^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\ollama_client.py”, line 333, in chat
return self._request(
^^^^^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\ollama_client.py”, line 178, in _request
return cls(**self._request_raw(*args, **kwargs).json())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “D:\Program Files\anaconda\envs\sentinenv\Lib\site-packages\ollama_client.py”, line 124, in _request_raw
raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. Download Ollama on macOS

Ollama server is running
(base) PS C:\Users\18845> curl http://localhost:11434

StatusCode : 200
StatusDescription : OK
Content : Ollama is running
RawContent : HTTP/1.1 200 OK
Content-Length: 17
Content-Type: text/plain; charset=utf-8
Date: Wed, 09 Apr 2025 10:25:00 GMT

                Ollama is running

Forms : {}
Headers : {[Content-Length, 17], [Content-Type, text/plain; charset=utf-8], [Date, Wed, 09 Apr 2025 10:25:00
GMT]}
Images : {}
InputFields : {}
Links : {}
ParsedHtml : mshtml.HTMLDocumentClass
RawContentLength : 17

Hi @Polaris
thanks for reaching out and supporting Agno. I’ve shared this with the team, we’re working through all requests one by one and will get back to you soon.
If it’s urgent, please let us know. We appreciate your patience!

Thanks for your reply

Hi there!

Let’s try these steps to debug:

  1. ensure you have correctly activated the sentinenv environment that corresponds to the D:\Program Files\anaconda installation. It’s easy to accidentally run the script from the wrong environment (like base or one from a different Anaconda installation).

  2. Try running this script to test Ollama network connection issues:

    import urllib.request
    import urllib.error

    OLLAMA_URL = "http://localhost:11434"
    print(f"Attempting to connect to Ollama at: {OLLAMA_URL} from within Python...")
    try:
        # Added a user-agent header, sometimes helps bypass simple blocks
        headers = {'User-Agent': 'Python-urllib/3'}
        req = urllib.request.Request(OLLAMA_URL, headers=headers)
        with urllib.request.urlopen(req, timeout=10) as response:
            print(f"SUCCESS! Status: {response.status}")
            print(f"Response: {response.read().decode('utf-8')}")
    except urllib.error.URLError as e:
        print(f"FAILED to connect via urllib: {e}")
    except Exception as e:
        print(f"An unexpected error occurred: {e}")
  1. Check Environment Variables (OLLAMA_HOST): The ollama-python library (used by agno) might be looking at an environment variable like OLLAMA_HOST that could be incorrectly set only within the sentinenv environment

Incase none of the above works try creating a new conda environment and try running the Agent again. Hopefully this should resolve this issue :smiley:

I modified the Environment Variables (OLLAMA_HOST). After changing the environment variable of OLLAMA_HOST from 0.0.0.0:11434 to :11434, it can run normally. The issue is solved

1 Like