ERROR: Error loading ASGI app. Could not import module "multi_ai_agent"

from phi.agent import Agent
from phi.model.ollama import Ollama
from phi.tools.duckduckgo import DuckDuckGo
from phi.tools.yfinance import YFinanceTools
from phi.playground import Playground, serve_playground_app

# Create a base Ollama model configuration
ollama_model = Ollama(id="llama3.2:1b")

web_search_agent = Agent(
    name="Web Search Agent",
    role="Search the web for information",
    model=ollama_model,
    tools=[DuckDuckGo()],
    instructions=["Always include sources"],
    show_tool_calls=True,
    markdown=True,
)

finance_agent = Agent(
    name="Finance AI Agent",
    role="Get financial data",
    model=ollama_model,
    tools=[
        YFinanceTools(
            stock_price=True,
            analyst_recommendations=True,
            company_info=True
        )
    ],
    instructions=["Use table to display data"],
    show_tool_calls=True,
    markdown=True,
)

multi_ai_agent = Agent(
    model=ollama_model,
    team=[web_search_agent, finance_agent],
    instructions=["Always include sources", "Use table to display data"],
    show_tool_calls=True,
    markdown=True,
)

#response on terminal
# multi_ai_agent.print_response("Summarize analyst recommendations and share the latest news for NVDA", stream=True)

#Create playground with both agents
app = Playground(agents=[multi_ai_agent]).get_app()

if __name__ == "__main__":
    serve_playground_app("multi_ai_agent:app", reload=True, port=7777)

Response on terminal with commented command is working fine.

After this I ran phi auth
It authenticated and my email address was shown in the terminal.

After that I ran the file and It gave this error

On phidata website inside playground when I select LOCALHOST:7777 it gives this error
The endpoint is not available
Choose a different endpoint

How to fix it?

1 Like

@Taimour the line

serve_playground_app("multi_ai_agent:app", reload=True, port=7777)

Needs your file name instead of multi_ai_agent. So if your file name is test then it will be

serve_playground_app("test:app", reload=True, port=7777)
1 Like

Thank you. After changing filename it progressed further but still it doesn’t works fully.
After changing filename it shows this on console

While on the phidata website it now shows a green circle next to localhost:7777 on uper right corner and also shows a message on bottom right corner that it was successful.

After this I wrote a prompt on the playground and it gave this (network) error on phidata

While on the console it is giving error for OPENAI API KEY but I am using Ollama

INFO:     127.0.0.1:41142 - "GET /v1/playground/status HTTP/1.1" 200 OK
INFO:     127.0.0.1:41142 - "GET /v1/playground/agent/get HTTP/1.1" 200 OK
INFO:     127.0.0.1:41142 - "POST /v1/playground/agent/sessions/all HTTP/1.1" 404 Not Found
INFO:     127.0.0.1:60430 - "OPTIONS /v1/playground/agent/run HTTP/1.1" 200 OK
INFO:     127.0.0.1:60440 - "POST /v1/playground/agent/run HTTP/1.1" 200 OK
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/responses.py", line 259, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/responses.py", line 255, in wrap
    await func()
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/responses.py", line 232, in listen_for_disconnect
    message = await receive()
              ^^^^^^^^^^^^^^^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/uvicorn/protocols/http/httptools_impl.py", line 563, in receive
    await self.message_event.wait()
  File "/usr/lib/python3.13/asyncio/locks.py", line 213, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7f6eea955010

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/uvicorn/protocols/http/httptools_impl.py", line 409, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |         self.scope, self.receive, self.send
  |         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |     )
  |     ^
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
  |     return await self.app(scope, receive, send)
  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/applications.py", line 113, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/middleware/errors.py", line 187, in __call__
  |     raise exc
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/middleware/errors.py", line 165, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/middleware/cors.py", line 93, in __call__
  |     await self.simple_response(scope, receive, send, request_headers=headers)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/middleware/cors.py", line 144, in simple_response
  |     await self.app(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     raise exc
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/routing.py", line 715, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/routing.py", line 735, in app
  |     await route.handle(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/routing.py", line 288, in handle
  |     await self.app(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/routing.py", line 76, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     raise exc
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/routing.py", line 74, in app
  |     await response(scope, receive, send)
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/responses.py", line 252, in __call__
  |     async with anyio.create_task_group() as task_group:
  |                ~~~~~~~~~~~~~~~~~~~~~~~^^
  |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/anyio/_backends/_asyncio.py", line 815, in __aexit__
  |     raise BaseExceptionGroup(
  |         "unhandled errors in a TaskGroup", self._exceptions
  |     )
  | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/responses.py", line 255, in wrap
    |     await func()
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/starlette/responses.py", line 244, in stream_response
    |     async for chunk in self.body_iterator:
    |     ...<2 lines>...
    |         await send({"type": "http.response.body", "body": chunk, "more_body": True})
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/playground/router.py", line 400, in chat_response_streamer
    |     async for run_response_chunk in run_response:
    |         run_response_chunk = cast(RunResponse, run_response_chunk)
    |         yield run_response_chunk.to_json()
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/agent/agent.py", line 2149, in _arun
    |     async for model_response_chunk in model_response_stream:  # type: ignore
    |     ...<32 lines>...
    |                 )
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/model/openai/chat.py", line 927, in aresponse_stream
    |     async for response in self.ainvoke_stream(messages=messages):
    |     ...<22 lines>...
    |             self.add_response_usage_to_metrics(metrics=metrics, response_usage=response.usage)
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/model/openai/chat.py", line 406, in ainvoke_stream
    |     async_stream = await self.get_async_client().chat.completions.create(
    |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |     ...<5 lines>...
    |     )
    |     ^
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/resources/chat/completions.py", line 1720, in create
    |     return await self._post(
    |            ^^^^^^^^^^^^^^^^^
    |     ...<42 lines>...
    |     )
    |     ^
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_base_client.py", line 1843, in post
    |     return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_base_client.py", line 1537, in request
    |     return await self._request(
    |            ^^^^^^^^^^^^^^^^^^^^
    |     ...<5 lines>...
    |     )
    |     ^
    |   File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_base_client.py", line 1638, in _request
    |     raise self._make_status_error_from_response(err.response) from None
    | openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: ollama. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

I think now we are close to solving this error. Can you kindly guide further on how to solve this.
Thank you very much.

as it seems the error is about invalid api key. you might want to check your api rate limit and make sure you are using correct api as it is demanding for openai api key. but you saiad you are using ollama. can you show your code so we can better understand what the problem really is.

My code is in first message. Kindly have a look. Thank you

can you try to run your code using this command and see if it throws the error

multi_agent.cli_app(stream=False, show_full_reasoning=True)

ok sure, give me a moment and I will try now and get back to you. Thank you for your response.

It gives this response on terminal

After this when I type something it gives this error again:

 😎 User : sumarize news about NVDA
▰▱▱▱▱▱▱ Thinking...
Traceback (most recent call last):
  File "/home/user/Documents/Work/Python/frameworks_agents/phidata_multiagent.py", line 53, in <module>
    multi_ai_agent.cli_app(stream=False, show_full_reasoning=True)
    ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/agent/agent.py", line 3217, in cli_app
    self.print_response(message=message, stream=stream, markdown=markdown, **kwargs)
    ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/agent/agent.py", line 2904, in print_response
    run_response = self.run(message=message, messages=messages, stream=False, **kwargs)
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/agent/agent.py", line 2070, in run
    return next(resp)
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/agent/agent.py", line 1842, in _run
    model_response = self.model.response(messages=messages_for_model)
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/model/openai/chat.py", line 591, in response
    response: Union[ChatCompletion, ParsedChatCompletion] = self.invoke(messages=messages)
                                                            ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/phi/model/openai/chat.py", line 343, in invoke
    return self.get_client().chat.completions.create(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        model=self.id,
        ^^^^^^^^^^^^^^
        messages=[self.format_message(m) for m in messages],  # type: ignore
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        **self.request_kwargs,
        ^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_utils/_utils.py", line 279, in wrapper
    return func(*args, **kwargs)
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/resources/chat/completions.py", line 859, in create
    return self._post(
           ~~~~~~~~~~^
        "/chat/completions",
        ^^^^^^^^^^^^^^^^^^^^
    ...<40 lines>...
        stream_cls=Stream[ChatCompletionChunk],
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_base_client.py", line 1280, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_base_client.py", line 957, in request
    return self._request(
           ~~~~~~~~~~~~~^
        cast_to=cast_to,
        ^^^^^^^^^^^^^^^^
    ...<3 lines>...
        retries_taken=retries_taken,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/home/user/Documents/Work/Python/frameworks_agents/.venv/lib/python3.13/site-packages/openai/_base_client.py", line 1061, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: ollama. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

@Taimour you need to mention the model in the multi_ai_agent as well. It defaults to open ai if not provided with a model.

1 Like

Thank you, it is working now.

1 Like

but previously you have mentioned the model but getting the errro

Hi
I see the similar stack trace but the reason is different.

agent = Agent(
    name="My Finance Model",
    model=Gemini(id="gemini-1.5-flash"),
    prevent_hallucinations=True,
    tools=[yt.top_best_performance_stocks_day_data, yt.top_best_performance_52_week_stocks_data, YFinanceTools(enable_all=True),yt.create_stock_chart],
    show_tool_calls=True,
    debug_mode=True,
    monitoring=True,
    add_chat_history_to_messages=True,
    num_history_responses=5,
    
    description="You are an investment analyst that researches stock prices, analyst recommendations, and stock fundamentals. For the best gainers of the day.",
    instructions=["some instructions"],
)


app = Playground(agents=[agent]).get_app()

Exception

Exception Group Traceback (most recent call last):
  |   File " financial_agent/venv/lib/python3.13/site-packages/uvicorn/protocols/http/httptools_impl.py", line 409, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |         self.scope, self.receive, self.send
  |         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |     )
  |     ^
  |   File " financial_agent/venv/lib/python3.13/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
  |     return await self.app(scope, receive, send)
  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  |   File " financial_agent/venv/lib/python3.13/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/applications.py", line 113, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/middleware/errors.py", line 187, in __call__
  |     raise exc
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/middleware/errors.py", line 165, in __call__
  |     await self.app(scope, receive, _send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/middleware/cors.py", line 93, in __call__
  |     await self.simple_response(scope, receive, send, request_headers=headers)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/middleware/cors.py", line 144, in simple_response
  |     await self.app(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     raise exc
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
  |     await app(scope, receive, sender)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/routing.py", line 715, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/routing.py", line 735, in app
  |     await route.handle(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/routing.py", line 288, in handle
  |     await self.app(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/routing.py", line 76, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     raise exc
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
  |     await app(scope, receive, sender)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/routing.py", line 74, in app
  |     await response(scope, receive, send)
  |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/responses.py", line 252, in __call__
  |     async with anyio.create_task_group() as task_group:
  |                ~~~~~~~~~~~~~~~~~~~~~~~^^
  |   File " financial_agent/venv/lib/python3.13/site-packages/anyio/_backends/_asyncio.py", line 815, in __aexit__
  |     raise BaseExceptionGroup(
  |         "unhandled errors in a TaskGroup", self._exceptions
  |     )
  | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/responses.py", line 255, in wrap
    |     await func()
    |   File " financial_agent/venv/lib/python3.13/site-packages/starlette/responses.py", line 244, in stream_response
    |     async for chunk in self.body_iterator:
    |     ...<2 lines>...
    |         await send({"type": "http.response.body", "body": chunk, "more_body": True})
    |   File " financial_agent/venv/lib/python3.13/site-packages/phi/playground/router.py", line 471, in chat_response_streamer
    |     async for run_response_chunk in run_response:
    |         run_response_chunk = cast(RunResponse, run_response_chunk)
    |         yield run_response_chunk.to_json()
    |   File " financial_agent/venv/lib/python3.13/site-packages/phi/agent/agent.py", line 2155, in _arun
    |     raise NotImplementedError(f"{self.model.id} does not support streaming")
    | NotImplementedError: gemini-1.5-flash does not support streaming
    +------------------------------------

My guess is that:
NotImplementedError: gemini-1.5-flash does not support streaming
is not yet implemented
Any work around for it?

phidata 2.7.9
Thanks a lot

@mikeR you can replace

app = Playground(agents=[agent]).get_app()

with

app = Playground(agents=[agent]).get_app(use_async=False)