Hi,
Can someone help me solve this problem please?
Everything seems OK on the interface and I have no errors when debugging, but as soon as I start a conversation, I have this error message!
Thank you for your help
Hi,
Can someone help me solve this problem please?
Everything seems OK on the interface and I have no errors when debugging, but as soon as I start a conversation, I have this error message!
Hi @adilsasse
Thank you for reaching out and using Phidata! I’ve tagged the relevant engineers to assist you with your query. We aim to respond within 24 hours.
If this is urgent, please feel free to let us know, and we’ll do our best to prioritize it.
Thanks for your patience!
Hey @adilsasse can you share the agent config you are running?
I just replaced the Gemini model with GTP-4o and it works.
the model must be configured to run text generation in streaming mode by default, otherwise it doesn’t work.
Gemini is not configured for streaming by default.
Can someone help me add this in the code?
Here is the code for one of the agents I use:
“”"
from phi.model.google import Gemini
web_search_agent = Agent(
name=“WebAgent”,
description=“This is the agent for searching content from the web”,
model=Gemini(id=“gemini-1.5-flash”),
tools=[DuckDuckGo()],
instructions=[“Always include the sources”],
storage=SqlAgentStorage(table_name=“web_search_agent”, db_file=“agents.db”),
add_history_to_messages=True,
show_tool_calls=True,
markdown=True,
debug_mode=True,
app = Playground(agents=[finance_agent, web_search_agent]).get_app()
if name == “main”:
serve_playground_app(“playground:app”, reload=True)
sorry for the inconvenience. I just found the answer in the repo on github :
cookbook/playground/gemini_agents.py
you just have to put this parameter “get_app(use_async=False)” instead of “get_app()” at the line “app = Playground(agents=[finance_agent, web_search_agent]).get_app()”
Thanks
I am getting the same error and i am using model=Groq(id=“llama3-70b-8192”)
Can someone suggest how to fix the issue
Hi @jaibhavani
Thank you for reaching out and using Phidata! I’ve tagged the relevant engineers to assist you with your query. We aim to respond within 48 hours.
If this is urgent, please feel free to let us know, and we’ll do our best to prioritize it.
Thanks for your patience!
@jaibhavani can you share your code?
from phi.agent import Agent
from phi.model.groq import Groq
from phi.tools.yfinance import YFinanceTools
from phi.tools.duckduckgo import DuckDuckGo
import openai
import os
from dotenv import load_dotenv
load_dotenv()
openai.api_key=os.getenv(“OPENAI_API_KEY”)
web_search_agent = Agent(
name=“web search agent”,
role=“Search the web for information”,
model=Groq(id=“llama3-70b-8192”),
tools=[DuckDuckGo()],
instructions=[“Always include the source”],
show_tool_calls=True,
markdown=True
)
finance_agent = Agent(
name=“Finance AI Agent”,
model=Groq(id=“llama3-70b-8192”),
tools=[
YFinanceTools(stock_price=True, analyst_recommendations=True, stock_fundamentals=True, company_news=True)
],
instructions=[“Use tables to display the data”],
show_tool_calls=True,
markdown=True
)
multi_ai_agent = Agent(
model=Groq(id=“llama3-70b-8192”),
team=[web_search_agent, finance_agent],
instructions=[“Always include sources”, “Use tables to display the data”],
show_tool_calls=True,
markdown=True
)
multi_ai_agent.print_response(“Summarize analyst recommendations and share the latest news for NVDA”, stream=True)
@manthanguptaa i have shared the code
@jaibhavani you haven’t added the code to deploy it on the playground in your code.
You have to add this code block instead of print_response
app = Playground(agents=[multi_ai_agent]).get_app()
if __name__ == "__main__":
serve_playground_app("playground:app", reload=True)
import openai
from phi.agent import Agent
import phi.api
from phi.model.openai import OpenAIChat
from phi.tools.yfinance import YFinanceTools
from phi.tools.duckduckgo import DuckDuckGo
from dotenv import load_dotenv
from phi.model.groq import Groq
import os
import phi
from phi.playground import Playground, serve_playground_app
load_dotenv()
phi.api=os.getenv(“PHI_API_KEY”)
web_search_agent=Agent(
name=“Web Search Agent”,
role=“Search the web for the information”,
model=Groq(id=“llama3-groq-70b-8192-tool-use-preview”),
tools=[DuckDuckGo()],
instructions=[“Alway include sources”],
show_tools_calls=True,
markdown=True,
)
finance_agent=Agent(
name=“Finance AI Agent”,
model=Groq(id=“llama3-groq-70b-8192-tool-use-preview”),
tools=[
YFinanceTools(stock_price=True, analyst_recommendations=True, stock_fundamentals=True,
company_news=True),
],
instructions=[“Use tables to display the data”],
show_tool_calls=True,
markdown=True,
)
app=Playground(agents=[finance_agent,web_search_agent]).get_app()
if name==“main”:
serve_playground_app(“playground:app”,reload=True)
@manthanguptaa i already had the code to deploy. It
@manthanguptaa
The error i am getting is when i select Tell me about yourself
Hello
There are 2 possibilities to solve the problem.
1- Use a model with the “stream” option enabled by default. You have to look at the model documentation to check (OpenAI and Gemini are models with the “Stream” mode enabled by default).
2- Modify this line of code if the “stream” option isn’t enabled by default for the model :
“app = Playground(agents=[finance_agent, web_search_agent]).get_app()”
Replace: “get_app()”
By : “get_app(use_async=False)”
@adilsasse that’s correct! Groq doesn’t support streaming at the moment because of which the error bubbles up