LLM as endpoint

I am having LLM exposed as API Endpoint as mentioned below . Could you advice the way to utilize this LLM model for building agents with Agno .

import requests
import json

url = “https://host.com/v1/app_id/chat
payload = json.dumps({
“chat_model_id”: “Claude 3.5 Sonnet v2”, #“Claude 3.5 Haiku”,
“history”: [
{
“assistant”: “I’m doing well, thanks! How can I assist you today?”,
“user”: “Hi, how are you?”
}
],
“response_length”: “Short”,
“session_id”: “string”,
“session_mode”: “private”,
“app_id”: 1,
“app_name”: “Default Prompt”,
“temperature”: {
“temperature”: 0.0,
“temperature_key”: “Balanced”
},
“user_docs”: ,
“user_question”: “Who are you?”,
“user_time”: “Mon Jun 24 2024 11:11:01 GMT+0200 (Central European Summer Time)”
})
headers = {
‘api-key’: ‘<your_api_key>’,
‘Content-Type’: ‘application/json’
}
response = requests.request(“POST”, url, headers=headers, data=payload)

Hi @Dinesh
thanks for reaching out and supporting Agno!
We’ve shared this with the team and are working through requests one by one—we’ll get back to you as soon as we can.
We’ve just kicked off the Global Agent Hackathon, so things are a bit busier than usual. If you’re up for it, we’d love for you to join—it’s a great chance to build, win some exciting prizes and connect with the agent community!
If it’s urgent, just let us know. Thanks for your patience!

Hi @Dinesh , if your model support an OpenAI like api structure , then you refer this OpenAI Like - Agno and implement it . Let me know if it helps.