DynamoDB as knowledge base for agents

Hi there,

I’m working on a agentic autonomous analyst for datascience, to analyze time-series and other types of results.

My question is how to use the knowledge base in a DynamoDB format instead of Postgreql with vector search. The ideal approach would be the agent search for keyword (type of analysis related to the type of data at the moment of the analysis) with that being able to pass a large specific kwonledge base for each type of analysis.

Any example available using dynamoDB as source of knowledge (I have searched and not found yet).
Thanks

Hi @TheDev, welcome to the Agno community!
Thank you for reaching out and supporting Agno. I’ve shared this with the team, we’re working through all queries one by one and will get back to you soon. If it’s urgent, please let us know. We appreciate your patience!

1 Like

Hi @TheDev , Dynamo does not yet offer a vector database with things like vector, semantic and hybrid search capabilities.
But from Agno Runtime vectorDB selection and support for multiple vectorDB is a feature planned in our roadmap and will be released soon .
Thanks

Alright @monalisha, thanks for that.

So, in relation to the usage of storage for knowledge base, how to solve for an error on pgvector dimensions that keep recurring happening and shown bellow for the code snippet?

As context, the database and table are there (using the docker container pgvector).

Thanks

the result:

I’m sorry, I was unable to find a recipe for pad thai.

Error :

Table might not exist, creating for future use
ERROR Error performing semantic search: (builtins.ValueError) expected 1024
dimensions, not 1536
[SQL: SELECT ai.vectors.id, ai.vectors.name, ai.vectors.meta_data,
ai.vectors.content, ai.vectors.embedding, ai.vectors.usage
FROM ai.vectors ORDER BY ai.vectors.embedding <=> %(embedding_1)s
LIMIT %(param_1)s::INTEGER]
[parameters: [{}]]

The code snippet


import boto3
from agno.agent import Agent
from agno.knowledge.embedder.aws_bedrock import AwsBedrockEmbedder
from agno.knowledge.knowledge import Knowledge
from agno.vectordb.pgvector import PgVector, SearchType
from agno.models.aws import AwsBedrock
from agno.db.postgres import PostgresDb

boto3_session = boto3.Session(aws_access_key_id="key_id",
                              aws_secret_access_key="key_secret")

db_url = "postgresql+psycopg://ai:ai@localhost:5532/ai"

postgres_db = PostgresDb(db_url=db_url)

vector_db = PgVector(table_name="vectors",
                     db_url=db_url,
                     embedder=AwsBedrockEmbedder(aws_region="us-east-2",
                                                 aws_access_key_id="key_id",                                              aws_secret_access_key="key_secret",
                                                 id="us.cohere.embed-v4:0",
                                                 session=boto3_session),
                     search_type=SearchType.vector)

knowledge = Knowledge(
    name="My PG Vector Knowledge Base",
    description="This is a knowledge base that uses a PG Vector DB",
    vector_db=vector_db,
    contents_db=postgres_db,
)
knowledge.add_content(
    name="Recipes",
    url="https://agno-public.s3.amazonaws.com/recipes/ThaiRecipes.pdf",
    metadata={"doc_type": "recipe_book"},
)

agent = Agent(
    model=AwsBedrock(id="us.meta.llama4-maverick-17b-instruct-v1:0",
                     aws_region="us-east-2",
                     aws_access_key_id="key_id",
                     aws_secret_access_key="key_secret"),
    knowledge=knowledge,
    # Enable the agent to search the knowledge base
    search_knowledge=True,
    # Enable the agent to read the chat history
    read_chat_history=True,
)

agent.print_response("How do I make pad thai?", markdown=True)

vector_db.delete_by_name("Recipes")
# or
vector_db.delete_by_metadata({"doc_type": "recipe_book"})