https://camunda.com/
Seamless long-term memory for AI agents

Seamless long-term memory for AI agents

The Embeddings Vector Database connector provides bidirectional access to vector stores, enabling Camunda processes and AI agents to write new embeddings and retrieve the most relevant chunks at runtime. Typical use cases include long-term conversational memory, RAG (Retrieval-Augmented Generation), and semantic search. This connector is built for Camunda 8.8-alpha4 and is fully compatible with the Camunda AI Agent connector. You can point it at any compatible vector endpoint. It supports both write-only and search-only modes, plus hybrid read and write. It returns the top-k matches with cosine similarity scores for deterministic routing. Please note that alpha software interfaces may change without notice.


Features and Benefits

Long-term memory for AI agents

Works out-of-the-box with the Camunda AI Agent connector to ground LLM calls with historical context.

Bidirectional vector operations

Write embeddings for new content and perform top-k similarity search from the same connector.

Details

  • Marketplace release date -
  • Last Github commit -
  • Associated Product Group Categories:
    • AI Services
  • Version Compatibility:
  • Used resources:

Support and documentation
Creator


Resources