Long-term memory for AI agents
Works out-of-the-box with the Camunda AI Agent connector to ground LLM calls with historical context.
The Embeddings Vector Database connector provides bidirectional access to vector stores, enabling Camunda processes and AI agents to write new embeddings and retrieve the most relevant chunks at runtime. Typical use cases include long-term conversational memory, RAG (Retrieval-Augmented Generation), and semantic search. This connector is built for Camunda 8.8-alpha4 and is fully compatible with the Camunda AI Agent connector. You can point it at any compatible vector endpoint. It supports both write-only and search-only modes, plus hybrid read and write. It returns the top-k matches with cosine similarity scores for deterministic routing. Please note that alpha software interfaces may change without notice.