cognee logo
BUZZ: 49%
No reviews yet!

Reliable LLM Memory for AI Applications and AI Agents

AI Agent Memory
Technology
Open Source
Featured on AI Agents Directory

cognee Overview

Cognee implements scalable, modular ECL (Extract, Cognify, Load) pipelines that allow you to interconnect and retrieve past conversations, documents, and audio transcriptions while reducing hallucinations, developer effort, and cost.

cognee Key Features

Modular: Cognee is modular by nature, using tasks grouped into pipelines
Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI.
Vector Stores: Cognee supports LanceDB, Qdrant, PGVector and Weaviate for vector storage.
Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider.
Graph Stores: In addition to NetworkX, Neo4j is also supported for graph storage.

cognee Use Cases

Memory for AI Agents
Ontology definition
Entity resolution
Chatbot memory
Pricing
Free

Alternative AI Agents

Loading featured agents...

Popular Categories

View All
Loading latest articles...

Stay Ahead of the Curve with AI Agents updates