oms-cognee
oms-cognee
Overview
Cognee is an open-source knowledge-graph memory engine for AI agents. It combines a vector store (semantic search), a graph store (entities + relationships), and a relational store (provenance) into a single three-layer memory architecture. The canonical V1 workflow is add → cognify → search: ingest data, build a knowledge graph, then query it. The new V2 memory-oriented API wraps this as remember → recall with an optional improve/forget/serve/agent_memory layer for agent contexts.
- Source: topoteretes/cognee @
v1.0.0(commit3c048aa4)[SRC:pyproject.toml:L4] - Language: Python >=3.10, <3.14
[SRC:pyproject.toml:L10] - Forge tier: Deep (AST + ccc + QMD + docs fetch)
- Public exports: 34 top-level names in
cognee/__init__.py[AST:cognee/__init__.py:L1] - Confidence: All T1 (AST-verified from source clone)
- Async model: Cognee is async-first — nearly all top-level functions are coroutines and must be
awaited[EXT:https://docs.cognee.ai/getting-started/quickstart]
Quick Start
import asyncio
import cognee
from cognee import SearchType
More from armelhbobdad/oh-my-skills
cognee
Use when cognee is a Python AI memory engine that transforms
41uitripled
>
3oms-uitripled
>
3oms-storybook-react-vite
>
2oms-cocoindex
Build data ingestion + transformation pipelines on top of cocoindex v1.0.0 — a Python framework with a Rust engine for ultra-performant, incremental indexing (ETL, RAG, knowledge graphs, vector search). Use when authoring cocoindex v1.0.0 components — App / Environment / @lifespan / @fn / mount + TargetState reconciliation + connectors + ops. **This is a complete paradigm change from v0.3.37**; the old FlowBuilder/DataScope/DataSlice/flow_def API was wholly removed in v1.0.0 — do NOT mix vocabularies.
2bmad-distillator
Lossless LLM-optimized compression of source documents. Use when the user requests to 'distill documents' or 'create a distillate'.
1