AI agents forget everything.
CoMeT gives them memory. 

SCROLL

Store, Compact, and Recall agent memory across sessions– without losing context or increasing cost

CoMeT

[cognitive memory tree]

agent memory protocol. store, compact, and recall agent memory across sessions- without losing context or increasing cost.

unlike simple markdown files, CoMeT gives agents infinite, lossless, shareable memory that persists and improves over time.

How it works

Case-study & benchmark

200k Context: LongMemEval-S

CoMeT
Original context
Description
GPT 5.40.6830.667Better than original
Gemini 3.1 Pro0.8830.90011x less input only 2% loss
CoMeT
Original context
Description
GPT 5.40.6830.667Better than original
Gemini 3.1 Pro0.8830.90011x less input only 2% loss

947 turns. 26 hours. Zero context loss.

947 turns. 26 hours.
Zero context loss.

Architecturally unbounded. We just stopped testing.

CoMeT

CoMeT

947 turns ~

947 turns ~

Claude Code

~50 turns

~50 turns

Cursor

Cursor

~50 turns

~50 turns

Devin

Devin

~40 turns

~40 turns

Unlike simple markdown files,
CoMeT gives agents infinite, lossless, shareable memory that persists and improves over time.

CoBrA: AI agent built to retain context and evolve across sessions with 0 manual intervention.

Powered by CoMeT, CoBrA retains context across sessions, learns from its actions, and continuously improves without manual intervention 

CoBrA in action: