IDE Monitoring

Every AI conversation, captured and searchable

AI coding conversations are ephemeral. Insights, solutions, and architectural decisions made during AI-assisted sessions disappear when the chat window closes. Our IDE extensions fix that.

How it works

One Click Setup

1

Install the Extension

Add the Hallucinated extension from your IDE marketplace. One click for VS Code, Cursor, and Windsurf. JetBrains plugin available separately.

2

Monitor Automatically

The extension hooks into your IDE natively. It captures AI conversations, code changes, and session metadata without interrupting your flow.

3

Search & Explore

Sessions sync to the cloud and become searchable via the knowledge base. Find past solutions, replay conversations, and extract insights.

Extension-First Architecture

Why IDE Extensions?

We chose native extensions over background daemons for better integration, trust, and performance

Native IDE Access

Extensions hook directly into your IDE APIs — editor state, active files, AI chat panels. No file-watching hacks needed.

Only When You Code

No background processes draining resources. The extension only runs when your IDE is open.

Marketplace Distribution

Install from VS Code Marketplace or JetBrains Plugin Hub. Auto-updates, one-click install, no CLI required.

Transparent & Trusted

Extensions are explicitly installed by you. No mysterious background processes — full visibility into what runs on your machine.

Supported IDEs

Works With Your Stack

Native extensions for all major AI-powered IDEs and editors

Cursor

VS Code Extension

Windsurf

VS Code Extension

Claude Code

VS Code Extension + CLI Hook

VS Code

VS Code Extension

JetBrains

JetBrains Plugin

Cursor and Windsurf are VS Code forks — the same extension works across all three.
Claude Code CLI is supported via shell hooks when used outside an IDE.

Cross-Session Memory

RAG-Powered Knowledge Base

Search past sessions using natural language. Our RAG pipeline indexes your conversations with OpenAI embeddings and returns relevant context from any previous session.

OpenAI embeddings (text-embedding-3-small, 1536 dims)
0.7 similarity threshold, max 10 results/query
Up to 1M indexed documents on Enterprise
Search across all sessions...

How did we implement auth middleware?

Cursor - 3 days ago0.92 match

Redis caching strategy discussion

Claude Code - 1 week ago0.87 match

Database migration patterns

Windsurf - 2 weeks ago0.81 match

Features by Plan

More IDEs, longer retention, and deeper insights at every tier

IDEsConversationsRetentionExtras
Free1 IDE25030 daysBasic monitoring
Pro2 IDEs1,00090 daysConversation replay
Ultra4 IDEs3,000365 daysKnowledge extraction
TeamsPer-userPooledUp to 365 daysShared knowledge base
Enterprise10 IDEs5,000CustomAdvanced audit logs

Never lose an AI conversation again

Free tier includes 1 IDE and 250 stored conversations.