/* === Background === */ /* === Improved Foreground: Textured Hard Corals === */ /* === Bubbles === */
🚧 Early Development

Root cause in seconds, not hours

The open-source AI debugger for distributed apps. Give your AI assistant "hands" inside your running code β€” profile live services, trace functions, and capture traffic without redeployment.

terminal
$ coral ask "Why is the payment API slow?"
πŸ€– Analyzing host metrics and continuous profiles...
Host: api-v1-pod-abc (CPU: 12%, Mem: 45%)
Service: payment-api (P95: 2.3s)
Initial findings: High "Off-CPU" wait time detected in process.
Executing coral_profile_functions (strategy: critical_path)...
Analysis of 30s capture:
β€’ ProcessPayment() total: 2.1s
  β””─ Mutex Contention: 1.8s (Blocked by Logger.Write)
  β””─ VFS Write (Disk I/O): 1.7s (Wait on /var/log/app.log)

Root Cause:
Synchronous logging to a slow disk volume is blocking the main execution thread.
⏱️ Root cause in seconds, zero code changes
$ coral ask "Why is the order-processor using 10GB of RAM?"
πŸ€– Analyzing host metrics and continuous memory profiles...
Host: worker-node-5 (CPU: 18%, Mem: 85% - 10.2GB)
Service: order-processor (Heap growth: +200MB/hour)
Memory leak detected. Analyzing allocation patterns...
Executing coral_profile_memory...
Top Memory Allocators (30s sample):
β€’ cache.Store: 45.2% (523 MB/s)
  β””─ No TTL-based eviction detected

Root Cause:
cache.Store retains entries indefinitely, causing unbounded memory growth.
⏱️ <1 second analysis using your own LLM
<1s
Root cause analysis
Zero
Mandatory code changes
100%
Your infrastructure, your AI
Any
AI assistant via MCP
∞
Environments supported

The Problem: Debugging Is Still Manual

Modern distributed systems don’t fail in obvious ways. They fail across interactions β€” between services, retries, network calls, and kernel behavior.

Observability shows you symptoms. Engineers still have to investigate the cause.

🧩The Context Gap

Metrics spike. A trace looks slow. Logs look noisy. But none of them explain why. You jump between dashboards, correlate timestamps by hand, and try to reconstruct what actually happened.

πŸ”The Escalation Gap

When surface data isn’t enough, you add logs, enable tracing, redeploy, or attach a profiler. Each step is reactive, disruptive, and often changes the very behavior you're trying to observe.

⏱️Passive Tools, Active Toil

Traditional observability tools collect data. They don’t investigate. Finding the right question β€” across services and layers β€” is still a human task, and it dominates MTTR.

Coral Turns This Upside Down

The depth of a kernel debugger
with the reasoning of an AI

Instead of dashboards and guesswork, Coral performs on-demand investigations β€” safely, across your distributed stack.

One Interface for Everything

πŸ‘οΈObserve

Passive, always-on data collection:

  • Zero-config eBPF metrics: Rate, Errors, Duration (RED)
  • Host health: Continuous monitoring of CPU, memory, disk, and network
  • Continuous profiling: Always-on CPU and memory profiling (<1% overhead)
  • OTLP ingestion: For apps using OpenTelemetry
  • Auto-discovered dependencies: Service connection mapping

πŸ”Explore

Deep introspection and investigation tools:

  • Remote execution: Run standard tools like netstat, curl, and grep on any agent
  • Remote shell: Jump into any agent's shell
  • On-demand profiling: High-frequency CPU profiling with Flame Graphs for line-level analysis
  • Live debugging: Attach eBPF uprobes to specific functions to capture args and return values
  • Traffic capture: Sample live requests to understand payload structures

πŸ€–Diagnose

AI-powered insights for intelligent Root Cause Analysis (RCA):

  • Profiling-enriched summaries: AI gets metrics + code-level hotspots in one call
  • Regression detection: Automatically identifies performance shifts across deployment versions
  • Built-in assistant: Use coral ask directly from your terminal
  • Universal AI integration: Works with Claude Desktop, IDEs, any MCP client
  • Real-time data access: AI queries live observability data, not stale dashboards

MCP Integration: Use Any AI Assistant

Bring your own LLM - Claude Desktop, VS Code, Cursor, or custom apps

πŸ€–

External AI

Claude β€’ VS Code β€’ Cursor
⌨️

Coral Ask

Built-in Terminal AI
MCP Protocol
🧠

Colony Server

MCP Server & Analytics
Encrypted Mesh
πŸ‘οΈ

Agents

eBPF & OTLP Collection
Instrumentation & OTEL
πŸ“¦

Application

SDK & Runtime

What Makes Coral Different?

A shift in how debugging works

Mode

You explore dashboards

Coral investigates systems

Escalation

You add logs & redeploy

Coral attaches live probes

Instrumentation

Pre-instrument everything

Instrument only when needed

Outcome

Symptoms

Root cause with evidence

How It Works

From observability to insights - a complete journey through Coral's architecture

1

Observe Everywhere

Progressive integration levels - start with zero-config, add capabilities as needed

Level 0

πŸ“‘eBPF Probes

Zero-config RED metrics Β· No code changes required

Level 1

πŸ”­OTLP Ingestion

Rich traces if using OpenTelemetry Β· Optional

Level 2

πŸ“ŠContinuous Intel

Always-on host metrics & continuous profiling Β· Low overhead

Level 3

🎯Deep Introspection

On-demand profiling, function tracing & active investigation

Agents collect locally
2

Aggregate Intelligently

Colony receives and stores data from all agents across your distributed infrastructure

β†’ DuckDB storage for fast analytical queries
β†’ Cross-agent correlation discovers dependencies
β†’ Encrypted mesh connects fragmented infrastructure
MCP Server exposes tools
3

Query with AI

Colony exposes MCP server for universal AI integration

β†’ Works with any MCP client: Claude Desktop, VS Code, Cursor, custom apps
β†’ Bring your own LLM: Anthropic, OpenAI, or local Ollama
β†’ Natural language queries: "Why is checkout slow?" instead of PromQL
Insights delivered
4

Act on Insights

Get actionable recommendations in natural language, execute with approval

β†’ Root cause analysis in <1 second
β†’ Actionable recommendations with evidence
β†’ Human-approved execution for safety

Get Started in Minutes

1

Install Coral

# macOS / Linux
brew install coral # COMING SOON

# Or download from GitHub Releases
2

Start the Colony

coral colony start
3

Start Agent & Connect Services

coral agent start
4

Ask Questions

coral ask "What's happening with the API?"

🚧 Early Development

Coral is an experimental project currently in active development.

Stay tuned for future updates.

Contact