VERSION 0.1.0

Engram: The Neural Interoperability Layer for AI Agents

CONNECT ANY AGENT. ANY TOOL. ANY API.

Engram is a mission-critical middleware platform that provides a universal bridge for AI agents, tools, and third-party APIs. It abstracts away protocol complexities (A2A, MCP, ACP), auto-heals semantic schema mismatches through a combination of ontologies and machine learning, and orchestrates multi-agent tasks through a performance-aware routing engine.


1. System Vision & The Request Lifecycle

Engram acts as the "connective tissue" of the agentic web. It is designed to be a transparent, high-throughput layer that ensures semantic fidelity and operational reliability across heterogeneous agent ecosystems.

The Full Execution Cycle

A typical request (e.g., a natural language command via POST /api/v1/delegate) undergoes the following transformations:

  1. Ingress & Auth: The edge gateway verifies the Engram Access Token (EAT) for tool-level permissions.
  2. Intent Resolution: The IntentResolver decomposes the prompt into AtomicTask objects with confidence scores.
  3. Pathfinding: The ProtocolGraph (NetworkX) runs a Dijkstra shortest-path search to find the most efficient corridor between protocols (e.g., NL -> MCP -> MiroFish).
  4. Envelope Normalization: The TranslatorEngine applies Version Deltas (BFS-based upgrades) to ensure the source message matches the current standard before translation.
  5. Multi-Hop Translation: For each hop in the determined path:
    • Structural Translation: Field names and envelopes are morphed.
    • Tiered Semantic Resolution: (See Section 3).
    • Crypto Snapshot: A SHA-256 state proof is generated for the hop.
  6. Task Enqueueing: The resulting payload is persisted to the Postgres Task Queue.
  7. Async Processing: The TaskWorker picks up the task, executes the final connector (e.g., MiroFish or a Tool), and manages the lifecycle (Retries -> Ack -> Success/Dead-Letter).
  8. Verification: The final HandoffResult returns an aggregate execution proof (v1:agg) proving the integrity of the entire chain.

2. Advanced Protocol Routing

The Dynamic Protocol Graph

Engram treats protocols as nodes and translators as edges. Each edge carries a Dynamic Weight that shifts based on real-time agent performance.

The Weighting Formula

Weights are computed by the DiscoveryService and injected into the graph during pathfinding:

$$W = 1.0 + (L \times 0.1) + ((1.0 - R) \times 50.0)$$
  • Base (1.0): Minimum cost per hop.
  • Latency (L): 0.1 point penalty per second of average response time.
  • Reliability (R): 5.0 points per 10% drop below 100% success rate.

Example: An agent with 1.5s latency and 90% success rate has a weight of 1.0 + 0.15 + 5.0 = 6.15, making it 6x less likely to be chosen than a perfectly reliable, sub-second agent.

3. Tiered Semantic Resolution (The Bridge)

Engram's SemanticMapper solves the "Data Silo" problem using a tiered fall-through strategy to maximize both speed and depth.

Tier Technology Rationale
1. Validation jsonschema Ensures the incoming payload matches the expected contract.
2. Flattening Recursive Walk Converts nested structures (user.info.name) into dotted paths for mapping.
3. Explicit Rules PyDatalog Near-instant renaming for high-traffic, known field pairs.
4. Ontology owlready2 High-depth resolution of concept IRIs in protocols.owl (Namespaces: A2A, MCP, ACP).
5. ML Inference TF-IDF + LogReg Bayesian fallback that predicts field names based on training data from the ProtocolMapping table.

The Self-Healing Loop

When Tier 5 reaches a confidence threshold of ≥ 85%, the system auto-applies the mapping to the database and marks the MappingFailureLog as applied. After 5 such corrections, the ML model automatically triggers a background retraining loop to incorporate the new knowledge.

4. Cryptographic Proof of Execution

Every data transformation in Engram is verifiable. This prevents "Man-in-the-Middle" payload manipulation within the middleware.

  • Hop Proof (v1:sha256): A hash of (Input + Output + Source + Target + Timestamp).
  • Aggregate Proof (v1:agg): A hash of the concatenated hop proofs.

Agents can verify the execution_proof field in the API response to ensure that the data they received is exactly what was generated by the translation chain.

5. Security & Identity: The EAT System

The Engram Access Token (EAT) is the project’s security backbone.

  • Stateless but Revocable: JTI-based revocation in Redis allows for immediate global sign-out of long-lived tokens.
  • Scoped Permissions:
    • translate:a2a: Access to the protocol translator.
    • discovery:read: Access to search for collaborators.
    • tools:*: Recursive access to all tool connectors.
  • Credential Vaulting: AI provider keys (OpenAI, Anthropic, etc.) are stored as encrypted metadata in the ProviderCredential table, accessible only to the owner’s sub-tasks.

6. Functional Modules & Special Bridges

MiroFish Swarm Bridge

Designed for predictive financial modeling, this bridge pipes data directly into a MiroFish simulation.

  • Context Injection: Automatically enriches seeds with real-time prices (Binance/Kalshi), sentiment scores (X), and headlines.
  • God's-Eye Mode: Allows for mid-simulation "shocks" (event injection) to test swarm stability under volatility.

Trading Developer Templates

Unified schemas for the decentralized economy:

  • Trade Order: Symbols, actions (buy/sell/limit/market), and quantities.
  • Payment Intent: Amounts and currencies across Stripe and PayPal.
  • Data Feed: Structured queries for FRED economic data or X-sentiment.

7. Operational Command & Control

Terminal Dashboard (TUI)

Run python app/cli.py debug for a live NOC-style view:

  • Side-by-Side Trace: Inspect Input Protocol vs Output Protocol payloads in real-time.
  • Task Telemetry: View lease timestamps and worker attempts for the background queue.
  • Event Stream: Real-time structured logs from the DiscoveryService and Orchestrator.

Performance Benchmarks (JMeter Verified)

Metric Result Environment
Throughput ≥ 150 requests/sec Local Docker Stack
p50 Latency ≤ 120 ms Local Docker Stack
p99 Latency ≤ 600 ms Local Docker Stack

8. Implementation & Deployment Tips

Deployment Matrix

  • Docker (Standard): Use docker compose up --build for the full stack (Redis, Postgres, Prometheus).
  • Staging: Use docker compose -f docker-compose.staging.yml for a WireMock-enriched environment for mocking external agent APIs.
# Start the full stack easily
docker compose up --build -d

Essential Env Vars

  • DATABASE_URL: Your primary Postgres sink.
  • REDIS_ENABLED: Critical for caching OWL ontology hits.
  • AUTH_JWT_SECRET: The master key for signing EATs.
  • SENTRY_DSN: Required for production error tracking.

9. Troubleshooting & Common Errors

HTTP Code Error Category Solution
401/403 EAT Failure Check token exp and iss claims. Verify tool scopes.
422 Mapping Gap The MappingFailureLog contains the raw fields. Check the mapping_suggestions in the response.
503 Worker Lag Monitor /metrics. If task_latency_seconds spikes, scale the app service containers.
Documentation Hub Info

Documentation last updated: March 2026. For further assistance, check the main repository.