AI infrastructure for organizations that can't afford to leak data BETA

Curie is a self-hosted control plane for the LLMs your organization already pays for. Run a secure team chat surface, expose a managed AI gateway to your in-house apps, and process documents on your own infrastructure — all behind one set of credentials, one set of policies, and one audit trail.

Fully Offline Deployment US-Owned & Operated No Vendor Egress
One control plane, three jobs
  • Chat — secure, collaborative AI for your team
  • Gateway — managed AI access for your in-house apps
  • Documents — PDFs and scans to structured markdown
  • Always — on your hardware, with your policies

What Curie does

A single platform that replaces three separate procurement decisions

Secure Team Chat

A ChatGPT-style surface your staff actually uses, but on your infrastructure with your credentials and your guardrails.

  • Shared workspaces with discussion threads
  • Agent profiles with layered prompt stacks
  • Per-org capability gates and usage groups
  • Per-user and per-team cost limits
  • OpenAI, Anthropic, Azure, or your own endpoint

AI Gateway for Your Apps

NEW

Point your internal apps, agents, and back-office tools at Curie instead of the LLM vendor. Centralize keys, limits, audit, and tool forwarding in one place.

  • HTTP + SSE streaming API
  • User-issued API keys, scoped to an org
  • Tool forwarding to OpenAI / Anthropic / Azure
  • Same rate, cost, and group limits as chat
  • Read-only audit of every programmatic conversation
See the API

Document Pipeline

Convert long PDFs, scanned drawings, and mixed-format documents into structured markdown — without shipping a single page to a third-party OCR vendor.

  • Chunked PDF → markdown with vision reconciliation
  • Local Tesseract OCR pre-pass for scans
  • Auto-rotation, focus zones, prefer-literal mode
  • Self-paced text-query batches over big corpora
  • No Google Vision, no AWS Textract, no Azure Doc Intel

The spine of every feature: data stays put

Curie targets restricted-cloud and air-gapped deployments. The configured LLM is the only egress path for user content — and that path is one your operator chose.

US-Based, Owned & Operated

All development, operations, and support conducted in the United States with complete transparency.

Fully Offline Deployment

Run Curie on your hardware, in your VPC, or fully air-gapped. No internet connection required for the platform itself.

No Side-Channel Egress

No third-party OCR, no error trackers that capture request bodies, no telemetry that ships content. User data hits one vendor: the one you configured.

Content-Free Logs

Log streams (journald, Datadog, etc.) carry record IDs and metrics — never message bodies, filenames, or document content.

Deployment Options

On-Premises
Your hardware, complete isolation. Container images delivered via private registry.
Private Cloud
Run inside your VPC alongside your existing services and identity provider.
Restricted Cloud (e.g. Azure GCC-High)
Targeted at FedRAMP / GCC-High tenants where data residency is non-negotiable.
Air-Gapped
Fully offline, paired with an in-tenant or on-prem model endpoint.

Who reaches for which surface

The same policies, limits, and audit cover all three

Your Team

Open the chat surface in a browser. Share workspaces, comment on responses, attach context items, run text-query batches against a folder of documents.

Your Developers

Mint an API key, point an internal app or agent at Curie, and ship without re-implementing rate limits, audit logs, or tool plumbing for every project.

Your Admins

Set per-org capability gates, override them per usage-group, watch token budgets in real time, and audit every programmatic conversation from one console.

Ready to deploy AI on your terms?

Join organizations that need security, control, and a single integration point for every AI workload — without compromise.

Currently in private beta. US-based organizations with security requirements prioritized.