Skip to main content

Simple, Transparent Pricing

Open source core with enterprise features when you need them. Start free, scale when ready.

Open Source

Free

Full detection pipeline for individual developers and small projects.

Install from PyPI
  • 5-stage detection pipeline
  • ML classifier (F1: 0.98)
  • 12 LLM provider backends
  • 9 SDK integrations
  • Deception modes (honeypot, tarpit, redirect)
  • CEF/SIEM logging
  • Apache 2.0 license
  • Community support via GitHub
Most Popular

Pro

$499 /mo

$4,990/yr (save 17%)

For teams securing production LLM applications. Priority support and enterprise features.

Start 30-Day Free Trial
  • Everything in Open Source
  • Multi-tenant isolation
  • Role-based access control (RBAC)
  • SQLite persistent storage
  • ML drift monitoring and alerts
  • Webhook alerts (Slack, Teams, PagerDuty)
  • Threat intelligence (STIX 2.1 export)
  • Output scanning (secrets, PII, URLs)
  • MCP Guard (tool call validation)
  • Agent policy enforcement
  • Usage metering dashboard
  • Priority email support
  • Quarterly security briefing

Enterprise

Custom

For organizations with compliance requirements, air-gap deployments, or high-volume workloads.

Contact Sales
  • Everything in Pro
  • Air-gapped / SCIF deployment
  • Custom LLM judge tuning
  • Dedicated red team assessment (57 scenarios)
  • MITRE ATLAS threat mapping
  • NIST AI RMF alignment report
  • OWASP LLM Top 10 coverage report
  • CMMC 2.0 compliance documentation
  • Dedicated support engineer
  • Custom SLA (99.9%+ uptime)
  • On-site deployment assistance
  • Federal procurement support (SDVOSB)

The ROI Case

847%
Annual ROI at 100K msgs/day
1.9 mo
Average payback period
85-95%
LLM inference calls eliminated

Based on quantitative analysis across three deployment scenarios. Shield's tiered pipeline eliminates 85-95% of expensive LLM judge calls, paying for itself through infrastructure savings alone. Request the full ROI report.

Frequently Asked Questions

What counts as a "deployment"?

One deployment is a single Shield instance protecting one application or API endpoint. If you run multiple applications, each needs its own license.

Can I try Pro features before buying?

Yes. Start a free 30-day pilot with full Pro features enabled. No credit card required. Install from PyPI and contact us to activate.

Do you support air-gapped environments?

Yes. Oubliette Shield runs fully offline with Ollama or llama.cpp backends. No internet connectivity required. Available on Enterprise plans.

How does federal procurement work?

Oubliette Security is a veteran-owned small business registered on SAM.gov with SDVOSB certification pending. We support GSA Schedule, micro-purchase, and sole-source procurement under FAR 19.1405. Contact us for federal pricing.

What LLM providers are supported?

All 12 backends are available on every tier: OpenAI, Anthropic, Azure, Bedrock, Vertex AI, Gemini, Ollama, llama.cpp, Transformers, LiteLLM, and more.

Is there a discount for annual billing?

Yes. Annual Pro billing is $4,990/year, saving 17% compared to monthly. Enterprise plans are quoted annually.

Not Sure Which Plan?

Start with the open source package. Upgrade when you need enterprise features, support, or compliance documentation.