Nora gives your team a single place to deploy isolated AI agent runtimes, manage LLM provider keys, monitor performance, and connect integrations — all from a clean web dashboard. It is self-hostable under Apache 2.0, commercially usable without restriction, and designed to scale from a single-host evaluation up to enterprise-grade Proxmox, Kubernetes, or cloud deployments.Documentation Index
Fetch the complete documentation index at: https://noradocs.solomontsao.com/llms.txt
Use this file to discover all available pages before exploring further.
Quick Start
Install Nora, create your first account, and deploy an agent in under 15 minutes.
Self-Hosting
Run Nora on your own infrastructure with Docker Compose, Proxmox, or Kubernetes.
API Reference
Explore the full REST API for agents, providers, integrations, and monitoring.
Guides
Step-by-step walkthroughs for deploying agents, wiring up integrations, and more.
What you can do with Nora
Deploy Agents
Launch agent runtimes into isolated Docker or NemoClaw sandboxed containers with configurable CPU, RAM, and disk.
Manage LLM Providers
Store API keys for Anthropic, OpenAI, Google, Groq, NVIDIA, and 10+ more providers with encrypted at-rest storage.
Connect Integrations
Wire GitHub, Slack, Jira, AWS, and 60+ other tools directly to your running agents.
Monitor & Observe
Track agent health, CPU/memory metrics, LLM cost, and activity events in real time.
Get started in three steps
Install Nora
Run the one-line installer or use Docker Compose to stand up the full stack on your infrastructure.
Add an LLM provider
Open Settings in the dashboard and save an API key for Anthropic, OpenAI, or any supported provider. Keys are encrypted at rest with AES-256-GCM.
Nora is licensed under Apache 2.0 — you can self-host, modify, and use it commercially without restriction.