OyaAI Documentation

The runtime platform for AI Employees

OyaAI is built to power truly native AI agencies. We run AI Employees more like software systems than chatbots so they can do real operational work at production scale.

What is OyaAI?

OyaAI is a runtime platform for AI Employees. You describe the role: an SDR that emails 300 leads a day, an executive assistant that triages your inbox, an SEO manager that audits sites every Monday. The platform assembles the agent: a soul (persona plus behavior rules), the skills it needs (web search, email, Slack, your CRM), the routines that run on a schedule, and the knowledge base it consults. Each AI Employee gets a chat surface, an OpenAI-compatible API key, webhooks, and integrations into wherever your work happens. They run continuously in isolated sandboxes.

The problem we are solving

Most AI agencies today rely on prompt-based wrappers around LLMs. They break easily because token-only systems are inherently non-deterministic. The same input produces different outputs, retries cascade, and small drift compounds across long workflows. Building AI Employees that can reliably perform real operational work requires far deeper infrastructure than a clever prompt.

The honest tooling alternatives today are limited:

  • Hand-coded Python on a VPS that only one person on the team can edit.
  • A no-code automation tool that breaks the moment a workflow needs anything custom.
  • A chat agent that cannot do anything outside the chat window.

OyaAI replaces that gap with infrastructure: a proprietary orchestration layer, a sandboxed skills runtime, and an execution engine where only ~20% of work runs on tokens and the other ~80% is structured, deterministic compute. That ratio is the difference between a demo and an AI Employee that handles real load.

Why OyaAI

80% structured compute, 20% tokens

LLMs handle reasoning. Everything else (data fetching, schema validation, scheduling, integrations, file IO, retries) runs on deterministic infrastructure. The result: agents that behave predictably across thousands of runs, not demos that break on the second prompt.

Skills are the deterministic primitives

Every integration is a versioned skill in a catalog: Python code in an isolated sandbox, with a typed input/output schema. Drop them into an agent like LEGO. When something is missing, write a 50-line skill, import it, and reuse it across every future agent.

Orchestration handles the messy parts

Routines, triggers, retries, credential rotation, multi-channel identity, run history with full LLM traces, and the sandbox runtime are all part of the platform, not glue code you maintain.

One identity, many channels

The same AI Employee chats in your web app, replies on Slack, takes Telegram messages, fires on webhooks, and runs scheduled jobs. One memory, one personality, one API key.

Agency-native

Templates compress build time from weeks to minutes. Sub-accounts let one operator run dozens of customer deployments from a single login. Consolidated billing rolls every customer's usage up to the agency account.

No black box

Read every persona, behavior rule, skill, and routine an agent uses. Run history shows the LLM trace, tool calls, and sandbox stdout. Export any agent as a YAML spec and version it in git.

Built for AI agencies

For agencies and resellers serving 50+ customers

OyaAI partners with agencies that package and deploy production-grade AI Employees to their clients. One operator can run dozens of customer sub-accounts from a single login. Templates compress build time from weeks to minutes. Consolidated billing rolls every customer's usage up to the agency account.

Operator playbook lives in the Partners docs: multi-customer accounts, template authoring, customer impersonation, consolidated billing.

Getting started

Build your first AI Employee

Walk through the Agent Builder: describe the role, pick skills, connect platforms, deploy. Five steps end to end.

Or pick a pre-built AI Employee from the gallery in the app and customize what you need.

Pick your path

Three audiences, three focused manuals.