Your AI Agents. Your Infrastructure. Your Rules.
A powerful self-hosted AI agent platform powered by Claude Code CLI — free for internal business use. Connect any LLM, run agents in isolated Docker containers, and process real files on your own servers. Then unlock the full Enterprise suite: Command Center, KPI Dashboards, IDE & Chat integration, Knowledge Graph, Database Semantic Layer, Code Migration & Review Agents — all built on the ezinsights.ai Data Intelligence & SDLC Framework.
# Clone & enter the repo $ git clone https://github.com/EzinsightsInc/EzCoworker $ cd ezcoworker-community # Build the agent container image (~5 min) $ docker build -f agent.Dockerfile -t claude-agent-image-community . # Download the skills from the skills folder and extract them in the relevant skill directory # Based on the OS \ If Windows: HOST_SKILLS_PATH=C:/claude_data_community/skills \ If MacOS: HOST_SKILLS_PATH=/opt/claude_data_community/skills \ Do the same for plugins \ Do the same for mcp-servers set up # Configure API keys, then launch $ cp .env.example .env && vi .env $ docker-compose up -d --build ✓ frontend http://localhost:3600 ✓ backend http://localhost:5600 ✓ litellm http://localhost:4000 (OpenAI/Gemini/Groq router) ✓ postgres ready ✓ 22 models discovered (anthropic · openai · gemini · groq · ollama) $
Watch EzCoworker handle real-world tasks — from data analysis to document generation to live code execution.
Install EzCoworker, connect Ollama, and run your first agent end-to-end.
⏱ ~5 minUpload a CSV, ask for analysis, watch the agent generate charts and a formatted PDF output.
⏱ ~2 minAdd a custom MCP server as an agent tool and watch it get called live during a session.
⏱ ~2 minCreate a SKILL.md, assign it to a user, and see how it shapes agent behavior in real-time.
⏱ ~2 minManage users, configure models, assign skills, and monitor storage from the admin UI.
⏱ ~6 minConfigure Ollama local models side by side with Anthropic and OpenAI cloud models.
⏱ ~0 minCommunity Edition gives you a complete, production-ready self-hosted AI agent platform — free for internal business use. The Enterprise tier layers on a full Data Intelligence & SDLC automation suite that redefines what an AI platform can do for your organisation.
Ollama for free local inference, or API keys for Anthropic Claude, OpenAI GPT, Google Gemini, Groq, and OpenRouter. Models auto-discovered and synced from your environment on every startup. Switch models per conversation. 22 models supported out of the box.
Every agent run spawns a dedicated Docker container with a per-user workspace volume. The container has full filesystem access, runs real code, and is destroyed after idle timeout. Absolute zero cross-user contamination — no shared memory, no shared state.
Connect any Model Context Protocol server as live agent tooling. SSE and stdio transports both supported. Per-user MCP configurations. Bundle tools with instruction prompts as reusable plugins — the agent gets the right tools injected automatically per conversation.
Upload files before or during a conversation. The agent reads from the input folder, processes data, and writes all outputs to a separate output folder. Download results instantly. Supports CSV, PDF, Excel, DOCX, PPTX, ZIP, images, code files and any file type the agent runtime can handle.
Markdown instruction files injected into the agent's system context at runtime. Each skill teaches the agent a specific domain or workflow — data analysis, PDF generation, web scraping, security audits, code review. Users enable the skills they need; unused skills are never injected.
Configurable 10-turn conversation history per user, persisted in PostgreSQL. The agent maintains context across multiple messages in the same session. Context survives page refreshes and re-connections. Each user's history is fully isolated from other users.
User registration and JWT authentication out of the box. Each user gets their own isolated workspace, skill assignments, and conversation history. An admin monitoring dashboard shows usage metrics, token consumption, storage, response times, and failure rates across all users and models.
Claude Code CLI + Python 3 + pandas + openpyxl + LibreOffice + WeasyPrint + reportlab + pdfplumber + python-pptx + Node.js baked into every agent container. Agents don't just write code — they execute it, process real files, and deliver actual outputs like PDFs, charts, and spreadsheets.
docker-compose up starts the frontend (nginx), backend (Node.js), PostgreSQL, and LiteLLM proxy together. Works on any Linux server, Mac, or Windows host with Docker Desktop. No Kubernetes, no cloud dependencies. Full production setup in under 10 minutes.
Claude Code CLI speaks Anthropic format natively. For OpenAI, Gemini, Groq, and Azure, a built-in LiteLLM gateway handles format translation. A custom stripping proxy removes any Anthropic-specific parameters before they reach providers that would reject them — no errors, no manual configuration per provider.
Vision capability auto-detected per model — all Claude 4.x, GPT-4o family, and Gemini models support image input. Reasoning models (o3, o4-mini, gemini-2.5-pro) identified and classified for complex tasks. The auto-router uses these flags to select the right model for the task at hand.
Auto mode analyses each message and routes to the most cost-effective model tier. Simple scripts go to capable local Ollama models. Data analysis and coding go to mid-tier cloud. Complex reasoning and agentic tasks go to flagship models. Skill context influences routing but never overrides intent for simple tasks.
Per-user and per-model token consumption, response times, storage breakdown, failed responses, and conversation counts — all tracked in PostgreSQL. Monitoring views with sortable, searchable tables. Activity trends from the actual DB creation date. Failed responses surfaced in red with error type classification.
Generate a public share link for any conversation. Recipients see the full message history and can download agent-generated output files without creating an account. Input/uploaded files are never exposed via share links — only agent outputs are publicly accessible.
Spawn multiple specialised agents that work in parallel on different aspects of the same task. A planner agent decomposes complex goals into sub-tasks, delegates to worker agents, and synthesises results. Dramatically reduces time-to-completion for large, multi-step projects.
An LLM-powered execution planner analyses each request, builds a structured task graph, and determines optimal agent composition, model selection, and execution order before a single agent fires. Results in significantly higher task success rates on complex, ambiguous requests.
A purpose-built executive intelligence hub that gives C-suite leaders a single conversational interface to every corner of the business. Ask natural-language questions across finance, ops, HR, and sales simultaneously. The Command Center dispatches specialised data agents, synthesises results in real-time, and delivers board-ready narrative reports with charts, variance analysis, and recommended actions — all without touching a BI tool.
ezinsights.ai Data IntelligenceLive KPI dashboards powered by AI agents, not static SQL. Define your business metrics once in plain language; the dashboard agent continuously queries connected data sources, detects anomalies, surfaces trend breakdowns, and pushes proactive alerts when targets drift. Supports department-level drill-down, cohort comparisons, and predictive trend lines. Exportable as PDF, PPTX, or live-shareable links for leadership reviews.
ezinsights.ai Data IntelligenceDeploy EzCoworker agents natively inside Zoho Cliq, Slack, WhatsApp, Microsoft Teams, and Discord. Users chat with specialised AI agents in the tools they already live in — no context switching, no separate UI. Webhook-based integration delivers rich formatted results, file attachments, and download links back to the originating channel. Team-level agent pools, channel-scoped skill assignments, and per-channel audit logs included.
Chat PlatformsFirst-class IDE plug-in for VS Code and JetBrains IDEs that surfaces EzCoworker's full agent suite directly in the editor sidebar. Developers trigger code review, security scans, migration suggestions, and documentation generation without leaving their workspace. The IDE extension routes requests through the same enterprise agent infrastructure — all governed by your org's model policies, skill assignments, and audit trail. Not a copilot autocomplete — a full agentic pipeline accessible from the editor.
Developer ToolingA persistent, enterprise-wide Knowledge Graph that agents use to understand entity relationships, business context, and domain ontologies across your entire organisation. Ingest documentation, code repositories, data dictionaries, org charts, and internal wikis. The graph enables agents to answer questions like "which services depend on this API?" or "what teams own this data domain?" — grounding every response in your real business context rather than hallucinated assumptions. Backed by a graph database with automatic relationship inference and a built-in MCP server interface.
ezinsights.ai Knowledge GraphA natural-language-to-SQL semantic layer that sits on top of any database — PostgreSQL, MySQL, Snowflake, BigQuery, Redshift, or Databricks. Business users describe what they want in plain English; the semantic layer agent translates intent into optimised, governance-compliant SQL, executes the query, and returns results as structured data, charts, or narrative summaries. Includes schema discovery, column-level data classification, query history, and a masking layer for PII fields — making every database safely queryable by any enterprise user without writing a single line of SQL.
ezinsights.ai Data OntologyA suite of specialised data agents from the ezinsights.ai Data Intelligence framework: the Profiler Agent automatically scans and classifies datasets; the Lineage Agent traces data from source to report; the Quality Agent monitors freshness, completeness, and consistency rules continuously; the Enrichment Agent joins internal data with external market or reference datasets; and the Insight Agent proactively surfaces non-obvious patterns and correlations without being asked. Together they form an always-on intelligence layer over your entire data estate.
ezinsights.ai Persona AgentsMulti-step agentic workflows for large-scale codebase migrations — Python 2 → 3, Java 8 → 21, Angular → React, monolith → microservices, on-prem SQL → cloud-native, and more. The Migration Agent analyses the full dependency graph, generates a phased migration plan, executes transformations file by file, runs the test suite after each change, resolves failures autonomously, and produces a detailed migration report. Far beyond find-and-replace — it understands idiom, library equivalence, and runtime semantics. Dramatically reduces migration timelines from months to days.
SDLC AutomationAutonomous design agents that translate product requirements, wireframes, and brand guidelines into production-ready UI components, design system tokens, and full page layouts. The Design Generation Agent ingests a plain-language brief or an existing Figma/Sketch artefact, applies your organisation's design system constraints, and outputs pixel-accurate HTML/CSS, React components, or Tailwind markup ready to hand directly to engineering. Covers accessibility compliance (WCAG 2.2), responsive breakpoints, dark/light theming, and component documentation. Dramatically compresses the design-to-code handoff from days to minutes — without losing design intent or brand consistency.
SDLC AutomationFull-stack code generation agents that go far beyond autocomplete. Describe a feature, a service, or an entire module in natural language — the Code Generation Agent analyses your existing codebase architecture, matches your team's patterns and naming conventions, generates implementation files, writes inline documentation, and opens a pull request. Supports REST and GraphQL API scaffolding, database schema generation with migration scripts, event-driven microservice boilerplate, and CLI tooling. The agent runs the generated code in an isolated Docker container, validates it compiles and passes basic smoke tests, and only submits the PR once it has verified its own output. Integrates with Jira and Linear to auto-link generated code to the originating ticket.
SDLC AutomationEnd-to-end test generation and execution agents that analyse your codebase, identify coverage gaps, and autonomously author unit, integration, and end-to-end test suites. The Test Automation Agent reads existing source code and documentation to infer intent, generates tests in your framework of choice — Jest, Pytest, JUnit, Playwright, Cypress, or Selenium — and runs them in an isolated container to confirm they pass before committing. On every pull request it identifies net-new code paths with no test coverage and automatically proposes the missing tests inline. For regression suites, it detects flaky tests, diagnoses root causes, and proposes fixes. Coverage reports, failure summaries, and trend analytics are surfaced in the engineering dashboard — giving QA leads full visibility without manual test management overhead.
SDLC AutomationAutomated, multi-dimensional code review that goes well beyond linting. The Code Review Agent covers security vulnerability scanning (OWASP Top 10, SSRF, injection), performance bottleneck identification, architecture pattern conformance, test coverage gap analysis, documentation completeness, and pull request summarisation for human reviewers. Integrates with GitHub, GitLab, and Bitbucket via webhooks — reviews are posted as inline PR comments the moment a pull request opens. Engineering leads configure review policies per repo; the agent enforces them consistently across every commit, every team, every time zone.
SDLC AutomationEzCoworker orchestrates Docker containers running Claude Code CLI, routing to your chosen LLM and managing isolated user workspaces automatically.
From zero to running AI agents on your own machine in under 10 minutes.
Clone EzCoworker Community Edition from GitHub. The repo includes the backend, frontend, agent Dockerfile, and all configuration templates.
git clone github.com/ezinsightsainc/ezcoworker
Build the Claude Code agent image with Python, LibreOffice, and all document tools baked in. Takes ~5 min first time.
docker build -f agent.Dockerfile -t claude-agent-image-community .
Copy the example env and add API keys. Set GEMINI_API_KEY for Google, OPENAI_API_KEY for GPT models. LiteLLM handles the translation automatically.
cp .env.example .env
Start all services — frontend, backend, PostgreSQL, and LiteLLM proxy — with one command.
docker-compose up -d --build
POST to the setup endpoint with your ADMIN_SETUP_KEY from .env to create the first admin account. Then register regular users via the web interface at localhost:3600.
POST /api/admin/setup
Open the platform, register a user, pick from auto-discovered models, and run your first agent session. Upload a file, ask the agent to process it, and download the output.
http://localhost:3600
No lock-in. Add as many providers as you want. Models are auto-discovered from API keys and your Ollama installation at startup.
Two comparisons: first against the leading commercial AI coworker products, then against open-source alternatives. EzCoworker wins on every dimension that matters for enterprise deployment.
| Capability | Claude Cowork Anthropic · SaaS · Paid · Jan 2026 |
Microsoft Copilot Cowork Microsoft · M365 SaaS · Mar 2026 |
EzCoworker
Community Free · Self-Hosted · IBU |
EzCoworker
Enterprise Contact for pricing |
|---|---|---|---|---|
| Self-hosted / on-premise | ✗ SaaS only | ✗ M365 cloud only | ✓ 100% | ✓ 100% |
| Your data stays on your servers | ✗ Anthropic cloud | ✗ Microsoft cloud | ✓ | ✓ |
| Any LLM provider (Ollama, OpenAI, Gemini, Groq…) | ✗ Claude only | ~ Claude + OpenAI only | ✓ 22 models, 6+ providers | ✓ |
| Free local LLM inference (Ollama) | ✗ | ✗ | ✓ | ✓ |
| Works outside Microsoft 365 / Office apps | ✓ Mac & Windows desktop | ✗ M365 apps only | ✓ Any workflow | ✓ |
| Multi-user platform with admin panel | ✗ Per-seat SaaS | ✗ Per-seat M365 licence | ✓ Unlimited users | ✓ |
| Isolated Docker agent containers per user | ✗ | ✗ | ✓ | ✓ |
| MCP tool integration + custom skills | ~ Limited | ~ Copilot Studio connectors only | ✓ Full MCP + skills | ✓ |
| File workspace — process & download outputs | ✓ Local files | ~ M365/SharePoint only | ✓ Any file type | ✓ |
| Zero per-seat / per-query SaaS fees | ✗ Per-seat billing | ✗ M365 E3/E5/E7 required | ✓ Free | ✓ Flat enterprise |
| CEO Command Center + KPI Dashboards | ✗ | ✗ | — | ✓ |
| IDE integration (VS Code, JetBrains) | ✗ | ✗ | — | ✓ All repos + orgs |
| Chat integrations (Slack, Teams, Cliq, Discord) | ✗ | ~ Teams only | — | ✓ All platforms |
| Knowledge Graph Server | ✗ | ~ Work IQ (M365 graph only) | — | ✓ Any data source |
| Database Semantic Layer (NL→SQL any warehouse) | ✗ | ✗ | — | ✓ |
| Smart Data Intelligence Agents (5 specialist) | ✗ | ✗ | — | ✓ |
| Code Migration Agents (full codebase, autonomous) | ✗ | ✗ | — | ✓ Fully autonomous |
| Code Review Agents (PR webhook, OWASP, perf) | ✗ | ✗ | — | ✓ + security + arch |
| Multi-agent parallel orchestration | ✗ | ~ Sequential M365 tasks only | — | ✓ |
| Open source — inspect, extend, own the code | ✗ Closed source | ✗ Closed source | ✓ | ✓ Core open |
Community Edition is the open-source foundation — free, self-hosted, and genuinely production-ready. Enterprise Edition is the only platform that extends that foundation all the way to full Data Intelligence and SDLC automation — capabilities no SaaS coworker or open-source alternative offers — built on the ezinsights.ai Data Intelligence & SDLC Framework.
Community Edition is free for internal business use — a complete self-hosted AI agent platform with multi-user support, MCP tools, file workspaces, and 22+ models. Enterprise adds the full ezinsights.ai Data Intelligence & SDLC suite — Command Center, KPI Dashboards, IDE & Chat integrations, Knowledge Graph, Semantic Layer, and specialist AI agents — contact sales@ezinsights.ai for pricing and onboarding.
| Capability | Community (Free · Internal Business Use) | Enterprise |
|---|---|---|
| Single agent execution | ✓ | ✓ |
| All LLM providers (Ollama, Anthropic, OpenAI, Google Gemini, Groq, OpenRouter) | ✓ | ✓ |
| LiteLLM proxy — translates Anthropic ↔ OpenAI format for all cloud providers | ✓ | ✓ |
| Latest 2026 models — claude-sonnet-4-6, gpt-4.1, gemini-3.1-pro, o3, o4-mini | ✓ | ✓ |
| Vision model flag (image input) + reasoning model flag (chain-of-thought) | ✓ | ✓ |
| MCP server support & plugin system | ✓ | ✓ |
| File workspace (upload / process / download) — handles any filename format | ✓ | ✓ |
| Shared chat links — visitors download output files without an account | ✓ | ✓ |
| Custom skills system | ✓ | ✓ |
| Multi-user platform + admin panel | ✓ | ✓ |
| Admin dashboard — failed response tracking (per user, model, day) | ✓ | ✓ |
| Admin tables — sortable & searchable across all views | ✓ | ✓ |
| Activity trends from DB creation date — no phantom empty rows | ✓ | ✓ |
| Conversation history (10 turns) | ✓ | ✓ |
| Docker Compose deployment | ✓ | ✓ |
| Multi-agent parallel execution | — | ✓ |
| Intent-based LLM execution planner | — | ✓ |
| Smart model tier routing | — | ✓ |
| CEO Command Center — conversational executive intelligence hub | — | ✓ |
| KPI Intelligence Dashboard — live AI-driven metrics with anomaly alerts | — | ✓ |
| Channel integrations (Zoho Cliq, Slack, WhatsApp, Teams, Discord) | — | ✓ |
| IDE integration — VS Code & JetBrains sidebar agent | — | ✓ |
| Knowledge Graph Server — org-wide entity & relationship intelligence | — | ✓ |
| Database Semantic Layer — natural-language-to-SQL over any warehouse | — | ✓ |
| Smart Data Intelligence Agents (Profiler, Lineage, Quality, Enrichment, Insight) | — | ✓ |
| Code Migration Agents — automated large-scale codebase migrations | — | ✓ |
| Code Review Agents — security, perf, architecture, PR comments via webhook | — | ✓ |
| Advanced admin intelligence dashboard — cost attribution & anomaly detection | — | ✓ |
| SSO/SAML + SOC 2 / ISO 27001 / GDPR audit logging | — | ✓ |
| Priority support & SLA | — | ✓ |
//
Internal Business Use License
EzCoworker Community Edition is free to deploy and use within your organisation for internal business purposes. You may not resell, sublicense, or offer it as a hosted service to third parties. Modifications for internal use are permitted; redistributed forks require written approval from ezinsights.ai. For commercial redistribution, SaaS deployments, or OEM embedding, contact
sales@ezinsights.ai.Deploy Community Edition free, or reach out to unlock the full Enterprise Data Intelligence & SDLC suite — the most powerful AI agent platform available anywhere.