The self-hosted AI platform for chat, assistants & apps
Run chat, configurable assistants, and installable apps. Full control of your code and data.
npx create-ai-portal@latestor
Get StartedEverything you need to run AI in production
One platform for chat, assistants, and apps. Add agents and apps; optionally install plugins. Configure once, scale with your team.
Chat & Central Assistant
Built-in chat UI and configurable Central assistant. Connect OpenAI, Gemini, or your own LLM via Admin.
Assistants & Apps
Register Agent APIs (assistants) and Applications (apps with UI). Each has its own alias and endpoint. Multi-agent workflows supported.
Plugins (optional)
Install optional plugins from Admin β Plugins when you need extra features. Not required by default.
Self-hosted
Docker Compose stack: PostgreSQL, MinIO, backend, frontend. You own the data and the infrastructure.
Setup wizard
First run: /setup for language, branding, database, admin account, and optional Central LLM.
Admin panel
Manage agents, applications, system settings, and users from a single admin UI.
Solutions for every use case
From side projects to production. AI-Portal adapts to your needs.
For Developers
Integrate AI-Portal with your stack. REST APIs, Docker, and a clear Admin for agents and apps.
Quick Start βFor Teams
Central assistant and custom assistants. Share one portal across the team with role-based access.
Documentation βEnterprise
Self-host on your infrastructure. Full control over data, compliance, and scaling.
Deploy guide βBuilt for developers
Get started in minutes. No lock-in. Use the tools you already love.
Quick Start
Create a new project with one command and run with Docker.
npx create-ai-portal@latestDocumentation
Setup, Admin, agents, applications, and system settings.
GitHub
Source code, issues, and contributions.
npm
create-ai-portal package for scaffolding.
LakeFlow
Data Lake pipelines for embeddings & RAG. Ingest docs, embed with Qdrant, semantic search API.
Ecosystem
AI-Portal works alongside complementary open-source projects for a complete AI stack.
LakeFlow β Data Lake pipelines for Vector DB & AI
Ingest raw documents (PDF, Excel), run staged pipelines, produce embeddings, and store in Qdrant. Expose semantic search and embedding APIs for RAG, LLM, and AI-Portal agents.
- Ingest & pipeline
- Embeddings & Qdrant
- Semantic search API
Ready to deploy?
Start with a free, self-hosted AI-Portal. Run in Docker in minutes.
Open source (MIT). No account required. You own your data.