New /ai/chat SSE endpoint is live via a fresh Elysia AI plugin, proxying streaming chat through TanStack AI to OpenAI/Anthropic/Gemini/Ollama/Grok (with per-request x-provider-api-key so keys never hit storage). Users can now build real-time chat UIs against the Epicenter server with better disconnect cleanup and clearer provider errors.

AI chat streaming lands in server - EpicenterHQ/epicenter