AI Provider Setup
Moraya has a built-in AI assistant panel that supports multiple LLM providers with streaming responses.
Opening the AI Panel
Toggle the AI chat panel with Cmd+Shift+I (macOS) or Ctrl+Shift+I (Windows/Linux). The panel opens as a 340px sidebar on the right side of the editor.
Configuring Providers
Open Settings (Cmd+,) > AI tab. Select your provider and enter the required API key:
| Provider | API Key Required | Notes |
|---|---|---|
| Claude (Anthropic) | Yes | Anthropic API key from console.anthropic.com |
| OpenAI | Yes | OpenAI API key from platform.openai.com |
| Gemini (Google) | Yes | Google AI Studio key from aistudio.google.com |
| DeepSeek | Yes | DeepSeek API key from platform.deepseek.com |
| Ollama | No | Runs locally. Install Ollama and pull a model first. |
Custom Endpoints
Moraya supports any OpenAI-compatible API endpoint. Enter the base URL and API key in the custom provider section.
Streaming
All providers support streaming via SSE (Server-Sent Events) or ReadableStream. Responses appear character-by-character in the AI panel.