Skip to content
HomeDocsBlogAbout
English 简体中文 繁體中文 Español Français Русский العربية Português Deutsch 日本語 한국어 हिन्दी

AI Provider Setup

Moraya has a built-in AI assistant panel that supports multiple LLM providers with streaming responses.

Opening the AI Panel

Toggle the AI chat panel with Cmd+Shift+I (macOS) or Ctrl+Shift+I (Windows/Linux). The panel opens as a 340px sidebar on the right side of the editor.

Configuring Providers

Open Settings (Cmd+,) > AI tab. Select your provider and enter the required API key:

ProviderAPI Key RequiredNotes
Claude (Anthropic)YesAnthropic API key from console.anthropic.com
OpenAIYesOpenAI API key from platform.openai.com
Gemini (Google)YesGoogle AI Studio key from aistudio.google.com
DeepSeekYesDeepSeek API key from platform.deepseek.com
OllamaNoRuns locally. Install Ollama and pull a model first.

Custom Endpoints

Moraya supports any OpenAI-compatible API endpoint. Enter the base URL and API key in the custom provider section.

Streaming

All providers support streaming via SSE (Server-Sent Events) or ReadableStream. Responses appear character-by-character in the AI panel.