Engine Labs Docs home page
Search...
⌘K
Support
Open Engine
Open Engine
Search...
Navigation
Configuration
LLM Models
Documentation
Enterprise
Home
Documentation
FAQ
Get Started
Introducing Engine
Quickstart
Essentials
Best Practices
Prompting Tips
Task Guidelines
Development Environment
Setup Agent (beta)
Task Runner (VM)
Test Your Task Runner Setup
Environment Variables
Working with Engine
Triggering Tasks
Issues and Tickets
Pull Requests
Task Status
Configuration
LLM Models
Prompt
Memory (beta)
Git Settings
Workflow Settings
Integrations
MCP
Github
Gitlab
Linear
Github Issues
Trello
Jira
Clickup
Slack
Admin
Pricing
Teams
Troubleshooting
Support
Configuration
LLM Models
Use Engine with most frontier LLMs
Engine works with all of the best frontier LLMs for code and we add new models all the time.
Access to some models requires a paid subscription.
You can set the model on a per repo basis from the repository settings menu.
Supported models:
Anthropic Claude 4 Sonnet
Anthropic Claude 4 Opus
Anthropic Claude 3.7 Sonnet
Anthropic Claude 3.5 Sonnet
Google Gemini 2.5 Pro Preview (beta)
OpenAI GPT 4.1
OpenAI o3 (3x usage costs)
Open AI o4 Mini
Performance may vary on some models. Be prepared to experiment when changing Engine’s model
LLM models are
never
trained or fine-tuned on any of your code or Engine usage.
Task Status
Prompt
Assistant
Responses are generated using AI and may contain mistakes.