There are two kinds of scenario to use Lima with AI:
- AI agents inside Lima: running an AI agent inside a VM
- AI agents outside Lima: calling Lima’s MCP tools from an AI agent running outside a VM
This is the multi-page printable view of this section. Click here to print.
There are two kinds of scenario to use Lima with AI:
Lima is useful for running AI agents (e.g., Codex, Claude, Gemini) so as to prevent them from directly reading, writing, or executing the host files.
Lima v2.0 is planned to be released with built-in templates for well-known AI agents.
For Lima v1.x, you can install AI agents in Lima manually.
e.g.,
lima sudo apt install -y npm
lima sudo npm install -g @google/gemini-cli
lima gemini
Starting with Lima v2.0, Lima provides Model Context Protocol (MCP) tools for reading, writing, and executing local files using a VM sandbox.
⚡ Requirement | Lima >= 2.0 |
---|
This page describes how to use Lima as an sandbox for Google Gemini CLI.
In addition to Gemini and Lima, make sure that limactl mcp
plugin is installed:
$ limactl mcp -v
limactl-mcp version 2.0.0-alpha.1
The limactl mcp
plugin is bundled in Lima since v2.0, however, it may not be installed
depending on the method of the installation.
limactl start --mount-only "$(pwd):w" default
Drop the :w
suffix if you do not want to allow writing to the mounted directory.
.gemini/extensions/lima/gemini-extension.json
as follows:{
"name": "lima",
"version": "2.0.0",
"mcpServers": {
"lima": {
"command": "limactl",
"args": [
"mcp",
"serve",
"default"
]
}
}
}
.gemini/settings.json
so as to disable Gemini CLI’s built-in tools
except ones that do not relate to local command execution and file I/O:{
"coreTools": ["WebFetchTool", "WebSearchTool", "MemoryTool"]
}
Just run gemini
in your project directory.
Gemini automatically recognizes the MCP tools provided by Lima.