- ✨ Create background agents with full shell access
- E.g. "Generate a NotebookLM-style podcast from my saved articles every morning"
- 🔧 Connect any MCP server to add capabilities
- Add MCP servers and RowboatX handles the integration
- 🎯 Let RowboatX control and monitor your background agents
- Easily inspect state on the filesystem
Inspired by Claude Code, RowboatX brings the same shell-native power to background automations.
- Install RowboatX
npx @rowboatlabs/rowboatx
- Configure LLM (defaults to OpenAI)
edit ~/.rowboat/config/models.json
Then set your API key in your environment. Supports OpenAI, Ollama, Anthropic, Gemini, LMStudio, OpenRouter, LiteLLM
$ rowboatx
- Add MCP: 'Add this MCP server config: <config> '
- Explore tools: 'What tools are there in <server-name> '
$ rowboatx
- 'Create agent to do X.'
- '... Attach the correct tools from <mcp-server-name> to the agent'
- '... Allow the agent to run shell commands including ffmpeg'
$ rowboatx
- 'Make agent <background-agent-name> run every day at 10 AM'
- 'What agents do I have scheduled to run and at what times'
- 'When was <background-agent-name> last run'
- 'Are any agents waiting for my input or confirmation'
rowboatx --agent=<agent-name> --input="xyz" --no-interactive=true
rowboatx --agent=<agent-name> --run_id=<run_id> # resume from a previous run
You can configure your models in ~/.rowboat/config/models.json
{
"providers": {
"openai": {
"flavor": "openai"
},
"openai-compatible-host": {
"flavor": "openai",
"baseURL": "http://localhost:2000/...",
"apiKey": "...",
"headers": {
"foo": "bar"
}
},
"anthropic": {
"flavor": "anthropic"
},
"google": {
"flavor": "google"
},
"ollama": {
"flavor": "ollama"
}
},
"defaults": {
"provider": "openai",
"model": "gpt-5"
}
}To use Rowboat Classic UI (not RowboatX), refer to Classic.

