Minimal Terminal Agent powered by RLMs
Microcode is an efficient terminal-based AI agent with an internal REPL environment for coding assistance. It leverages Reasoning Language Models (RLMs) to help developers with coding tasks directly from the command line. Because we are solely using RLMs, it can handle extra large code snippets, file contents, and pasted content without dumping it directly into the context window. Try running with the --verbose flag to view the trajectories, or the "internal monologue" of the agent.
Warning: Microcode is currently in beta and does not yet have the standard guardrails (such as asking the user to accept changes) as production coding agents. Use at your own risk.
- Verbose Output - Enable verbose output for debugging
- Interactive CLI - Seamless conversational interface with AI models
- Multiple Model Support - Choose from various AI providers (OpenAI, Anthropic, Google, Qwen, etc.)
- MCP Integration - Model Context Protocol server support for extended capabilities
- Smart Caching - Persistent settings, API keys, and model configurations
- Rich Terminal UI - Beautiful output with markdown rendering, gradient banners, and status indicators
- Paste Support - Handle large code snippets and file contents with ease
- Configurable - Environment variables and CLI flags for full customization
Install Deno with one of the following:
curl -fsSL https://deno.land/install.sh | shbrew install deno- ripgrep (fast file search)
Install ripgrep with one of the following:
brew install ripgrepFor Debian/Ubuntu:
curl -LO https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep_14.1.1-1_amd64.deb
sudo dpkg -i ripgrep_14.1.1-1_amd64.debuv tool install microcodeuv tool upgrade microcodepip install microcodegit clone https://github.com/modaic-ai/microcode.git
cd microcode
uv sync # or: pip install -e .Microcode uses OpenRouter for model access. Set your API key using one of these methods:
-
Environment Variable (recommended for CI/CD):
export OPENROUTER_API_KEY="your-api-key"
-
Interactive Setup (persisted to cache):
microcode /key # Then enter your API key when prompted
| Variable | Description | Default |
|---|---|---|
OPENROUTER_API_KEY |
OpenRouter API key | - |
MICROCODE_MODEL |
Primary model ID | Auto-selected |
MICROCODE_SUB_LM |
Sub-model for auxiliary tasks | Auto-selected |
MICROCODE_MAX_ITERATIONS |
Max iterations per task | - |
MICROCODE_MAX_TOKENS |
Max tokens per response | - |
MICROCODE_MAX_OUTPUT_CHARS |
Max output characters | - |
MICROCODE_API_BASE |
Custom API base URL | - |
MICROCODE_VERBOSE |
Enable verbose logging (1/0) |
0 |
MODAIC_ENV / MICROCODE_ENV |
Environment (dev/prod) |
prod |
microcodemicrocode --help| Flag | Description |
|---|---|
--lm, -m |
Override primary model |
--sub-lm, -s |
Override sub-model |
--api-key, -k |
Provide API key directly |
--max-iterations |
Set max iterations |
--max-tokens |
Set max tokens |
--max-output-chars |
Set max output characters |
--api-base |
Custom API base URL |
--verbose, -v |
Enable verbose output |
--env |
Set environment (dev/prod) |
--history-limit |
Conversation history limit |
--no-banner |
Disable startup banner |
Once in the CLI, use these commands:
| Command | Description |
|---|---|
/help, /h, ? |
Show help menu |
/q, exit |
Exit the CLI |
/clear, /cls |
Clear the terminal screen |
/c |
Clear conversation history |
/key [key] |
Set OpenRouter API key (or enter interactively) |
/key clear |
Remove stored API key |
/model |
Change primary model via TUI selector |
/model <id> |
Set primary model directly |
/mcp add <name> <command> |
Add an MCP server |
| # | Model | Provider |
|---|---|---|
| 1 | GPT-5.2 Codex | OpenAI |
| 2 | GPT-5.2 | OpenAI |
| 3 | Claude Opus 4.5 | Anthropic |
| 4 | Claude Opus 4 | Anthropic |
| 5 | Qwen 3 Coder | Qwen |
| 6 | Gemini 3 Flash Preview | |
| 7 | Kimi K2 0905 | Moonshot AI |
| 8 | Minimax M2.1 | Minimax |
| 9 | Add your own openrouter model | Custom |
microcode task "Your task here"microcode/
├── main.py # Entry point and interactive CLI loop
├── pyproject.toml # Project configuration and dependencies
├── utils/
│ ├── __init__.py
│ ├── cache.py # API key and settings persistence
│ ├── constants.py # Colors, models, paths, and banner art
│ ├── display.py # Terminal rendering and UI utilities
│ ├── mcp.py # MCP server integration
│ ├── models.py # Model selection and configuration
│ └── paste.py # Clipboard and paste handling
└── tests/
└── test_main_settings.py
main.py- Orchestrates the interactive session, handles user input, manages conversation history, and invokes the RLM agent via Modaic'sAutoProgramutils/cache.py- Secure storage for API keys and user preferences using JSON filesutils/constants.py- Centralized configuration including available models, ANSI color codes, and file pathsutils/display.py- Terminal output formatting, markdown rendering, and the startup bannerutils/models.py- Model selection TUI using Textual, model ID normalization, and agent reconfigurationutils/mcp.py- Model Context Protocol server registration and managementutils/paste.py- Handles large text inputs via placeholder replacement
git clone https://github.com/modaic-ai/microcode.git
cd microcode
uv sync --devuv run pytest tests/Core dependencies:
- click / typer - CLI framework
- modaic - RLM program execution
- dspy - Language model programming
- mcp2py - MCP integration
- rich - Rich terminal output
- textual - Terminal UI framework
- prompt-toolkit - Input handling
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Homepage: https://modaic.dev/
- Repository: https://github.com/modaic-ai/microcode
- Issues: https://github.com/modaic-ai/microcode/issues
Built with by Modaic

