Skip to content

modaic-ai/microcode

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Octopus

Microcode

Minimal Terminal Agent powered by RLMs

Python Version License Version

Microcode is an efficient terminal-based AI agent with an internal REPL environment for coding assistance. It leverages Reasoning Language Models (RLMs) to help developers with coding tasks directly from the command line. Because we are solely using RLMs, it can handle extra large code snippets, file contents, and pasted content without dumping it directly into the context window. Try running with the --verbose flag to view the trajectories, or the "internal monologue" of the agent.

Warning: Microcode is currently in beta and does not yet have the standard guardrails (such as asking the user to accept changes) as production coding agents. Use at your own risk.

Microcode

Features

  • Verbose Output - Enable verbose output for debugging
  • Interactive CLI - Seamless conversational interface with AI models
  • Multiple Model Support - Choose from various AI providers (OpenAI, Anthropic, Google, Qwen, etc.)
  • MCP Integration - Model Context Protocol server support for extended capabilities
  • Smart Caching - Persistent settings, API keys, and model configurations
  • Rich Terminal UI - Beautiful output with markdown rendering, gradient banners, and status indicators
  • Paste Support - Handle large code snippets and file contents with ease
  • Configurable - Environment variables and CLI flags for full customization

Installation

Prerequisites

  • Python 3.11 or higher
  • uv (recommended) or pip
  • Deno (RLM runtime for the REPL)

Install Deno with one of the following:

curl -fsSL https://deno.land/install.sh | sh
brew install deno

Install ripgrep with one of the following:

brew install ripgrep

For Debian/Ubuntu:

curl -LO https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep_14.1.1-1_amd64.deb
sudo dpkg -i ripgrep_14.1.1-1_amd64.deb

Install via uv (recommended)

uv tool install microcode

Upgrade via uv

uv tool upgrade microcode

Install via pip

pip install microcode

Install from source

git clone https://github.com/modaic-ai/microcode.git
cd microcode
uv sync  # or: pip install -e .

Configuration

API Key Setup

Microcode uses OpenRouter for model access. Set your API key using one of these methods:

  1. Environment Variable (recommended for CI/CD):

    export OPENROUTER_API_KEY="your-api-key"
  2. Interactive Setup (persisted to cache):

    microcode
    /key  # Then enter your API key when prompted

Environment Variables

Variable Description Default
OPENROUTER_API_KEY OpenRouter API key -
MICROCODE_MODEL Primary model ID Auto-selected
MICROCODE_SUB_LM Sub-model for auxiliary tasks Auto-selected
MICROCODE_MAX_ITERATIONS Max iterations per task -
MICROCODE_MAX_TOKENS Max tokens per response -
MICROCODE_MAX_OUTPUT_CHARS Max output characters -
MICROCODE_API_BASE Custom API base URL -
MICROCODE_VERBOSE Enable verbose logging (1/0) 0
MODAIC_ENV / MICROCODE_ENV Environment (dev/prod) prod

Usage

Starting the CLI

microcode

CLI Options

microcode --help
Flag Description
--lm, -m Override primary model
--sub-lm, -s Override sub-model
--api-key, -k Provide API key directly
--max-iterations Set max iterations
--max-tokens Set max tokens
--max-output-chars Set max output characters
--api-base Custom API base URL
--verbose, -v Enable verbose output
--env Set environment (dev/prod)
--history-limit Conversation history limit
--no-banner Disable startup banner

Interactive Commands

Once in the CLI, use these commands:

Command Description
/help, /h, ? Show help menu
/q, exit Exit the CLI
/clear, /cls Clear the terminal screen
/c Clear conversation history
/key [key] Set OpenRouter API key (or enter interactively)
/key clear Remove stored API key
/model Change primary model via TUI selector
/model <id> Set primary model directly
/mcp add <name> <command> Add an MCP server

Available Models

# Model Provider
1 GPT-5.2 Codex OpenAI
2 GPT-5.2 OpenAI
3 Claude Opus 4.5 Anthropic
4 Claude Opus 4 Anthropic
5 Qwen 3 Coder Qwen
6 Gemini 3 Flash Preview Google
7 Kimi K2 0905 Moonshot AI
8 Minimax M2.1 Minimax
9 Add your own openrouter model Custom

Single Task Mode

microcode task "Your task here"

Project Structure

microcode/
├── main.py              # Entry point and interactive CLI loop
├── pyproject.toml       # Project configuration and dependencies
├── utils/
│   ├── __init__.py
│   ├── cache.py         # API key and settings persistence
│   ├── constants.py     # Colors, models, paths, and banner art
│   ├── display.py       # Terminal rendering and UI utilities
│   ├── mcp.py           # MCP server integration
│   ├── models.py        # Model selection and configuration
│   └── paste.py         # Clipboard and paste handling
└── tests/
    └── test_main_settings.py

Key Components

  • main.py - Orchestrates the interactive session, handles user input, manages conversation history, and invokes the RLM agent via Modaic's AutoProgram
  • utils/cache.py - Secure storage for API keys and user preferences using JSON files
  • utils/constants.py - Centralized configuration including available models, ANSI color codes, and file paths
  • utils/display.py - Terminal output formatting, markdown rendering, and the startup banner
  • utils/models.py - Model selection TUI using Textual, model ID normalization, and agent reconfiguration
  • utils/mcp.py - Model Context Protocol server registration and management
  • utils/paste.py - Handles large text inputs via placeholder replacement

Development

Setting Up Development Environment

git clone https://github.com/modaic-ai/microcode.git
cd microcode
uv sync --dev

Running Tests

uv run pytest tests/

Dependencies

Core dependencies:

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Links


Built with by Modaic

About

context-efficient terminal agent powered by an RLM

Resources

License

Stars

Watchers

Forks

Packages

No packages published