1
0

feat: update README to reflect model pricing display, enhanced context compaction, and new interactive/JSON-based configuration.

This commit is contained in:
2026-03-04 12:09:03 +01:00
parent d12a75bcfb
commit e6a607f3c4

View File

@@ -24,7 +24,8 @@ This eliminates:
- **Interactive REPL**: Chat with an AI model to edit files, manage directories, and execute commands
- **Slash Commands**: `/setup`, `/help`, `/exit`, `/clear`, `/status`, `/compact`
- **Token Tracking**: Real-time token usage and cost per response, plus session totals
- **Context Compaction**: Automatic conversation history compression when approaching context limits
- **Model Pricing Display**: Shows current model pricing from OpenRouter in the header
- **Context Compaction**: Automatic conversation history compression when approaching context limits, including stale tool result compaction
- **Comprehensive Toolset**: 15 tools for file operations, editing, directory management, and command execution
- **AOT-Ready**: Native AOT compilation for ~12 MB binaries with no .NET runtime dependency
- **Rich CLI**: Beautiful terminal output using Spectre.Console with tables, rules, and colored text
@@ -40,13 +41,11 @@ This eliminates:
## Quick Start
```bash
# Option 1: Set environment variables
export ANCHOR_API_KEY=your_key_here
export ANCHOR_MODEL=qwen3.5-27b # optional, default: gpt-4o
# Run the application
dotnet run --project AnchorCli
# Option 2: Use interactive setup
# First time? The app will prompt you to run /setup
# Or run it explicitly:
dotnet run --project AnchorCli
/setup
```
@@ -60,20 +59,11 @@ dotnet publish AnchorCli -r linux-x64 -c Release
The resulting binary is ~12 MB, has no .NET runtime dependency, and starts instantly.
## Environment Variables
| Variable | Default | Description |
|---|---|---|
| `ANCHOR_API_KEY` | *(required)* | API key for the LLM provider |
| `ANCHOR_ENDPOINT` | `https://openrouter.ai/api/v1` | Any OpenAI-compatible endpoint |
| `ANCHOR_MODEL` | `gpt-4o` | Model name |
| `ANCHOR_MAX_TOKENS` | `4096` | Max response tokens |
## Slash Commands
| Command | Description |
|---|---|
| `/setup` | Run interactive TUI to configure API key, model, endpoint |
| `/setup` | Run interactive TUI to configure API key and model (also accessible via `anchor setup` subcommand) |
| `/help` | Show available tools and commands |
| `/exit` | Exit the application |
| `/clear` | Clear the conversation history |
@@ -113,7 +103,7 @@ The resulting binary is ~12 MB, has no .NET runtime dependency, and starts insta
```
AnchorCli/
├── Program.cs # Entry point + REPL loop + AI client setup
├── AnchorConfig.cs # Environment variable configuration
├── AnchorConfig.cs # JSON file-based configuration (~APPDATA~\anchor\config.json)
├── ContextCompactor.cs # Conversation history compression
├── AppJsonContext.cs # Source-generated JSON context (AOT)
├── Hashline/
@@ -137,7 +127,7 @@ AnchorCli/
## How It Works
1. **Setup**: Configure API credentials via environment variables or `/setup` command
1. **Setup**: Configure API credentials via the `/setup` command (or `anchor setup` subcommand)
2. **REPL Loop**: You interact with the AI through a conversational interface
3. **Tool Calling**: The AI can call any of the available tools to read/edit files, manage directories, or execute commands
4. **Hashline Validation**: All file edits are validated using the Hashline technique to ensure precision