BotBeat
...
← Back

> ▌

AnthropicAnthropic
OPEN SOURCEAnthropic2026-03-17

rtk: Open-Source CLI Proxy Slashes LLM Token Consumption by 60-90%

Key Takeaways

  • ▸rtk reduces LLM token consumption by 60-90% through intelligent output filtering and compression of CLI command results
  • ▸The tool integrates seamlessly with Claude Code via an optional global Bash hook that transparently rewrites commands without user intervention
  • ▸A single 30-minute session showed 80% token reduction (from ~118,000 to ~23,900 tokens) while maintaining development context
Source:
Hacker Newshttps://github.com/rtk-ai/rtk↗

Summary

A new open-source command-line tool called rtk has been released that dramatically reduces token consumption for LLM-based coding assistants like Claude Code. The lightweight Rust-based proxy filters and compresses command outputs before they reach the LLM context window, achieving token savings of 60-90% across common development tasks. In a 30-minute Claude Code session, rtk reduced total token usage from approximately 118,000 to 23,900 tokens—an 80% reduction—while maintaining all necessary context for the AI assistant.

rtk works by applying four optimization strategies: smart filtering to remove noise, grouping to aggregate similar items, truncation to preserve relevant context, and deduplication to collapse repeated information. The tool supports a wide range of development workflows including git operations, file reading, search, and test runners. With zero dependencies, sub-10ms overhead, and easy installation via Homebrew or cargo, rtk integrates seamlessly into Claude Code through an optional global Bash hook that transparently rewrites commands to use compressed output.

The project addresses a critical pain point for developers using AI coding assistants: context window limitations and token costs. By reducing noise in command outputs—such as stripping comments from code summaries, condensing git diffs, and showing only test failures—rtk enables longer, more productive AI-assisted coding sessions without hitting token limits or incurring excessive API costs.

  • rtk uses four optimization strategies: smart filtering, grouping, truncation, and deduplication tailored to different command types (git, test runners, file operations)
  • Available as a lightweight open-source Rust binary with zero dependencies and <10ms overhead, installable via Homebrew or cargo

Editorial Opinion

rtk represents a pragmatic approach to optimizing AI-assisted development workflows. While not groundbreaking technically, its focus on reducing token consumption through intelligent output curation addresses a genuine pain point for developers using context-window-limited models. The high compression ratios (up to 90% for test outputs) suggest the tool successfully distinguishes signal from noise—a valuable capability as AI coding assistants become more integral to development practices.

Large Language Models (LLMs)AI AgentsMachine LearningOpen Source

More from Anthropic

AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Security Researcher Exposes Critical Infrastructure After Following Claude's Configuration Advice Without Authentication

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic's Claude Code Stores Unencrypted Session Data and Secrets in Plain Text

2026-04-04

Comments

Suggested

OracleOracle
POLICY & REGULATION

AI Agents Promise to 'Run the Business'—But Who's Liable When Things Go Wrong?

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
GitHubGitHub
PRODUCT LAUNCH

GitHub Launches Squad: Open Source Multi-Agent AI Framework to Simplify Complex Workflows

2026-04-05
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us