The Agentic CLI

CONNECT ANY LLM.

It doesn't matter if it's local or external. Work with whatever's most convenient for you.

ANYLLM v2.5
> Explain this file @src/App.php and fix bug on line 42
Think...
✦ Applying changes to: src/App.php
│ 41 - return $data;
│ 42 + return $data ?? [];

Integration Made Simple

No installers, no complex environment variables, no headaches.

Quick Install
curl -sSfL https://anyllm.tech/install.sh | sh

Why AnyLLM is better

Stop being limited by your tools.

The Problem with Others

  • Other tools lock you into their cloud models.
  • Local model support is often broken or an afterthought.
  • Overengineered agents confuse small models like Phi-3 or DeepSeek (small).

The AnyLLM Way

  • Connect to ANY provider: Local Ollama, Gemini, OpenAI, Other.
  • Lightweight agent logic: 100% reliable even on 7B models.
  • No Bloat: Clean code with zero dependencies.
Feature Competitors AnyLLM
Local Model Support Poor / Hard to config Native (Ollama/GGUF)
Agent logic for 7B models They may not connect Optimized & Simple
Vendor Lock-in High Zero

Technical Superiority

AnyLLM isn't just another wrapper. It's a re-engineered approach to CLI-AI interaction.

⚡️

Zero-Runtime Overhead

While others rely on heavy Python environments or massive Node_modules, AnyLLM runs on native PHP. Memory footprint: < 40MB. Performance is instant, even on low-end VPS.

🎯

Deterministic Logic

Competitors use complex "Chain-of-Thought" that confuses 7B models. Our Atomic Agent Logic breaks tasks into binary steps, making local LLMs as reliable as GPT-4.

🔓

True Privacy

No telemetry. No "middleman" servers. Your requests go directly from your terminal to your local Ollama or chosen API. 100% OpenAI compatible.

Beyond a Simple Chat:
Your Terminal, Empowered.

01
Legacy Code Surgeon

Pass an entire legacy file using @filename and ask AnyLLM to refactor it. It reads, analyzes, and applies diffs precisely.

02
Context-Aware Grep

Stop guessing. AnyLLM uses its [[GREP]] tool to find logic across your project and explain how pieces connect.

03
Server-Side Assistant

Since it's PHP-based, you can run it on almost any production or staging server where Python is forbidden or unavailable.

Real Utility Example
> @src/Database.php Find potential SQL injections and fix them.
"AnyLLM found 3 vulnerable queries, performed a project-wide search for similar patterns, and applied patches in 12 seconds."

Switching from
OpenCode or Codex?

We made AnyLLM compatible with common patterns. Your muscle memory stays, but the limitations disappear.

  • Standard OpenAI Headers
  • Familiar @file syntax
  • Common Tool-Calling
  • Easy History Export
Migration Time
< 2 min
average setup time for existing OpenAI / Ollama users
🛠

Transparent Agent

Logic is simplified so even small models (Phi-3, DeepSeek 7B) can manage files without errors.

🏠

Local-First

Full support for Ollama and local GGUF models. No cloud lock-in.