Configuration

The heart of AnyLLM's flexibility lies in its configuration file, anyllm.json. This file allows you to define and manage connections to various AI providers and models.

The anyllm.json file

This JSON file is where you'll configure all your AI providers. It's structured to be easy to read and manage. The main provider object holds a collection of individual provider configurations.

Getting Started

When you first start AnyLLM, it will look for the anyllm.json file in the root of the project. If it doesn't exist, you'll need to create it.

This documentation section will guide you through setting up different types of providers and models.