Skip to content

Installation

liter-llm has prebuilt packages for every supported language. Pick your stack, run one command, and start calling models.

Every package includes prebuilt binaries for Linux (x86_64 / aarch64), macOS (Apple Silicon), and Windows. No Rust toolchain needed unless you're building from source.

CLI / Docker

The CLI runs the proxy server and MCP tool server. You don't need it if you're only using a language binding.

brew tap kreuzberg-dev/tap
brew install liter-llm
cargo install liter-llm-cli
docker pull ghcr.io/kreuzberg-dev/liter-llm:latest
docker run -p 4000:4000 -e LITER_LLM_MASTER_KEY=sk-your-key ghcr.io/kreuzberg-dev/liter-llm

Start the proxy:

liter-llm api --config liter-llm-proxy.toml

Or the MCP server:

liter-llm mcp --transport stdio

Proxy Server docs   MCP Server docs


Choose your language

Requires Python 3.10+

pip install liter-llm

Or with uv:

uv add liter-llm

Requires Node.js 18+

pnpm add @kreuzberg/liter-llm

Or with npm / yarn:

npm install @kreuzberg/liter-llm
# or
yarn add @kreuzberg/liter-llm

Requires Rust 1.75+ (stable)

cargo add liter-llm

Requires Go 1.23+

go get github.com/kreuzberg-dev/liter-llm/packages/go

Requires Java 17+ (Panama FFM)

Maven:

<dependency>
    <groupId>dev.kreuzberg</groupId>
    <artifactId>liter-llm</artifactId>
    <version>1.4.0-rc.17</version>
</dependency>

Gradle:

implementation("dev.kreuzberg:liter-llm:1.4.0-rc.17")

Requires Ruby 3.2+

gem install liter_llm

Or add to your Gemfile:

gem "liter_llm"

Requires PHP 8.2+

composer require kreuzberg/liter-llm

Requires .NET 8+

dotnet add package LiterLlm

Requires Elixir 1.14+ / OTP 25+

Add to mix.exs:

defp deps do
  [
    {:liter_llm, "~> 1.4.0-rc.17"}
  ]
end

Then run:

mix deps.get
pnpm add @kreuzberg/liter-llm-wasm

Build from source (requires Rust toolchain):

git clone https://github.com/kreuzberg-dev/liter-llm.git
cd liter-llm
cargo build --release -p liter-llm-ffi

The shared library and C header are output to target/release/.


API Key Setup

Set the environment variable for the provider you're calling:

export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GOOGLE_API_KEY="..."
export GROQ_API_KEY="gsk_..."
export MISTRAL_API_KEY="..."
export AWS_ACCESS_KEY_ID="..."
export AWS_SECRET_ACCESS_KEY="..."

You only need one key

If you only call OpenAI models, only OPENAI_API_KEY is needed. liter-llm resolves the provider from the model prefix (e.g. openai/gpt-4o) and picks the matching key automatically.

You can also pass the key at client construction:

from liter_llm import LlmClient

client = LlmClient(api_key="sk-...")
import { LlmClient } from "@kreuzberg/liter-llm";

const client = new LlmClient({ apiKey: "sk-..." });
use liter_llm::{ClientConfigBuilder, DefaultClient};

let config = ClientConfigBuilder::new("sk-...").build();
let client = DefaultClient::new(config, None)?;

Don't hard-code keys in source files

Use environment variables or a secret manager. Keys passed to LlmClient are wrapped in secrecy::SecretString and never logged.


Verify it works

python -c "from liter_llm import LlmClient; print('ok')"
node -e "import('@kreuzberg/liter-llm').then(m => { new m.LlmClient({ apiKey: 'test' }); console.log('ok') })"
cargo build
go build ./...

Building from source

If prebuilt binaries aren't available for your platform, build from source. You'll need the Rust toolchain (stable 1.75+):

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
git clone https://github.com/kreuzberg-dev/liter-llm.git
cd liter-llm
task build

Next steps