Skip to content

liter-llm

Universal LLM API client -- one Rust core, 11 native language bindings, 142 providers.

liter-llm gives you a single, unified interface to 142 LLM providers -- OpenAI, Anthropic, Google, AWS Bedrock, Groq, Mistral, and many more -- with native bindings for Python, TypeScript, Go, Java, Ruby, PHP, C#, Elixir, WebAssembly, and C/FFI.

Built in Rust for performance, safety, and reliability.

[Quick Start](getting-started/quickstart.md){ .md-button .md-button--primary } [Installation](getting-started/installation.md){ .md-button } [GitHub](https://github.com/kreuzberg-dev/liter-llm){ .md-button }
  • Getting Started


    Install liter-llm in your language of choice and make your first API call in minutes.

    Installation

  • 142 Providers


    Access OpenAI, Anthropic, Google, AWS Bedrock, Groq, Mistral, and 130+ more through one interface.

    Providers

  • Architecture


    Understand the Rust core, Tower middleware stack, and how language bindings work.

    Architecture

  • API Reference


    Complete API documentation for all 11 supported languages.

    Python | TypeScript | Go

Why liter-llm?

A universal LLM API client, compiled from the ground up in Rust. No interpreter, no transitive dependency tree, no supply chain surface area. The kind of supply chain attack that hit litellm is structurally impossible here.

API keys are wrapped in secrecy::SecretString, observability is built in via OpenTelemetry, and middleware is composable via Tower. We give credit to litellm for proving the category -- see ATTRIBUTIONS.md.

Key Features

  • Polyglot -- Native bindings for 11 languages from a single Rust core
  • 142 Providers -- OpenAI, Anthropic, Google, Bedrock, Groq, Mistral, and more
  • Streaming -- First-class SSE and AWS EventStream support
  • Observability -- Built-in OpenTelemetry with GenAI semantic conventions
  • Type Safe -- Compile-time checked types across all bindings
  • Secure -- API keys wrapped in secrecy::SecretString, never logged or exposed
  • Middleware -- Composable Tower stack: rate limiting, caching, cost tracking, health checks, fallback
  • Tool Calling -- Parallel tools, structured outputs, JSON schema validation

Quick Example

import asyncio
import os
from liter_llm import LlmClient

async def main() -> None:
    client = LlmClient(api_key=os.environ["OPENAI_API_KEY"])
    response = await client.chat(
        model="openai/gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}],
    )
    print(response.choices[0].message.content)

asyncio.run(main())
import { LlmClient } from "@kreuzberg/liter-llm";

const client = new LlmClient({ apiKey: process.env.OPENAI_API_KEY! });
const response = await client.chat({
  model: "openai/gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);
use liter_llm::{
    ChatCompletionRequest, ClientConfigBuilder, DefaultClient, LlmClient,
    Message, UserContent, UserMessage,
};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let config = ClientConfigBuilder::new(std::env::var("OPENAI_API_KEY")?)
        .build();
    let client = DefaultClient::new(config, Some("openai/gpt-4o"))?;

    let request = ChatCompletionRequest {
        model: "openai/gpt-4o".into(),
        messages: vec![Message::User(UserMessage {
            content: UserContent::Text("Hello!".into()),
            name: None,
        })],
        ..Default::default()
    };

    let response = client.chat(request).await?;
    if let Some(choice) = response.choices.first() {
        println!("{}", choice.message.content.as_deref().unwrap_or(""));
    }
    Ok(())
}
package main

import (
 "context"
 "fmt"
 "os"

 llm "github.com/kreuzberg-dev/liter-llm/packages/go"
)

func main() {
 client := llm.NewClient(llm.WithAPIKey(os.Getenv("OPENAI_API_KEY")))
 resp, err := client.Chat(context.Background(), &llm.ChatCompletionRequest{
  Model: "openai/gpt-4o",
  Messages: []llm.Message{
   llm.NewTextMessage(llm.RoleUser, "Hello!"),
  },
 })
 if err != nil {
  panic(err)
 }
 if len(resp.Choices) > 0 && resp.Choices[0].Message.Content != nil {
  fmt.Println(*resp.Choices[0].Message.Content)
 }
}

Part of kreuzberg.dev

liter-llm is built by the kreuzberg.dev team -- the same people behind Kreuzberg (document extraction for 91+ formats), tree-sitter-language-pack, and html-to-markdown. All our libraries share the same Rust-core, polyglot-bindings architecture.

Community