Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llm7.io/llms.txt

Use this file to discover all available pages before exploring further.

Start here

Quickstart

Install the SDK, configure the client, and make your first requests.

Text generation

Streaming

Stream tokens for low-latency UIs.

Function calling

Let the model call your functions with structured arguments.

JSON mode

Force valid JSON outputs for structured responses.

Available models

See model options and recommended selectors.