Quick Start
Get started with FreeInference in 5 minutes.
Step 1: Get Your API Key
Register for a free account
Log in and create your API key from the dashboard
Step 2: Choose Your IDE
Cursor
Open Settings (
Cmd + ,orCtrl + ,)Go to API Keys → Enter your API key
Click Override OpenAI Base URL → Enter:
https://freeinference.org/v1Enable the toggle and start coding
Codex
Create
~/.codex/config.toml:model = "glm-4.6" model_provider = "free_inference" [model_providers.free_inference] name = "FreeInference" base_url = "https://freeinference.org/v1" wire_api = "chat" env_http_headers = { "X-Session-ID" = "CODEX_SESSION_ID", "Authorization" = "FREEINFERENCE_API_KEY" }
Add to
~/.zshrc:export CODEX_SESSION_ID="$(date +%Y%m%d-%H%M%S)-$(uuidgen)" export FREEINFERENCE_API_KEY="Bearer your-api-key-here"
Reload:
source ~/.zshrc
Roo Code / Kilo Code
Install extension in your IDE
Settings → OpenAI Compatible
Base URL:
https://freeinference.org/v1API Key:
your-api-key-here
Step 3: Choose a Model
See available models and select one that fits your needs.
Next Steps
Integration Guides - Detailed setup and troubleshooting
Available Models - Model specifications and features
Need Help?
Having issues? Check the integration guide for troubleshooting.