Using Open WebUI
Open WebUI is an extensible, self-hosted web interface for interacting with LLMs that provides a ChatGPT-like experience you can deploy with your own AI models.
Prerequisites
- Docker installed on your system
- Your Azerion Intelligence API key from https://app.azerion.ai/account#api-tokens
Secure Your Credentials
Store your API key as environment variables. Never include API keys directly in configuration files.
Integration Steps
Quick Docker Setup
-
Run Open WebUI with Azerion Intelligence:
docker run -d \
--name open-webui \
-p 3000:8080 \
-e OPENAI_API_BASE_URL=https://api.azerion.ai/v1 \
-e OPENAI_API_KEY=your_azerion_api_key \
-v open-webui:/app/backend/data \
ghcr.io/open-webui/open-webui:main -
Access the Interface:
- Open
http://localhost:3000
in your browser - Create your admin account on first visit
- Start using Azerion Intelligence models
- Open
Docker Compose Setup
Create a docker-compose.yml
file:
version: '3.8'
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
ports:
- "3000:8080"
environment:
- OPENAI_API_BASE_URL=https://api.azerion.ai/v1
- OPENAI_API_KEY=${AZERION_API_KEY}
volumes:
- ./data:/app/backend/data
restart: unless-stopped
Create a .env
file:
AZERION_API_KEY=your_azerion_api_key_here
Run with: docker-compose up -d
Basic Example
Simple Docker run command for testing:
docker run -d \
--name azerion-webui \
-p 3000:8080 \
-e OPENAI_API_BASE_URL=https://api.azerion.ai/v1 \
-e OPENAI_API_KEY=sk-your-api-key-here \
ghcr.io/open-webui/open-webui:main
After running, visit http://localhost:3000
to start chatting with the meta.llama3-3-70b-instruct-v1:0
model.
Troubleshooting
API Connection Issues
Error: Cannot connect to Azerion Intelligence API
- Verify your API key is correct and active
- Check the API base URL:
https://api.azerion.ai/v1
- Ensure environment variables are properly set
Error: Invalid API key
- Confirm API key hasn't been revoked in your Azerion account
- Check for extra spaces or characters in the API key
- Verify the key has the correct permissions
Model Access Issues
Error: Model not found
- Ensure you're using supported model names like
meta.llama3-3-70b-instruct-v1:0
- Check your account has access to the requested models
- Verify your API key has sufficient credits
Container Issues
Container won't start:
# Check container logs
docker logs open-webui
# Verify environment variables
docker inspect open-webui
Port conflicts:
- Change the port mapping:
-p 8080:8080
instead of-p 3000:8080
- Check if port 3000 is already in use:
lsof -i :3000