Using Dify
Dify is an open-source platform for creating AI-powered applications with a visual interface. This guide shows how to integrate Azerion Intelligence with Dify.
What is Dify?
Dify provides a no-code/low-code environment for building AI applications with visual workflow builders and multi-modal support.
Prerequisites
- A Dify account and installation (cloud or self-hosted)
- Your Azerion Intelligence API key from https://app.azerion.ai/account#api-tokens
Integration Steps
1. Add Azerion Intelligence as a Model Provider
-
Access Model Settings:
- Log into your Dify workspace
- Navigate to Settings → Model Provider
- Click Add Model Provider
-
Configure OpenAI-Compatible Provider:
- Select OpenAI-API-compatible as the provider type
- Set the following configuration:
Provider Name: Azerion Intelligence
Base URL: https://api.azerion.ai/v1
API Key: YOUR_AZERION_API_KEY -
Test Connection:
- Click Test to verify the connection
- If successful, click Save to add the provider
2. Configure Available Models
-
Add Model Configuration:
- In the Model Provider settings, click Add Model
- Configure the model:
Model Name: meta.llama3-3-70b-instruct-v1:0
Model Type: Text Generation
Context Length: 32768
Max Tokens: 4096
3. Create an Application
-
Create New App:
- Go to Studio and click Create App
- Choose your application type (Chatbot, Agent, or Workflow)
-
Select Model:
- Choose Azerion Intelligence as your provider
- Select meta.llama3-3-70b-instruct-v1:0 as the model
- Configure temperature and max tokens as needed
Basic Example
Simple chatbot configuration:
{
"model": {
"provider": "azerion-intelligence",
"name": "meta.llama3-3-70b-instruct-v1:0",
"mode": "chat",
"completion_params": {
"temperature": 0.7,
"max_tokens": 1024
}
}
}
Troubleshooting
Connection Issues
Problem: "Failed to connect to model provider"
Solutions:
- Verify your API key is correct and active
- Check that the base URL is set to
https://api.azerion.ai/v1
- Ensure your network allows outbound HTTPS requests
Model Not Available
Problem: "Model not found" error
Solutions:
- Verify the model name is correct:
meta.llama3-3-70b-instruct-v1:0
- Check if the model is available in your Azerion Intelligence plan
Rate Limiting
Problem: "Rate limit exceeded" errors
Solutions:
- Implement request queuing in your Dify workflows
- Consider upgrading your Azerion Intelligence plan
- Add retry logic with exponential backoff