Skip to main content

Using LangChain

What is LangChain?

LangChain is a framework for developing applications with large language models, providing tools for chaining operations, managing memory, and building AI agents.

Prerequisites

Install LangChain:

pip install langchain langchain-openai python-dotenv
Secure Your Credentials

Store your API key as an environment variable. Never hardcode API keys in your source code.

Integration Steps

1. Environment Setup

Create a .env file:

AZERION_API_KEY=your_azerion_api_key_here

2. Configure Azerion Intelligence

import os
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv

# Load environment variables
load_dotenv()

# Configure Azerion Intelligence as OpenAI-compatible
llm = ChatOpenAI(
openai_api_key=os.getenv("AZERION_API_KEY"),
openai_api_base="https://api.azerion.ai/v1",
model_name="meta.llama3-3-70b-instruct-v1:0",
temperature=0.7
)

Basic Example

from langchain_core.messages import HumanMessage, SystemMessage

# Create messages
messages = [
SystemMessage(content="You are a helpful AI assistant."),
HumanMessage(content="What are the benefits of renewable energy?")
]

# Get response
response = llm.invoke(messages)
print(response.content)

Troubleshooting

Common Issues

Authentication Error:

Error: Incorrect API key provided
Solution: Verify your API key is correct and properly loaded from environment variables

Connection Error:

Error: Connection timeout or refused
Solution: Verify the API base URL is set to "https://api.azerion.ai/v1"

Model Not Found:

Error: Model not found
Solution: Ensure you're using a valid model name like "meta.llama3-3-70b-instruct-v1:0"

Rate Limiting:

Error: Rate limit exceeded
Solution: Implement retry logic with exponential backoff or reduce request frequency