🚀 Deployment
Deploy Unified AI Router to various platforms.
🌈 Render.com
🖥️ Dashboard Method
Push to GitHub first:
bashgit push origin mainCreate Web Service on Render:
- Go to dashboard.render.com
- Click "New +" → "Web Service"
- Connect your GitHub repository
Configure Build Settings:
- Build Command:
npm install - Start Command:
npm start - Node Version: 24.x or higher
- Build Command:
Add Environment Variables:
bashOPENROUTER_API_KEY=your_key_here OPENAI_API_KEY=your_key_here PORT=3000Deploy and Test:
bashcurl https://your-app.onrender.com/health curl https://your-app.onrender.com/models
✅ Verify Deployment
bash
# Health check
curl https://your-app.onrender.com/health
# List available models
curl https://your-app.onrender.com/v1/models
# Test chat completions
curl -X POST https://your-app.onrender.com/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"messages":[{"role":"user","content":"Hello!"}],"model":"any"}'🚄 Railway
Install Railway CLI:
bashnpm install -g @railway/cliLogin and Initialize:
bashrailway login railway initSet Environment Variables:
bashrailway variables set OPENROUTER_API_KEY=your_key railway variables set OPENAI_API_KEY=your_keyDeploy:
bashrailway up
⚙️ Environment Configuration
🌍 Production .env
bash
# Required API Keys
OPENROUTER_API_KEY=sk-or-your-key
OPENAI_API_KEY=sk-your-openai-key
# Server Configuration
PORT=3000
NODE_ENV=production
# Optional: Circuit Breaker Settings
CIRCUIT_TIMEOUT=30000
CIRCUIT_ERROR_THRESHOLD=50
CIRCUIT_RESET_TIMEOUT=300000🔒 Security Considerations
- Never commit .env files to git
- Use platform-specific environment variable management
- Rotate API keys regularly
- Monitor API usage and costs
- Implement rate limiting if needed
📊 Monitoring
🏥 Health Checks
bash
# Basic health check
curl http://localhost:3000/health
# Provider status
curl http://localhost:3000/providers/statusLogging
Logs include:
- Provider fallback events
- Circuit breaker state changes
- API response times
- Error details
View logs:
bash
# Render
tail -f /var/log/app.log
# Heroku
heroku logs --tail
# Docker
docker logs -f container-name⚡ Performance Tips
- Use multiple API keys for load balancing
- Configure appropriate timeouts based on provider speeds
- Monitor circuit breaker settings to avoid premature failures
- Use streaming for better user experience with long responses
- Cache responses where appropriate