Files
biblical-guide.com/temp/production-deployment-plan.md
Claude Assistant ee99e93ec2 Implement dynamic daily verse system with rotating Biblical content
- Add daily-verse API endpoint with 7 rotating verses in Romanian and English
- Replace static homepage verse with dynamic fetch from API
- Ensure consistent daily rotation using day-of-year calculation
- Support both ro and en locales for verse content

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-22 19:22:34 +00:00

302 lines
7.4 KiB
Markdown

# Biblical Guide - Production Deployment Plan
## Overview
This guide covers deploying the Biblical Guide (ghidul-biblic) application in production mode without local nginx, assuming you'll use a separate nginx proxy manager server.
## Current Application Status
- ✅ Next.js 15.5.3 application
- ✅ Database configured and working
- ✅ Running on port 3010 in development
- ✅ Multi-language support (English/Romanian)
- ✅ Docker configuration ready
## Production Deployment Options
### Option 1: Docker Compose Production (Recommended)
#### Prerequisites
1. Docker and Docker Compose installed
2. Environment variables configured
3. External nginx proxy manager configured to route to your server
#### Step 1: Environment Configuration
Create production environment file:
```bash
cp .env.example .env.production
```
Edit `.env.production` with production values:
```bash
# Database - Use strong password in production
DATABASE_URL=postgresql://bible_admin:STRONG_PASSWORD_HERE@postgres:5432/bible_chat
DB_PASSWORD=STRONG_PASSWORD_HERE
# Authentication - Generate secure secrets
NEXTAUTH_URL=https://yourdomain.com
NEXTAUTH_SECRET=generate-long-random-secret-here
JWT_SECRET=another-long-random-secret
# Azure OpenAI (if using AI features)
AZURE_OPENAI_KEY=your-azure-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
AZURE_OPENAI_DEPLOYMENT=gpt-4
# Ollama (optional - if using local AI)
OLLAMA_API_URL=http://your-ollama-server:11434
```
#### Step 2: Create Production Docker Compose
Create `docker-compose.prod.simple.yml`:
```yaml
version: '3.8'
services:
postgres:
image: pgvector/pgvector:pg16
restart: always
environment:
POSTGRES_DB: bible_chat
POSTGRES_USER: bible_admin
POSTGRES_PASSWORD: ${DB_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
- ./scripts/init.sql:/docker-entrypoint-initdb.d/init.sql
networks:
- bible_network
healthcheck:
test: ["CMD-SHELL", "pg_isready -U bible_admin -d bible_chat"]
interval: 30s
timeout: 10s
retries: 3
app:
build:
context: .
dockerfile: docker/Dockerfile.prod
restart: always
ports:
- "3010:3000" # Expose on port 3010 for external proxy
environment:
DATABASE_URL: postgresql://bible_admin:${DB_PASSWORD}@postgres:5432/bible_chat
AZURE_OPENAI_KEY: ${AZURE_OPENAI_KEY}
AZURE_OPENAI_ENDPOINT: ${AZURE_OPENAI_ENDPOINT}
AZURE_OPENAI_DEPLOYMENT: ${AZURE_OPENAI_DEPLOYMENT}
OLLAMA_API_URL: ${OLLAMA_API_URL}
JWT_SECRET: ${JWT_SECRET}
NEXTAUTH_URL: ${NEXTAUTH_URL}
NEXTAUTH_SECRET: ${NEXTAUTH_SECRET}
NODE_ENV: production
depends_on:
postgres:
condition: service_healthy
networks:
- bible_network
healthcheck:
test: ["CMD-SHELL", "curl -f http://localhost:3000/api/health || exit 1"]
interval: 30s
timeout: 10s
retries: 3
networks:
bible_network:
driver: bridge
volumes:
postgres_data:
```
#### Step 3: Deploy to Production
```bash
# Stop development server first
pkill -f "next dev"
# Load production environment
export $(cat .env.production | xargs)
# Build and start production services
docker-compose -f docker-compose.prod.simple.yml up -d --build
# Check status
docker-compose -f docker-compose.prod.simple.yml ps
docker-compose -f docker-compose.prod.simple.yml logs app
```
#### Step 4: Configure External Nginx Proxy Manager
Point your nginx proxy manager to:
- **Target**: `http://your-server-ip:3010`
- **Health Check**: `http://your-server-ip:3010/api/health`
### Option 2: Direct Node.js Production (Alternative)
#### Step 1: Build the Application
```bash
# Install production dependencies
npm ci --only=production
# Generate Prisma client
npx prisma generate
# Build the application
npm run build
```
#### Step 2: Start Production Server
```bash
# Set production environment
export NODE_ENV=production
export PORT=3010
export HOSTNAME=0.0.0.0
# Load environment variables
export $(cat .env.production | xargs)
# Start the production server
npm start
```
#### Step 3: Process Management (Optional)
Use PM2 for process management:
```bash
# Install PM2
npm install -g pm2
# Create ecosystem file
cat > ecosystem.config.js << 'EOF'
module.exports = {
apps: [{
name: 'ghidul-biblic',
script: 'npm',
args: 'start',
cwd: '/root/ghidul-biblic',
env: {
NODE_ENV: 'production',
PORT: 3010,
HOSTNAME: '0.0.0.0'
},
env_file: '.env.production'
}]
}
EOF
# Start with PM2
pm2 start ecosystem.config.js
pm2 save
pm2 startup
```
## Production Checklist
### Security
- [ ] Strong database passwords set
- [ ] JWT secrets generated (min 32 characters)
- [ ] NEXTAUTH_SECRET generated
- [ ] Environment files secured (not in git)
- [ ] Database not exposed to public internet
### Performance
- [ ] Application built with `npm run build`
- [ ] Database optimized for production
- [ ] Proper caching headers configured in proxy
- [ ] Health checks configured
### Monitoring
- [ ] Health endpoint accessible: `/api/health`
- [ ] Database connection monitoring
- [ ] Application logs configured
- [ ] Error tracking setup
### External Services
- [ ] Azure OpenAI configured (if using AI features)
- [ ] Ollama server configured (if using local AI)
- [ ] External nginx proxy manager configured
## Nginx Proxy Manager Configuration
### Proxy Host Settings
- **Domain Names**: `yourdomain.com`
- **Scheme**: `http`
- **Forward Hostname/IP**: `your-server-ip`
- **Forward Port**: `3010`
- **Cache Assets**: `Yes`
- **Block Common Exploits**: `Yes`
- **Websockets Support**: `Yes`
### SSL Configuration
- Enable SSL with Let's Encrypt or your certificate
- Force SSL redirect
- HTTP/2 Support
### Custom Nginx Configuration (Advanced)
```nginx
# Add to Custom Nginx Configuration in Proxy Manager
location /api/health {
access_log off;
}
location /_next/static {
expires 1y;
add_header Cache-Control "public, immutable";
}
client_max_body_size 10M;
```
## Troubleshooting
### Common Issues
1. **Port conflicts**: Ensure port 3010 is available
2. **Database connection**: Check DATABASE_URL format
3. **Environment variables**: Verify all required vars are set
4. **Build errors**: Check Node.js version compatibility
### Health Check Commands
```bash
# Check application health
curl http://localhost:3010/api/health
# Check Docker services
docker-compose -f docker-compose.prod.simple.yml ps
# View logs
docker-compose -f docker-compose.prod.simple.yml logs -f app
```
### Maintenance Commands
```bash
# Update application
git pull
docker-compose -f docker-compose.prod.simple.yml up -d --build
# Database backup
docker-compose -f docker-compose.prod.simple.yml exec postgres pg_dump -U bible_admin bible_chat > backup.sql
# View resource usage
docker stats
```
## Next Steps After Deployment
1. **Configure DNS** to point to your server
2. **Setup SSL certificate** in nginx proxy manager
3. **Configure monitoring** and alerting
4. **Setup automated backups** for database
5. **Test all functionality** in production environment
6. **Setup log rotation** and monitoring
## Performance Optimization
### Database
- Regular VACUUM and ANALYZE
- Monitor slow queries
- Configure connection pooling if needed
### Application
- Monitor memory usage
- Setup proper logging levels
- Configure rate limiting in proxy if needed
### Caching
- Static assets cached by proxy
- API responses cached where appropriate
- Database query optimization