feat: Apple-style donation-focused landing page + Azure OpenAI fixes
Major updates: - Replace homepage with clean, minimalist Apple-style landing page - Focus on donation messaging and mission statement - Add comprehensive AI chat analysis documentation - Fix Azure OpenAI configuration with correct endpoints - Update embedding API to use text-embedding-ada-002 (1536 dims) Landing Page Features: - Hero section with tagline "Every Scripture. Every Language. Forever Free" - Mission statement emphasizing free access - Matthew 10:8 verse highlight - 6 feature cards (Global Library, Multilingual, Prayer Wall, AI Chat, Privacy, Offline) - Donation CTA sections with PayPal and card options - "Why It Matters" section with dark background - Clean footer with navigation links Technical Changes: - Updated .env.local with new Azure credentials - Fixed vector-search.ts to support separate embed API version - Integrated AuthModal into Bible reader and prayers page - Made prayer filters collapsible and mobile-responsive - Changed language picker to single-select Documentation Created: - AI_CHAT_FIX_PLAN.md - Comprehensive implementation plan - AI_CHAT_VERIFICATION_FINDINGS.md - Database analysis - AI_CHAT_ANALYSIS_SUMMARY.md - Executive summary - AI_CHAT_STATUS_UPDATE.md - Current status and next steps - logo.svg - App logo (MenuBook icon) Build: ✅ Successful (Next.js 15.5.3) 🤖 Generated with Claude Code Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
402
AI_CHAT_ANALYSIS_SUMMARY.md
Normal file
402
AI_CHAT_ANALYSIS_SUMMARY.md
Normal file
@@ -0,0 +1,402 @@
|
||||
# AI Chat System Analysis - Executive Summary
|
||||
|
||||
**Date:** 2025-10-10
|
||||
**Analyst:** Claude Code
|
||||
**Status:** 🔴 Critical Issues Found - Requires User Action
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Bottom Line
|
||||
|
||||
The AI chat system has **excellent infrastructure** (vector database, search algorithms) but is blocked by **two critical issues**:
|
||||
|
||||
1. **❌ Azure OpenAI Not Configured** - No deployments exist or are accessible
|
||||
2. **❌ Wrong Bible Versions** - Priority languages (Romanian, Spanish, Italian) are NOT in database
|
||||
|
||||
**Good News:**
|
||||
- ✅ Ollama embedding model is being installed now (alternative to Azure)
|
||||
- ✅ Vector search code is production-ready
|
||||
- ✅ Database has 116 fully-embedded Bible versions
|
||||
|
||||
---
|
||||
|
||||
## 📊 System Status Report
|
||||
|
||||
### Vector Database: ✅ EXCELLENT (100%)
|
||||
|
||||
| Component | Status | Details |
|
||||
|-----------|--------|---------|
|
||||
| PostgreSQL Connection | ✅ Working | v17.5 |
|
||||
| pgvector Extension | ✅ Installed | v0.8.0 |
|
||||
| Schema `ai_bible` | ✅ Exists | Ready |
|
||||
| Total Vector Tables | ✅ 116 tables | 100% embedded |
|
||||
| Languages Supported | ⚠️ 47 languages | BUT missing priority ones |
|
||||
|
||||
### AI API Status: ❌ BLOCKED
|
||||
|
||||
| Service | Status | Issue |
|
||||
|---------|--------|-------|
|
||||
| Azure OpenAI Chat | ❌ Not Working | Deployment `gpt-4o` not found (404) |
|
||||
| Azure OpenAI Embeddings | ❌ Not Working | Deployment `embed-3` not found (404) |
|
||||
| Ollama (Local AI) | 🔄 Installing | `nomic-embed-text` downloading now |
|
||||
|
||||
### Vector Search Code: ✅ READY
|
||||
|
||||
| Feature | Status | Location |
|
||||
|---------|--------|----------|
|
||||
| Multi-table search | ✅ Implemented | `/lib/vector-search.ts:109` |
|
||||
| Hybrid search (vector + text) | ✅ Implemented | `/lib/vector-search.ts:163` |
|
||||
| Language filtering | ✅ Implemented | Table pattern: `bv_{lang}_{version}` |
|
||||
| Chat integration | ✅ Implemented | `/app/api/chat/route.ts:190` |
|
||||
|
||||
---
|
||||
|
||||
## 🚨 Critical Issue #1: Wrong Bible Versions
|
||||
|
||||
### User Requirements vs. Reality
|
||||
|
||||
**What You Need:**
|
||||
- ✅ English
|
||||
- ❌ Romanian (ro)
|
||||
- ❌ Spanish (es)
|
||||
- ❌ Italian (it)
|
||||
|
||||
**What's in Database:**
|
||||
- ✅ English: 9 versions (KJV, ASV, etc.)
|
||||
- ❌ Romanian: **NOT FOUND**
|
||||
- ❌ Spanish: **NOT FOUND**
|
||||
- ❌ Italian: **NOT FOUND**
|
||||
|
||||
### What IS in the Database (47 Languages)
|
||||
|
||||
The 116 tables contain mostly obscure languages:
|
||||
- `ab` (Abkhazian), `ac` (Acholi), `ad` (Adangme), `ag` (Aguacateca), etc.
|
||||
- German (de), Dutch (nl), French (fr) ✓
|
||||
- But **NO Romanian, Spanish, or Italian**
|
||||
|
||||
### Where These Tables Came From
|
||||
|
||||
Looking at your environment variable:
|
||||
```bash
|
||||
BIBLE_MD_PATH=./bibles/Biblia-Fidela-limba-romana.md
|
||||
LANG_CODE=ro
|
||||
TRANSLATION_CODE=FIDELA
|
||||
```
|
||||
|
||||
You have Romanian Bible data (`Fidela`) but it's **NOT in the vector database yet**.
|
||||
|
||||
---
|
||||
|
||||
## 🚨 Critical Issue #2: Azure OpenAI Not Configured
|
||||
|
||||
### The Problem
|
||||
|
||||
Your `.env.local` has:
|
||||
```bash
|
||||
AZURE_OPENAI_DEPLOYMENT=gpt-4o
|
||||
AZURE_OPENAI_EMBED_DEPLOYMENT=embed-3
|
||||
```
|
||||
|
||||
But when we try to access these deployments:
|
||||
```
|
||||
❌ Error 404: DeploymentNotFound
|
||||
"The API deployment for this resource does not exist"
|
||||
```
|
||||
|
||||
### Tested All Common Names - None Work
|
||||
|
||||
We automatically tested these deployment names:
|
||||
- Chat: `gpt-4`, `gpt-4o`, `gpt-35-turbo`, `gpt-4-32k`, `chat`, `gpt4`, `gpt4o`
|
||||
- Embeddings: `text-embedding-ada-002`, `text-embedding-3-small`, `embed`, `embed-3`, `ada-002`
|
||||
|
||||
**Result:** All returned 404
|
||||
|
||||
### What This Means
|
||||
|
||||
Either:
|
||||
1. **No deployments have been created yet** in your Azure OpenAI resource
|
||||
2. **Deployments have custom names** that we can't guess
|
||||
3. **API key doesn't have access** to the deployments
|
||||
|
||||
### How to Check
|
||||
|
||||
1. Go to Azure Portal: https://portal.azure.com
|
||||
2. Find your resource: `azureopenaiinstant.openai.azure.com`
|
||||
3. Click "Deployments" or "Model deployments"
|
||||
4. **Screenshot what you see** and share deployment names
|
||||
|
||||
---
|
||||
|
||||
## ✅ The Good News: Ollama Alternative
|
||||
|
||||
### Ollama is Available Locally
|
||||
|
||||
We found Ollama running on your server:
|
||||
- URL: `http://localhost:11434`
|
||||
- Chat model installed: `llama3.1:latest` ✅
|
||||
- Embedding model: `nomic-embed-text` (downloading now... ~260MB)
|
||||
|
||||
### What Ollama Can Do
|
||||
|
||||
| Capability | Status |
|
||||
|------------|--------|
|
||||
| Generate embeddings | ✅ Yes (once download completes) |
|
||||
| Vector search queries | ✅ Yes |
|
||||
| Generate chat responses | ✅ Yes (using llama3.1) |
|
||||
| **Cost** | ✅ **FREE** (runs locally) |
|
||||
|
||||
### Ollama vs. Azure OpenAI
|
||||
|
||||
| Feature | Ollama | Azure OpenAI |
|
||||
|---------|--------|--------------|
|
||||
| Cost | Free | Pay per token |
|
||||
| Speed | Fast (local) | Moderate (network) |
|
||||
| Quality | Good | Excellent |
|
||||
| Multilingual | Good | Excellent |
|
||||
| Configuration | ✅ Working now | ❌ Broken |
|
||||
|
||||
---
|
||||
|
||||
## 🎬 What Happens Next
|
||||
|
||||
### Option A: Use Ollama (Can Start Now)
|
||||
|
||||
**Pros:**
|
||||
- ✅ Already working on your server
|
||||
- ✅ Free (no API costs)
|
||||
- ✅ Fast (local processing)
|
||||
- ✅ Can generate embeddings for Romanian/Spanish/Italian Bibles
|
||||
|
||||
**Cons:**
|
||||
- ⚠️ Slightly lower quality than GPT-4
|
||||
- ⚠️ Requires local compute resources
|
||||
|
||||
**Implementation:**
|
||||
1. Wait for `nomic-embed-text` download to complete (~2 minutes)
|
||||
2. Update `.env.local` to prefer Ollama:
|
||||
```bash
|
||||
OLLAMA_API_URL=http://localhost:11434
|
||||
OLLAMA_EMBED_MODEL=nomic-embed-text
|
||||
```
|
||||
3. Create embeddings for Romanian/Spanish/Italian Bibles
|
||||
4. Chat will use `llama3.1` for responses
|
||||
|
||||
### Option B: Fix Azure OpenAI (Requires Azure Access)
|
||||
|
||||
**Pros:**
|
||||
- ✅ Higher quality responses (GPT-4)
|
||||
- ✅ Better multilingual support
|
||||
- ✅ Scalable for many users
|
||||
|
||||
**Cons:**
|
||||
- ❌ Costs money per API call
|
||||
- ❌ Requires Azure Portal access
|
||||
- ❌ Blocked until deployments are created
|
||||
|
||||
**Implementation:**
|
||||
1. Log into Azure Portal
|
||||
2. Go to Azure OpenAI resource
|
||||
3. Create two deployments:
|
||||
- Chat: Deploy `gpt-4` or `gpt-35-turbo` (name it anything)
|
||||
- Embeddings: Deploy `text-embedding-ada-002` or `text-embedding-3-small`
|
||||
4. Update `.env.local` with actual deployment names
|
||||
5. Test with our verification script
|
||||
|
||||
### Option C: Hybrid (Best of Both)
|
||||
|
||||
Use Ollama for embeddings (free) + Azure for chat (quality):
|
||||
|
||||
```bash
|
||||
# Use Ollama for embeddings
|
||||
OLLAMA_API_URL=http://localhost:11434
|
||||
OLLAMA_EMBED_MODEL=nomic-embed-text
|
||||
|
||||
# Use Azure for chat (once fixed)
|
||||
AZURE_OPENAI_DEPLOYMENT=<your-deployment-name>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📋 Required Actions (In Order)
|
||||
|
||||
### Immediate (Today)
|
||||
|
||||
1. **Decision:** Choose Option A (Ollama), B (Azure), or C (Hybrid)
|
||||
|
||||
2. **If Ollama (Option A or C):**
|
||||
- ✅ Download is in progress
|
||||
- Wait 2-5 minutes for completion
|
||||
- Test with: `curl -X POST http://localhost:11434/api/embeddings -d '{"model":"nomic-embed-text","prompt":"test"}'`
|
||||
|
||||
3. **If Azure (Option B or C):**
|
||||
- Log into Azure Portal
|
||||
- Navigate to Azure OpenAI resource
|
||||
- Check/create deployments
|
||||
- Share deployment names
|
||||
|
||||
### Short-term (This Week)
|
||||
|
||||
4. **Get Romanian Bible Data:**
|
||||
- Source: `/bibles/Biblia-Fidela-limba-romana.md` (already exists!)
|
||||
- Need: Cornilescu version (if available)
|
||||
- Action: Create embeddings and import
|
||||
|
||||
5. **Get Spanish Bible Data:**
|
||||
- Source needed: RVR1960 (Reina-Valera 1960)
|
||||
- Optional: NVI (Nueva Versión Internacional)
|
||||
- Action: Find source, create embeddings, import
|
||||
|
||||
6. **Get Italian Bible Data:**
|
||||
- Source needed: Nuova Diodati
|
||||
- Optional: Nuova Riveduta
|
||||
- Action: Find source, create embeddings, import
|
||||
|
||||
### Medium-term (Next 2 Weeks)
|
||||
|
||||
7. **Implement English Fallback:**
|
||||
- When Romanian/Spanish/Italian searches return poor results
|
||||
- Automatically search English versions
|
||||
- Add language indicator in citations: `[KJV - English] John 3:16`
|
||||
|
||||
8. **Create Version Config Table:**
|
||||
- Track which versions are complete
|
||||
- Map versions to languages
|
||||
- Enable smart fallback logic
|
||||
|
||||
9. **Testing:**
|
||||
- Test Romanian queries → Romanian results
|
||||
- Test Spanish queries → Spanish results
|
||||
- Test Italian queries → Italian results
|
||||
- Test fallback when needed
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Technical Details
|
||||
|
||||
### Current Database Schema
|
||||
|
||||
Table naming pattern:
|
||||
```
|
||||
ai_bible.bv_{language_code}_{version_abbreviation}
|
||||
|
||||
Examples:
|
||||
- ai_bible.bv_en_eng_kjv ✅ Exists (English KJV)
|
||||
- ai_bible.bv_ro_cornilescu ❌ Needed (Romanian Cornilescu)
|
||||
- ai_bible.bv_es_rvr1960 ❌ Needed (Spanish RVR1960)
|
||||
- ai_bible.bv_it_nuovadiodati ❌ Needed (Italian Nuova Diodati)
|
||||
```
|
||||
|
||||
### Table Structure (All 116 tables have this)
|
||||
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `id` | uuid | Primary key |
|
||||
| `testament` | text | OT/NT |
|
||||
| `book` | text | Book name |
|
||||
| `chapter` | integer | Chapter number |
|
||||
| `verse` | integer | Verse number |
|
||||
| `language` | text | Language code |
|
||||
| `translation` | text | Version abbreviation |
|
||||
| `ref` | text | "Genesis 1:1" format |
|
||||
| `text_raw` | text | Verse text |
|
||||
| `text_norm` | text | Normalized text |
|
||||
| `tsv` | tsvector | Full-text search index |
|
||||
| **`embedding`** | vector | **Vector embedding (3072 dims)** |
|
||||
| `created_at` | timestamp | Creation time |
|
||||
| `updated_at` | timestamp | Update time |
|
||||
|
||||
### Embedding Dimensions
|
||||
|
||||
Current `.env.local` says:
|
||||
```bash
|
||||
EMBED_DIMS=3072
|
||||
```
|
||||
|
||||
This matches:
|
||||
- ✅ Azure `text-embedding-3-small` (3072 dims)
|
||||
- ✅ Azure `text-embedding-3-large` (3072 dims)
|
||||
- ❌ Azure `text-embedding-ada-002` (1536 dims) - **INCOMPATIBLE**
|
||||
- ✅ Ollama `nomic-embed-text` (768 dims default, but can use 3072)
|
||||
|
||||
**Important:** If using Ollama, we may need to adjust embedding dimensions or re-create tables.
|
||||
|
||||
---
|
||||
|
||||
## 💡 Recommendations
|
||||
|
||||
### My Recommendation: Start with Ollama
|
||||
|
||||
**Why:**
|
||||
1. ✅ It's already working (or will be in 5 minutes)
|
||||
2. ✅ Free (no API costs while developing)
|
||||
3. ✅ Can immediately create Romanian embeddings from your `Fidela` Bible
|
||||
4. ✅ Unblocks development
|
||||
|
||||
**Then:**
|
||||
- Add Azure OpenAI later for higher quality (when deployments are fixed)
|
||||
- Use hybrid: Ollama for embeddings, Azure for chat
|
||||
|
||||
### Workflow I Suggest
|
||||
|
||||
```
|
||||
Today:
|
||||
→ Finish installing Ollama embedding model
|
||||
→ Test embedding generation
|
||||
→ Create embeddings for Fidela Romanian Bible
|
||||
→ Import into ai_bible.bv_ro_fidela
|
||||
→ Test Romanian chat
|
||||
|
||||
This Week:
|
||||
→ Fix Azure deployments (for better chat quality)
|
||||
→ Find Spanish RVR1960 data
|
||||
→ Find Italian Nuova Diodati data
|
||||
→ Create embeddings for both
|
||||
→ Import into database
|
||||
|
||||
Next Week:
|
||||
→ Implement English fallback
|
||||
→ Add version metadata table
|
||||
→ Create test suite
|
||||
→ Optimize performance
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📞 Questions for You
|
||||
|
||||
1. **AI Provider:** Do you want to use Ollama (free, local) or fix Azure OpenAI (better quality, costs money)?
|
||||
|
||||
2. **Azure Access:** Do you have access to the Azure Portal to check/create deployments?
|
||||
|
||||
3. **Bible Data:** Do you have Spanish (RVR1960) and Italian (Nuova Diodati) Bible data, or do we need to source it?
|
||||
|
||||
4. **Fidela Bible:** The file `./bibles/Biblia-Fidela-limba-romana.md` exists - should we create embeddings for this now?
|
||||
|
||||
5. **Embedding Dimensions:** Are you okay with potentially re-creating embedding tables with different dimensions if we switch from Azure (3072) to Ollama (768)?
|
||||
|
||||
---
|
||||
|
||||
## 📄 Reference Documents
|
||||
|
||||
| Document | Purpose | Location |
|
||||
|----------|---------|----------|
|
||||
| Implementation Plan | Detailed technical plan | `/AI_CHAT_FIX_PLAN.md` |
|
||||
| Verification Findings | Database analysis | `/AI_CHAT_VERIFICATION_FINDINGS.md` |
|
||||
| This Summary | Executive overview | `/AI_CHAT_ANALYSIS_SUMMARY.md` |
|
||||
| Verification Script | System health check | `/scripts/verify-ai-system.ts` |
|
||||
| Deployment Discovery | Find Azure deployments | `/scripts/discover-azure-deployments.ts` |
|
||||
|
||||
---
|
||||
|
||||
## ✅ Next Action
|
||||
|
||||
**Waiting for your decision:**
|
||||
- Option A: Use Ollama ← **Recommended to start**
|
||||
- Option B: Fix Azure OpenAI
|
||||
- Option C: Hybrid approach
|
||||
|
||||
Once you decide, I can immediately proceed with implementation.
|
||||
|
||||
---
|
||||
|
||||
**Status:** Analysis complete. Ready to implement based on your choice. 🚀
|
||||
Reference in New Issue
Block a user