feat: implement AI chat with vector search and random loading messages

Major Features:
-  AI chat with Azure OpenAI GPT-4o integration
-  Vector search across Bible versions (ASV English, RVA 1909 Spanish)
-  Multi-language support with automatic English fallback
-  Bible version citations in responses [ASV] [RVA 1909]
-  Random Bible-themed loading messages (5 variants)
-  Safe build script with memory guardrails
-  8GB swap memory for build safety
-  Stripe donation integration (multiple payment methods)

AI Chat Improvements:
- Implement vector search with 1536-dim embeddings (Azure text-embedding-ada-002)
- Search all Bible versions in user's language, fallback to English
- Cite Bible versions properly in AI responses
- Add 5 random loading messages: "Searching the Scriptures...", etc.
- Fix Ollama conflict (disabled to use Azure OpenAI exclusively)
- Optimize hybrid search queries for actual table schema

Build & Infrastructure:
- Create safe-build.sh script with memory monitoring (prevents server crashes)
- Add 8GB swap memory for emergency relief
- Document build process in BUILD_GUIDE.md
- Set Node.js memory limits (4GB max during builds)

Database:
- Clean up 115 old vector tables with wrong dimensions
- Keep only 2 tables with correct 1536-dim embeddings
- Add Stripe schema for donations and subscriptions

Documentation:
- AI_CHAT_FINAL_STATUS.md - Complete implementation status
- AI_CHAT_IMPLEMENTATION_COMPLETE.md - Technical details
- BUILD_GUIDE.md - Safe building guide with guardrails
- CHAT_LOADING_MESSAGES.md - Loading messages implementation
- STRIPE_IMPLEMENTATION_COMPLETE.md - Stripe integration docs
- STRIPE_SETUP_GUIDE.md - Stripe configuration guide

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-10-12 19:37:24 +00:00
parent b3ec31a265
commit a01377b21a
20 changed files with 3022 additions and 130 deletions

View File

@@ -4,10 +4,20 @@ import { useState, useRef, useEffect } from 'react'
import { Send, User } from 'lucide-react'
import ReactMarkdown from 'react-markdown'
// Random Bible-related loading messages
const LOADING_MESSAGES = [
"Searching the Scriptures...",
"Seeking wisdom from God's Word...",
"Consulting the Holy Scriptures...",
"Finding relevant Bible verses...",
"Exploring God's eternal truth..."
]
export function ChatInterface() {
const [messages, setMessages] = useState<Array<{ role: string; content: string }>>([])
const [input, setInput] = useState('')
const [loading, setLoading] = useState(false)
const [loadingMessage, setLoadingMessage] = useState('')
const [isAuthenticated, setIsAuthenticated] = useState(false)
const messagesEndRef = useRef<HTMLDivElement>(null)
@@ -42,6 +52,10 @@ export function ChatInterface() {
const userMessage = { role: 'user', content: input }
setMessages(prev => [...prev, userMessage])
setInput('')
// Pick a random loading message
const randomMessage = LOADING_MESSAGES[Math.floor(Math.random() * LOADING_MESSAGES.length)]
setLoadingMessage(randomMessage)
setLoading(true)
try {
@@ -135,11 +149,14 @@ export function ChatInterface() {
{loading && (
<div className="flex justify-start">
<div className="bg-gray-100 p-3 rounded-lg">
<div className="flex space-x-2">
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce" />
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce delay-100" />
<div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce delay-200" />
<div className="bg-gray-100 p-4 rounded-lg">
<div className="flex items-center space-x-3">
<div className="flex space-x-2">
<div className="w-2 h-2 bg-blue-500 rounded-full animate-bounce" />
<div className="w-2 h-2 bg-blue-500 rounded-full animate-bounce delay-100" />
<div className="w-2 h-2 bg-blue-500 rounded-full animate-bounce delay-200" />
</div>
<span className="text-sm text-gray-600 italic">{loadingMessage}</span>
</div>
</div>
</div>