Add voice input UI components for hands-free tracking
Some checks failed
CI/CD Pipeline / Lint and Test (push) Has been cancelled
CI/CD Pipeline / E2E Tests (push) Has been cancelled
CI/CD Pipeline / Build Application (push) Has been cancelled

Implemented complete voice input user interface:

**Voice Recording Hook (useVoiceInput):**
- Browser Web Speech API integration
- Real-time speech recognition
- Continuous and interim results
- 10-second auto-timeout
- Error handling for permissions, network, audio issues
- Graceful fallback for unsupported browsers

**Voice Input Button Component:**
- Modal dialog with microphone button
- Animated pulsing microphone when recording
- Real-time transcript display
- Automatic intent classification on completion
- Structured data visualization
- Example commands for user guidance
- Success/error feedback with MUI Alerts
- Confidence level indicators

**Floating Action Button:**
- Always-visible FAB in bottom-right corner
- Quick access from any page
- Auto-navigation to appropriate tracking page
- Snackbar feedback messages
- Mobile-optimized positioning (thumb zone)

**Integration with Tracking Pages:**
- Voice button in feeding page header
- Auto-fills form fields from voice commands
- Seamless voice-to-form workflow
- Example: "Fed baby 120ml" → fills bottle type & amount

**Features:**
-  Browser speech recognition (Chrome, Edge, Safari)
-  Real-time transcription display
-  Automatic intent classification
-  Auto-fill tracking forms
-  Visual feedback (animations, colors)
-  Error handling & user guidance
-  Mobile-optimized design
-  Accessibility support

**User Flow:**
1. Click microphone button (floating or in-page)
2. Speak command: "Fed baby 120 ml"
3. See real-time transcript
4. Auto-classification shows intent & data
5. Click "Use Command"
6. Form auto-fills or activity created

**Browser Support:**
- Chrome 
- Edge 
- Safari 
- Firefox  (Web Speech API not supported)

**Files Created:**
- hooks/useVoiceInput.ts - Speech recognition hook
- components/voice/VoiceInputButton.tsx - Modal input component
- components/voice/VoiceFloatingButton.tsx - FAB for quick access
- app/layout.tsx - Added floating button globally
- app/track/feeding/page.tsx - Added voice button to header

Voice input is now accessible from anywhere in the app, providing
true hands-free tracking for parents.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-10-01 20:24:43 +00:00
parent 79966a6a6d
commit 63a333bba3
5 changed files with 618 additions and 1 deletions

View File

@@ -3,6 +3,7 @@ import { Inter } from 'next/font/google';
import { ThemeRegistry } from '@/components/ThemeRegistry';
import { ErrorBoundary } from '@/components/common/ErrorBoundary';
import { ReduxProvider } from '@/components/providers/ReduxProvider';
import { VoiceFloatingButton } from '@/components/voice/VoiceFloatingButton';
// import { PerformanceMonitor } from '@/components/common/PerformanceMonitor'; // Temporarily disabled
import './globals.css';
@@ -44,6 +45,7 @@ export default function RootLayout({
<ThemeRegistry>
{/* <PerformanceMonitor /> */}
{children}
<VoiceFloatingButton />
</ThemeRegistry>
</ReduxProvider>
</ErrorBoundary>

View File

@@ -48,6 +48,7 @@ import { withErrorBoundary } from '@/components/common/ErrorFallbacks';
import { useAuth } from '@/lib/auth/AuthContext';
import { trackingApi, Activity } from '@/lib/api/tracking';
import { childrenApi, Child } from '@/lib/api/children';
import { VoiceInputButton } from '@/components/voice/VoiceInputButton';
import { motion } from 'framer-motion';
import { formatDistanceToNow } from 'date-fns';
@@ -350,9 +351,31 @@ function FeedingTrackPage() {
<IconButton onClick={() => router.back()} sx={{ mr: 2 }}>
<ArrowBack />
</IconButton>
<Typography variant="h4" fontWeight="600">
<Typography variant="h4" fontWeight="600" sx={{ flex: 1 }}>
Track Feeding
</Typography>
<VoiceInputButton
onTranscript={(transcript) => {
console.log('[Feeding] Voice transcript:', transcript);
}}
onClassifiedIntent={(result) => {
if (result.intent === 'feeding' && result.structuredData) {
const data = result.structuredData;
// Auto-fill form with voice data
if (data.type === 'bottle' && data.amount) {
setFeedingType('bottle');
setAmount(data.amount.toString());
} else if (data.type?.includes('breast')) {
setFeedingType('breast');
if (data.side) setSide(data.side);
if (data.duration) setDuration(data.duration.toString());
} else if (data.type === 'solid') {
setFeedingType('solid');
}
}
}}
size="medium"
/>
</Box>
{error && (