Implemented complete voice input user interface: **Voice Recording Hook (useVoiceInput):** - Browser Web Speech API integration - Real-time speech recognition - Continuous and interim results - 10-second auto-timeout - Error handling for permissions, network, audio issues - Graceful fallback for unsupported browsers **Voice Input Button Component:** - Modal dialog with microphone button - Animated pulsing microphone when recording - Real-time transcript display - Automatic intent classification on completion - Structured data visualization - Example commands for user guidance - Success/error feedback with MUI Alerts - Confidence level indicators **Floating Action Button:** - Always-visible FAB in bottom-right corner - Quick access from any page - Auto-navigation to appropriate tracking page - Snackbar feedback messages - Mobile-optimized positioning (thumb zone) **Integration with Tracking Pages:** - Voice button in feeding page header - Auto-fills form fields from voice commands - Seamless voice-to-form workflow - Example: "Fed baby 120ml" → fills bottle type & amount **Features:** - ✅ Browser speech recognition (Chrome, Edge, Safari) - ✅ Real-time transcription display - ✅ Automatic intent classification - ✅ Auto-fill tracking forms - ✅ Visual feedback (animations, colors) - ✅ Error handling & user guidance - ✅ Mobile-optimized design - ✅ Accessibility support **User Flow:** 1. Click microphone button (floating or in-page) 2. Speak command: "Fed baby 120 ml" 3. See real-time transcript 4. Auto-classification shows intent & data 5. Click "Use Command" 6. Form auto-fills or activity created **Browser Support:** - Chrome ✅ - Edge ✅ - Safari ✅ - Firefox ❌ (Web Speech API not supported) **Files Created:** - hooks/useVoiceInput.ts - Speech recognition hook - components/voice/VoiceInputButton.tsx - Modal input component - components/voice/VoiceFloatingButton.tsx - FAB for quick access - app/layout.tsx - Added floating button globally - app/track/feeding/page.tsx - Added voice button to header Voice input is now accessible from anywhere in the app, providing true hands-free tracking for parents. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
56 lines
1.6 KiB
TypeScript
56 lines
1.6 KiB
TypeScript
import type { Metadata } from 'next';
|
|
import { Inter } from 'next/font/google';
|
|
import { ThemeRegistry } from '@/components/ThemeRegistry';
|
|
import { ErrorBoundary } from '@/components/common/ErrorBoundary';
|
|
import { ReduxProvider } from '@/components/providers/ReduxProvider';
|
|
import { VoiceFloatingButton } from '@/components/voice/VoiceFloatingButton';
|
|
// import { PerformanceMonitor } from '@/components/common/PerformanceMonitor'; // Temporarily disabled
|
|
import './globals.css';
|
|
|
|
const inter = Inter({ subsets: ['latin'] });
|
|
|
|
export const metadata: Metadata = {
|
|
title: 'Maternal - AI-Powered Child Care Assistant',
|
|
description: 'Track, analyze, and get AI-powered insights for your child\'s development, sleep, feeding, and more.',
|
|
manifest: '/manifest.json',
|
|
themeColor: '#FFB6C1',
|
|
viewport: {
|
|
width: 'device-width',
|
|
initialScale: 1,
|
|
maximumScale: 1,
|
|
userScalable: false,
|
|
},
|
|
appleWebApp: {
|
|
capable: true,
|
|
statusBarStyle: 'default',
|
|
title: 'Maternal',
|
|
},
|
|
};
|
|
|
|
export default function RootLayout({
|
|
children,
|
|
}: Readonly<{
|
|
children: React.ReactNode;
|
|
}>) {
|
|
return (
|
|
<html lang="en">
|
|
<head>
|
|
<link rel="manifest" href="/manifest.json" />
|
|
<meta name="theme-color" content="#FFB6C1" />
|
|
<link rel="apple-touch-icon" href="/icon-192x192.png" />
|
|
</head>
|
|
<body className={inter.className}>
|
|
<ErrorBoundary>
|
|
<ReduxProvider>
|
|
<ThemeRegistry>
|
|
{/* <PerformanceMonitor /> */}
|
|
{children}
|
|
<VoiceFloatingButton />
|
|
</ThemeRegistry>
|
|
</ReduxProvider>
|
|
</ErrorBoundary>
|
|
</body>
|
|
</html>
|
|
);
|
|
}
|