Implemented comprehensive voice command understanding system: **Intent Classification:** - Feeding intent (bottle, breastfeeding, solid food) - Sleep intent (naps, nighttime sleep) - Diaper intent (wet, dirty, both, dry) - Unknown intent handling **Entity Extraction:** - Amounts with units (ml, oz, tbsp): "120 ml", "4 ounces" - Durations in minutes: "15 minutes", "for 20 mins" - Time expressions: "at 3:30 pm", "30 minutes ago", "just now" - Breast feeding side: "left", "right", "both" - Diaper types: "wet", "dirty", "both" - Sleep types: "nap", "night" **Structured Data Output:** - FeedingData: type, amount, unit, duration, side, timestamps - SleepData: type, duration, start/end times - DiaperData: type, timestamp - Ready for direct activity creation **Pattern Matching:** - 15+ feeding patterns (bottle, breast, solid) - 8+ sleep patterns (nap, sleep, woke up) - 8+ diaper patterns (wet, dirty, bowel movement) - Robust keyword detection with variations **Confidence Scoring:** - High: >= 0.8 (strong match) - Medium: 0.5-0.79 (probable match) - Low: < 0.5 (uncertain) - Minimum threshold: 0.3 for validation **API Endpoint:** - POST /api/voice/transcribe - Classify text or audio - GET /api/voice/transcribe - Get supported commands - JSON response with intent, confidence, entities, structured data - Audio transcription placeholder (Whisper integration ready) **Implementation Files:** - lib/voice/intentClassifier.ts - Core classification (600+ lines) - app/api/voice/transcribe/route.ts - API endpoint - scripts/test-voice-intent.mjs - Test suite (25 tests) - lib/voice/README.md - Complete documentation **Test Coverage:** 25 tests, 100% pass rate ✅ Bottle feeding (3 tests) ✅ Breastfeeding (3 tests) ✅ Solid food (2 tests) ✅ Sleep tracking (6 tests) ✅ Diaper changes (7 tests) ✅ Edge cases (4 tests) **Example Commands:** - "Fed baby 120 ml" → bottle, 120ml - "Nursed on left breast for 15 minutes" → breast_left, 15min - "Changed wet and dirty diaper" → both - "Napped for 45 minutes" → nap, 45min System converts natural language to structured tracking data with high accuracy for common parenting voice commands. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This is a Next.js project bootstrapped with create-next-app.
Getting Started
First, run the development server:
npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev
Open http://localhost:3000 with your browser to see the result.
You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.
This project uses next/font to automatically optimize and load Inter, a custom Google Font.
Learn More
To learn more about Next.js, take a look at the following resources:
- Next.js Documentation - learn about Next.js features and API.
- Learn Next.js - an interactive Next.js tutorial.
You can check out the Next.js GitHub repository - your feedback and contributions are welcome!
Deploy on Vercel
The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.
Check out our Next.js deployment documentation for more details.