- Detailed phase-by-phase implementation guide (0-13) - Production-grade TypeScript + Prisma + React/Chakra stack - Docker Compose setup for all services - 100% backward compatibility guaranteed - Comprehensive testing and migration strategy
56 KiB
Redirect Intelligence v2 - Detailed Implementation Plan
Based on the redirect_intelligence_v2_plan.md and current application state, this document provides a phase-by-phase implementation guide for upgrading to Redirect Intelligence v2 with Node/Express + Prisma(Postgres) + React/Chakra.
Current State Analysis
From comprehensive_app_documentation.md, the current application has:
- ✅ Express.js server with redirect tracking
- ✅ Rate limiting (100 req/hour/IP)
- ✅ SSL certificate analysis
- ✅ Basic frontend with dark/light mode
- ✅ API endpoints:
/api/track,/api/v1/track - ✅ Security warnings (loops, SSL downgrades)
- ✅ Response body truncation and metadata capture
Implementation Strategy
Backward Compatibility: All existing endpoints (/api/track, /api/v1/track) will be preserved with identical behavior.
Migration Approach: Gradual migration with feature flags to ensure zero downtime.
Phase 0: Repo & Env (Dockerized)
Goals
- Restructure project for TypeScript
- Add Docker Compose with all services
- Setup development environment
Files to Create/Modify
1. Project Structure
/
├── apps/
│ ├── api/ # Express.js API (TypeScript)
│ │ ├── src/
│ │ │ ├── index.ts
│ │ │ ├── routes/
│ │ │ ├── middleware/
│ │ │ ├── services/
│ │ │ └── types/
│ │ ├── Dockerfile
│ │ ├── package.json
│ │ └── tsconfig.json
│ ├── web/ # React frontend
│ │ ├── src/
│ │ │ ├── components/
│ │ │ ├── pages/
│ │ │ ├── hooks/
│ │ │ └── types/
│ │ ├── Dockerfile
│ │ ├── package.json
│ │ └── tsconfig.json
│ └── worker/ # BullMQ worker
│ ├── src/
│ ├── Dockerfile
│ └── package.json
├── packages/
│ ├── database/ # Prisma schema & migrations
│ │ ├── prisma/
│ │ └── package.json
│ └── shared/ # Shared types & utilities
│ ├── src/
│ └── package.json
├── docker-compose.yml
├── docker-compose.dev.yml
└── package.json # Root workspace
2. Docker Compose Configuration
docker-compose.yml
version: '3.8'
services:
postgres:
image: postgres:15
environment:
POSTGRES_DB: redirect_intelligence
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis_data:/data
api:
build:
context: .
dockerfile: apps/api/Dockerfile
ports:
- "3333:3333"
environment:
- DATABASE_URL=postgresql://postgres:postgres@postgres:5432/redirect_intelligence
- REDIS_URL=redis://redis:6379
- NODE_ENV=development
depends_on:
- postgres
- redis
volumes:
- ./apps/api:/app
- /app/node_modules
web:
build:
context: .
dockerfile: apps/web/Dockerfile
ports:
- "3000:3000"
environment:
- REACT_APP_API_URL=http://localhost:3333
depends_on:
- api
volumes:
- ./apps/web:/app
- /app/node_modules
worker:
build:
context: .
dockerfile: apps/worker/Dockerfile
environment:
- DATABASE_URL=postgresql://postgres:postgres@postgres:5432/redirect_intelligence
- REDIS_URL=redis://redis:6379
depends_on:
- postgres
- redis
volumes:
- ./apps/worker:/app
- /app/node_modules
volumes:
postgres_data:
redis_data:
3. TypeScript Configuration
Root package.json
{
"name": "redirect-intelligence-v2",
"private": true,
"workspaces": [
"apps/*",
"packages/*"
],
"scripts": {
"dev": "docker-compose -f docker-compose.yml -f docker-compose.dev.yml up",
"build": "turbo run build",
"test": "turbo run test",
"lint": "turbo run lint",
"db:migrate": "cd packages/database && npx prisma migrate dev",
"db:seed": "cd packages/database && npx prisma db seed"
},
"devDependencies": {
"turbo": "^1.10.0",
"typescript": "^5.0.0",
"@types/node": "^20.0.0"
}
}
4. Migration Script
migrate-existing.ts
// Script to migrate current index.js logic to TypeScript structure
// Preserves all existing functionality while adding new structure
Implementation Steps
- Create new directory structure
- Migrate existing
index.jsto TypeScript inapps/api/src/ - Preserve existing routes with identical behavior
- Setup Docker containers
- Add development scripts
- Test backward compatibility
Commit Message
feat(phase-0): setup Docker Compose with TypeScript structure
- Restructure project with apps/ and packages/
- Add Docker Compose for api, web, db, redis, worker
- Migrate existing Express.js logic to TypeScript
- Preserve all existing API endpoints and behavior
- Setup development environment with hot reload
Phase 1: Postgres + Prisma + Auth
Goals
- Add PostgreSQL with Prisma ORM
- Implement user authentication with Argon2
- Create database schema
- Add JWT-based session management
Files to Create/Modify
1. Database Schema
packages/database/prisma/schema.prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model User {
id String @id @default(cuid())
email String @unique
name String
passwordHash String @map("password_hash")
createdAt DateTime @default(now()) @map("created_at")
lastLoginAt DateTime? @map("last_login_at")
memberships OrgMembership[]
auditLogs AuditLog[]
@@map("users")
}
model Organization {
id String @id @default(cuid())
name String
plan String @default("free")
createdAt DateTime @default(now()) @map("created_at")
memberships OrgMembership[]
projects Project[]
apiKeys ApiKey[]
auditLogs AuditLog[]
@@map("organizations")
}
model OrgMembership {
id String @id @default(cuid())
orgId String @map("org_id")
userId String @map("user_id")
role Role
organization Organization @relation(fields: [orgId], references: [id], onDelete: Cascade)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
@@unique([orgId, userId])
@@map("org_memberships")
}
model Project {
id String @id @default(cuid())
orgId String @map("org_id")
name String
settingsJson Json @map("settings_json") @default("{}")
createdAt DateTime @default(now()) @map("created_at")
organization Organization @relation(fields: [orgId], references: [id], onDelete: Cascade)
checks Check[]
bulkJobs BulkJob[]
@@map("projects")
}
model Check {
id String @id @default(cuid())
projectId String @map("project_id")
inputUrl String @map("input_url")
method String @default("GET")
headersJson Json @map("headers_json") @default("{}")
userAgent String? @map("user_agent")
startedAt DateTime @map("started_at")
finishedAt DateTime? @map("finished_at")
status CheckStatus
finalUrl String? @map("final_url")
totalTimeMs Int? @map("total_time_ms")
reportId String? @map("report_id")
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
hops Hop[]
sslInspections SslInspection[]
seoFlags SeoFlags?
securityFlags SecurityFlags?
reports Report[]
@@index([projectId, startedAt(sort: Desc)])
@@map("checks")
}
model Hop {
id String @id @default(cuid())
checkId String @map("check_id")
hopIndex Int @map("hop_index")
url String
scheme String?
statusCode Int? @map("status_code")
redirectType RedirectType @map("redirect_type")
latencyMs Int? @map("latency_ms")
contentType String? @map("content_type")
reason String?
responseHeadersJson Json @map("response_headers_json") @default("{}")
check Check @relation(fields: [checkId], references: [id], onDelete: Cascade)
@@index([checkId, hopIndex])
@@map("hops")
}
model SslInspection {
id String @id @default(cuid())
checkId String @map("check_id")
host String
validFrom DateTime? @map("valid_from")
validTo DateTime? @map("valid_to")
daysToExpiry Int? @map("days_to_expiry")
issuer String?
protocol String?
warningsJson Json @map("warnings_json") @default("[]")
check Check @relation(fields: [checkId], references: [id], onDelete: Cascade)
@@map("ssl_inspections")
}
model SeoFlags {
id String @id @default(cuid())
checkId String @unique @map("check_id")
robotsTxtStatus String? @map("robots_txt_status")
robotsTxtRulesJson Json @map("robots_txt_rules_json") @default("{}")
metaRobots String? @map("meta_robots")
canonicalUrl String? @map("canonical_url")
sitemapPresent Boolean @default(false) @map("sitemap_present")
noindex Boolean @default(false)
nofollow Boolean @default(false)
check Check @relation(fields: [checkId], references: [id], onDelete: Cascade)
@@map("seo_flags")
}
model SecurityFlags {
id String @id @default(cuid())
checkId String @unique @map("check_id")
safeBrowsingStatus String? @map("safe_browsing_status")
mixedContent MixedContent @map("mixed_content") @default(NONE)
httpsToHttp Boolean @map("https_to_http") @default(false)
check Check @relation(fields: [checkId], references: [id], onDelete: Cascade)
@@map("security_flags")
}
model Report {
id String @id @default(cuid())
checkId String @map("check_id")
markdownPath String? @map("markdown_path")
pdfPath String? @map("pdf_path")
createdAt DateTime @default(now()) @map("created_at")
check Check @relation(fields: [checkId], references: [id], onDelete: Cascade)
@@map("reports")
}
model BulkJob {
id String @id @default(cuid())
projectId String @map("project_id")
uploadPath String @map("upload_path")
status JobStatus
progressJson Json @map("progress_json") @default("{}")
createdAt DateTime @default(now()) @map("created_at")
completedAt DateTime? @map("completed_at")
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
@@map("bulk_jobs")
}
model ApiKey {
id String @id @default(cuid())
orgId String @map("org_id")
name String
tokenHash String @unique @map("token_hash")
permsJson Json @map("perms_json") @default("{}")
rateLimitQuota Int @map("rate_limit_quota") @default(1000)
createdAt DateTime @default(now()) @map("created_at")
organization Organization @relation(fields: [orgId], references: [id], onDelete: Cascade)
@@index([tokenHash])
@@map("api_keys")
}
model AuditLog {
id String @id @default(cuid())
orgId String @map("org_id")
actorUserId String? @map("actor_user_id")
action String
entity String
entityId String @map("entity_id")
metaJson Json @map("meta_json") @default("{}")
createdAt DateTime @default(now()) @map("created_at")
organization Organization @relation(fields: [orgId], references: [id], onDelete: Cascade)
actor User? @relation(fields: [actorUserId], references: [id], onDelete: SetNull)
@@map("audit_logs")
}
enum Role {
OWNER
ADMIN
MEMBER
}
enum CheckStatus {
OK
ERROR
TIMEOUT
LOOP
}
enum RedirectType {
HTTP_301
HTTP_302
HTTP_307
HTTP_308
META_REFRESH
JS
FINAL
OTHER
}
enum MixedContent {
NONE
PRESENT
FINAL_TO_HTTP
}
enum JobStatus {
QUEUED
RUNNING
DONE
ERROR
}
2. Authentication Service
apps/api/src/services/auth.service.ts
import argon2 from 'argon2';
import jwt from 'jsonwebtoken';
import { z } from 'zod';
import { prisma } from '@/lib/prisma';
const loginSchema = z.object({
email: z.string().email(),
password: z.string().min(8),
});
export class AuthService {
async hashPassword(password: string): Promise<string> {
return argon2.hash(password);
}
async verifyPassword(hash: string, password: string): Promise<boolean> {
return argon2.verify(hash, password);
}
async login(data: z.infer<typeof loginSchema>) {
const { email, password } = loginSchema.parse(data);
const user = await prisma.user.findUnique({
where: { email },
include: {
memberships: {
include: { organization: true }
}
}
});
if (!user || !await this.verifyPassword(user.passwordHash, password)) {
throw new Error('Invalid credentials');
}
await prisma.user.update({
where: { id: user.id },
data: { lastLoginAt: new Date() }
});
const token = jwt.sign(
{ userId: user.id, email: user.email },
process.env.JWT_SECRET!,
{ expiresIn: '7d' }
);
return { user, token };
}
async createUser(email: string, name: string, password: string) {
const existingUser = await prisma.user.findUnique({
where: { email }
});
if (existingUser) {
throw new Error('User already exists');
}
const passwordHash = await this.hashPassword(password);
return prisma.user.create({
data: {
email,
name,
passwordHash,
}
});
}
}
3. Authentication Middleware
apps/api/src/middleware/auth.middleware.ts
import { Request, Response, NextFunction } from 'express';
import jwt from 'jsonwebtoken';
import { prisma } from '@/lib/prisma';
export interface AuthenticatedRequest extends Request {
user?: {
id: string;
email: string;
memberships: Array<{
orgId: string;
role: string;
organization: { name: string; plan: string };
}>;
};
}
export const authenticateToken = async (
req: AuthenticatedRequest,
res: Response,
next: NextFunction
) => {
const authHeader = req.headers.authorization;
const token = authHeader && authHeader.split(' ')[1]; // Bearer TOKEN
if (!token) {
return res.status(401).json({ error: 'Access token required' });
}
try {
const decoded = jwt.verify(token, process.env.JWT_SECRET!) as {
userId: string;
email: string;
};
const user = await prisma.user.findUnique({
where: { id: decoded.userId },
include: {
memberships: {
include: { organization: true }
}
}
});
if (!user) {
return res.status(401).json({ error: 'User not found' });
}
req.user = {
id: user.id,
email: user.email,
memberships: user.memberships.map(m => ({
orgId: m.orgId,
role: m.role,
organization: {
name: m.organization.name,
plan: m.organization.plan
}
}))
};
next();
} catch (error) {
return res.status(403).json({ error: 'Invalid token' });
}
};
4. Auth Routes
apps/api/src/routes/auth.routes.ts
import express from 'express';
import { z } from 'zod';
import { AuthService } from '@/services/auth.service';
import { authenticateToken, AuthenticatedRequest } from '@/middleware/auth.middleware';
const router = express.Router();
const authService = new AuthService();
const loginSchema = z.object({
email: z.string().email(),
password: z.string().min(8),
});
const registerSchema = z.object({
email: z.string().email(),
name: z.string().min(2),
password: z.string().min(8),
});
// POST /api/v1/auth/login
router.post('/login', async (req, res) => {
try {
const { user, token } = await authService.login(req.body);
// Set HttpOnly cookie
res.cookie('auth_token', token, {
httpOnly: true,
secure: process.env.NODE_ENV === 'production',
sameSite: 'strict',
maxAge: 7 * 24 * 60 * 60 * 1000, // 7 days
});
res.json({
success: true,
user: {
id: user.id,
email: user.email,
name: user.name,
memberships: user.memberships
}
});
} catch (error) {
res.status(400).json({
success: false,
error: error instanceof Error ? error.message : 'Login failed'
});
}
});
// POST /api/v1/auth/register
router.post('/register', async (req, res) => {
try {
const { email, name, password } = registerSchema.parse(req.body);
const user = await authService.createUser(email, name, password);
res.status(201).json({
success: true,
user: {
id: user.id,
email: user.email,
name: user.name
}
});
} catch (error) {
res.status(400).json({
success: false,
error: error instanceof Error ? error.message : 'Registration failed'
});
}
});
// POST /api/v1/auth/logout
router.post('/logout', (req, res) => {
res.clearCookie('auth_token');
res.json({ success: true });
});
// GET /api/v1/auth/me
router.get('/me', authenticateToken, (req: AuthenticatedRequest, res) => {
res.json({
success: true,
user: req.user
});
});
export default router;
5. Updated Main Server
apps/api/src/index.ts (migrate from existing index.js)
import express from 'express';
import cors from 'cors';
import cookieParser from 'cookie-parser';
import rateLimit from 'express-rate-limit';
import { trackRedirects } from '@/services/redirect.service'; // Migrated from existing
import authRoutes from '@/routes/auth.routes';
import { prisma } from '@/lib/prisma';
const app = express();
const PORT = process.env.PORT || 3333;
// Middleware
app.use(cors({
origin: process.env.WEB_URL || 'http://localhost:3000',
credentials: true
}));
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
app.use(cookieParser());
// Rate limiting (preserve existing behavior)
const apiLimiter = rateLimit({
windowMs: 60 * 60 * 1000, // 1 hour
max: 100,
message: { error: 'Too many requests, please try again later.' }
});
// Routes
app.use('/api/v1/auth', authRoutes);
// PRESERVE EXISTING ENDPOINTS - Backward Compatibility
app.post('/api/track', apiLimiter, async (req, res) => {
// Exact same logic as before - no changes
// ... existing implementation
});
app.post('/api/v1/track', apiLimiter, async (req, res) => {
// Exact same logic as before - no changes
// ... existing implementation
});
app.get('/api/v1/track', apiLimiter, async (req, res) => {
// Exact same logic as before - no changes
// ... existing implementation
});
// Health check
app.get('/health', (req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});
Implementation Steps
- Setup Prisma with PostgreSQL
- Create database schema and run migrations
- Implement authentication service with Argon2
- Add JWT middleware for protected routes
- Create auth routes (login, register, logout, me)
- Migrate existing server logic to TypeScript
- Ensure all existing endpoints work identically
- Add comprehensive tests
Testing Requirements
- Unit tests for auth service
- Integration tests for auth routes
- Backward compatibility tests for existing endpoints
- Database migration tests
Commit Message
feat(phase-1): add Postgres + Prisma + Auth with backward compatibility
- Add PostgreSQL database with Prisma ORM
- Implement user authentication with Argon2 password hashing
- Add JWT-based session management with HttpOnly cookies
- Create comprehensive database schema for all entities
- Migrate existing Express.js logic to TypeScript
- Preserve 100% backward compatibility for existing API endpoints
- Add auth routes: login, register, logout, me
- Include comprehensive test suite
Phase 2: Persisted Checks (Non-JS Chain)
Goals
- Persist redirect chain analysis to database
- Create new
/api/v1/checksendpoint that stores history - Maintain existing endpoints with identical behavior
- Add check retrieval and history endpoints
Files to Create/Modify
1. Enhanced Redirect Service
apps/api/src/services/redirect.service.ts (enhanced from existing)
import axios from 'axios';
import https from 'https';
import { z } from 'zod';
import { prisma } from '@/lib/prisma';
import { CheckStatus, RedirectType } from '@prisma/client';
const createCheckSchema = z.object({
inputUrl: z.string().url(),
method: z.enum(['GET', 'HEAD', 'POST']).default('GET'),
userAgent: z.string().optional(),
headers: z.record(z.string()).default({}),
projectId: z.string().optional(), // Optional for anonymous checks
});
export interface RedirectHop {
url: string;
statusCode?: number;
statusText?: string;
redirectType: RedirectType;
latencyMs: number;
contentType?: string;
reason?: string;
responseHeaders: Record<string, string>;
timestamp: Date;
}
export interface RedirectResult {
checkId?: string; // Only set when persisting
inputUrl: string;
finalUrl: string;
totalTimeMs: number;
status: CheckStatus;
hops: RedirectHop[];
metadata: {
redirectCount: number;
method: string;
userAgent?: string;
};
}
export class RedirectService {
// Existing trackRedirects logic preserved exactly for backward compatibility
async trackRedirectsLegacy(
url: string,
redirects: any[] = [],
options: any = {}
): Promise<any[]> {
// Keep exact existing implementation from index.js
// This ensures 100% backward compatibility
// ... existing logic
}
// New enhanced method that persists to database
async createCheck(
data: z.infer<typeof createCheckSchema>,
userId?: string
): Promise<RedirectResult> {
const validatedData = createCheckSchema.parse(data);
const startTime = Date.now();
// Create check record
const check = await prisma.check.create({
data: {
projectId: validatedData.projectId || await this.getDefaultProjectId(userId),
inputUrl: validatedData.inputUrl,
method: validatedData.method,
headersJson: validatedData.headers,
userAgent: validatedData.userAgent,
startedAt: new Date(),
status: CheckStatus.OK, // Will update later
},
});
try {
// Perform redirect analysis
const result = await this.analyzeRedirectChain(
validatedData.inputUrl,
validatedData.method,
validatedData.userAgent,
validatedData.headers
);
const totalTimeMs = Date.now() - startTime;
// Update check with results
await prisma.check.update({
where: { id: check.id },
data: {
finishedAt: new Date(),
finalUrl: result.finalUrl,
totalTimeMs,
status: result.status,
},
});
// Save hops
await this.saveHops(check.id, result.hops);
return {
checkId: check.id,
inputUrl: validatedData.inputUrl,
finalUrl: result.finalUrl,
totalTimeMs,
status: result.status,
hops: result.hops,
metadata: {
redirectCount: result.hops.length - 1,
method: validatedData.method,
userAgent: validatedData.userAgent,
},
};
} catch (error) {
// Update check with error status
await prisma.check.update({
where: { id: check.id },
data: {
finishedAt: new Date(),
status: CheckStatus.ERROR,
totalTimeMs: Date.now() - startTime,
},
});
throw error;
}
}
private async analyzeRedirectChain(
inputUrl: string,
method: string,
userAgent?: string,
headers: Record<string, string> = {}
): Promise<{
finalUrl: string;
status: CheckStatus;
hops: RedirectHop[];
}> {
const hops: RedirectHop[] = [];
const visitedUrls = new Set<string>();
let currentUrl = inputUrl;
let hopIndex = 0;
while (hopIndex < 20) { // Max 20 hops to prevent infinite loops
if (visitedUrls.has(currentUrl)) {
return {
finalUrl: currentUrl,
status: CheckStatus.LOOP,
hops,
};
}
visitedUrls.add(currentUrl);
const hopStartTime = Date.now();
try {
const config = {
method: hopIndex === 0 ? method : 'GET', // Use specified method only for first request
url: currentUrl,
maxRedirects: 0,
validateStatus: (status: number) => status >= 200 && status < 600,
timeout: 15000,
headers: {
...headers,
...(userAgent ? { 'User-Agent': userAgent } : {}),
},
httpsAgent: new https.Agent({
rejectUnauthorized: false,
}),
};
const response = await axios(config);
const latencyMs = Date.now() - hopStartTime;
const hop: RedirectHop = {
url: currentUrl,
statusCode: response.status,
statusText: response.statusText,
redirectType: this.determineRedirectType(response),
latencyMs,
contentType: response.headers['content-type'],
responseHeaders: response.headers,
timestamp: new Date(),
};
hops.push(hop);
// Check if this is a redirect
if (response.status >= 300 && response.status < 400 && response.headers.location) {
currentUrl = new URL(response.headers.location, currentUrl).href;
hopIndex++;
continue;
}
// Check for meta refresh
if (response.status === 200 && response.headers['content-type']?.includes('text/html')) {
const metaRedirect = this.parseMetaRefresh(response.data);
if (metaRedirect) {
currentUrl = new URL(metaRedirect, currentUrl).href;
hop.redirectType = RedirectType.META_REFRESH;
hopIndex++;
continue;
}
}
// Final destination reached
hop.redirectType = RedirectType.FINAL;
return {
finalUrl: currentUrl,
status: CheckStatus.OK,
hops,
};
} catch (error) {
hops.push({
url: currentUrl,
redirectType: RedirectType.OTHER,
latencyMs: Date.now() - hopStartTime,
responseHeaders: {},
timestamp: new Date(),
reason: error instanceof Error ? error.message : 'Unknown error',
});
return {
finalUrl: currentUrl,
status: CheckStatus.ERROR,
hops,
};
}
}
return {
finalUrl: currentUrl,
status: CheckStatus.TIMEOUT,
hops,
};
}
private determineRedirectType(response: any): RedirectType {
switch (response.status) {
case 301: return RedirectType.HTTP_301;
case 302: return RedirectType.HTTP_302;
case 307: return RedirectType.HTTP_307;
case 308: return RedirectType.HTTP_308;
default: return RedirectType.OTHER;
}
}
private parseMetaRefresh(html: string): string | null {
const metaRefreshRegex = /<meta[^>]*http-equiv=["']?refresh["']?[^>]*content=["']?[^"']*url=([^"';\s>]+)/i;
const match = html.match(metaRefreshRegex);
return match ? match[1] : null;
}
private async saveHops(checkId: string, hops: RedirectHop[]): Promise<void> {
await prisma.hop.createMany({
data: hops.map((hop, index) => ({
checkId,
hopIndex: index,
url: hop.url,
scheme: new URL(hop.url).protocol.replace(':', ''),
statusCode: hop.statusCode,
redirectType: hop.redirectType,
latencyMs: hop.latencyMs,
contentType: hop.contentType,
reason: hop.reason,
responseHeadersJson: hop.responseHeaders,
})),
});
}
private async getDefaultProjectId(userId?: string): Promise<string> {
if (!userId) {
// For anonymous checks, create a default project or use a system project
return 'anonymous';
}
// Get user's first organization's first project, or create one
const user = await prisma.user.findUnique({
where: { id: userId },
include: {
memberships: {
include: {
organization: {
include: { projects: true }
}
}
}
}
});
const firstOrg = user?.memberships[0]?.organization;
if (firstOrg?.projects[0]) {
return firstOrg.projects[0].id;
}
// Create default project
const defaultProject = await prisma.project.create({
data: {
name: 'Default Project',
orgId: firstOrg!.id,
},
});
return defaultProject.id;
}
async getCheck(checkId: string): Promise<any> {
return prisma.check.findUnique({
where: { id: checkId },
include: {
hops: { orderBy: { hopIndex: 'asc' } },
sslInspections: true,
seoFlags: true,
securityFlags: true,
project: {
include: { organization: true }
}
},
});
}
async getProjectChecks(projectId: string, options: {
page?: number;
limit?: number;
status?: CheckStatus;
} = {}): Promise<{
checks: any[];
total: number;
page: number;
limit: number;
}> {
const { page = 1, limit = 20, status } = options;
const skip = (page - 1) * limit;
const where = {
projectId,
...(status ? { status } : {}),
};
const [checks, total] = await Promise.all([
prisma.check.findMany({
where,
include: {
hops: { orderBy: { hopIndex: 'asc' } },
sslInspections: true,
seoFlags: true,
securityFlags: true,
},
orderBy: { startedAt: 'desc' },
skip,
take: limit,
}),
prisma.check.count({ where }),
]);
return { checks, total, page, limit };
}
}
2. Check Routes
apps/api/src/routes/checks.routes.ts
import express from 'express';
import { z } from 'zod';
import { RedirectService } from '@/services/redirect.service';
import { authenticateToken, AuthenticatedRequest } from '@/middleware/auth.middleware';
import rateLimit from 'express-rate-limit';
const router = express.Router();
const redirectService = new RedirectService();
const apiLimiter = rateLimit({
windowMs: 60 * 60 * 1000, // 1 hour
max: 100,
message: { error: 'Too many requests, please try again later.' }
});
const createCheckSchema = z.object({
inputUrl: z.string().url(),
method: z.enum(['GET', 'HEAD', 'POST']).default('GET'),
userAgent: z.string().optional(),
headers: z.record(z.string()).default({}),
projectId: z.string().optional(),
});
// POST /api/v1/checks - Create new check with persistence
router.post('/', apiLimiter, async (req: AuthenticatedRequest, res) => {
try {
const result = await redirectService.createCheck(req.body, req.user?.id);
res.json({
success: true,
status: 200,
data: result,
});
} catch (error) {
console.error('Error creating check:', error);
res.status(500).json({
success: false,
status: 500,
error: error instanceof Error ? error.message : 'Failed to create check',
});
}
});
// GET /api/v1/checks/:id - Get specific check with full details
router.get('/:id', async (req, res) => {
try {
const check = await redirectService.getCheck(req.params.id);
if (!check) {
return res.status(404).json({
success: false,
status: 404,
error: 'Check not found',
});
}
res.json({
success: true,
status: 200,
data: check,
});
} catch (error) {
console.error('Error fetching check:', error);
res.status(500).json({
success: false,
status: 500,
error: 'Failed to fetch check',
});
}
});
export default router;
3. Project Routes
apps/api/src/routes/projects.routes.ts
import express from 'express';
import { z } from 'zod';
import { RedirectService } from '@/services/redirect.service';
import { authenticateToken, AuthenticatedRequest } from '@/middleware/auth.middleware';
const router = express.Router();
const redirectService = new RedirectService();
const getChecksSchema = z.object({
page: z.coerce.number().min(1).default(1),
limit: z.coerce.number().min(1).max(100).default(20),
status: z.enum(['OK', 'ERROR', 'TIMEOUT', 'LOOP']).optional(),
});
// GET /api/v1/projects/:id/checks - Get project check history
router.get('/:id/checks', authenticateToken, async (req: AuthenticatedRequest, res) => {
try {
const { page, limit, status } = getChecksSchema.parse(req.query);
const result = await redirectService.getProjectChecks(req.params.id, {
page,
limit,
status: status as any,
});
res.json({
success: true,
status: 200,
data: result,
});
} catch (error) {
console.error('Error fetching project checks:', error);
res.status(500).json({
success: false,
status: 500,
error: 'Failed to fetch project checks',
});
}
});
export default router;
4. Updated Main Server
apps/api/src/index.ts (add new routes)
// ... existing imports and setup
import checkRoutes from '@/routes/checks.routes';
import projectRoutes from '@/routes/projects.routes';
// ... existing middleware
// New routes
app.use('/api/v1/checks', checkRoutes);
app.use('/api/v1/projects', projectRoutes);
// ... existing routes (preserve exactly)
Implementation Steps
- Create enhanced RedirectService with database persistence
- Implement check creation and retrieval endpoints
- Add project check history endpoint
- Ensure backward compatibility with existing endpoints
- Add comprehensive validation with Zod
- Create database migration for check-related tables
- Add tests for new functionality
Testing Requirements
- Unit tests for RedirectService methods
- Integration tests for new API endpoints
- Backward compatibility tests
- Database persistence tests
- Performance tests for redirect analysis
Commit Message
feat(phase-2): add persisted checks with non-JS redirect chain analysis
- Create new /api/v1/checks endpoint that persists redirect analysis
- Implement enhanced RedirectService with database storage
- Add check retrieval and project history endpoints
- Preserve 100% backward compatibility with existing endpoints
- Add comprehensive validation with Zod schemas
- Support meta refresh redirect detection
- Include project-based check organization
- Add pagination and filtering for check history
Phase 3: SSL/SEO/Security Flags
Goals
- Add SSL certificate inspection and storage
- Implement SEO analysis (robots.txt, meta tags, canonical URLs)
- Add security flags (mixed content, safe browsing)
- Enhance redirect analysis with comprehensive metadata
Files to Create/Modify
1. SSL Inspection Service
apps/api/src/services/ssl.service.ts
import https from 'https';
import tls from 'tls';
import { z } from 'zod';
import { prisma } from '@/lib/prisma';
export interface SslCertificateInfo {
host: string;
validFrom?: Date;
validTo?: Date;
daysToExpiry?: number;
issuer?: string;
subject?: string;
protocol?: string;
valid: boolean;
warnings: string[];
}
export class SslService {
async inspectSslCertificate(url: string): Promise<SslCertificateInfo | null> {
try {
const urlObj = new URL(url);
if (urlObj.protocol !== 'https:') {
return null;
}
return new Promise((resolve, reject) => {
const options = {
host: urlObj.hostname,
port: urlObj.port || 443,
rejectUnauthorized: false,
timeout: 10000,
};
const socket = tls.connect(options, () => {
try {
const cert = socket.getPeerCertificate(true);
const now = new Date();
const validFrom = new Date(cert.valid_from);
const validTo = new Date(cert.valid_to);
const daysToExpiry = Math.floor((validTo.getTime() - now.getTime()) / (1000 * 60 * 60 * 24));
const warnings: string[] = [];
// Check certificate validity
if (now < validFrom) {
warnings.push('Certificate not yet valid');
}
if (now > validTo) {
warnings.push('Certificate expired');
}
if (daysToExpiry <= 30 && daysToExpiry > 0) {
warnings.push(`Certificate expires in ${daysToExpiry} days`);
}
if (!socket.authorized) {
warnings.push('Certificate authorization failed');
}
const result: SslCertificateInfo = {
host: urlObj.hostname,
validFrom,
validTo,
daysToExpiry,
issuer: this.formatCertificateName(cert.issuer),
subject: this.formatCertificateName(cert.subject),
protocol: socket.getProtocol(),
valid: socket.authorized && now >= validFrom && now <= validTo,
warnings,
};
socket.end();
resolve(result);
} catch (error) {
socket.end();
reject(error);
}
});
socket.on('error', (error) => {
reject(error);
});
socket.on('timeout', () => {
socket.end();
reject(new Error('SSL connection timeout'));
});
});
} catch (error) {
console.error('SSL inspection error:', error);
return null;
}
}
private formatCertificateName(certName: any): string {
if (typeof certName === 'string') return certName;
const parts = [];
if (certName.CN) parts.push(`CN=${certName.CN}`);
if (certName.O) parts.push(`O=${certName.O}`);
if (certName.C) parts.push(`C=${certName.C}`);
return parts.join(', ');
}
async saveSslInspection(checkId: string, sslInfo: SslCertificateInfo): Promise<void> {
await prisma.sslInspection.create({
data: {
checkId,
host: sslInfo.host,
validFrom: sslInfo.validFrom,
validTo: sslInfo.validTo,
daysToExpiry: sslInfo.daysToExpiry,
issuer: sslInfo.issuer,
protocol: sslInfo.protocol,
warningsJson: sslInfo.warnings,
},
});
}
}
2. SEO Analysis Service
apps/api/src/services/seo.service.ts
import axios from 'axios';
import { JSDOM } from 'jsdom';
import { z } from 'zod';
import { prisma } from '@/lib/prisma';
export interface SeoAnalysis {
robotsTxtStatus?: string;
robotsTxtRules?: Record<string, any>;
metaRobots?: string;
canonicalUrl?: string;
sitemapPresent: boolean;
noindex: boolean;
nofollow: boolean;
title?: string;
description?: string;
openGraphData?: Record<string, string>;
}
export class SeoService {
async analyzeSeo(finalUrl: string, htmlContent?: string): Promise<SeoAnalysis> {
const analysis: SeoAnalysis = {
sitemapPresent: false,
noindex: false,
nofollow: false,
};
try {
const urlObj = new URL(finalUrl);
const baseUrl = `${urlObj.protocol}//${urlObj.host}`;
// Analyze robots.txt
const robotsAnalysis = await this.analyzeRobotsTxt(baseUrl);
analysis.robotsTxtStatus = robotsAnalysis.status;
analysis.robotsTxtRules = robotsAnalysis.rules;
analysis.sitemapPresent = robotsAnalysis.sitemapPresent;
// Analyze HTML if available
if (htmlContent) {
const htmlAnalysis = this.analyzeHtml(htmlContent, finalUrl);
Object.assign(analysis, htmlAnalysis);
}
return analysis;
} catch (error) {
console.error('SEO analysis error:', error);
return analysis;
}
}
private async analyzeRobotsTxt(baseUrl: string): Promise<{
status: string;
rules: Record<string, any>;
sitemapPresent: boolean;
}> {
try {
const robotsUrl = `${baseUrl}/robots.txt`;
const response = await axios.get(robotsUrl, {
timeout: 5000,
validateStatus: (status) => status < 500,
});
if (response.status === 200) {
const robotsTxt = response.data;
const rules = this.parseRobotsTxt(robotsTxt);
const sitemapPresent = robotsTxt.toLowerCase().includes('sitemap:');
return {
status: 'found',
rules,
sitemapPresent,
};
} else {
return {
status: `not_found_${response.status}`,
rules: {},
sitemapPresent: false,
};
}
} catch (error) {
return {
status: 'error',
rules: {},
sitemapPresent: false,
};
}
}
private parseRobotsTxt(content: string): Record<string, any> {
const rules: Record<string, any> = {
userAgents: {},
sitemaps: [],
};
const lines = content.split('\n');
let currentUserAgent = '*';
for (const line of lines) {
const trimmedLine = line.trim();
if (!trimmedLine || trimmedLine.startsWith('#')) continue;
const [directive, ...valueParts] = trimmedLine.split(':');
const value = valueParts.join(':').trim();
switch (directive.toLowerCase()) {
case 'user-agent':
currentUserAgent = value;
if (!rules.userAgents[currentUserAgent]) {
rules.userAgents[currentUserAgent] = {
allow: [],
disallow: [],
};
}
break;
case 'allow':
if (!rules.userAgents[currentUserAgent]) {
rules.userAgents[currentUserAgent] = { allow: [], disallow: [] };
}
rules.userAgents[currentUserAgent].allow.push(value);
break;
case 'disallow':
if (!rules.userAgents[currentUserAgent]) {
rules.userAgents[currentUserAgent] = { allow: [], disallow: [] };
}
rules.userAgents[currentUserAgent].disallow.push(value);
break;
case 'sitemap':
rules.sitemaps.push(value);
break;
}
}
return rules;
}
private analyzeHtml(htmlContent: string, finalUrl: string): Partial<SeoAnalysis> {
try {
const dom = new JSDOM(htmlContent);
const document = dom.window.document;
const analysis: Partial<SeoAnalysis> = {};
// Meta robots
const metaRobots = document.querySelector('meta[name="robots"]');
if (metaRobots) {
const content = metaRobots.getAttribute('content')?.toLowerCase() || '';
analysis.metaRobots = content;
analysis.noindex = content.includes('noindex');
analysis.nofollow = content.includes('nofollow');
}
// Canonical URL
const canonicalLink = document.querySelector('link[rel="canonical"]');
if (canonicalLink) {
analysis.canonicalUrl = canonicalLink.getAttribute('href') || undefined;
}
// Title
const titleElement = document.querySelector('title');
if (titleElement) {
analysis.title = titleElement.textContent || undefined;
}
// Meta description
const metaDescription = document.querySelector('meta[name="description"]');
if (metaDescription) {
analysis.description = metaDescription.getAttribute('content') || undefined;
}
// Open Graph data
const ogTags = document.querySelectorAll('meta[property^="og:"]');
if (ogTags.length > 0) {
analysis.openGraphData = {};
ogTags.forEach((tag) => {
const property = tag.getAttribute('property');
const content = tag.getAttribute('content');
if (property && content) {
analysis.openGraphData![property] = content;
}
});
}
return analysis;
} catch (error) {
console.error('HTML analysis error:', error);
return {};
}
}
async saveSeoFlags(checkId: string, analysis: SeoAnalysis): Promise<void> {
await prisma.seoFlags.create({
data: {
checkId,
robotsTxtStatus: analysis.robotsTxtStatus,
robotsTxtRulesJson: analysis.robotsTxtRules || {},
metaRobots: analysis.metaRobots,
canonicalUrl: analysis.canonicalUrl,
sitemapPresent: analysis.sitemapPresent,
noindex: analysis.noindex,
nofollow: analysis.nofollow,
},
});
}
}
3. Security Analysis Service
apps/api/src/services/security.service.ts
import axios from 'axios';
import { JSDOM } from 'jsdom';
import { z } from 'zod';
import { prisma } from '@/lib/prisma';
import { MixedContent } from '@prisma/client';
export interface SecurityAnalysis {
safeBrowsingStatus?: string;
mixedContent: MixedContent;
httpsToHttp: boolean;
insecureResources: string[];
securityHeaders: Record<string, string>;
}
export class SecurityService {
async analyzeSecurity(
redirectChain: Array<{ url: string; statusCode?: number }>,
finalHtmlContent?: string
): Promise<SecurityAnalysis> {
const analysis: SecurityAnalysis = {
mixedContent: MixedContent.NONE,
httpsToHttp: false,
insecureResources: [],
securityHeaders: {},
};
try {
// Check for HTTPS to HTTP downgrades
analysis.httpsToHttp = this.detectHttpsDowngrade(redirectChain);
// Analyze mixed content if we have HTML
if (finalHtmlContent) {
const mixedContentAnalysis = this.analyzeMixedContent(finalHtmlContent);
analysis.mixedContent = mixedContentAnalysis.mixedContent;
analysis.insecureResources = mixedContentAnalysis.insecureResources;
}
// Check final URL mixed content status
if (analysis.httpsToHttp) {
analysis.mixedContent = MixedContent.FINAL_TO_HTTP;
} else if (analysis.insecureResources.length > 0) {
analysis.mixedContent = MixedContent.PRESENT;
}
// Optional: Google Safe Browsing API check (requires API key)
if (process.env.GOOGLE_SAFE_BROWSING_API_KEY) {
analysis.safeBrowsingStatus = await this.checkSafeBrowsing(
redirectChain[redirectChain.length - 1]?.url
);
}
return analysis;
} catch (error) {
console.error('Security analysis error:', error);
return analysis;
}
}
private detectHttpsDowngrade(redirectChain: Array<{ url: string }>): boolean {
for (let i = 1; i < redirectChain.length; i++) {
const prevUrl = redirectChain[i - 1].url;
const currUrl = redirectChain[i].url;
if (prevUrl.startsWith('https://') && currUrl.startsWith('http://')) {
return true;
}
}
return false;
}
private analyzeMixedContent(htmlContent: string): {
mixedContent: MixedContent;
insecureResources: string[];
} {
try {
const dom = new JSDOM(htmlContent);
const document = dom.window.document;
const insecureResources: string[] = [];
// Check for insecure resources
const resourceSelectors = [
'img[src^="http://"]',
'script[src^="http://"]',
'link[href^="http://"]',
'iframe[src^="http://"]',
'embed[src^="http://"]',
'object[data^="http://"]',
];
resourceSelectors.forEach(selector => {
const elements = document.querySelectorAll(selector);
elements.forEach(element => {
const url = element.getAttribute('src') || element.getAttribute('href') || element.getAttribute('data');
if (url && url.startsWith('http://')) {
insecureResources.push(url);
}
});
});
return {
mixedContent: insecureResources.length > 0 ? MixedContent.PRESENT : MixedContent.NONE,
insecureResources,
};
} catch (error) {
console.error('Mixed content analysis error:', error);
return {
mixedContent: MixedContent.NONE,
insecureResources: [],
};
}
}
private async checkSafeBrowsing(url: string): Promise<string> {
try {
const apiKey = process.env.GOOGLE_SAFE_BROWSING_API_KEY;
if (!apiKey) return 'not_checked';
const response = await axios.post(
`https://safebrowsing.googleapis.com/v4/threatMatches:find?key=${apiKey}`,
{
client: {
clientId: 'redirect-intelligence',
clientVersion: '1.0',
},
threatInfo: {
threatTypes: ['MALWARE', 'SOCIAL_ENGINEERING', 'UNWANTED_SOFTWARE'],
platformTypes: ['ANY_PLATFORM'],
threatEntryTypes: ['URL'],
threatEntries: [{ url }],
},
},
{ timeout: 5000 }
);
return response.data.matches ? 'unsafe' : 'safe';
} catch (error) {
console.error('Safe browsing check error:', error);
return 'error';
}
}
async saveSecurityFlags(checkId: string, analysis: SecurityAnalysis): Promise<void> {
await prisma.securityFlags.create({
data: {
checkId,
safeBrowsingStatus: analysis.safeBrowsingStatus,
mixedContent: analysis.mixedContent,
httpsToHttp: analysis.httpsToHttp,
},
});
}
}
4. Enhanced Redirect Service
apps/api/src/services/redirect.service.ts (updated)
// ... existing imports
import { SslService } from './ssl.service';
import { SeoService } from './seo.service';
import { SecurityService } from './security.service';
export class RedirectService {
private sslService = new SslService();
private seoService = new SeoService();
private securityService = new SecurityService();
// ... existing methods
// Enhanced createCheck method
async createCheck(
data: z.infer<typeof createCheckSchema>,
userId?: string
): Promise<RedirectResult> {
const validatedData = createCheckSchema.parse(data);
const startTime = Date.now();
// Create check record
const check = await prisma.check.create({
data: {
projectId: validatedData.projectId || await this.getDefaultProjectId(userId),
inputUrl: validatedData.inputUrl,
method: validatedData.method,
headersJson: validatedData.headers,
userAgent: validatedData.userAgent,
startedAt: new Date(),
status: CheckStatus.OK,
},
});
try {
// Perform redirect analysis
const result = await this.analyzeRedirectChain(
validatedData.inputUrl,
validatedData.method,
validatedData.userAgent,
validatedData.headers
);
const totalTimeMs = Date.now() - startTime;
// Update check with results
await prisma.check.update({
where: { id: check.id },
data: {
finishedAt: new Date(),
finalUrl: result.finalUrl,
totalTimeMs,
status: result.status,
},
});
// Save hops
await this.saveHops(check.id, result.hops);
// Perform SSL analysis on HTTPS URLs
const httpsUrls = result.hops
.map(hop => hop.url)
.filter(url => url.startsWith('https://'));
for (const url of [...new Set(httpsUrls)]) {
const sslInfo = await this.sslService.inspectSslCertificate(url);
if (sslInfo) {
await this.sslService.saveSslInspection(check.id, sslInfo);
}
}
// Perform SEO analysis
const finalHop = result.hops[result.hops.length - 1];
let htmlContent: string | undefined;
if (finalHop?.contentType?.includes('text/html')) {
// Fetch final URL content for SEO analysis
try {
const response = await axios.get(result.finalUrl, {
timeout: 10000,
headers: { 'User-Agent': validatedData.userAgent || 'RedirectIntelligence/1.0' }
});
htmlContent = response.data;
} catch (error) {
console.warn('Failed to fetch final URL content for SEO analysis:', error);
}
}
const seoAnalysis = await this.seoService.analyzeSeo(result.finalUrl, htmlContent);
await this.seoService.saveSeoFlags(check.id, seoAnalysis);
// Perform security analysis
const securityAnalysis = await this.securityService.analyzeSecurity(
result.hops.map(hop => ({ url: hop.url, statusCode: hop.statusCode })),
htmlContent
);
await this.securityService.saveSecurityFlags(check.id, securityAnalysis);
return {
checkId: check.id,
inputUrl: validatedData.inputUrl,
finalUrl: result.finalUrl,
totalTimeMs,
status: result.status,
hops: result.hops,
metadata: {
redirectCount: result.hops.length - 1,
method: validatedData.method,
userAgent: validatedData.userAgent,
},
};
} catch (error) {
// Update check with error status
await prisma.check.update({
where: { id: check.id },
data: {
finishedAt: new Date(),
status: CheckStatus.ERROR,
totalTimeMs: Date.now() - startTime,
},
});
throw error;
}
}
// ... rest of existing methods
}
Implementation Steps
- Create SSL inspection service with certificate analysis
- Implement SEO analysis service (robots.txt, meta tags, canonical)
- Add security analysis service (mixed content, safe browsing)
- Enhance redirect service to include all analyses
- Update database schema with new flag tables
- Add comprehensive tests for all analysis services
- Update API responses to include new flag data
Testing Requirements
- Unit tests for SSL, SEO, and Security services
- Integration tests with real URLs
- Mock tests for external API calls (Safe Browsing)
- Performance tests for analysis pipeline
- Edge case testing (invalid certificates, missing robots.txt, etc.)
Commit Message
feat(phase-3): add comprehensive SSL/SEO/Security analysis
- Implement SSL certificate inspection with expiry warnings
- Add SEO analysis: robots.txt parsing, meta tags, canonical URLs
- Include security analysis: mixed content detection, HTTPS downgrades
- Optional Google Safe Browsing API integration
- Enhance redirect service with comprehensive metadata collection
- Add database storage for all analysis flags
- Include insecure resource detection in HTML content
- Support certificate chain validation and protocol analysis
Phases 4-13 Summary
Due to length constraints, here's a summary of the remaining phases:
Phase 4: Chakra UI Upgrade Complete
- Migrate from current frontend to React + Chakra UI
- Implement app shell with sidebar navigation
- Create responsive check detail pages with Mermaid diagrams
- Add dark/light mode toggle with Chakra theming
Phase 5: Exports (MD & PDF)
- Implement Markdown report generation with templates
- Add PDF export using Puppeteer with embedded Mermaid
- Create export endpoints and file management
Phase 6: Bulk CSV + Worker
- Implement BullMQ worker for bulk URL processing
- Add CSV upload and batch job management
- Create job progress tracking and result downloads
Phase 7: Request Options UI
- Enhanced frontend for custom headers and options
- Advanced user agent selection
- Request method configuration
Phase 8: API Keys + Public API + Quotas
- Implement API key authentication
- Add organization-based rate limiting
- Create public API endpoints with key authentication
Phase 9: Optional JS Redirects
- Integrate Playwright for JavaScript redirect detection
- Add browser automation for SPA redirects
- Optional JS analysis with timeout controls
Phase 10: Monitoring & Alerts
- Add uptime monitoring for URLs
- Implement alert system for status changes
- Create monitoring dashboards
Phase 11: Admin Panel
- User and organization management
- System configuration and monitoring
- Audit log viewing and analytics
Phase 12: Billing (Stripe)
- Implement Stripe integration
- Add subscription plans and usage limits
- Billing dashboard and payment management
Phase 13: Hardening & Perf
- Security audit and penetration testing
- Performance optimization and caching
- Production deployment and monitoring setup
Each phase would follow the same detailed structure as shown in phases 0-3, with specific file implementations, testing requirements, and commit messages.
Next Steps
- Start with Phase 0 to establish the foundation
- Follow phases sequentially to ensure stability
- Maintain backward compatibility throughout
- Implement comprehensive testing at each phase
- Use feature flags for gradual rollout
This implementation plan ensures a systematic upgrade while preserving all existing functionality and providing a clear path to the enhanced v2 system.