A local-first task management system for product and engineering teams. With AI-assisted refactoring, an 8-year-old legacy project was upgraded into a modern full-stack application built with Vite, Vue 3, Bun, and Hono, addressing the bloat, sluggishness, and weak scheduling found in traditional task tools. Keywords: task management, AI refactoring, local deployment.
The technical specification snapshot highlights a modern local-first stack
| Parameter | Details |
|---|---|
| Project Type | Local-first full-stack task management system |
| Frontend Language | TypeScript |
| Backend Language | TypeScript |
| Frontend Framework | Vue 3 + Vite + Pinia |
| Backend Framework | Bun + Hono |
| Data Storage | Official MongoDB driver |
| Communication Protocol | HTTP / WebSocket |
| Deployment Method | One-click local deployment with Docker |
| Code Organization | pnpm workspaces monorepo |
| GitHub Stars | Not provided in the source |
| Core Dependencies | Vue 3, Pinia, Vite, Bun, Hono, MongoDB |
This project redefines the minimum viable shape of personal task management
The original project came from a task manager the author built 8 years ago. It initially handled basic CRUD, but it could not support a production-grade product. With help from AI, the legacy system was refactored into an independent product codenamed age35. The goal was not to build a large, all-in-one collaboration platform, but a task system that is fast, lightweight, and sufficient for real work.
AI Visual Insight: This image shows the product launch visual and its core value proposition. It emphasizes local deployment, real-time synchronization across multiple views, and a modern full-stack stack, reinforcing the positioning of a legacy project upgraded into a product with AI.
The author’s dissatisfaction with traditional tools was highly focused: bloated features, slow cloud response times, weak scheduling capabilities, and fragmented personal context. Those pain points shaped the product direction: keep only high-frequency features, and turn each task card into a container for aggregated context.
A minimal task model matters more than a complex workflow
The system ultimately keeps only five fields: title, content, status, schedule, and note. This design is critical. It avoids the field sprawl common in workflow systems and creates a much denser context layer for AI to read and understand task data.
export interface Task {
title: string; // Task title that describes the goal
content: string; // Context such as PRDs, design drafts, and solutions
status: 'todo' | 'doing' | 'done'; // Status drives the global workflow
schedule: Array<{ start: string; end: string }>; // Supports multiple schedule segments
note?: string; // Additional notes
}
This type definition reflects the product’s core principles: fewer fields, higher expressiveness, and easier sharing.
Real-time synchronization across multiple views is the most valuable engineering design in the system
This project is not just a Kanban board. It builds a unified data source around task status. Individuals, team leads, and project managers see different views, but all of them are backed by the same task data. This avoids duplicate maintenance and prevents state drift.
AI Visual Insight: This image shows the main product interface, centered on task cards and a multi-column layout. It indicates that the system uses Kanban-style interactions to organize task states while emphasizing a lightweight, focused, and highly readable information layout.
AI Visual Insight: This image shows how each task card carries multiple document links and contextual references, demonstrating that a task entity stores more than status and also acts as an aggregation entry point for requirements, design, solutions, and prototypes.
A single task dataset supports three management perspectives
The personal view emphasizes execution efficiency, the lead view emphasizes team member status, and the project view emphasizes overall progress. By managing a unified status flow, the author allows different views to be assembled freely instead of maintaining separate task copies for each perspective.
AI Visual Insight: This image shows a personal task board with a classic status-column layout, making it easy for developers to manage individual workflows across To Do, In Progress, and Done.
AI Visual Insight: This image shows a personal timeline view that maps tasks into a linear schedule by time range. It is useful for identifying fragmented work hours, conflicting time slots, and phase-based workload density.
AI Visual Insight: This image shows a calendar view, indicating that the task system supports date-based aggregation and is well suited for reviewing weekly or monthly cadence and task distribution around holidays.
AI Visual Insight: This image shows a team management board view that can aggregate by member or task status, highlighting a team lead’s ability to observe execution progress across the team.
AI Visual Insight: This image shows a project manager’s perspective, demonstrating that the system supports aggregating task status at the project level to track progress across multiple execution units.
AI Visual Insight: This animated image shows real-time synchronization across multiple views after a task status change, demonstrating that the system has a unified state source and a real-time update mechanism that helps reduce synchronization overhead.
const updateTaskStatus = (taskId: string, status: Task['status']) => {
store.patch(taskId, { status }); // Update the status in the single source of truth
syncViews(taskId); // Broadcast the change to all views
};
This logic shows that the system uses a single source of truth to drive synchronization across multiple views instead of copying data into multiple places.
The scheduling capability is designed to serve real development rhythms
Most task systems offer only a single date field, but real product and engineering scheduling is far more complex. This project’s scheduling component supports weekends, holidays, occupied time slots, half-day granularity, and multi-segment scheduling, making it much closer to real-world development workflows.
AI Visual Insight: This image shows a custom scheduling component. The interface clearly includes a date grid, time-slot selection, and occupied-time indicators, showing that it is more than a date picker and instead functions as a task time planner with constraints.
Multi-segment scheduling solves the problem of expressing fragmented work hours
Frontend tasks often split into several phases: development, integration, testing submission, and bug fixing. Supporting multi-segment scheduling means the system can accurately represent how one task progresses across multiple days and multiple half-day blocks, instead of forcing everything into a vague single date.
const schedule = [
{ start: '2026-04-28 AM', end: '2026-04-28 PM' }, // First development time segment
{ start: '2026-04-30 PM', end: '2026-05-01 PM' } // Second integration time segment
];
This data structure reflects the scheduling component’s native support for fragmented work hours and multi-phase tasks.
The point of the stack upgrade is not novelty but lower maintenance cost
On the frontend, the stack moved from vue-cli + Vue2 + Vuex to Vite + Vue3 + Pinia. On the backend, it was refactored from Node + Koa + Mongoose to Bun + Hono + MongoDB Driver. On the surface, this looks like a technology refresh. In practice, it is about reducing historical baggage.
Monorepo organization and shared types improve full-stack collaboration efficiency
The author reorganized the project with pnpm workspaces, allowing the frontend and backend to evolve independently while sharing TypeScript type definitions. This enables unified reuse of API structures, task models, and status enums, greatly reducing alignment cost between frontend and backend.
packages/
web/ # Frontend application
server/ # Backend service
shared/ # Shared types and utilities
This directory structure reflects a full-stack engineering organization model with clear boundaries and shared types.
Bun and Hono are better suited for maintaining a lightweight personal product backend
The author’s reason for abandoning Mongoose was practical: the abstraction benefits of an ODM did not justify the cost of schema maintenance. By using the official MongoDB driver directly, together with Bun’s built-in capabilities and Hono’s lightweight routing, the backend uses fewer dependencies, starts faster, and becomes easier to deploy.
import { Hono } from 'hono';
const app = new Hono();
app.get('/tasks', async (c) => {
const tasks = await db.collection('tasks').find().toArray(); // Query MongoDB directly
return c.json(tasks); // Return the task list
});
This code shows the core style of the refactored backend: lightweight frameworks, minimal wrapping, and lower cognitive overhead.
A local-first deployment strategy fits the target users better than moving to the cloud
This is a very clear product decision. The target users are product and engineering professionals and small teams in China. They care more about response speed, data control, and low operations overhead than SaaS packaging. One-click Docker deployment strikes the right balance between usability and technical threshold.
Local deployment brings three immediate benefits: complete control over data, zero-latency access, and availability in intranet scenarios. For individual users, that is often more appealing than creating accounts, binding organizations, and waiting for cloud synchronization.
AI does more than write code and also helps drive architecture decisions
The most important takeaway from the original article is not how much code was generated with “$20 of tokens.” It is that AI participated in stack migration, monorepo design, dependency consolidation, and deployment strategy selection. Here, AI acts as a refactoring copilot rather than a simple code generator.
Extensibility is built on top of dense task context
Because each task card already carries requirements, design, solutions, schedule, and status, it naturally supports derived automation capabilities such as one-click weekly reports, periodic summaries, testing notifications, and GitLab integration.
AI Visual Insight: This animated image shows the system automatically generating a weekly report from task data, demonstrating that the structured task model already contains enough information to power document-generation automation and serve as a key entry point for AI-enhanced productivity.
function generateWeeklyReport(tasks: Task[]) {
return tasks
.filter(task => task.status === 'done') // Only summarize completed tasks
.map(task => `- ${task.title}`)
.join('\n');
}
This logic shows that weekly reporting is built on structured task data, which makes expansion inexpensive.
FAQ provides structured answers to key design questions
Why does this project emphasize local deployment instead of building a SaaS product directly?
Because the goal is to improve efficiency for individuals and small teams. Compared with cloud-based systems, local deployment is faster, more stable, gives users more control over data, and significantly reduces operations burden for both the author and the users.
Why migrate the backend from Koa and Mongoose to Bun, Hono, and the official MongoDB driver?
The core reason is to reduce maintenance burden. Bun provides faster runtime and installation performance, Hono is lighter, and the official driver removes the schema maintenance cost imposed by an ODM, making the stack better suited for long-term solo maintenance.
Why does this task model keep only five fields?
Because the author is solving high-frequency task management rather than approval workflows. Fewer fields reduce input cost and also make it easier for AI to understand context and generate weekly reports, summaries, or automation actions.
AI Readability Summary: This article examines how an 8-year-old codebase evolved into an independent product. It focuses on how a task management system used AI to complete a stack upgrade, architectural simplification, and a local-first deployment design, covering Vite, Vue 3, Bun, Hono, monorepo organization, and real-time synchronization across multiple views.