[AI Readability Summary] This article breaks down a HarmonyOS 6 cross-device whiteboard system built around distributed state synchronization, device-adaptive floating navigation, immersive light effect coordination, and real-time stroke collaboration. It solves common multi-device issues such as abrupt navigation changes, fragmented theming, and unsynchronized interaction feedback. Keywords: HarmonyOS 6, distributed collaboration, immersive lighting.
The technical specification snapshot outlines the system baseline
| Parameter | Description |
|---|---|
| Language | ArkTS / TypeScript-style ETS |
| UI Framework | ArkUI |
| Collaboration Protocol | HarmonyOS Distributed Soft Bus + distributedDataObject |
| API Version | HarmonyOS 6 / API 23 |
| GitHub Stars | Not provided in the original article |
| Core Dependencies | @kit.ArkUI, @kit.AbilityKit, @kit.DistributedServiceKit, @kit.BasicServicesKit |
This system is designed for cross-device collaborative whiteboard scenarios
This solution centers on a “Light and Shadow Co-Creation” whiteboard. Its goal is not to synchronize pages alone, but to synchronize interaction state, visual feedback, and collaboration context. Phones, tablets, and PCs in the same session share tool state, theme lighting, focused device identity, and stroke data.
Traditional multi-device collaboration usually suffers from three categories of problems: inconsistent layouts, fragmented visual themes, and invisible remote actions. HarmonyOS 6 abstracts these states into broadcastable data objects, allowing both UI state and collaboration semantics to flow across devices at the same time.
AI Visual Insight: The diagram shows a three-endpoint collaboration architecture. At the bottom is a low-latency Distributed Soft Bus network layer, and above it are three device classes: phone, tablet, and PC. Each device uses a different floating navigation layout and a differentiated interaction model for touch, stylus, and keyboard-mouse input, highlighting the system principle of “shared state, heterogeneous presentation.”
The distributed state object acts as the system hub
The system models currentTool, themeColor, navExpanded, activeDeviceId, and lastOperation inside one distributed object. As a result, tool switches, theme changes, and focus transfers on any device can trigger immediate updates on the others.
interface DistributedState {
currentTool: string; // Current tool
themeColor: string; // Theme color
navExpanded: boolean; // Whether the navigation is expanded
activeDeviceId: string; // Currently active device
lastOperation: { type: string; timestamp: number; deviceId: string }; // Most recent operation
}
This structure defines the minimum state set required for cross-device synchronization and serves as the data contract for the entire whiteboard system.
Environment setup must begin with distributed capability initialization
At the dependency level, the system requires at least ArkUI, AbilityKit, DistributedServiceKit, and BasicServicesKit. The first two handle UI and lifecycle management. The latter two handle device discovery, data synchronization, and exception processing.
{
"dependencies": {
"@kit.ArkUI": "^6.1.0",
"@kit.AbilityKit": "^6.1.0",
"@kit.DistributedServiceKit": "^6.1.0",
"@kit.BasicServicesKit": "^6.1.0"
}
}
This configuration declares the core module dependencies required to build a cross-device collaborative whiteboard.
The initialization flow sets the upper bound of the collaboration experience
After the application starts, CollaborateAbility first obtains the local device ID, then listens for trusted device online and offline events, and finally creates the distributedDataObject and binds it to a session ID. This object is essentially a shared cross-device state container.
this.distributedObject = distributedDataObject.create(this.context, initialState);
this.distributedObject.on('change', (_sessionId, fields) => {
this.handleDistributedDataChange(fields); // Handle remote state changes
});
await this.distributedObject.setSessionId('collaborate_session_001'); // Join the shared collaboration session
This code completes the full three-step loop of distributed state object creation, event listening, and session attachment.
Device-adaptive floating navigation is responsible for interaction consistency
The navigation component does not use a fixed layout. Instead, it dynamically selects a form factor based on the physical size of the device. Phones use bottom floating navigation, tablets use side navigation, and PCs enable left-side navigation with an extended toolbar. This approach maps the same toolset to the most natural interaction model for each device.
The key is not responsiveness alone, but synchronizing navExpanded and currentTool into the distributed object. When a user expands navigation on a PC, other devices can perceive the collaboration context instead of rendering in isolation.
private switchTool(toolId: string): void {
if (this.currentTool === toolId) return;
this.currentTool = toolId; // Switch tool locally
(this.distributedObj as any).currentTool = toolId; // Broadcast to other devices
(this.distributedObj as any).lastOperation = {
type: 'tool_switch', // Mark the operation type
timestamp: Date.now(),
deviceId: AppStorage.get
<string>('local_device_id') || ''
};
}
This logic allows a tool switch to update both the local UI and the remote collaboration semantics.
Immersive lighting establishes visual consistency through the theme synchronization engine
The responsibility of LightSyncEngine is not to simply change colors. It turns theme switching into a smooth 300 ms cross-device transition. When any device initiates a theme change, other devices receive the lightTheme update and run interpolation animation, producing the perception that a color change on one device propagates across the entire collaborative environment.
applyTheme(theme: LightTheme, sync: boolean = true): void {
this.animateThemeTransition(this.currentTheme, theme); // Smooth transition
this.currentTheme = theme;
if (sync && this.distributedObj) {
(this.distributedObj as any).lightTheme = theme; // Synchronize theme
}
}
This code combines local visual transition and cross-device state synchronization into one theme update action.
The collaborative canvas must balance real-time responsiveness with interpretability
The canvas component splits strokes into two layers: StrokePoint and Stroke. During the real-time phase, it synchronizes partial point sets with a 50 ms throttle. After the stroke ends, it broadcasts the full stroke. This design reduces network load while still ensuring that remote participants see a continuous drawing process.
Remote strokes are not only rendered. They also include device color and an active glow effect. This allows collaborators to immediately understand who is drawing and where they are drawing, reducing cognitive conflict during multi-user editing.
private syncStrokeThrottled(): void {
const now = Date.now();
if (now - this.lastSyncTime > 50) {
this.lastSyncTime = now;
AppStorage.setOrCreate('sync_stroke_partial', {
id: `${this.localDeviceId}_${now}`,
points: [...this.currentStroke], // Incrementally synchronize the current stroke
color: this.themeColor,
width: 3,
deviceId: this.localDeviceId,
deviceColor: this.getDeviceColor(this.localDeviceId)
});
}
}
This code uses a 50 ms throttling mechanism to balance real-time collaboration with transmission cost.
The main page integrates navigation, canvas, and status bar into one unified experience
The main page uses a layered Stack to organize a dynamic ambient light background, collaborative canvas, immersive status bar, and online device indicator. It is not the center of business logic. It serves as the UI orchestration layer that coordinates multiple modules.
The top status bar can display the number of online devices, the current collaboration activity state, and a theme switch entry point. Combined with the focused-device synchronization mechanism, non-active devices reduce light intensity and show remote operation prompts, visually clarifying the current primary operator.
The technical value of this solution lies in synchronizing more than data
From an architectural perspective, the system synchronizes five categories of critical state: navigation state, lighting theme, stroke data, focus state, and operation events. Together, they cover the three major collaboration dimensions: what users see, what they are doing, and who is currently operating.
In terms of performance, the original article proposes stroke compression, incremental synchronization, offscreen rendering, and object pool reuse. For further production hardening, it is advisable to add conflict resolution strategies, reconnection recovery, session snapshots, and an evolution path toward CRDT-based collaboration.
FAQ
Q1: Why does this whiteboard system synchronize state objects instead of synchronizing the entire UI directly?
A1: Because devices differ in layout and input models. Direct UI synchronization would create rigid presentation behavior. Synchronizing state objects enables adaptive rendering based on the principle of “shared semantics, different forms.”
Q2: Why does stroke synchronization use a 50 ms throttle instead of broadcasting every touch point?
A2: Per-point broadcasting would significantly increase network jitter and rendering pressure. A 50 ms throttle preserves continuity while keeping synchronization cost within an acceptable range for real-time cross-device collaboration.
Q3: What is the core engineering value of immersive lighting?
A3: It is not decorative. It serves as a collaboration feedback layer. Theme synchronization, focus enhancement, and remote pulse prompts together create a perceptible collaboration state that significantly reduces uncertainty during cross-device operations.
Core Summary: This article reconstructs a HarmonyOS 6 (API 23)-based cross-device whiteboard solution focused on distributed state synchronization, device-adaptive floating navigation, immersive light effect coordination, and real-time stroke collaboration. It is well suited for ArkTS and HarmonyOS developers who want to quickly understand the system design and core implementation.