01
01
01
Cosein Stars
Cosein Stars
Cosein Stars
An edge-synchronized camera constellation that records time as space—turning motion into live, interactive 4D Gaussian Splatting
An edge-synchronized camera constellation that records time as space—turning motion into live, interactive 4D Gaussian Splatting
An edge-synchronized camera constellation that records time as space—turning motion into live, interactive 4D Gaussian Splatting
Description
Description
Cosein Stars makes streaming spatial—delivering live, interactive 4D Gaussian Splatting.
We built a capture architecture that records the world not as isolated frames but as a coherent, time-aligned field that is natively ready for 4D Gaussian Splatting. Each Star is an edge computer with eyes: a tri-sensor cluster spanning complementary focal ranges to fuse texture and structure in a single pass. The system phases every exposure deterministically, aligns time at the source, and streams reconstruction-ready data over one cable. Orchestrated by our Constellation controller, dozens of Stars behave like one instrument. In practice we achieve sub-frame precision—≤1 ms on 30-camera rigs and ≤3 ms across larger constellations—unlocking live, navigable 4D that feels less like post and more like playback. This is a first-of-its-kind way to shoot: edge-native, millisecond-true, built for the stage as much as the lab.
Cosein Stars makes streaming spatial—delivering live, interactive 4D Gaussian Splatting.
We built a capture architecture that records the world not as isolated frames but as a coherent, time-aligned field that is natively ready for 4D Gaussian Splatting. Each Star is an edge computer with eyes: a tri-sensor cluster spanning complementary focal ranges to fuse texture and structure in a single pass. The system phases every exposure deterministically, aligns time at the source, and streams reconstruction-ready data over one cable. Orchestrated by our Constellation controller, dozens of Stars behave like one instrument. In practice we achieve sub-frame precision—≤1 ms on 30-camera rigs and ≤3 ms across larger constellations—unlocking live, navigable 4D that feels less like post and more like playback. This is a first-of-its-kind way to shoot: edge-native, millisecond-true, built for the stage as much as the lab.
Keywords
Keywords
4D Gaussian Splatting
4D Gaussian Splatting
Edge Volumetric Capture
Edge Volumetric Capture
Multicamera Sync
Multicamera Sync
Real-time Renderer
Spatial media
Year
Year
2025
2025






Technical Details
Technical Details
Capture triad, multi-focal fusion. Each Star integrates three precisely phase-locked sensors with complementary focal coverage. The layout yields high-parallax baselines for robust feature tracks and stable epipolar geometry, while preserving fine surface detail for dense correspondence. Long, clean tracks translate to faster SfM/BA convergence and well-conditioned intrinsics/extrinsics.
Deterministic timing, proven in measurement. We synchronize at the source using hardware-timed phasing and future-timestamp triggering—no drift-prone “best effort” software gates. Field tests: ≤1 ms RMS skew on 30-camera kits; ≤3 ms across multi-node constellations, sustained over hour-scale sessions.
Edge compute, renderer-ready output. On-device encoding (H.264/H.265) with monotonic, per-frame timecodes produces three synchronized 4K30 streams per Star without external racks. An onboard AI pipeline (single-digit TOPS class) handles realtime LUTs, color management, and pre-flight QA, so footage arrives reconstruction-ready.
One cable, all signals. Gigabit PoE is the backplane: power, control, and data over a single line with studio-length runs. Fewer cables; fewer failure points; tool-free scaling.
Lighting that understands timing. Each Star carries a tri-LED fill module addressable from the Constellation controller. Per-node lighting cues and cross-node animations can be scripted into the capture, improving multi-view consistency for dynamic scenes.
Constellation control, built to scale. Our conductor discovers nodes, pushes future-time schedules, verifies health/storage, and coordinates start/stop with sub-frame determinism. Tested to 72 cameras today, architected for hundreds tomorrow—same APIs, same mental model.
From set to splats—fast. The software path favors temporally coherent reconstruction: we feed SfM/MVS with high-parallax, low-jitter tracks; apply motion-aware regularization to preserve identity through pose change; and publish a splat-native stream that supports progressive, tile-based playback on web and headset.
Capture triad, multi-focal fusion. Each Star integrates three precisely phase-locked sensors with complementary focal coverage. The layout yields high-parallax baselines for robust feature tracks and stable epipolar geometry, while preserving fine surface detail for dense correspondence. Long, clean tracks translate to faster SfM/BA convergence and well-conditioned intrinsics/extrinsics.
Deterministic timing, proven in measurement. We synchronize at the source using hardware-timed phasing and future-timestamp triggering—no drift-prone “best effort” software gates. Field tests: ≤1 ms RMS skew on 30-camera kits; ≤3 ms across multi-node constellations, sustained over hour-scale sessions.
Edge compute, renderer-ready output. On-device encoding (H.264/H.265) with monotonic, per-frame timecodes produces three synchronized 4K30 streams per Star without external racks. An onboard AI pipeline (single-digit TOPS class) handles realtime LUTs, color management, and pre-flight QA, so footage arrives reconstruction-ready.
One cable, all signals. Gigabit PoE is the backplane: power, control, and data over a single line with studio-length runs. Fewer cables; fewer failure points; tool-free scaling.
Lighting that understands timing. Each Star carries a tri-LED fill module addressable from the Constellation controller. Per-node lighting cues and cross-node animations can be scripted into the capture, improving multi-view consistency for dynamic scenes.
Constellation control, built to scale. Our conductor discovers nodes, pushes future-time schedules, verifies health/storage, and coordinates start/stop with sub-frame determinism. Tested to 72 cameras today, architected for hundreds tomorrow—same APIs, same mental model.
From set to splats—fast. The software path favors temporally coherent reconstruction: we feed SfM/MVS with high-parallax, low-jitter tracks; apply motion-aware regularization to preserve identity through pose change; and publish a splat-native stream that supports progressive, tile-based playback on web and headset.






Highlights
Highlights
Synchronized by design. Millisecond-true capture at the source (≤1 ms on 30 cams; ≤3 ms at constellation scale (upto 256 cameras)).
Edge-native by default. Three 4K30 streams per node, hardware-encoded with per-frame timecodes—no server farm required.
One-cable deployment. PoE backbone for power, control, and data; studio-length runs with clean topology.
Lighting in lockstep. Integrated fill lights obey the same schedule as shutters for scene-aware, cross-node cues.
Scales without drama. Proven at 72 cameras; designed for hundreds with the same control plane and APIs.
Made for live 4D. Streamable, splat-native output that moves from set to interactive playback in minutes.
Synchronized by design. Millisecond-true capture at the source (≤1 ms on 30 cams; ≤3 ms at constellation scale (upto 256 cameras)).
Edge-native by default. Three 4K30 streams per node, hardware-encoded with per-frame timecodes—no server farm required.
One-cable deployment. PoE backbone for power, control, and data; studio-length runs with clean topology.
Lighting in lockstep. Integrated fill lights obey the same schedule as shutters for scene-aware, cross-node cues.
Scales without drama. Proven at 72 cameras; designed for hundreds with the same control plane and APIs.
Made for live 4D. Streamable, splat-native output that moves from set to interactive playback in minutes.















Credits
Creative Design
Yueze Zhang
Hardware
Yueze Zhang
Fabrication
Yueze Zhang, Ziyi Li, Zhuo Lou, Xiaoguang Huang, Junyi Chen
Artists
Jiazheng Huang (HK), Xiaoguang Huang
Appendix
Special Thanks
We’re grateful to our partners and friends for their support:
BAFC 湾区时尚联盟
Credits
Creative Design
Yueze Zhang
Hardware
Yueze Zhang
Fabrication
Yueze Zhang, Ziyi Li, Zhuo Lou, Xiaoguang Huang, Junyi Chen
Artists
Jiazheng Huang (HK), Xiaoguang Huang
Appendix
Special Thanks
We’re grateful to our partners and friends for their support:
BAFC 湾区时尚联盟


