Table of Contents
Fetching ...

Streaming Real-Time Rendered Scenes as 3D Gaussians

Matti Siekkinen, Teemu Kämäräinen

Abstract

Cloud rendering is widely used in gaming and XR to overcome limited client-side GPU resources and to support heterogeneous devices. Existing systems typically deliver the rendered scene as a 2D video stream, which tightly couples the transmitted content to the server-rendered viewpoint and limits latency compensation to image-space reprojection or warping. In this paper, we investigate an alternative approach based on streaming a live 3D Gaussian Splatting (3DGS) scene representation instead of only rendered video. We present a Unity-based prototype in which a server constructs and continuously optimizes a 3DGS model from real-time rendered reference views, while streaming the evolving representation to remote clients using full model snapshots and incremental updates supporting relighting and rigid object dynamics. The clients reconstruct the streamed Gaussian model locally and render their current viewpoint from the received representation. This approach aims to improve viewpoint flexibility for latency compensation and to better amortize server-side scene modeling across multiple users than per-user rendering and video streaming. We describe the system design, evaluate it, and compare it with conventional image warping.

Streaming Real-Time Rendered Scenes as 3D Gaussians

Abstract

Cloud rendering is widely used in gaming and XR to overcome limited client-side GPU resources and to support heterogeneous devices. Existing systems typically deliver the rendered scene as a 2D video stream, which tightly couples the transmitted content to the server-rendered viewpoint and limits latency compensation to image-space reprojection or warping. In this paper, we investigate an alternative approach based on streaming a live 3D Gaussian Splatting (3DGS) scene representation instead of only rendered video. We present a Unity-based prototype in which a server constructs and continuously optimizes a 3DGS model from real-time rendered reference views, while streaming the evolving representation to remote clients using full model snapshots and incremental updates supporting relighting and rigid object dynamics. The clients reconstruct the streamed Gaussian model locally and render their current viewpoint from the received representation. This approach aims to improve viewpoint flexibility for latency compensation and to better amortize server-side scene modeling across multiple users than per-user rendering and video streaming. We describe the system design, evaluate it, and compare it with conventional image warping.

Paper Structure

This paper contains 23 sections, 10 figures, 1 table.

Figures (10)

  • Figure 1: Overview of the proposed live game engine-driven 3DGS pipeline.
  • Figure 2: Novel-view quality as a function of elapsed iterations with controlled fixed-reference setting. Box plots are calculated across the different viewpoints.
  • Figure 3: Elapsed wall-clock time until FLIP $<$ 0.07.
  • Figure 4: Measured quality when orbiting the view camera around the content.
  • Figure 5: Comparison to depth-assisted image warping.
  • ...and 5 more figures