Neural Texture Splatting: Expressive 3D Gaussian Splatting for View Synthesis, Geometry, and Dynamic Reconstruction

Yiming Wang, Shaofei Wang, Marko Mihajlovic, Siyu Tang

ETH Zurich

SIGGRAPH Asia 2025 (Conference Track)

Teaser overview of Neural Texture Splatting
Neural Texture Splatting (NTS) augments 3D Gaussian Splatting (3DGS) with neural RGBA fields per primitive, offering a plug-and-play module that boosts performance across multiple applications. We show that NTS significantly enhances the quality of sparse-view reconstruction for static and dynamic scenes, while also achieving noticeable improvements on geometry reconstruction and novel view synthesis under dense training views.

TL;DR

Neural Texture Splatting (NTS) an expressive extension of 3D Gaussian Splatting that introduces a local neural RGBA field for each primitive.

  • ✔ Enhances modeling capacity to capture fine-grained local details as well as view- and temporal-dependent variations
  • ✔ Applicable to static and dynamic scenes, under both sparse-view and dense-view configurations

Method

Overview of Neural Texture Splatting
Overview of our Neural Texture Splatting framework.

Neural Texture Splatting augments 3D Gaussian Splatting (3DGS) by attaching a local RGBA tri-plane texture to each Gaussian primitive.

  1. Local RGBA Textures. During rendering, each camera ray intersects a splat and samples its local tri-plane texture. The sampled RGBA value is fused with the original Gaussian attributes via volume rendering.
  2. Limitation of Local Textures. Although local textures improve color and opacity modeling, they tend to overfit and cannot represent view-dependent or time-varying effects.
  3. Global Tri-plane Field. To address this, we introduce a global tri-plane neural field that provides compact, shared texture features and regularizes the per-splat textures.

Results

Quantitative & Qualitative

Static scenes quantitative table
Dense-view Reconstruction.
Representative frames (static)
Qualitative Improvements on MipNeRF360 Dataset.

Video Comparisons

Geometry Reconstruction

BibTeX

@@misc{wang2025neuraltexturesplattingexpressive,
    title={Neural Texture Splatting: Expressive 3D Gaussian Splatting for View Synthesis, Geometry, and Dynamic Reconstruction}, 
    author={Yiming Wang and Shaofei Wang and Marko Mihajlovic and Siyu Tang},
    year={2025},
    eprint={2511.18873},
    archivePrefix={arXiv},
    primaryClass={cs.CV},
    url={https://arxiv.org/abs/2511.18873}, 
}

Acknowledgments

We would like to express our gratitude to Brian Chao and Qin Han for their helpful discussions.