Gaussian Splatting: Papers #3
Here are the latest papers related to Gaussian Splatting! 🤘
Novel View Synthesis for Cinematic Anatomy on Mobile and Immersive Displays
Simon Niedermayr, Christoph Neuhauser, Kaloian Petkov, Klaus Engel, Rüdiger Westermann
Published on: 2024–04–17
PDF
This paper introduces a method for interactive photorealistic visualization of 3D anatomy, termed “Cinematic Anatomy,” which is adaptable for mobile and virtual reality platforms without high-end GPU dependencies. It utilizes compressed 3D Gaussian splatting to facilitate accessible and comprehensive anatomy teaching, allowing detailed visualization on low-capacity devices.
DeblurGS: Gaussian Splatting for Camera Motion Blur
Jeongtaek Oh, Jaeyoung Chung, Dongwoo Lee, Kyoung Mu Lee
Published on: 2024–04–17
PDF
DeblurGS presents a technique for reconstructing sharp 3D scenes from motion-blurred images using Gaussian Splatting. The method improves the accuracy of initial camera pose estimations affected by motion blur, enabling the optimization of image clarity and detail through innovative Gaussian Densification Annealing strategies during the splatting process.
RainyScape: Unsupervised Rainy Scene Reconstruction using Decoupled Neural Rendering
Xianqiang Lyu, Hui Liu, Junhui Hou
Published on: 2024–04–17
PDF
RainyScape employs an unsupervised approach to reconstruct clear scenes from rain-distorted multi-view images using a decoupled neural rendering framework. This method significantly enhances the visual quality by optimizing for the removal of rain streaks and improving the overall detail and clarity of the scene representation.
Dynamic Gaussians Mesh: Consistent Mesh Reconstruction from Monocular Videos
Isabella Liu, Hao Su, Xiaolong Wang
Published on: 2024–04–18
PDF / Project Website
DG-Mesh is an innovative framework designed to reconstruct high-fidelity and time-consistent meshes from monocular video inputs. Leveraging recent advancements in 3D Gaussian Splatting, DG-Mesh achieves temporal consistency in mesh reconstruction, enabling the tracking of mesh vertices over time.