image: <(From Left) Ph.D candidate Jumin Lee, Ph.D candidate Woo Jae Kim, Ph.D candidate Youngju Na, Ph.D candidate Kyu Beom Han, Professor Sung-eui Yoon>
Credit: KASIT
Existing 3D scene reconstructions require a cumbersome process of precisely measuring physical spaces with LiDAR or 3D scanners, or correcting thousands of photos along with camera pose information. The research team at KAIST has overcome these limitations and introduced a technology enabling the reconstruction of 3D —from tabletop objects to outdoor scenes—with just two to three ordinary photographs. The breakthrough suggests a new paradigm in which spaces captured by camera can be immediately transformed into virtual environments.
KAIST announced on November 6 that the research team led by Professor Sung-Eui Yoon from the School of Computing has developed a new technology called SHARE (Shape-Ray Estimation), which can reconstruct high-quality 3D scenes using only ordinary images, without precise camera pose information.
Existing 3D reconstruction technology has been limited by the requirement of precise camera position and orientation information at the time of shooting to reproduce 3D scenes from a small number of images. This has necessitated specialized equipment or complex calibration processes, making real-world applications difficult and slowing widespread adoption.
To solve these problems, the research team developed a technology that constructs accurate 3D models by simultaneously estimating the 3D scene and the camera orientation using just two to three standard photographs. The technology has been recognized for its high efficiency and versatility, enabling rapid and precise reconstruction in real-world environments without additional training or complex calibration processes.
While existing methods calculate 3D structures from known camera poses, SHARE autonomously extracts spatial information from images themselves and infers both camera pose and scene structure. This enables stable 3D reconstruction without shape distortion by aligning multiple images taken from different positions into a single unified space.
"The SHARE technology is a breakthrough that dramatically lowers the barrier to entry for 3D reconstruction,” said Professor Sung-Eui Yoon. “It will enable the creation of high-quality content in various industries such as construction, media, and gaming using only a smartphone camera. It also has diverse application possibilities, such as building low-cost simulation environments in the fields of robotics and autonomous driving."
Ph.D. Candidate Youngju Na and M.S candidate Taeyeon Kim participated as co-first authors on the research. The results were presented on September 17th at the IEEE International Conference on Image Processing (ICIP 2025), where the paper received the Best Student Paper Award.
The award, given to only one paper among 643 accepted papers this year—a selection rate of 0.16 percent—once again underscores the excellent research capabilities of the KAIST research team.
- Paper Title: Pose-free 3D Gaussian Splatting via Shape-Ray Estimation, DOI: https://arxiv.org/abs/2505.22978
- Award Information: https://www.linkedin.com/posts/ieeeicip_congratulations-to-the-icip-2025-best-activity-7374146976449335297-6hXz
This achievement was carried out with support from the Ministry of Science and ICT's SW Star Lab Project under the task 'Development of Perception, Action, and Interaction Algorithms for Unspecified Environments for Open World Robot Services.'
Article Title
Pose-free 3D Gaussian splatting via shape-ray estimation
COI Statement
Youngju Na, Taeyeon Kim, Jumin Lee, Kyu Beom Han, Woo Jae Kim, Sung-eui Yoon