Stella X. Yu : Papers / Google Scholar

3D Shape Reconstruction from Free-Hand Sketches
Jiayun Wang and Jierui Lin and Qian Yu and Runtao Liu and Yubei Chen and Stella X. Yu
European Conference on Computer Vision Workshop on Drawings and Abstract Imagery: Representation and Analysis (DIRA), Tel Aviv, Isarel, 23 October 2022
Paper | Slides

Abstract

Sketches are arguably the most abstract 2D representations of real-world objects. Although a sketch usually has geometrical distortion and lacks visual cues, humans can effortlessly envision a 3D object from it. This suggests that sketches encode the information necessary for reconstructing 3D shapes. Despite great progress achieved in 3D reconstruction from distortion-free line drawings, such as CAD and edge maps, little effort has been made to reconstruct 3D shapes from free-hand sketches. We study this task and aim to enhance the power of sketches in 3D-related applications such as interactive design and VR/AR games.

Unlike previous works, which mostly study distortion-free line drawings, our 3D shape reconstruction is based on free-hand sketches. A major challenge for free-hand sketch 3D reconstruction comes from the insufficient training data and free-hand sketch diversity, e.g. individualized sketching styles. We thus propose data generation and standardization mechanisms. Instead of distortion-free line drawings, synthesized sketches are adopted as input training data. Additionally, we propose a sketch standardization module to handle different sketch distortions and styles. Extensive experiments demonstrate the effectiveness of our model and its strong generalizability to various free-hand sketches. Our \href{https://github.com/samaonline/3D-Shape-Reconstruction-from-Free-Hand-Sketches}{code} is available.


Keywords
unsupervised learning, 3D shape reconstruction, sketch to 3D synthesis