Generative Sketch Healing
作者:Yonggang Qi, Guoyao Su, Qiang Wang, Jie Yang, Kaiyue Pang, Yi-Zhe Song
摘要
To perceive and create a whole from parts is a prime trait of the human visual system. In this paper, we teach machines to perform a similar task by recreating a vectorised human sketch from its incomplete parts, dubbed as sketch healing. This is fundamentally different to prior works on image completion since (i) sketches exhibit a severe lack of visual cues and are of a sequential nature, and more importantly (ii) we ask for an agent that does not just fill in a missing part, but to recreate a novel sketch that closely resembles the partial input from scratch. We identify two key facets of sketch healing that are fundamental for effective learning. The first is encoding the incomplete sketches in a graph model that leverages the sequential nature of sketches to associate key visual parts centred around stroke junctions. The intuition is then that message passing within the graph topology will naturally provide the healing power when it comes to missing parts (nodes and edges). Second we show healing is a trade-off process between global semantic preservation and local structure reconstruction, and that it can only be effectively solved when both are taken into account and optimised together. Both qualitative and quantitative results suggest that the proposed method significantly outperforms the state-of-the-art alternatives on sketch healing. Last but not least, we show that sketch healing can be re-purposed to support the interesting application of sketch-based creativity assistant, which aims at generating a novel sketch from two partial sketches even without specifically trained so.
论文关键词:Sketch healing, Graph convolutional networks, Perceptual distance, Sketch generative model, Sketch representation
论文评审过程:
论文官网地址:https://doi.org/10.1007/s11263-022-01623-7