eprintid: 10139850
rev_number: 13
eprint_status: archive
userid: 608
dir: disk0/10/13/98/50
datestamp: 2021-12-08 10:32:37
lastmod: 2022-06-05 06:10:28
status_changed: 2021-12-08 10:32:37
type: article
metadata_visibility: show
creators_name: Chen, X
creators_name: Cohen-Or, D
creators_name: Chen, B
creators_name: Mitra, NJ
title: Towards a Neural Graphics Pipeline for Controllable Image Generation
ispublished: pub
divisions: UCL
divisions: B04
divisions: C05
divisions: F48
keywords: CCS Concepts
• Computing methodologies → Rendering; Shape modeling;
note: This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
abstract: In this paper, we leverage advances in neural networks towards forming a neural rendering for controllable image generation, and thereby bypassing the need for detailed modeling in conventional graphics pipeline. To this end, we present Neural Graphics Pipeline (NGP), a hybrid generative model that brings together neural and traditional image formation models. NGP decomposes the image into a set of interpretable appearance feature maps, uncovering direct control handles for controllable image generation. To form an image, NGP generates coarse 3D models that are fed into neural rendering modules to produce view-specific interpretable 2D maps, which are then composited into the final output image using a traditional image formation model. Our approach offers control over image generation by providing direct handles controlling illumination and camera parameters, in addition to control over shape and appearance variations. The key challenge is to learn these controls through unsupervised training that links generated coarse 3D models with unpaired real images via neural and traditional (e.g., Blinn-Phong) rendering functions, without establishing an explicit correspondence between them. We demonstrate the effectiveness of our approach on controllable image generation of single-object scenes. We evaluate our hybrid modeling framework, compare with neural-only generation methods (namely, DCGAN, LSGAN, WGAN-GP, VON, and SRNs), report improvement in FID scores against real images, and demonstrate that NGP supports direct controls common in traditional forward rendering. Code is available at http://geometry.cs.ucl.ac.uk/projects/2021/ngp.
date: 2021-05-01
date_type: published
publisher: WILEY
official_url: https://doi.org/10.1111/cgf.142620
oa_status: green
full_text_type: other
language: eng
primo: open
primo_central: open_green
verified: verified_manual
elements_id: 1871880
doi: 10.1111/cgf.142620
lyricists_name: Mitra, Niloy
lyricists_id: NMITR19
actors_name: Mitra, Niloy
actors_id: NMITR19
actors_role: owner
full_text_status: public
publication: Computer Graphics Forum
volume: 40
number: 2
pagerange: 127-140
pages: 14
citation:        Chen, X;    Cohen-Or, D;    Chen, B;    Mitra, NJ;      (2021)    Towards a Neural Graphics Pipeline for Controllable Image Generation.                   Computer Graphics Forum , 40  (2)   pp. 127-140.    10.1111/cgf.142620 <https://doi.org/10.1111/cgf.142620>.       Green open access   
 
document_url: https://discovery.ucl.ac.uk/id/eprint/10139850/1/neuralGraphics.pdf