Skip to main content

Aδ: Autodiff for Discontinuous Programs - Applied to Shaders

Author(s): Yang, Yuting; Barnes, Connelly; Adams, Andrew; Finkelstein, Adam

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr17659f7d
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYang, Yuting-
dc.contributor.authorBarnes, Connelly-
dc.contributor.authorAdams, Andrew-
dc.contributor.authorFinkelstein, Adam-
dc.date.accessioned2023-12-19T21:44:26Z-
dc.date.available2023-12-19T21:44:26Z-
dc.date.issued2022-07en_US
dc.identifier.citationYuting Yang, Connelly Barnes, Andrew Adams, and Adam Finkelstein. 2022. A𝛿: Autodiff for Discontinuous Programs – Applied to Shaders. ACM Trans. Graph. 41, 4, Article 135 (July 2022), 24 pages. https://doi.org/10.1145/3528223.3530125en_US
dc.identifier.issn0730-0301-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr17659f7d-
dc.description.abstractOver the last decade, automatic differentiation (AD) has profoundly impacted graphics and vision applications --- both broadly via deep learning and specifically for inverse rendering. Traditional AD methods ignore gradients at discontinuities, instead treating functions as continuous. Rendering algorithms intrinsically rely on discontinuities, crucial at object silhouettes and in general for any branching operation. Researchers have proposed fully-automatic differentiation approaches for handling discontinuities by restricting to affine functions, or semi-automatic processes restricted either to invertible functions or to specialized applications like vector graphics. This paper describes a compiler-based approach to extend reverse mode AD so as to accept arbitrary programs involving discontinuities. Our novel gradient rules generalize differentiation to work correctly, assuming there is a single discontinuity in a local neighborhood, by approximating the prefiltered gradient over a box kernel oriented along a 1D sampling axis. We describe when such approximation rules are first-order correct, and show that this correctness criterion applies to a relatively broad class of functions. Moreover, we show that the method is effective in practice for arbitrary programs, including features for which we cannot prove correctness. We evaluate this approach on procedural shader programs, where the task is to optimize unknown parameters in order to match a target image, and our method outperforms baselines in terms of both convergence and efficiency. Our compiler outputs gradient programs in TensorFlow, PyTorch (for quick prototypes) and Halide with an optional auto-scheduler (for efficiency). The compiler also outputs GLSL that renders the target image, allowing users to interactively modify and animate the shader, which would otherwise be cumbersome in other representations such as triangle meshes or vector art.en_US
dc.languageenen_US
dc.language.isoen_USen_US
dc.relation.ispartofACM Transactions on Graphicsen_US
dc.rightsFinal published version. This is an open access article.en_US
dc.subjectAutomatic Differentiation, Differentiable Programming, Differentiable Rendering, Domain-Specific Languageen_US
dc.titleAδ: Autodiff for Discontinuous Programs - Applied to Shadersen_US
dc.typeJournal Articleen_US
dc.identifier.doi10.1145/3528223.3530125-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
AutodiffDiscontinuousProgramsAppliedToShaders.pdf5.67 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.