I am PhD student in computer science at Stanford University, advised by
Scott Linderman.
My recent interests include search, sequential Monte Carlo, and discrete latent variable models, especially as applied to tool use and alignment in LLMs.
I'm graduating this year and am on the industry job market. If you'd like to chat about research, please get in touch!
Email: dieterich.lawson@gmail.com
(full list)
SIXO: Smoothing Inference with Twisted Objectives
Dieterich Lawson*, Allan Raventós*, Andrew Warrington*, Scott Linderman
Oral, NeurIPS 2022
Energy-Inspired Models: Learning with Sampler-Induced Distributions
Dieterich Lawson*, George Tucker*, Bo Dai, and Rajesh Ranganath
NeurIPS 2019
Doubly Reparameterized Gradient Estimators for Monte Carlo Objectives
George Tucker, Dieterich Lawson, Shixiang Gu, Chris J Maddison
ICLR 2019
Filtering Variational Objectives
Chris J. Maddison*, Dieterich Lawson*, George Tucker*, Nicolas Heess, Mohammad
Norouzi, Andriy Mnih, Arnaud Doucet, and Yee Whye Teh
NeurIPS 2017, Best Paper at ICML 2017 Deep Structured Prediction Workshop
Rebar: Low-variance, unbiased gradient estimates for discrete latent variable models
George Tucker, Andriy Mnih, Chris J Maddison, Dieterich Lawson, Jascha Sohl-Dickstein
Oral, NeurIPS 2017
Learning Hard Alignments with Variational Inference
Dieterich Lawson*, Chung-Cheng Chiu*, George Tucker*, Colin Raffel, Kevin Swersky, and Navdeep Jaitly
ICASSP, 2018
Particle Value Functions
Chris J. Maddison, Dieterich Lawson, George Tucker, Nicolas Heess, Arnaud Doucet, Andriy Mnih, Yee Whye Teh
ICLR Workshop Track, 2017