Dustin Tran

Computer Science Ph.D. Student at Columbia
dustin@cs.columbia.edu
Blog


I am a Ph.D. student in Computer Science at Columbia, where I am advised by David Blei and Andrew Gelman. I work in the fields of Bayesian statistics, machine learning, and deep learning. My research interests are in complex probabilistic models, general-purpose inference algorithms, and generally foundations of Bayesian analysis.

I am fortunate to be a member of the Blei Lab and the Stan development team. Recently, I transferred to Columbia from a Statistics Ph.D. at Harvard, where I worked with Edo Airoldi and also spent time at the Harvard Intelligent Probabilistic Systems group. During my undergraduate years, I studied at Berkeley with a double major in mathematics and statistics.

Curriculum Vitae

Publications


Preprints

Some of my work is available as preprints on arXiv.

Automatic differentiation variational inference
An automated tool for black box variational inference, available in Stan.
Alp Kucukelbir, Dustin Tran, Rajesh Ranganath, Andrew Gelman, David M. Blei

Stochastic gradient descent methods for estimation with large data sets
Fast and statistically efficient algorithms for generalized linear models and M-estimation.
Dustin Tran, Panos Toulis, Edoardo M. Airoldi

2016

Hierarchical variational models
A Bayesian formalism for constructing expressive variational families.
Rajesh Ranganath, Dustin Tran, David M. Blei
International Conference on Machine Learning, 2016

Spectral M-estimation with application to hidden Markov models
Applying M-estimation for sample efficiency and robustness in moment-based estimators.
Dustin Tran, Minjae Kim, Finale Doshi-Velez
Artificial Intelligence and Statistics, 2016

Towards stability and optimality in stochastic gradient descent
A stochastic gradient method combining numerical stability and statistical efficiency.
Panos Toulis, Dustin Tran, Edoardo M. Airoldi
Artificial Intelligence and Statistics, 2016

The variational Gaussian process
A powerful variational model that can universally approximate any posterior.
Dustin Tran, Rajesh Ranganath, David M. Blei
International Conference on Learning Representations, 2016

2015

Copula variational inference
Posterior approximations using copulas, which find meaningful dependence between latent variables.
Dustin Tran, David M. Blei, Edoardo M. Airoldi
Neural Information Processing Systems, 2015

Software


Edward


Edward is a Python library for probabilistic modeling, inference, and criticism. It is a testbed for fast experimentation and research with probabilistic models, ranging from classical hierarchical models on small data sets to complex deep probabilistic models on large data sets. Check it out on Github.

Stan


Stan is a probabilistic programming language featuring fast, highly optimized inference algorithms. It supports a large class of models, with a user base of roughly 10,000. Alp Kucukelbir and I are developing variational inference. I also work on algorithms for optimizing marginal distributions. Check it out on Github.

sgd


We build large-scale estimation tools in R using stochastic gradient descent, available on CRAN. The library I'm developing with Panos Toulis includes a slew of stochastic gradient methods, built-in models, visualization tools, hypothesis testing, convergence diagnostics, and other cool stuff. Check it out on Github.

Talks


Edward: A library for probabilistic modeling, inference, and criticism
Variational models Slides
Automating inference with Stan Slides

My blog on statistics, probability, machine learning