My paper (together with Urs Köster, Jonathan Pillow and Jakob Macke), “Low-dimensional models of neural population activity in sensory cortical circuits”, was accepted to this year’s upcoming Neural Information Processing Systems (NIPS) conference.
Our Cosyne “Scalable Models” workshop exceeded all my expectations. We had neuroscientists packed to the… brim? Well. I don’t know what the best packing metaphor for neuroscientists is. But we sure had lots of them in a room listening to talks on high-dimensional neural data.
Two of my abstracts (together with Jakob Macke, Jonathan Pillow, Memming Park, and Kenneth Latimer) will appear as posters in this year’s COSYNE conference later this month in phantasmagorical Salt Lake City. If you’re around Salt Lake this year, do pop in.
I can tell you all about learning distributions of high-dimensional spike data (in Scalable nonparametric models for binary spike patterns) and learning low-dimensional dynamical models of large-scale neural recordings (in Low-dimensional models of neural population recordings with complex stimulus selectivity). Also swing by Snowbird and catch the Scalable Models workshop, which is sure to be a blast and a half.
And as ever, more information about the abstracts may be found over on the publications page.
Memming Park, Jonathan Pillow and I recently had our workshop proposal, Scalable Models for high-dimensional neural data, accepted to be part of the Computational and Systems Neuroscience (Cosyne) conference, which takes place in February 2014. You can check out the workshop site for more information.
- Bayesian entropy estimation for binary spike train data using parametric prior knowledge
(with Memming Park and Jonathan Pillow)
We formulate new Bayesian estimators for the entropy of binary spike trains, using priors designed to exploit the statistical structure of simultaneously-recorded spike responses. These estimators are computationally efficient, and show excellent performance on empirical data. This paper was selected for a Spotlight Presentation at the main conference!
- Universal models for binary spike patterns using centered Dirichlet processes
(with Memming Park, Kenneth Latimer, and Jonathan Pillow)
We propose a family of models (universal binary models, or UBM’s) capable of describing arbitrary distributions over all binary spike patterns. Combined with a good choice of parametric “base measure”, universal models are flexible, parsimonious, and computationally efficient. In application to data, we show that UBM’s to be a promising tool for studying the statistical structure of large-scale neural populations.
- Spectral methods for neural characterization using generalized quadratic models
(with Memming Park, Nicholas Priebe, and Jonathan Pillow)
We introduce a new class of single-neuron models we call the Generalized Quadratic Model, or GQM. While similar to the GLM, the GQM is closely related to methods for dimensionality-reduction often used in neuroscience (STA and STC). A model-based framework, and a few computational tricks based on a quantity known as the “expected log likelihood”, allow us to derive fast inference methods for both spike and analog data, and (in the analog case) for experiments with non-Gaussian stimuli.
For more information you can check out my publications page.
My paper (together with Memming Park and Jonathan Pillow), “Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data“, was published online yesterday by the journal Entropy. Entropy is open-access, so anyone can download the article directly from the journal’s website, free of charge.
The fun term “quasi-Bayesian” in the title refers to mutual information estimators which, despite being computed using Bayesian entropy estimators, are not themselves Bayesian. These estimators have some undesirable properties – for instance, they can be negative. We introduce a new, “fully” Bayesian mutual information estimator and compare its performance to quasi-Bayesian estimators (among others), with some surprising results. A detailed summary is forthcoming on the Pillow Lab blog. Until then, why not pop yourself some popcorn and savor all the quasi-Bayesian goodness firsthand?
We’ve just uploaded the first code release of the Pitman-Yor Mixture (PYM) entropy estimator to GitHub. This is code from our recent NIPS paper “Bayesian estimation of discrete entropy with mixtures of stick-breaking priors“. You can find more details in a longer manuscript which recently appeared on the arXiv.
Two of my abstracts (together with Memming Park and Jonathan Pillow) were accepted to this year’s Computational and Systems Neuroscience (Cosyne) conference! In “Got a moment or two? Neural models and linear dimensionality reduction,” we propose an extension of the Generalized Linear Model (GLM) framework which integrates well-known methods for neural dimensionality reduction with a parametric model of neural responses. In “Semi-parametric Bayesian entropy estimation for binary spike trains,” we extend the work of our recent NIPS paper by using a simple model of spike counts as the “base measure” for a Dirichlet distribution.