Category: news

Scalable models workshop summary

Our CosyneScalable Models” workshop exceeded all my expectations. We had neuroscientists packed to the… brim? Well. I don’t know what the best packing metaphor for neuroscientists is. But we sure had lots of them in a room listening to talks on high-dimensional neural data.

Memming and I just posted a summary to the workshop website. It includes an overview of all the talks, along with lots of references to techniques and ideas that will fill your brain to the… brim.

I’d also like to send a shout out to Scott Linderman and Eric Jonas, two organizers of a closely-related event that made the Cosyne workshops a two-day big-data party.

Two abstracts accepted to #cosyne14

Two of my abstracts (together with Jakob Macke, Jonathan Pillow, Memming Park, and Kenneth Latimer) will appear as posters in this year’s COSYNE conference later this month in phantasmagorical Salt Lake City. If you’re around Salt Lake this year, do pop in.

I can tell you all about learning distributions of high-dimensional spike data (in Scalable nonparametric models for binary spike patterns) and learning low-dimensional dynamical models of large-scale neural recordings (in Low-dimensional models of neural population recordings with complex stimulus selectivity). Also swing by Snowbird and catch the Scalable Models workshop, which is sure to be a blast and a half.

And as ever, more information about the abstracts may be found over on the publications page.

Three papers accepted to #NIPS2013

Three of my papers (together with Jonathan Pillow, Memming Park, Nicholas Priebe, and Kenneth Latimer) were accepted to this year’s upcoming Neural Information Processing Systems (NIPS) conference.

  • Bayesian entropy estimation for binary spike train data using parametric prior knowledge
    (with Memming Park and Jonathan Pillow)

    We formulate new Bayesian estimators for the entropy of binary spike trains, using priors designed to exploit the statistical structure of simultaneously-recorded spike responses. These estimators are computationally efficient, and show excellent performance on empirical data. This paper was selected for a Spotlight Presentation at the main conference!

  • Universal models for binary spike patterns using centered Dirichlet processes
    (with Memming Park, Kenneth Latimer, and Jonathan Pillow)

    We propose a family of models (universal binary models, or UBM’s) capable of describing arbitrary distributions over all 2^m binary spike patterns. Combined with a good choice of parametric “base measure”, universal models are flexible, parsimonious, and computationally efficient. In application to data, we show that UBM’s to be a promising tool for studying the statistical structure of large-scale neural populations.

  • Spectral methods for neural characterization using generalized quadratic models
    (with Memming Park, Nicholas Priebe, and Jonathan Pillow)

    We introduce a new class of single-neuron models we call the Generalized Quadratic Model, or GQM. While similar to the GLM, the GQM is closely related to methods for dimensionality-reduction often used in neuroscience (STA and STC). A model-based framework, and a few computational tricks based on a quantity known as the “expected log likelihood”, allow us to derive fast inference methods for both spike and analog data, and (in the analog case) for experiments with non-Gaussian stimuli.

For more information you can check out my publications page.

New paper on Bayesian mutual information estimation

My paper (together with Memming Park and Jonathan Pillow), “Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data“, was published online yesterday by the journal Entropy. Entropy is open-access, so anyone can download the article directly from the journal’s website, free of charge.

The fun term “quasi-Bayesian” in the title refers to mutual information estimators which, despite being computed using Bayesian entropy estimators, are not themselves Bayesian. These estimators have some undesirable properties – for instance, they can be negative. We introduce a new, “fully” Bayesian mutual information estimator and compare its performance to quasi-Bayesian estimators (among others), with some surprising results. A detailed summary is forthcoming on the Pillow Lab blog. Until then, why not pop yourself some popcorn and savor all the quasi-Bayesian goodness firsthand?

PYM code now on GitHub

We’ve just uploaded the first code release of the Pitman-Yor Mixture (PYM) entropy estimator to GitHub. This is code from our recent NIPS paper “Bayesian estimation of discrete entropy with mixtures of stick-breaking priors“. You can find more details in a longer manuscript which recently appeared on the arXiv.

If you have some countable discrete distributions hanging around, you’re sure to have a blast estimating their entropy. You can browse the project page or directly download a zip file of the code.

Two abstracts accepted to Cosyne

Two of my abstracts (together with Memming Park and Jonathan Pillow) were accepted to this year’s Computational and Systems Neuroscience (Cosyne) conference! In “Got a moment or two? Neural models and linear dimensionality reduction,” we propose an extension of the Generalized Linear Model (GLM) framework which integrates well-known methods for neural dimensionality reduction with a parametric model of neural responses. In “Semi-parametric Bayesian entropy estimation for binary spike trains,” we extend the work of our recent NIPS paper by using a simple model of spike counts as the “base measure” for a Dirichlet distribution.

More information soon in a forthcoming update to the Pillow Lab blog, but until then you can check out the abstracts on my publications page.