My paper (together with Memming Park and Jonathan Pillow), “Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data“, was published online yesterday by the journal Entropy. Entropy is open-access, so anyone can download the article directly from the journal’s website, free of charge.
The fun term “quasi-Bayesian” in the title refers to mutual information estimators which, despite being computed using Bayesian entropy estimators, are not themselves Bayesian. These estimators have some undesirable properties – for instance, they can be negative. We introduce a new, “fully” Bayesian mutual information estimator and compare its performance to quasi-Bayesian estimators (among others), with some surprising results. A detailed summary is forthcoming on the Pillow Lab blog. Until then, why not pop yourself some popcorn and savor all the quasi-Bayesian goodness firsthand?