Estimating Mutual Information for DiscreteContinuous Mixtures
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year, 1 month ago.

Estimating Mutual Information for DiscreteContinuous Mixtures
Estimating mutual information from observed samples is a basic primitive, useful in several machine learning tasks including correlation mining, information bottleneck clustering, learning a ChowLiu tree, and conditional independence testing in (causal) graphical models. While mutual information is a welldefined quantity in general probability spaces, existing estimators can only handle two special cases of purely discrete or purely continuous pairs of random variables. The main challenge is that these methods first estimate the (differential) entropies of X, Y and the pair (X;Y) and add them up with appropriate signs to get an estimate of the mutual information. These 3Hestimators cannot be applied in general mixture spaces, where entropy is not welldefined. In this paper, we design a novel estimator for mutual information of discretecontinuous mixtures. We prove that the proposed estimator is consistent. We provide numerical experiments suggesting superiority of the proposed estimator compared to other heuristics of adding small continuous noise to all the samples and applying standard estimators tailored for purely continuous variables, and quantizing the samples and applying standard estimators tailored for purely discrete variables. This significantly widens the applicability of mutual information estimation in realworld applications, where some variables are discrete, some continuous, and others are a mixture between continuous and discrete components.
Estimating Mutual Information for DiscreteContinuous Mixtures
by Weihao Gao, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
https://arxiv.org/pdf/1709.06212v1.pdf
You must be logged in to reply to this topic.