Machine Learning

A generalization of the Jensen divergence: The chord gap divergence

Tagged: ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 year, 3 months ago.


  • arXiv
    5 pts

    A generalization of the Jensen divergence: The chord gap divergence

    We introduce a novel family of distances, called the chord gap divergences, that generalizes the Jensen divergences (also called the Burbea-Rao distances), and study its properties. It follows a generalization of the celebrated statistical Bhattacharyya distance that is frequently met in applications. We report an iterative concave-convex procedure for computing centroids, and analyze the performance of the $k$-means++ clustering with respect to that new dissimilarity measure by introducing the Taylor-Lagrange remainder form of the skew Jensen divergences.

    A generalization of the Jensen divergence: The chord gap divergence
    by Frank Nielsen
    https://arxiv.org/pdf/1709.10498v1.pdf

You must be logged in to reply to this topic.