## Abstract

A wide variety of distortion functions are used for clustering, e.g., squared Euclidean distance, Mahalanobis distance and relative entropy. In this paper, we propose and analyze parametric hard and soft clustering algorithms based on a large class of distortion functions known as Bregman divergences. The proposed algorithms unify centroid-based parametric clustering approaches, such as classical kmeans and information-theoretic clustering, which arise by special choices of the Bregman divergence. The algorithms maintain the simplicity and scalability of the classical kmeans algorithm, while generalizing the basic idea to a very large class of clustering loss functions. There are two main contributions in this paper. First, we pose the hard clustering problem in terms of minimizing the loss in Bregman information, a quantity motivated by rate-distortion theory, and present an algorithm to minimize this loss. Secondly, we show an explicit bijection between Bregman divergences and exponential families. The bijection enables the development of an alternative interpretation of an efficient EM scheme for learning models involving mixtures of exponential distributions. This leads to a simple soft clustering algorithm for all Bregman divergences.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the Fourth SIAM International Conference on Data Mining |

Editors | M.W. Berry, U. Dayal, C. Kamath, D. Skillicorn |

Pages | 234-245 |

Number of pages | 12 |

State | Published - Jan 1 2004 |

Event | Proceedings of the Fourth SIAM International Conference on Data Mining - Lake Buena Vista, FL, United States Duration: Apr 22 2004 → Apr 24 2004 |

### Other

Other | Proceedings of the Fourth SIAM International Conference on Data Mining |
---|---|

Country | United States |

City | Lake Buena Vista, FL |

Period | 4/22/04 → 4/24/04 |