{"date_created":"2018-12-11T12:01:32Z","conference":{"start_date":"2012-08-28","location":"Graz, Austria","end_date":"2012-08-31","name":"DAGM: German Association For Pattern Recognition"},"scopus_import":1,"status":"public","doi":"10.1007/978-3-642-32717-9_21","page":"205 - 215","publication_status":"published","day":"14","alternative_title":["LNCS"],"department":[{"_id":"ChLa"}],"month":"08","date_updated":"2021-01-12T07:41:14Z","user_id":"3E5EF7F0-F248-11E8-B48F-1D18A9856A87","publisher":"Springer","title":"Information theoretic clustering using minimal spanning trees","intvolume":" 7476","date_published":"2012-08-14T00:00:00Z","oa_version":"None","year":"2012","quality_controlled":"1","volume":7476,"language":[{"iso":"eng"}],"type":"conference","_id":"3126","publist_id":"3573","citation":{"apa":"Müller, A., Nowozin, S., & Lampert, C. (2012). Information theoretic clustering using minimal spanning trees (Vol. 7476, pp. 205–215). Presented at the DAGM: German Association For Pattern Recognition, Graz, Austria: Springer. https://doi.org/10.1007/978-3-642-32717-9_21","chicago":"Müller, Andreas, Sebastian Nowozin, and Christoph Lampert. “Information Theoretic Clustering Using Minimal Spanning Trees,” 7476:205–15. Springer, 2012. https://doi.org/10.1007/978-3-642-32717-9_21.","ieee":"A. Müller, S. Nowozin, and C. Lampert, “Information theoretic clustering using minimal spanning trees,” presented at the DAGM: German Association For Pattern Recognition, Graz, Austria, 2012, vol. 7476, pp. 205–215.","ista":"Müller A, Nowozin S, Lampert C. 2012. Information theoretic clustering using minimal spanning trees. DAGM: German Association For Pattern Recognition, LNCS, vol. 7476, 205–215.","ama":"Müller A, Nowozin S, Lampert C. Information theoretic clustering using minimal spanning trees. In: Vol 7476. Springer; 2012:205-215. doi:10.1007/978-3-642-32717-9_21","short":"A. Müller, S. Nowozin, C. Lampert, in:, Springer, 2012, pp. 205–215.","mla":"Müller, Andreas, et al. Information Theoretic Clustering Using Minimal Spanning Trees. Vol. 7476, Springer, 2012, pp. 205–15, doi:10.1007/978-3-642-32717-9_21."},"author":[{"first_name":"Andreas","full_name":"Müller, Andreas","last_name":"Müller"},{"last_name":"Nowozin","full_name":"Nowozin, Sebastian","first_name":"Sebastian"},{"id":"40C20FD2-F248-11E8-B48F-1D18A9856A87","orcid":"0000-0001-8622-7887","full_name":"Lampert, Christoph","last_name":"Lampert","first_name":"Christoph"}],"abstract":[{"lang":"eng","text":"In this work we propose a new information-theoretic clustering algorithm that infers cluster memberships by direct optimization of a non-parametric mutual information estimate between data distribution and cluster assignment. Although the optimization objective has a solid theoretical foundation it is hard to optimize. We propose an approximate optimization formulation that leads to an efficient algorithm with low runtime complexity. The algorithm has a single free parameter, the number of clusters to find. We demonstrate superior performance on several synthetic and real datasets.\r\n"}]}