{"publisher":"ML Research Press","type":"conference","alternative_title":["PMLR"],"author":[{"last_name":"Wang","full_name":"Wang, Di","first_name":"Di"},{"full_name":"Fountoulakis, Kimon","first_name":"Kimon","last_name":"Fountoulakis"},{"last_name":"Henzinger","orcid":"0000-0002-5008-6530","id":"540c9bbd-f2de-11ec-812d-d04a5be85630","full_name":"Henzinger, Monika H","first_name":"Monika H"},{"first_name":"Michael W.","full_name":"Mahoney, Michael W.","last_name":"Mahoney"},{"first_name":" Satish","full_name":"Rao , Satish","last_name":"Rao "}],"language":[{"iso":"eng"}],"month":"09","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","oa_version":"Published Version","date_created":"2022-07-25T13:59:21Z","title":"Capacity releasing diffusion for speed and locality","quality_controlled":"1","date_published":"2017-09-01T00:00:00Z","extern":"1","conference":{"name":"International Conference on Machine Learning","location":"Sydney, Australia","start_date":"2017-08-06","end_date":"2017-08-11"},"volume":70,"page":"3598-3607","intvolume":" 70","main_file_link":[{"url":"http://proceedings.mlr.press/v70/wang17b/wang17b.pdf","open_access":"1"}],"date_updated":"2023-02-09T09:15:31Z","publication_identifier":{"eissn":["2640-3498"]},"publication_status":"published","article_processing_charge":"No","_id":"11651","citation":{"apa":"Wang, D., Fountoulakis, K., Henzinger, M. H., Mahoney, M. W., & Rao , Satish. (2017). Capacity releasing diffusion for speed and locality. In Proceedings of the 34th International Conference on Machine Learning (Vol. 70, pp. 3598–3607). Sydney, Australia: ML Research Press.","chicago":"Wang, Di, Kimon Fountoulakis, Monika H Henzinger, Michael W. Mahoney, and Satish Rao . “Capacity Releasing Diffusion for Speed and Locality.” In Proceedings of the 34th International Conference on Machine Learning, 70:3598–3607. ML Research Press, 2017.","short":"D. Wang, K. Fountoulakis, M.H. Henzinger, M.W. Mahoney, Satish Rao , in:, Proceedings of the 34th International Conference on Machine Learning, ML Research Press, 2017, pp. 3598–3607.","ista":"Wang D, Fountoulakis K, Henzinger MH, Mahoney MW, Rao Satish. 2017. Capacity releasing diffusion for speed and locality. Proceedings of the 34th International Conference on Machine Learning. International Conference on Machine Learning, PMLR, vol. 70, 3598–3607.","ieee":"D. Wang, K. Fountoulakis, M. H. Henzinger, M. W. Mahoney, and Satish Rao , “Capacity releasing diffusion for speed and locality,” in Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, 2017, vol. 70, pp. 3598–3607.","mla":"Wang, Di, et al. “Capacity Releasing Diffusion for Speed and Locality.” Proceedings of the 34th International Conference on Machine Learning, vol. 70, ML Research Press, 2017, pp. 3598–607.","ama":"Wang D, Fountoulakis K, Henzinger MH, Mahoney MW, Rao Satish. Capacity releasing diffusion for speed and locality. In: Proceedings of the 34th International Conference on Machine Learning. Vol 70. ML Research Press; 2017:3598-3607."},"publication":"Proceedings of the 34th International Conference on Machine Learning","year":"2017","external_id":{"arxiv":["1706.05826"]},"abstract":[{"text":"Diffusions and related random walk procedures are of central importance in many areas of machine learning, data analysis, and applied mathematics. Because they spread mass agnostically at each step in an iterative manner, they can sometimes spread mass “too aggressively,” thereby failing to find the “right” clusters. We introduce a novel Capacity Releasing Diffusion (CRD) Process, which is both faster and stays more local than the classical spectral diffusion process. As an application, we use our CRD Process to develop an improved local algorithm for graph clustering. Our local graph clustering method can find local clusters in a model of clustering where one begins the CRD Process in a cluster whose vertices are connected better internally than externally by an O(log2n) factor, where n is the number of nodes in the cluster. Thus, our CRD Process is the first local graph clustering algorithm that is not subject to the well-known quadratic Cheeger barrier. Our result requires a certain smoothness condition, which we expect to be an artifact of our analysis. Our empirical evaluation demonstrates improved results, in particular for realistic social graphs where there are moderately good—but not very good—clusters.","lang":"eng"}],"day":"01","oa":1,"status":"public"}