{"issue":"Part 1","date_created":"2018-12-11T12:04:41Z","conference":{"name":"ECML: European Conference on Machine Learning"},"doi":"10.1007/978-3-540-87479-9_27","status":"public","page":"133 - 145","day":"21","alternative_title":["LNCS"],"publication_status":"published","date_updated":"2021-01-12T07:49:02Z","month":"10","publisher":"Springer","title":"Semi-supervised Laplacian regularization of kernel canonical correlation analysis","date_published":"2008-10-21T00:00:00Z","intvolume":" 5211","extern":1,"year":"2008","volume":5211,"quality_controlled":0,"type":"conference","_id":"3698","citation":{"ista":"Blaschko M, Lampert C, Gretton A. 2008. Semi-supervised Laplacian regularization of kernel canonical correlation analysis. ECML: European Conference on Machine Learning, LNCS, vol. 5211, 133–145.","ieee":"M. Blaschko, C. Lampert, and A. Gretton, “Semi-supervised Laplacian regularization of kernel canonical correlation analysis,” presented at the ECML: European Conference on Machine Learning, 2008, vol. 5211, no. Part 1, pp. 133–145.","ama":"Blaschko M, Lampert C, Gretton A. Semi-supervised Laplacian regularization of kernel canonical correlation analysis. In: Vol 5211. Springer; 2008:133-145. doi:10.1007/978-3-540-87479-9_27","mla":"Blaschko, Matthew, et al. Semi-Supervised Laplacian Regularization of Kernel Canonical Correlation Analysis. Vol. 5211, no. Part 1, Springer, 2008, pp. 133–45, doi:10.1007/978-3-540-87479-9_27.","short":"M. Blaschko, C. Lampert, A. Gretton, in:, Springer, 2008, pp. 133–145.","apa":"Blaschko, M., Lampert, C., & Gretton, A. (2008). Semi-supervised Laplacian regularization of kernel canonical correlation analysis (Vol. 5211, pp. 133–145). Presented at the ECML: European Conference on Machine Learning, Springer. https://doi.org/10.1007/978-3-540-87479-9_27","chicago":"Blaschko, Matthew, Christoph Lampert, and Arthur Gretton. “Semi-Supervised Laplacian Regularization of Kernel Canonical Correlation Analysis,” 5211:133–45. Springer, 2008. https://doi.org/10.1007/978-3-540-87479-9_27."},"publist_id":"2662","author":[{"last_name":"Blaschko","full_name":"Blaschko,Matthew B","first_name":"Matthew"},{"last_name":"Lampert","full_name":"Christoph Lampert","first_name":"Christoph","id":"40C20FD2-F248-11E8-B48F-1D18A9856A87","orcid":"0000-0001-8622-7887"},{"first_name":"Arthur","last_name":"Gretton","full_name":"Gretton,Arthur"}],"abstract":[{"text":"Kernel canonical correlation analysis (KCCA) is a dimensionality reduction technique for paired data. By finding directions that maximize correlation, KCCA learns representations that are more closely tied to the underlying semantics of the data rather than noise. However, meaningful directions are not only those that have high correlation to another modality, but also those that capture the manifold structure of the data. We propose a method that is simultaneously able to find highly correlated directions that are also located on high variance directions along the data manifold. This is achieved by the use of semi-supervised Laplacian regularization of KCCA. We show experimentally that Laplacian regularized training improves class separation over KCCA with only Tikhonov regularization, while causing no degradation in the correlation between modalities. We propose a model selection criterion based on the Hilbert-Schmidt norm of the semi-supervised Laplacian regularized cross-covariance operator, which we compute in closed form.","lang":"eng"}]}