--- res: bibo_abstract: - Dynamic tactile sensing is a fundamental ability to recognize materials and objects. However, while humans are born with partially developed dynamic tactile sensing and quickly master this skill, today's robots remain in their infancy. The development of such a sense requires not only better sensors but the right algorithms to deal with these sensors' data as well. For example, when classifying a material based on touch, the data are noisy, high-dimensional, and contain irrelevant signals as well as essential ones. Few classification methods from machine learning can deal with such problems. In this paper, we propose an efficient approach to infer suitable lower dimensional representations of the tactile data. In order to classify materials based on only the sense of touch, these representations are autonomously discovered using visual information of the surfaces during training. However, accurately pairing vision and tactile samples in real-robot applications is a difficult problem. The proposed approach, therefore, works with weak pairings between the modalities. Experiments show that the resulting approach is very robust and yields significantly higher classification performance based on only dynamic tactile sensing.@eng bibo_authorlist: - foaf_Person: foaf_givenName: Oliver foaf_name: Kroemer, Oliver foaf_surname: Kroemer - foaf_Person: foaf_givenName: Christoph foaf_name: Lampert, Christoph foaf_surname: Lampert foaf_workInfoHomepage: http://www.librecat.org/personId=40C20FD2-F248-11E8-B48F-1D18A9856A87 orcid: 0000-0001-8622-7887 - foaf_Person: foaf_givenName: Jan foaf_name: Peters, Jan foaf_surname: Peters bibo_doi: 10.1109/TRO.2011.2121130 bibo_issue: '3' bibo_volume: 27 dct_date: 2011^xs_gYear dct_language: eng dct_publisher: IEEE@ dct_title: Learning dynamic tactile sensing with robust vision based training@ ...