{"external_id":{"isi":["000540384500015"],"arxiv":["1812.01475"]},"publication_status":"published","oa":1,"language":[{"iso":"eng"}],"year":"2019","publisher":"IEEE","day":"01","_id":"7606","publication":"IEEE Information Theory Workshop, ITW 2019","main_file_link":[{"open_access":"1","url":"https://arxiv.org/abs/1812.01475"}],"article_number":"8989292","conference":{"name":"Information Theory Workshop","end_date":"2019-08-28","start_date":"2019-08-25","location":"Visby, Sweden"},"date_updated":"2024-03-06T14:22:51Z","doi":"10.1109/ITW44776.2019.8989292","title":"A tight upper bound on mutual information","status":"public","ec_funded":1,"user_id":"c635000d-4b10-11ee-a964-aac5a93f6ac1","article_processing_charge":"No","oa_version":"Preprint","date_created":"2020-03-22T23:00:47Z","date_published":"2019-08-01T00:00:00Z","month":"08","type":"conference","publication_identifier":{"isbn":["9781538669006"]},"abstract":[{"text":"We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper bound on mutual information between a signal variable and channel outputs. The bound is in terms of the joint distribution of the signals and maximum a posteriori decodes (most probable signals given channel output). As part of our derivation, we describe the key properties of the distribution of signals, channel outputs and decodes, that minimizes equivocation and maximizes mutual information. This work addresses a problem in data analysis, where mutual information between signals and decodes is sometimes used to lower bound the mutual information between signals and channel outputs. Our result provides a corresponding upper bound.","lang":"eng"}],"project":[{"name":"International IST Doctoral Program","call_identifier":"H2020","_id":"2564DBCA-B435-11E9-9278-68D0E5697425","grant_number":"665385"}],"scopus_import":"1","quality_controlled":"1","related_material":{"record":[{"status":"public","id":"15020","relation":"dissertation_contains"}]},"department":[{"_id":"GaTk"}],"author":[{"last_name":"Hledik","full_name":"Hledik, Michal","id":"4171253A-F248-11E8-B48F-1D18A9856A87","first_name":"Michal"},{"full_name":"Sokolowski, Thomas R","last_name":"Sokolowski","first_name":"Thomas R","orcid":"0000-0002-1287-3779","id":"3E999752-F248-11E8-B48F-1D18A9856A87"},{"full_name":"Tkačik, Gašper","last_name":"Tkačik","first_name":"Gašper","id":"3D494DCA-F248-11E8-B48F-1D18A9856A87","orcid":"0000-0002-6699-1455"}],"citation":{"short":"M. Hledik, T.R. Sokolowski, G. Tkačik, in:, IEEE Information Theory Workshop, ITW 2019, IEEE, 2019.","ama":"Hledik M, Sokolowski TR, Tkačik G. A tight upper bound on mutual information. In: IEEE Information Theory Workshop, ITW 2019. IEEE; 2019. doi:10.1109/ITW44776.2019.8989292","ista":"Hledik M, Sokolowski TR, Tkačik G. 2019. A tight upper bound on mutual information. IEEE Information Theory Workshop, ITW 2019. Information Theory Workshop, 8989292.","ieee":"M. Hledik, T. R. Sokolowski, and G. Tkačik, “A tight upper bound on mutual information,” in IEEE Information Theory Workshop, ITW 2019, Visby, Sweden, 2019.","mla":"Hledik, Michal, et al. “A Tight Upper Bound on Mutual Information.” IEEE Information Theory Workshop, ITW 2019, 8989292, IEEE, 2019, doi:10.1109/ITW44776.2019.8989292.","apa":"Hledik, M., Sokolowski, T. R., & Tkačik, G. (2019). A tight upper bound on mutual information. In IEEE Information Theory Workshop, ITW 2019. Visby, Sweden: IEEE. https://doi.org/10.1109/ITW44776.2019.8989292","chicago":"Hledik, Michal, Thomas R Sokolowski, and Gašper Tkačik. “A Tight Upper Bound on Mutual Information.” In IEEE Information Theory Workshop, ITW 2019. IEEE, 2019. https://doi.org/10.1109/ITW44776.2019.8989292."},"isi":1}