{"issue":"1","doi":"10.1073/pnas.1711114115","author":[{"id":"2BAAC544-F248-11E8-B48F-1D18A9856A87","first_name":"Matthew J","last_name":"Chalk","orcid":"0000-0001-7782-4436","full_name":"Chalk, Matthew J"},{"full_name":"Marre, Olivier","last_name":"Marre","first_name":"Olivier"},{"last_name":"Tkacik","orcid":"0000-0002-6699-1455","full_name":"Tkacik, Gasper","id":"3D494DCA-F248-11E8-B48F-1D18A9856A87","first_name":"Gasper"}],"project":[{"_id":"254D1A94-B435-11E9-9278-68D0E5697425","name":"Sensitivity to higher-order statistics in natural scenes","grant_number":"P 25651-N26","call_identifier":"FWF"}],"external_id":{"isi":["000419128700049"]},"department":[{"_id":"GaTk"}],"_id":"543","article_processing_charge":"No","day":"02","oa":1,"date_updated":"2023-09-19T10:16:35Z","publisher":"National Academy of Sciences","date_published":"2018-01-02T00:00:00Z","scopus_import":"1","user_id":"c635000d-4b10-11ee-a964-aac5a93f6ac1","language":[{"iso":"eng"}],"title":"Toward a unified theory of efficient, predictive, and sparse coding","abstract":[{"lang":"eng","text":"A central goal in theoretical neuroscience is to predict the response properties of sensory neurons from first principles. To this end, “efficient coding” posits that sensory neurons encode maximal information about their inputs given internal constraints. There exist, however, many variants of efficient coding (e.g., redundancy reduction, different formulations of predictive coding, robust coding, sparse coding, etc.), differing in their regimes of applicability, in the relevance of signals to be encoded, and in the choice of constraints. It is unclear how these types of efficient coding relate or what is expected when different coding objectives are combined. Here we present a unified framework that encompasses previously proposed efficient coding models and extends to unique regimes. We show that optimizing neural responses to encode predictive information can lead them to either correlate or decorrelate their inputs, depending on the stimulus statistics; in contrast, at low noise, efficiently encoding the past always predicts decorrelation. Later, we investigate coding of naturalistic movies and show that qualitatively different types of visual motion tuning and levels of response sparsity are predicted, depending on whether the objective is to recover the past or predict the future. Our approach promises a way to explain the observed diversity of sensory neural responses, as due to multiple functional goals and constraints fulfilled by different cell types and/or circuits."}],"intvolume":" 115","isi":1,"quality_controlled":"1","oa_version":"Submitted Version","main_file_link":[{"url":"https://doi.org/10.1101/152660 ","open_access":"1"}],"publication_status":"published","date_created":"2018-12-11T11:47:04Z","status":"public","publication":"PNAS","publist_id":"7273","month":"01","type":"journal_article","citation":{"ieee":"M. J. Chalk, O. Marre, and G. Tkačik, “Toward a unified theory of efficient, predictive, and sparse coding,” PNAS, vol. 115, no. 1. National Academy of Sciences, pp. 186–191, 2018.","apa":"Chalk, M. J., Marre, O., & Tkačik, G. (2018). Toward a unified theory of efficient, predictive, and sparse coding. PNAS. National Academy of Sciences. https://doi.org/10.1073/pnas.1711114115","ista":"Chalk MJ, Marre O, Tkačik G. 2018. Toward a unified theory of efficient, predictive, and sparse coding. PNAS. 115(1), 186–191.","mla":"Chalk, Matthew J., et al. “Toward a Unified Theory of Efficient, Predictive, and Sparse Coding.” PNAS, vol. 115, no. 1, National Academy of Sciences, 2018, pp. 186–91, doi:10.1073/pnas.1711114115.","ama":"Chalk MJ, Marre O, Tkačik G. Toward a unified theory of efficient, predictive, and sparse coding. PNAS. 2018;115(1):186-191. doi:10.1073/pnas.1711114115","chicago":"Chalk, Matthew J, Olivier Marre, and Gašper Tkačik. “Toward a Unified Theory of Efficient, Predictive, and Sparse Coding.” PNAS. National Academy of Sciences, 2018. https://doi.org/10.1073/pnas.1711114115.","short":"M.J. Chalk, O. Marre, G. Tkačik, PNAS 115 (2018) 186–191."},"year":"2018","volume":115,"page":"186 - 191"}