{"ec_funded":1,"acknowledgement":"This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement No.\r\n101019564 “The Design of Modern Fully Dynamic Data Structures (MoDynStruct)” and from the Austrian Science Fund (FWF) project Z 422-N, and project “Fast Algorithms for a Reactive Network Layer (ReactNet)”, P 33775-N, with additional funding from the netidee SCIENCE Stiftung, 2020–2024. 2020–2024. JU’s research was funded by Decanal Research Grant. A part of this work was done when JU was visiting Indian Statistical Institute, Delhi. The authors would like to thank Rajat Bhatia, Aleksandar Nikolov, Shanta Laisharam, Vern Paulsen, Ryan Rogers, Abhradeep Thakurta, and Sarvagya Upadhyay for useful discussions.","project":[{"name":"The design and evaluation of modern fully dynamic data structures","_id":"bd9ca328-d553-11ed-ba76-dc4f890cfe62","grant_number":"101019564","call_identifier":"H2020"},{"name":"Wittgenstein Award - Monika Henzinger","_id":"34def286-11ca-11ed-8bc3-da5948e1613c","grant_number":"Z00422"},{"grant_number":"P33775 ","name":"Fast Algorithms for a Reactive Network Layer","_id":"bd9e3a2e-d553-11ed-ba76-8aa684ce17fe"}],"department":[{"_id":"MoHe"}],"oa_version":"Published Version","type":"conference","date_updated":"2023-10-31T09:54:05Z","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","conference":{"name":"ICML: International Conference on Machine Learning","start_date":"2023-07-23","end_date":"2023-07-29","location":"Honolulu, Hawaii, HI, United States"},"status":"public","_id":"14462","month":"07","date_created":"2023-10-29T23:01:17Z","alternative_title":["PMLR"],"publication_status":"published","publisher":"ML Research Press","publication":"Proceedings of the 40th International Conference on Machine Learning","citation":{"mla":"Fichtenberger, Hendrik, et al. “Constant Matters: Fine-Grained Error Bound on Differentially Private Continual Observation.” Proceedings of the 40th International Conference on Machine Learning, vol. 202, ML Research Press, 2023, pp. 10072–92.","ista":"Fichtenberger H, Henzinger MH, Upadhyay J. 2023. Constant matters: Fine-grained error bound on differentially private continual observation. Proceedings of the 40th International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 202, 10072–10092.","ieee":"H. Fichtenberger, M. H. Henzinger, and J. Upadhyay, “Constant matters: Fine-grained error bound on differentially private continual observation,” in Proceedings of the 40th International Conference on Machine Learning, Honolulu, Hawaii, HI, United States, 2023, vol. 202, pp. 10072–10092.","short":"H. Fichtenberger, M.H. Henzinger, J. Upadhyay, in:, Proceedings of the 40th International Conference on Machine Learning, ML Research Press, 2023, pp. 10072–10092.","chicago":"Fichtenberger, Hendrik, Monika H Henzinger, and Jalaj Upadhyay. “Constant Matters: Fine-Grained Error Bound on Differentially Private Continual Observation.” In Proceedings of the 40th International Conference on Machine Learning, 202:10072–92. ML Research Press, 2023.","apa":"Fichtenberger, H., Henzinger, M. H., & Upadhyay, J. (2023). Constant matters: Fine-grained error bound on differentially private continual observation. In Proceedings of the 40th International Conference on Machine Learning (Vol. 202, pp. 10072–10092). Honolulu, Hawaii, HI, United States: ML Research Press.","ama":"Fichtenberger H, Henzinger MH, Upadhyay J. Constant matters: Fine-grained error bound on differentially private continual observation. In: Proceedings of the 40th International Conference on Machine Learning. Vol 202. ML Research Press; 2023:10072-10092."},"intvolume":" 202","title":"Constant matters: Fine-grained error bound on differentially private continual observation","article_processing_charge":"No","publication_identifier":{"eissn":["2640-3498"]},"date_published":"2023-07-30T00:00:00Z","volume":202,"quality_controlled":"1","scopus_import":"1","abstract":[{"text":"We study fine-grained error bounds for differentially private algorithms for counting under continual observation. Our main insight is that the matrix mechanism when using lower-triangular matrices can be used in the continual observation model. More specifically, we give an explicit factorization for the counting matrix Mcount and upper bound the error explicitly. We also give a fine-grained analysis, specifying the exact constant in the upper bound. Our analysis is based on upper and lower bounds of the completely bounded norm (cb-norm) of Mcount\r\n. Along the way, we improve the best-known bound of 28 years by Mathias (SIAM Journal on Matrix Analysis and Applications, 1993) on the cb-norm of Mcount for a large range of the dimension of Mcount. Furthermore, we are the first to give concrete error bounds for various problems under continual observation such as binary counting, maintaining a histogram, releasing an approximately cut-preserving synthetic graph, many graph-based statistics, and substring and episode counting. Finally, we note that our result can be used to get a fine-grained error bound for non-interactive local learning and the first lower bounds on the additive error for (ϵ,δ)-differentially-private counting under continual observation. Subsequent to this work, Henzinger et al. (SODA, 2023) showed that our factorization also achieves fine-grained mean-squared error.","lang":"eng"}],"oa":1,"year":"2023","author":[{"first_name":"Hendrik","last_name":"Fichtenberger","full_name":"Fichtenberger, Hendrik"},{"id":"540c9bbd-f2de-11ec-812d-d04a5be85630","full_name":"Henzinger, Monika H","orcid":"0000-0002-5008-6530","last_name":"Henzinger","first_name":"Monika H"},{"full_name":"Upadhyay, Jalaj","first_name":"Jalaj","last_name":"Upadhyay"}],"language":[{"iso":"eng"}],"day":"30","page":"10072-10092","main_file_link":[{"url":"https://proceedings.mlr.press/v202/fichtenberger23a/fichtenberger23a.pdf","open_access":"1"}]}