---
OA_place: publisher
OA_type: diamond
_id: '20298'
abstract:
- lang: eng
  text: "In this paper, we study the problem of estimating the unknown mean θ of a
    unit variance Gaussian distribution in a locally differentially private (LDP)
    way. In the high-privacy regime (ϵ≤1\r\n), we identify an optimal privacy mechanism
    that minimizes the variance of the estimator asymptotically. Our main technical
    contribution is the maximization of the Fisher-Information of the sanitized data
    with respect to the local privacy mechanism Q. We find that the exact solution
    Qθ,ϵ of this maximization is the sign mechanism that applies randomized response
    to the sign of Xi−θ, where X1,…,Xn are the confidential iid original samples.
    However, since this optimal local mechanism depends on the unknown mean θ, we
    employ a two-stage LDP parameter estimation procedure which requires splitting
    agents into two groups. The first n1 observations are used to consistently but
    not necessarily efficiently estimate the parameter θ by θn1~\r\n. Then this estimate
    is updated by applying the sign mechanism with θ~n1 instead of θ\r\n to the remaining
    n−n1 observations, to obtain an LDP and efficient estimator of the unknown mean."
acknowledgement: "We would like to express our gratitude to Christoph Lampert for
  his valuable insights and fruitful discussions that significantly contributed to
  the development of this paper.\r\nWe also thank Salil Vadhan for his constructive
  feedback on an earlier version of this draft.\r\nThe second author gratefully acknowledges
  support by the Austrian Science Fund (FWF): I 5484-N, as part of the Research Unit
  5381 of the German Research Foundation."
alternative_title:
- PMLR
article_processing_charge: No
arxiv: 1
author:
- first_name: Nikita
  full_name: Kalinin, Nikita
  id: 4b14526e-14d2-11ed-ba64-c14c9553d137
  last_name: Kalinin
- first_name: Lukas
  full_name: Steinberger, Lukas
  last_name: Steinberger
citation:
  ama: 'Kalinin N, Steinberger L. Efficient estimation of a Gaussian mean with local
    differential privacy. In: <i>Proceedings of the 28th International Conference
    on Artificial Intelligence and Statistics</i>. Vol 258. ML Research Press; 2025:118-126.'
  apa: 'Kalinin, N., &#38; Steinberger, L. (2025). Efficient estimation of a Gaussian
    mean with local differential privacy. In <i>Proceedings of the 28th International
    Conference on Artificial Intelligence and Statistics</i> (Vol. 258, pp. 118–126).
    Mai Khao, Thailand: ML Research Press.'
  chicago: Kalinin, Nikita, and Lukas Steinberger. “Efficient Estimation of a Gaussian
    Mean with Local Differential Privacy.” In <i>Proceedings of the 28th International
    Conference on Artificial Intelligence and Statistics</i>, 258:118–26. ML Research
    Press, 2025.
  ieee: N. Kalinin and L. Steinberger, “Efficient estimation of a Gaussian mean with
    local differential privacy,” in <i>Proceedings of the 28th International Conference
    on Artificial Intelligence and Statistics</i>, Mai Khao, Thailand, 2025, vol.
    258, pp. 118–126.
  ista: 'Kalinin N, Steinberger L. 2025. Efficient estimation of a Gaussian mean with
    local differential privacy. Proceedings of the 28th International Conference on
    Artificial Intelligence and Statistics. AISTATS: Conference on Artificial Intelligence
    and Statistics, PMLR, vol. 258, 118–126.'
  mla: Kalinin, Nikita, and Lukas Steinberger. “Efficient Estimation of a Gaussian
    Mean with Local Differential Privacy.” <i>Proceedings of the 28th International
    Conference on Artificial Intelligence and Statistics</i>, vol. 258, ML Research
    Press, 2025, pp. 118–26.
  short: N. Kalinin, L. Steinberger, in:, Proceedings of the 28th International Conference
    on Artificial Intelligence and Statistics, ML Research Press, 2025, pp. 118–126.
conference:
  end_date: 2025-05-05
  location: Mai Khao, Thailand
  name: 'AISTATS: Conference on Artificial Intelligence and Statistics'
  start_date: 2025-05-03
corr_author: '1'
date_created: 2025-09-07T22:01:34Z
date_published: 2025-05-01T00:00:00Z
date_updated: 2025-09-09T08:28:41Z
day: '01'
ddc:
- '000'
department:
- _id: ChLa
external_id:
  arxiv:
  - '2402.04840'
file:
- access_level: open_access
  checksum: 3dcd59988ca974b98662ba09a516e616
  content_type: application/pdf
  creator: dernst
  date_created: 2025-09-09T08:26:44Z
  date_updated: 2025-09-09T08:26:44Z
  file_id: '20316'
  file_name: 2025_AISTATS_Kalinin.pdf
  file_size: 395864
  relation: main_file
  success: 1
file_date_updated: 2025-09-09T08:26:44Z
has_accepted_license: '1'
intvolume: '       258'
language:
- iso: eng
month: '05'
oa: 1
oa_version: Published Version
page: 118-126
publication: Proceedings of the 28th International Conference on Artificial Intelligence
  and Statistics
publication_identifier:
  eissn:
  - 2640-3498
publication_status: published
publisher: ML Research Press
quality_controlled: '1'
scopus_import: '1'
status: public
title: Efficient estimation of a Gaussian mean with local differential privacy
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 258
year: '2025'
...
---
OA_place: publisher
OA_type: gold
_id: '18875'
abstract:
- lang: eng
  text: Current state-of-the-art methods for differentially private model training
    are based on matrix factorization techniques. However, these methods suffer from
    high computational overhead because they require numerically solving a demanding
    optimization problem to determine an approximately optimal factorization prior
    to the actual model training. In this work, we present a new matrix factorization
    approach, BSR, which overcomes this computational bottleneck. By exploiting properties
    of the standard matrix square root, BSR allows to efficiently handle also large-scale
    problems. For the key scenario of stochastic gradient descent with momentum and
    weight decay, we even derive analytical expressions for BSR that render the computational
    overhead negligible. We prove bounds on the approximation quality that hold both
    in the centralized and in the federated learning setting. Our numerical experiments
    demonstrate that models trained using BSR perform on par with the best existing
    methods, while completely avoiding their computational overhead.
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Nikita
  full_name: Kalinin, Nikita
  id: 4b14526e-14d2-11ed-ba64-c14c9553d137
  last_name: Kalinin
- first_name: Christoph
  full_name: Lampert, Christoph
  id: 40C20FD2-F248-11E8-B48F-1D18A9856A87
  last_name: Lampert
  orcid: 0000-0001-8622-7887
citation:
  ama: 'Kalinin N, Lampert C. Banded square root matrix factorization for differentially
    private model training. In: <i>38th Annual Conference on Neural Information Processing
    Systems</i>. Vol 37. Neural Information Processing Systems Foundation; 2024.'
  apa: 'Kalinin, N., &#38; Lampert, C. (2024). Banded square root matrix factorization
    for differentially private model training. In <i>38th Annual Conference on Neural
    Information Processing Systems</i> (Vol. 37). Vancouver, Canada: Neural Information
    Processing Systems Foundation.'
  chicago: Kalinin, Nikita, and Christoph Lampert. “Banded Square Root Matrix Factorization
    for Differentially Private Model Training.” In <i>38th Annual Conference on Neural
    Information Processing Systems</i>, Vol. 37. Neural Information Processing Systems
    Foundation, 2024.
  ieee: N. Kalinin and C. Lampert, “Banded square root matrix factorization for differentially
    private model training,” in <i>38th Annual Conference on Neural Information Processing
    Systems</i>, Vancouver, Canada, 2024, vol. 37.
  ista: 'Kalinin N, Lampert C. 2024. Banded square root matrix factorization for differentially
    private model training. 38th Annual Conference on Neural Information Processing
    Systems. NeurIPS: Neural Information Processing Systems, Advances in Neural Information
    Processing Systems, vol. 37.'
  mla: Kalinin, Nikita, and Christoph Lampert. “Banded Square Root Matrix Factorization
    for Differentially Private Model Training.” <i>38th Annual Conference on Neural
    Information Processing Systems</i>, vol. 37, Neural Information Processing Systems
    Foundation, 2024.
  short: N. Kalinin, C. Lampert, in:, 38th Annual Conference on Neural Information
    Processing Systems, Neural Information Processing Systems Foundation, 2024.
conference:
  end_date: 2024-12-16
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-16
corr_author: '1'
date_created: 2025-01-24T17:58:16Z
date_published: 2024-12-01T00:00:00Z
date_updated: 2025-05-14T11:34:20Z
day: '01'
ddc:
- '000'
department:
- _id: GradSch
- _id: ChLa
external_id:
  arxiv:
  - '2405.13763'
file:
- access_level: open_access
  checksum: a216cab8eddc1fe7840aede0e2c0d41e
  content_type: application/pdf
  creator: dernst
  date_created: 2025-01-27T09:52:15Z
  date_updated: 2025-01-27T09:52:15Z
  file_id: '18888'
  file_name: 2024_NeurIPS_Nikita.pdf
  file_size: 1144656
  relation: main_file
  success: 1
file_date_updated: 2025-01-27T09:52:15Z
has_accepted_license: '1'
intvolume: '        37'
language:
- iso: eng
month: '12'
oa: 1
oa_version: Published Version
publication: 38th Annual Conference on Neural Information Processing Systems
publication_identifier:
  eissn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
scopus_import: '1'
status: public
title: Banded square root matrix factorization for differentially private model training
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
