---
OA_place: publisher
OA_type: hybrid
_id: '19051'
abstract:
- lang: eng
  text: This paper corrects an error in an earlier work of the author.
article_processing_charge: Yes (via OA deal)
article_type: original
author:
- first_name: Timothy D
  full_name: Browning, Timothy D
  id: 35827D50-F248-11E8-B48F-1D18A9856A87
  last_name: Browning
  orcid: 0000-0002-8314-0177
citation:
  ama: Browning TD. The polynomial sieve and equal sums of like polynomials. <i>International
    Mathematics Research Notices</i>. 2024;2024(13):10165-10168. doi:<a href="https://doi.org/10.1093/imrn/rnae066">10.1093/imrn/rnae066</a>
  apa: Browning, T. D. (2024). The polynomial sieve and equal sums of like polynomials.
    <i>International Mathematics Research Notices</i>. Oxford University Press. <a
    href="https://doi.org/10.1093/imrn/rnae066">https://doi.org/10.1093/imrn/rnae066</a>
  chicago: Browning, Timothy D. “The Polynomial Sieve and Equal Sums of like Polynomials.”
    <i>International Mathematics Research Notices</i>. Oxford University Press, 2024.
    <a href="https://doi.org/10.1093/imrn/rnae066">https://doi.org/10.1093/imrn/rnae066</a>.
  ieee: T. D. Browning, “The polynomial sieve and equal sums of like polynomials,”
    <i>International Mathematics Research Notices</i>, vol. 2024, no. 13. Oxford University
    Press, pp. 10165–10168, 2024.
  ista: Browning TD. 2024. The polynomial sieve and equal sums of like polynomials.
    International Mathematics Research Notices. 2024(13), 10165–10168.
  mla: Browning, Timothy D. “The Polynomial Sieve and Equal Sums of like Polynomials.”
    <i>International Mathematics Research Notices</i>, vol. 2024, no. 13, Oxford University
    Press, 2024, pp. 10165–68, doi:<a href="https://doi.org/10.1093/imrn/rnae066">10.1093/imrn/rnae066</a>.
  short: T.D. Browning, International Mathematics Research Notices 2024 (2024) 10165–10168.
corr_author: '1'
date_created: 2025-02-18T07:15:50Z
date_published: 2024-07-01T00:00:00Z
date_updated: 2025-09-09T12:16:45Z
day: '01'
ddc:
- '510'
department:
- _id: TiBr
doi: 10.1093/imrn/rnae066
external_id:
  isi:
  - '001196957300001'
file:
- access_level: open_access
  checksum: b625b8adf018d2a97591813c1fc17b96
  content_type: application/pdf
  creator: dernst
  date_created: 2025-02-18T07:56:36Z
  date_updated: 2025-02-18T07:56:36Z
  file_id: '19052'
  file_name: 2024_IMRN_Browning.pdf
  file_size: 205750
  relation: main_file
  success: 1
file_date_updated: 2025-02-18T07:56:36Z
has_accepted_license: '1'
intvolume: '      2024'
isi: 1
issue: '13'
language:
- iso: eng
license: https://creativecommons.org/licenses/by/4.0/
month: '07'
oa: 1
oa_version: Published Version
page: 10165-10168
publication: International Mathematics Research Notices
publication_identifier:
  eissn:
  - 1687-0247
  issn:
  - 1073-7928
publication_status: published
publisher: Oxford University Press
quality_controlled: '1'
related_material:
  record:
  - id: '254'
    relation: earlier_version
    status: public
scopus_import: '1'
status: public
title: The polynomial sieve and equal sums of like polynomials
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 317138e5-6ab7-11ef-aa6d-ffef3953e345
volume: 2024
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19063'
abstract:
- lang: eng
  text: "Instruction-tuned Large Language Models (LLMs) show impressive results in
    numerous practical applications, but they lack essential safety features that
    are common in other areas of computer science, particularly an explicit separation
    of instructions and data. This makes them vulnerable to manipulations such as
    indirect prompt injections and generally unsuitable for safety-critical tasks.
    Surprisingly, there is currently no established definition or benchmark to quantify
    this phenomenon. In this work, we close this gap by introducing a formal measure
    for instruction-data separation and an empirical variant that is calculable from
    a model's outputs. We also present a new dataset, SEP, that allows estimating
    the measure for real-world models. Our results on various LLMs show that the problem
    of instruction-data separation is real: all models fail to achieve high separation,
    and canonical mitigation techniques, such as prompt engineering and fine-tuning,
    either fail to substantially improve separation or reduce model utility. The source
    code and SEP dataset are openly accessible at https://github.com/egozverev/Shold-It-Be-Executed-Or-Processed.\r\n"
acknowledged_ssus:
- _id: ScienComp
acknowledgement: The authors would like to sincerely thank Juan Rocamonde for valuable
  feedback to our manuscript. We acknowledge the support from the Scientific Service
  Units (SSU) of ISTA through resources provided by Scientific Computing (SciComp).
  We thank Dan Alistarh for providing us with computational resources. This work was
  partially funded by the German Federal Ministry of Education and Research (BMBF)
  under the grant AIgenCY (16KIS2012) and ELSA – European Lighthouse on Secure and
  Safe AI funded by the European Union under grant agreement No. 101070617. Views
  and opinions expressed are however those of the authors only and do not necessarily
  reflect those of the European Union or European Commission. Neither the European
  Union nor the European Commission can be held responsible for them.
article_number: '2403.06833'
article_processing_charge: No
arxiv: 1
author:
- first_name: Egor
  full_name: Zverev, Egor
  id: 05162b19-1340-11ed-8f02-fa94e0e8c3bc
  last_name: Zverev
- first_name: Sahar
  full_name: Abdelnabi, Sahar
  last_name: Abdelnabi
- first_name: Soroush
  full_name: Tabesh, Soroush
  id: 06000900-6068-11ef-8d61-c2472ef2e752
  last_name: Tabesh
  orcid: 0009-0003-4119-6281
- first_name: Mario
  full_name: Fritz, Mario
  last_name: Fritz
- first_name: Christoph
  full_name: Lampert, Christoph
  id: 40C20FD2-F248-11E8-B48F-1D18A9856A87
  last_name: Lampert
  orcid: 0000-0001-8622-7887
citation:
  ama: Zverev E, Abdelnabi S, Tabesh S, Fritz M, Lampert C. Can LLMs separate instructions
    from data? And what do we even mean by that? <i>arXiv</i>. 2024. doi:<a href="https://doi.org/10.48550/arXiv.2403.06833">10.48550/arXiv.2403.06833</a>
  apa: Zverev, E., Abdelnabi, S., Tabesh, S., Fritz, M., &#38; Lampert, C. (2024).
    Can LLMs separate instructions from data? And what do we even mean by that? <i>arXiv</i>.
    <a href="https://doi.org/10.48550/arXiv.2403.06833">https://doi.org/10.48550/arXiv.2403.06833</a>
  chicago: Zverev, Egor, Sahar Abdelnabi, Soroush Tabesh, Mario Fritz, and Christoph
    Lampert. “Can LLMs Separate Instructions from Data? And What Do We Even Mean by
    That?” <i>ArXiv</i>, 2024. <a href="https://doi.org/10.48550/arXiv.2403.06833">https://doi.org/10.48550/arXiv.2403.06833</a>.
  ieee: E. Zverev, S. Abdelnabi, S. Tabesh, M. Fritz, and C. Lampert, “Can LLMs separate
    instructions from data? And what do we even mean by that?,” <i>arXiv</i>. 2024.
  ista: Zverev E, Abdelnabi S, Tabesh S, Fritz M, Lampert C. 2024. Can LLMs separate
    instructions from data? And what do we even mean by that? arXiv, 2403.06833.
  mla: Zverev, Egor, et al. “Can LLMs Separate Instructions from Data? And What Do
    We Even Mean by That?” <i>ArXiv</i>, 2403.06833, 2024, doi:<a href="https://doi.org/10.48550/arXiv.2403.06833">10.48550/arXiv.2403.06833</a>.
  short: E. Zverev, S. Abdelnabi, S. Tabesh, M. Fritz, C. Lampert, ArXiv (2024).
corr_author: '1'
date_created: 2025-02-20T10:13:42Z
date_published: 2024-03-01T00:00:00Z
date_updated: 2025-02-24T12:52:23Z
day: '01'
ddc:
- '000'
department:
- _id: GradSch
- _id: ChLa
doi: 10.48550/arXiv.2403.06833
external_id:
  arxiv:
  - '2403.06833'
file:
- access_level: open_access
  checksum: 35eb43968684b87be59144603ef10af0
  content_type: application/pdf
  creator: ezverev
  date_created: 2025-02-20T10:11:45Z
  date_updated: 2025-02-20T10:11:45Z
  file_id: '19064'
  file_name: 2403.06833v3.pdf
  file_size: 530972
  relation: main_file
  success: 1
file_date_updated: 2025-02-20T10:11:45Z
has_accepted_license: '1'
language:
- iso: eng
license: https://creativecommons.org/licenses/by-sa/4.0/
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2403.06833
month: '03'
oa: 1
oa_version: Preprint
publication: arXiv
publication_status: published
related_material:
  link:
  - relation: software
    url: ' https://github.com/egozverev/Shold-It-Be-Executed-Or-Processed'
status: public
title: Can LLMs separate instructions from data? And what do we even mean by that?
tmp:
  image: /images/cc_by_sa.png
  legal_code_url: https://creativecommons.org/licenses/by-sa/4.0/legalcode
  name: Creative Commons Attribution-ShareAlike 4.0 International Public License (CC
    BY-SA 4.0)
  short: CC BY-SA (4.0)
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19307'
abstract:
- lang: eng
  text: "This repository contains the data, scripts, SAM codes and files required
    to reproduce the results of the manuscript \"The Unreasonable Efficiency of Total
    Rain Evaporation Removal in Triggering Convective Self-Aggregation\" submitted
    to the Geophysical Research Letters (GRL).\r\n\r\nBrief description of project:
    This project aims to examine the impact of rain evaporation removal or reduction
    in the planetary boundary layer (PBL) on convective self aggregation (CSA). Non-rotating
    radiative-convective equilibrium (RCE) simulations were conducted with the System
    for Atmospheric Modeling (SAM) cloud resolving model. Rain evaporation in the
    lowest 1 km was progressively reduced and the effect on CSA was investigated.
    The physical processes underlying this type of aggregation (referred to in the
    manuscript as no-evaporation CSA, or NE-CSA) were analyzed and described. \r\nThe
    default SAM code base (version 6.10.8) can be downloaded from here: http://rossby.msrc.sunysb.edu/~marat/SAM.html"
article_processing_charge: No
author:
- first_name: Yi-Ling
  full_name: Hwong, Yi-Ling
  id: 1217aa61-4dd1-11ec-9ac3-f2ba3f17ee22
  last_name: Hwong
  orcid: 0000-0001-9281-3479
- first_name: Caroline J
  full_name: Muller, Caroline J
  id: f978ccb0-3f7f-11eb-b193-b0e2bd13182b
  last_name: Muller
  orcid: 0000-0001-5836-5350
citation:
  ama: Hwong Y-L, Muller CJ. Data - The unreasonable efficiency of total rain evaporation
    removal in triggering convective self-aggregation. 2024. doi:<a href="https://doi.org/10.5281/ZENODO.10687169">10.5281/ZENODO.10687169</a>
  apa: Hwong, Y.-L., &#38; Muller, C. J. (2024). Data - The unreasonable efficiency
    of total rain evaporation removal in triggering convective self-aggregation. Zenodo.
    <a href="https://doi.org/10.5281/ZENODO.10687169">https://doi.org/10.5281/ZENODO.10687169</a>
  chicago: Hwong, Yi-Ling, and Caroline J Muller. “Data - The Unreasonable Efficiency
    of Total Rain Evaporation Removal in Triggering Convective Self-Aggregation.”
    Zenodo, 2024. <a href="https://doi.org/10.5281/ZENODO.10687169">https://doi.org/10.5281/ZENODO.10687169</a>.
  ieee: Y.-L. Hwong and C. J. Muller, “Data - The unreasonable efficiency of total
    rain evaporation removal in triggering convective self-aggregation.” Zenodo, 2024.
  ista: Hwong Y-L, Muller CJ. 2024. Data - The unreasonable efficiency of total rain
    evaporation removal in triggering convective self-aggregation, Zenodo, <a href="https://doi.org/10.5281/ZENODO.10687169">10.5281/ZENODO.10687169</a>.
  mla: Hwong, Yi-Ling, and Caroline J. Muller. <i>Data - The Unreasonable Efficiency
    of Total Rain Evaporation Removal in Triggering Convective Self-Aggregation</i>.
    Zenodo, 2024, doi:<a href="https://doi.org/10.5281/ZENODO.10687169">10.5281/ZENODO.10687169</a>.
  short: Y.-L. Hwong, C.J. Muller, (2024).
corr_author: '1'
date_created: 2025-03-07T08:39:40Z
date_published: 2024-02-21T00:00:00Z
date_updated: 2025-09-04T13:16:39Z
day: '21'
ddc:
- '550'
department:
- _id: CaMu
doi: 10.5281/ZENODO.10687169
has_accepted_license: '1'
main_file_link:
- open_access: '1'
  url: https://doi.org/10.5281/zenodo.8369509
month: '02'
oa: 1
oa_version: Published Version
publisher: Zenodo
related_material:
  record:
  - id: '15186'
    relation: used_in_publication
    status: public
status: public
title: Data - The unreasonable efficiency of total rain evaporation removal in triggering
  convective self-aggregation
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: research_data_reference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2024'
...
---
OA_place: publisher
OA_type: diamond
_id: '19408'
abstract:
- lang: eng
  text: 'Continual learning is a subfield of machine learning, which aims to allow
    machine learning models to continuously learn on new data, by accumulating knowledge
    without forgetting what was learned in the past. In this work, we take a step
    back, and ask: "Why should one care about continual learning in the first place?".
    We set the stage by examining recent continual learning papers published at four
    major machine learning conferences, and show that memory-constrained settings
    dominate the field. Then, we discuss five open problems in machine learning, and
    even though they might seem unrelated to continual learning at first sight, we
    show that continual learning will inevitably be part of their solution. These
    problems are model editing, personalization and specialization, on-device learning,
    faster (re-)training and reinforcement learning. Finally, by comparing the desiderata
    from these unsolved problems and the current assumptions in continual learning,
    we highlight and discuss four future directions for continual learning research.
    We hope that this work offers an interesting perspective on the future of continual
    learning, while displaying its potential value and the paths we have to pursue
    in order to make it successful. This work is the result of the many discussions
    the authors had at the Dagstuhl seminar on Deep Continual Learning, in March 2023.'
alternative_title:
- TMLR
article_processing_charge: No
article_type: original
arxiv: 1
author:
- first_name: Eli
  full_name: Verwimp, Eli
  last_name: Verwimp
- first_name: Rahaf
  full_name: Aljundi, Rahaf
  last_name: Aljundi
- first_name: Shai
  full_name: Ben-David, Shai
  last_name: Ben-David
- first_name: Matthias
  full_name: Bethge, Matthias
  last_name: Bethge
- first_name: Andrea
  full_name: Cossu, Andrea
  last_name: Cossu
- first_name: Alexander
  full_name: Gepperth, Alexander
  last_name: Gepperth
- first_name: Tyler L.
  full_name: Hayes, Tyler L.
  last_name: Hayes
- first_name: Eyke
  full_name: Hüllermeier, Eyke
  last_name: Hüllermeier
- first_name: Christopher
  full_name: Kanan, Christopher
  last_name: Kanan
- first_name: Dhireesha
  full_name: Kudithipudi, Dhireesha
  last_name: Kudithipudi
- first_name: Christoph
  full_name: Lampert, Christoph
  id: 40C20FD2-F248-11E8-B48F-1D18A9856A87
  last_name: Lampert
  orcid: 0000-0001-8622-7887
- first_name: Martin
  full_name: Mundt, Martin
  last_name: Mundt
- first_name: Razvan
  full_name: Pascanu, Razvan
  last_name: Pascanu
- first_name: Adrian
  full_name: Popescu, Adrian
  last_name: Popescu
- first_name: Andreas S.
  full_name: Tolias, Andreas S.
  last_name: Tolias
- first_name: Joost
  full_name: Van De Weijer, Joost
  last_name: Van De Weijer
- first_name: Bing
  full_name: Liu, Bing
  last_name: Liu
- first_name: Vincenzo
  full_name: Lomonaco, Vincenzo
  last_name: Lomonaco
- first_name: Tinne
  full_name: Tuytelaars, Tinne
  last_name: Tuytelaars
- first_name: Gido M.
  full_name: Van De Ven, Gido M.
  last_name: Van De Ven
citation:
  ama: 'Verwimp E, Aljundi R, Ben-David S, et al. Continual learning: Applications
    and the road forward. <i>Transactions on Machine Learning Research</i>. 2024;2024.'
  apa: 'Verwimp, E., Aljundi, R., Ben-David, S., Bethge, M., Cossu, A., Gepperth,
    A., … Van De Ven, G. M. (2024). Continual learning: Applications and the road
    forward. <i>Transactions on Machine Learning Research</i>. Transactions on Machine
    Learning Research.'
  chicago: 'Verwimp, Eli, Rahaf Aljundi, Shai Ben-David, Matthias Bethge, Andrea Cossu,
    Alexander Gepperth, Tyler L. Hayes, et al. “Continual Learning: Applications and
    the Road Forward.” <i>Transactions on Machine Learning Research</i>. Transactions
    on Machine Learning Research, 2024.'
  ieee: 'E. Verwimp <i>et al.</i>, “Continual learning: Applications and the road
    forward,” <i>Transactions on Machine Learning Research</i>, vol. 2024. Transactions
    on Machine Learning Research, 2024.'
  ista: 'Verwimp E, Aljundi R, Ben-David S, Bethge M, Cossu A, Gepperth A, Hayes TL,
    Hüllermeier E, Kanan C, Kudithipudi D, Lampert C, Mundt M, Pascanu R, Popescu
    A, Tolias AS, Van De Weijer J, Liu B, Lomonaco V, Tuytelaars T, Van De Ven GM.
    2024. Continual learning: Applications and the road forward. Transactions on Machine
    Learning Research. 2024.'
  mla: 'Verwimp, Eli, et al. “Continual Learning: Applications and the Road Forward.”
    <i>Transactions on Machine Learning Research</i>, vol. 2024, Transactions on Machine
    Learning Research, 2024.'
  short: E. Verwimp, R. Aljundi, S. Ben-David, M. Bethge, A. Cossu, A. Gepperth, T.L.
    Hayes, E. Hüllermeier, C. Kanan, D. Kudithipudi, C. Lampert, M. Mundt, R. Pascanu,
    A. Popescu, A.S. Tolias, J. Van De Weijer, B. Liu, V. Lomonaco, T. Tuytelaars,
    G.M. Van De Ven, Transactions on Machine Learning Research 2024 (2024).
date_created: 2025-03-16T23:01:25Z
date_published: 2024-04-12T00:00:00Z
date_updated: 2025-03-20T09:21:02Z
day: '12'
ddc:
- '000'
department:
- _id: ChLa
external_id:
  arxiv:
  - '2311.11908'
file:
- access_level: open_access
  checksum: 0714e12f7423cd098976ed9974561155
  content_type: application/pdf
  creator: dernst
  date_created: 2025-03-20T09:02:18Z
  date_updated: 2025-03-20T09:02:18Z
  file_id: '19426'
  file_name: 2024_TMLR_Verwimp.pdf
  file_size: 1367966
  relation: main_file
  success: 1
file_date_updated: 2025-03-20T09:02:18Z
has_accepted_license: '1'
intvolume: '      2024'
language:
- iso: eng
month: '04'
oa: 1
oa_version: Published Version
publication: Transactions on Machine Learning Research
publication_identifier:
  eissn:
  - 2835-8856
publication_status: published
publisher: Transactions on Machine Learning Research
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'Continual learning: Applications and the road forward'
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 2024
year: '2024'
...
---
OA_type: closed access
_id: '19446'
abstract:
- lang: eng
  text: This Comment explores new approaches to enrich large-scale population data,
    including incorporating macro-environmental and digital health measures.
acknowledgement: Funded by the European Union. Complementary funding was received
  by the UK Research and Innovation (UKRI) under the UK government’s Horizon Europe
  funding guarantee (10041392 and 10038599). Views and opinions expressed are however
  those of the author(s) only and do not necessarily reflect those of the European
  Union, the European Health and Digital Executive Agency (HADEA) or UKRI. The European
  Union, HADEA and UKRI cannot be held responsible for them. This work received also
  support from Chinese Ministry for Science and Technology (MOST), the Horizon 2020-funded
  European Research Council Advanced Grant ‘STRATIFY’ (695313), the German Research
  Foundation (COPE; 675346; NE 1383/15-1 (CoviDrug)) and the National Natural Science
  Foundation of China grant 82150710554.
article_processing_charge: No
article_type: letter_note
author:
- first_name: Frauke
  full_name: Nees, Frauke
  last_name: Nees
- first_name: Paul
  full_name: Renner, Paul
  last_name: Renner
- first_name: Nathalie E.
  full_name: Holz, Nathalie E.
  last_name: Holz
- first_name: Elli
  full_name: Polemiti, Elli
  last_name: Polemiti
- first_name: Sebastian
  full_name: Siehl, Sebastian
  last_name: Siehl
- first_name: Sören
  full_name: Hese, Sören
  last_name: Hese
- first_name: Kerstin
  full_name: Schepanski, Kerstin
  last_name: Schepanski
- first_name: Gunter
  full_name: Schumann, Gunter
  last_name: Schumann
- first_name: Henrik
  full_name: Walter, Henrik
  last_name: Walter
- first_name: Andreas
  full_name: Heinz, Andreas
  last_name: Heinz
- first_name: Markus
  full_name: Ralser, Markus
  last_name: Ralser
- first_name: Sven
  full_name: Twardziok, Sven
  last_name: Twardziok
- first_name: Nilakshi
  full_name: Vaidya, Nilakshi
  last_name: Vaidya
- first_name: Antoine
  full_name: Bernas, Antoine
  last_name: Bernas
- first_name: Emin
  full_name: Serin, Emin
  last_name: Serin
- first_name: Marcel
  full_name: Jentsch, Marcel
  last_name: Jentsch
- first_name: Esther
  full_name: Hitchen, Esther
  last_name: Hitchen
- first_name: Hedi
  full_name: Kebir, Hedi
  last_name: Kebir
- first_name: Tristram A.
  full_name: Lett, Tristram A.
  last_name: Lett
- first_name: Jean Charles
  full_name: Roy, Jean Charles
  last_name: Roy
- first_name: Roland
  full_name: Eils, Roland
  last_name: Eils
- first_name: Ulrike Helene
  full_name: Taron, Ulrike Helene
  last_name: Taron
- first_name: Tatjana
  full_name: Schütz, Tatjana
  last_name: Schütz
- first_name: Jamie
  full_name: Banks, Jamie
  last_name: Banks
- first_name: Tobias
  full_name: Banaschewski, Tobias
  last_name: Banaschewski
- first_name: Karina
  full_name: Jansone, Karina
  last_name: Jansone
- first_name: Nina
  full_name: Christmann, Nina
  last_name: Christmann
- first_name: Andreas
  full_name: Meyer-Lindenberg, Andreas
  last_name: Meyer-Lindenberg
- first_name: Heike
  full_name: Tost, Heike
  last_name: Tost
- first_name: Nathalie
  full_name: Holz, Nathalie
  last_name: Holz
- first_name: Emanuel
  full_name: Schwarz, Emanuel
  last_name: Schwarz
- first_name: Argyris
  full_name: Stringaris, Argyris
  last_name: Stringaris
- first_name: Maja
  full_name: Neidhart, Maja
  last_name: Neidhart
- first_name: Beke
  full_name: Seefried, Beke
  last_name: Seefried
- first_name: Rieke
  full_name: Aden, Rieke
  last_name: Aden
- first_name: Ole A.
  full_name: Andreassen, Ole A.
  last_name: Andreassen
- first_name: Lars T.
  full_name: Westlye, Lars T.
  last_name: Westlye
- first_name: Dennis
  full_name: Van Der Meer, Dennis
  last_name: Van Der Meer
- first_name: Sara
  full_name: Fernandez, Sara
  last_name: Fernandez
- first_name: Rikka
  full_name: Kjelkenes, Rikka
  last_name: Kjelkenes
- first_name: Helga
  full_name: Ask, Helga
  last_name: Ask
- first_name: Michael
  full_name: Rapp, Michael
  last_name: Rapp
- first_name: Mira
  full_name: Tschorn, Mira
  last_name: Tschorn
- first_name: Sarah Jane
  full_name: Böttger, Sarah Jane
  last_name: Böttger
- first_name: Andre
  full_name: Marquand, Andre
  last_name: Marquand
- first_name: Gaia
  full_name: Novarino, Gaia
  id: 3E57A680-F248-11E8-B48F-1D18A9856A87
  last_name: Novarino
  orcid: 0000-0002-7673-7178
- first_name: Lena
  full_name: Marr, Lena
  id: 4406F586-F248-11E8-B48F-1D18A9856A87
  last_name: Marr
- first_name: Mel
  full_name: Slater, Mel
  last_name: Slater
- first_name: Guillem Feixas
  full_name: Viapiana, Guillem Feixas
  last_name: Viapiana
- first_name: Francisco Eiroa
  full_name: Orosa, Francisco Eiroa
  last_name: Orosa
- first_name: Jaime
  full_name: Gallego, Jaime
  last_name: Gallego
- first_name: Alvaro
  full_name: Pastor, Alvaro
  last_name: Pastor
- first_name: Andreas J.
  full_name: Forstner, Andreas J.
  last_name: Forstner
- first_name: Per
  full_name: Hoffmann, Per
  last_name: Hoffmann
- first_name: Markus M.
  full_name: Nöthen, Markus M.
  last_name: Nöthen
- first_name: Isabelle
  full_name: Claus, Isabelle
  last_name: Claus
- first_name: Abigail
  full_name: Miller, Abigail
  last_name: Miller
- first_name: Carina M.
  full_name: Mathey, Carina M.
  last_name: Mathey
- first_name: Stefanie
  full_name: Heilmann-Heimbach, Stefanie
  last_name: Heilmann-Heimbach
- first_name: Peter
  full_name: Sommer, Peter
  last_name: Sommer
- first_name: Myrto
  full_name: Patraskaki, Myrto
  last_name: Patraskaki
- first_name: Johannes
  full_name: Wilbertz, Johannes
  last_name: Wilbertz
- first_name: Karen
  full_name: Schmitt, Karen
  last_name: Schmitt
- first_name: Viktor
  full_name: Jirsa, Viktor
  last_name: Jirsa
- first_name: Spase
  full_name: Petkoski, Spase
  last_name: Petkoski
- first_name: Séverine
  full_name: Pitel, Séverine
  last_name: Pitel
- first_name: Lisa
  full_name: Otten, Lisa
  last_name: Otten
- first_name: Anastasios Polykarpos
  full_name: Athanasiadis, Anastasios Polykarpos
  last_name: Athanasiadis
- first_name: Charlie
  full_name: Pearmund, Charlie
  last_name: Pearmund
- first_name: Bernhard
  full_name: Spanlang, Bernhard
  last_name: Spanlang
- first_name: Elena
  full_name: Alvarez, Elena
  last_name: Alvarez
- first_name: Mavi
  full_name: Sanchez, Mavi
  last_name: Sanchez
- first_name: Arantxa
  full_name: Giner, Arantxa
  last_name: Giner
- first_name: Tianye
  full_name: Jia, Tianye
  last_name: Jia
- first_name: Yanting
  full_name: Gong, Yanting
  last_name: Gong
- first_name: Yunman
  full_name: Xia, Yunman
  last_name: Xia
- first_name: Xiao
  full_name: Chang, Xiao
  last_name: Chang
- first_name: Vince
  full_name: Calhoun, Vince
  last_name: Calhoun
- first_name: Jingyu
  full_name: Liu, Jingyu
  last_name: Liu
- first_name: Ameli
  full_name: Schwalber, Ameli
  last_name: Schwalber
- first_name: Paul
  full_name: Thompson, Paul
  last_name: Thompson
- first_name: Nicholas
  full_name: Clinton, Nicholas
  last_name: Clinton
- first_name: Sylvane
  full_name: Desrivières, Sylvane
  last_name: Desrivières
- first_name: Allan H.
  full_name: Young, Allan H.
  last_name: Young
- first_name: Bernd
  full_name: Stahl, Bernd
  last_name: Stahl
- first_name: George
  full_name: Ogoh, George
  last_name: Ogoh
citation:
  ama: Nees F, Renner P, Holz NE, et al. Large-scale population data enrichment in
    mental health research. <i>Nature Mental Health</i>. 2024;2(10):1124-1127. doi:<a
    href="https://doi.org/10.1038/s44220-024-00316-z">10.1038/s44220-024-00316-z</a>
  apa: Nees, F., Renner, P., Holz, N. E., Polemiti, E., Siehl, S., Hese, S., … Ogoh,
    G. (2024). Large-scale population data enrichment in mental health research. <i>Nature
    Mental Health</i>. Springer Nature. <a href="https://doi.org/10.1038/s44220-024-00316-z">https://doi.org/10.1038/s44220-024-00316-z</a>
  chicago: Nees, Frauke, Paul Renner, Nathalie E. Holz, Elli Polemiti, Sebastian Siehl,
    Sören Hese, Kerstin Schepanski, et al. “Large-Scale Population Data Enrichment
    in Mental Health Research.” <i>Nature Mental Health</i>. Springer Nature, 2024.
    <a href="https://doi.org/10.1038/s44220-024-00316-z">https://doi.org/10.1038/s44220-024-00316-z</a>.
  ieee: F. Nees <i>et al.</i>, “Large-scale population data enrichment in mental health
    research,” <i>Nature Mental Health</i>, vol. 2, no. 10. Springer Nature, pp. 1124–1127,
    2024.
  ista: Nees F, Renner P, Holz NE, Polemiti E, Siehl S, Hese S, Schepanski K, Schumann
    G, Walter H, Heinz A, Ralser M, Twardziok S, Vaidya N, Bernas A, Serin E, Jentsch
    M, Hitchen E, Kebir H, Lett TA, Roy JC, Eils R, Taron UH, Schütz T, Banks J, Banaschewski
    T, Jansone K, Christmann N, Meyer-Lindenberg A, Tost H, Holz N, Schwarz E, Stringaris
    A, Neidhart M, Seefried B, Aden R, Andreassen OA, Westlye LT, Van Der Meer D,
    Fernandez S, Kjelkenes R, Ask H, Rapp M, Tschorn M, Böttger SJ, Marquand A, Novarino
    G, Marr L, Slater M, Viapiana GF, Orosa FE, Gallego J, Pastor A, Forstner AJ,
    Hoffmann P, Nöthen MM, Claus I, Miller A, Mathey CM, Heilmann-Heimbach S, Sommer
    P, Patraskaki M, Wilbertz J, Schmitt K, Jirsa V, Petkoski S, Pitel S, Otten L,
    Athanasiadis AP, Pearmund C, Spanlang B, Alvarez E, Sanchez M, Giner A, Jia T,
    Gong Y, Xia Y, Chang X, Calhoun V, Liu J, Schwalber A, Thompson P, Clinton N,
    Desrivières S, Young AH, Stahl B, Ogoh G. 2024. Large-scale population data enrichment
    in mental health research. Nature Mental Health. 2(10), 1124–1127.
  mla: Nees, Frauke, et al. “Large-Scale Population Data Enrichment in Mental Health
    Research.” <i>Nature Mental Health</i>, vol. 2, no. 10, Springer Nature, 2024,
    pp. 1124–27, doi:<a href="https://doi.org/10.1038/s44220-024-00316-z">10.1038/s44220-024-00316-z</a>.
  short: F. Nees, P. Renner, N.E. Holz, E. Polemiti, S. Siehl, S. Hese, K. Schepanski,
    G. Schumann, H. Walter, A. Heinz, M. Ralser, S. Twardziok, N. Vaidya, A. Bernas,
    E. Serin, M. Jentsch, E. Hitchen, H. Kebir, T.A. Lett, J.C. Roy, R. Eils, U.H.
    Taron, T. Schütz, J. Banks, T. Banaschewski, K. Jansone, N. Christmann, A. Meyer-Lindenberg,
    H. Tost, N. Holz, E. Schwarz, A. Stringaris, M. Neidhart, B. Seefried, R. Aden,
    O.A. Andreassen, L.T. Westlye, D. Van Der Meer, S. Fernandez, R. Kjelkenes, H.
    Ask, M. Rapp, M. Tschorn, S.J. Böttger, A. Marquand, G. Novarino, L. Marr, M.
    Slater, G.F. Viapiana, F.E. Orosa, J. Gallego, A. Pastor, A.J. Forstner, P. Hoffmann,
    M.M. Nöthen, I. Claus, A. Miller, C.M. Mathey, S. Heilmann-Heimbach, P. Sommer,
    M. Patraskaki, J. Wilbertz, K. Schmitt, V. Jirsa, S. Petkoski, S. Pitel, L. Otten,
    A.P. Athanasiadis, C. Pearmund, B. Spanlang, E. Alvarez, M. Sanchez, A. Giner,
    T. Jia, Y. Gong, Y. Xia, X. Chang, V. Calhoun, J. Liu, A. Schwalber, P. Thompson,
    N. Clinton, S. Desrivières, A.H. Young, B. Stahl, G. Ogoh, Nature Mental Health
    2 (2024) 1124–1127.
date_created: 2025-03-23T23:01:28Z
date_published: 2024-10-01T00:00:00Z
date_updated: 2025-03-25T08:28:39Z
day: '01'
department:
- _id: GaNo
doi: 10.1038/s44220-024-00316-z
intvolume: '         2'
issue: '10'
language:
- iso: eng
month: '10'
oa_version: None
page: 1124-1127
publication: Nature Mental Health
publication_identifier:
  eissn:
  - 2731-6076
publication_status: published
publisher: Springer Nature
quality_controlled: '1'
scopus_import: '1'
status: public
title: Large-scale population data enrichment in mental health research
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 2
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19510'
abstract:
- lang: eng
  text: "We propose a new variant of the Adam optimizer [Kingma and Ba, 2014] called\r\nMICROADAM
    that specifically minimizes memory overheads, while maintaining\r\ntheoretical
    convergence guarantees. We achieve this by compressing the gradient\r\ninformation
    before it is fed into the optimizer state, thereby reducing its memory\r\nfootprint
    significantly. We control the resulting compression error via a novel\r\ninstance
    of the classical error feedback mechanism from distributed optimization [Seide
    et al., 2014, Alistarh et al., 2018, Karimireddy et al., 2019] in which\r\nthe
    error correction information is itself compressed to allow for practical memory\r\ngains.
    We prove that the resulting approach maintains theoretical convergence\r\nguarantees
    competitive to those of AMSGrad, while providing good practical performance. Specifically,
    we show that MICROADAM can be implemented efficiently\r\non GPUs: on both million-scale
    (BERT) and billion-scale (LLaMA) models, MICROADAM provides practical convergence
    competitive to that of the uncompressed\r\nAdam baseline, with lower memory usage
    and similar running time. Our code is\r\navailable at https://github.com/IST-DASLab/MicroAdam."
acknowledged_ssus:
- _id: CampIT
acknowledgement: The authors thank Razvan Pascanu, Mahdi Nikdan and Soroush Tabesh
  for their valuable feedback, the IT department from Institute of Science and Technology
  Austria for the hardware support and Weights and Biases for the infrastructure to
  track all our experiments. Mher Safaryan has received funding from the European
  Union’s Horizon 2020 research and innovation program under the Marie Sklodowska-Curie
  grant agreement No 101034413.
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Ionut-Vlad
  full_name: Modoranu, Ionut-Vlad
  id: 449f7a18-f128-11eb-9611-9b430c0c6333
  last_name: Modoranu
- first_name: Mher
  full_name: Safaryan, Mher
  id: dd546b39-0804-11ed-9c55-ef075c39778d
  last_name: Safaryan
- first_name: Grigory
  full_name: Malinovsky, Grigory
  last_name: Malinovsky
- first_name: Eldar
  full_name: Kurtic, Eldar
  id: 47beb3a5-07b5-11eb-9b87-b108ec578218
  last_name: Kurtic
- first_name: Thomas
  full_name: Robert, Thomas
  id: de632733-1457-11f0-ae22-b5914b8c1c41
  last_name: Robert
- first_name: Peter
  full_name: Richtárik, Peter
  last_name: Richtárik
- first_name: Dan-Adrian
  full_name: Alistarh, Dan-Adrian
  id: 4A899BFC-F248-11E8-B48F-1D18A9856A87
  last_name: Alistarh
  orcid: 0000-0003-3650-940X
citation:
  ama: 'Modoranu I-V, Safaryan M, Malinovsky G, et al. MICROADAM: Accurate adaptive
    optimization with low space overhead and provable convergence. In: <i>38th Conference
    on Neural Information Processing Systems</i>. Vol 37. Neural Information Processing
    Systems Foundation; 2024.'
  apa: 'Modoranu, I.-V., Safaryan, M., Malinovsky, G., Kurtic, E., Robert, T., Richtárik,
    P., &#38; Alistarh, D.-A. (2024). MICROADAM: Accurate adaptive optimization with
    low space overhead and provable convergence. In <i>38th Conference on Neural Information
    Processing Systems</i> (Vol. 37). Neural Information Processing Systems Foundation.'
  chicago: 'Modoranu, Ionut-Vlad, Mher Safaryan, Grigory Malinovsky, Eldar Kurtic,
    Thomas Robert, Peter Richtárik, and Dan-Adrian Alistarh. “MICROADAM: Accurate
    Adaptive Optimization with Low Space Overhead and Provable Convergence.” In <i>38th
    Conference on Neural Information Processing Systems</i>, Vol. 37. Neural Information
    Processing Systems Foundation, 2024.'
  ieee: 'I.-V. Modoranu <i>et al.</i>, “MICROADAM: Accurate adaptive optimization
    with low space overhead and provable convergence,” in <i>38th Conference on Neural
    Information Processing Systems</i>, 2024, vol. 37.'
  ista: 'Modoranu I-V, Safaryan M, Malinovsky G, Kurtic E, Robert T, Richtárik P,
    Alistarh D-A. 2024. MICROADAM: Accurate adaptive optimization with low space overhead
    and provable convergence. 38th Conference on Neural Information Processing Systems.
    , Advances in Neural Information Processing Systems, vol. 37.'
  mla: 'Modoranu, Ionut-Vlad, et al. “MICROADAM: Accurate Adaptive Optimization with
    Low Space Overhead and Provable Convergence.” <i>38th Conference on Neural Information
    Processing Systems</i>, vol. 37, Neural Information Processing Systems Foundation,
    2024.'
  short: I.-V. Modoranu, M. Safaryan, G. Malinovsky, E. Kurtic, T. Robert, P. Richtárik,
    D.-A. Alistarh, in:, 38th Conference on Neural Information Processing Systems,
    Neural Information Processing Systems Foundation, 2024.
corr_author: '1'
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T11:32:52Z
day: '20'
department:
- _id: DaAl
ec_funded: 1
external_id:
  arxiv:
  - '2405.15593'
intvolume: '        37'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2405.15593
month: '12'
oa: 1
oa_version: Preprint
project:
- _id: fc2ed2f7-9c52-11eb-aca3-c01059dda49c
  call_identifier: H2020
  grant_number: '101034413'
  name: 'IST-BRIDGE: International postdoctoral program'
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
related_material:
  link:
  - relation: software
    url: https://github.com/IST-DASLab/MicroAdam
scopus_import: '1'
status: public
title: 'MICROADAM: Accurate adaptive optimization with low space overhead and provable
  convergence'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19511'
abstract:
- lang: eng
  text: We introduce QuaRot, a new Quantization scheme based on Rotations, which is
    able to quantize LLMs end-to-end, including all weights, activations, and KV cache
    in 4 bits. QuaRot rotates LLMs in a way that removes outliers from the hidden
    state without changing the output, making quantization easier. This computational
    invariance is applied to the hidden state (residual) of the LLM, as well as to
    the activations of the feed-forward components, aspects of the attention mechanism,
    and to the KV cache. The result is a quantized model where all matrix multiplications
    are performed in 4 bits, without any channels identified for retention in higher
    precision. Our 4-bit quantized LLAMA2-70B model has losses of at most 0.47 WikiText-2
    perplexity and retains 99% of the zero-shot performance. We also show that QuaRot
    can provide lossless 6 and 8 bit LLAMA-2 models without any calibration data using
    round-to-nearest quantization. Code is available at github.com/spcl/QuaRot.
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Saleh
  full_name: Ashkboos, Saleh
  last_name: Ashkboos
- first_name: Amirkeivan
  full_name: Mohtashami, Amirkeivan
  last_name: Mohtashami
- first_name: Maximilian L.
  full_name: Croci, Maximilian L.
  last_name: Croci
- first_name: Bo
  full_name: Li, Bo
  last_name: Li
- first_name: Pashmina
  full_name: Cameron, Pashmina
  last_name: Cameron
- first_name: Martin
  full_name: Jaggi, Martin
  last_name: Jaggi
- first_name: Dan-Adrian
  full_name: Alistarh, Dan-Adrian
  id: 4A899BFC-F248-11E8-B48F-1D18A9856A87
  last_name: Alistarh
  orcid: 0000-0003-3650-940X
- first_name: Torsten
  full_name: Hoefler, Torsten
  last_name: Hoefler
- first_name: James
  full_name: Hensman, James
  last_name: Hensman
citation:
  ama: 'Ashkboos S, Mohtashami A, Croci ML, et al. QuaRot: Outlier-free 4-bit inference
    in rotated LLMs. In: <i>38th Conference on Neural Information Processing Systems</i>.
    Vol 37. Neural Information Processing Systems Foundation; 2024.'
  apa: 'Ashkboos, S., Mohtashami, A., Croci, M. L., Li, B., Cameron, P., Jaggi, M.,
    … Hensman, J. (2024). QuaRot: Outlier-free 4-bit inference in rotated LLMs. In
    <i>38th Conference on Neural Information Processing Systems</i> (Vol. 37). Vancouver,
    Canada: Neural Information Processing Systems Foundation.'
  chicago: 'Ashkboos, Saleh, Amirkeivan Mohtashami, Maximilian L. Croci, Bo Li, Pashmina
    Cameron, Martin Jaggi, Dan-Adrian Alistarh, Torsten Hoefler, and James Hensman.
    “QuaRot: Outlier-Free 4-Bit Inference in Rotated LLMs.” In <i>38th Conference
    on Neural Information Processing Systems</i>, Vol. 37. Neural Information Processing
    Systems Foundation, 2024.'
  ieee: 'S. Ashkboos <i>et al.</i>, “QuaRot: Outlier-free 4-bit inference in rotated
    LLMs,” in <i>38th Conference on Neural Information Processing Systems</i>, Vancouver,
    Canada, 2024, vol. 37.'
  ista: 'Ashkboos S, Mohtashami A, Croci ML, Li B, Cameron P, Jaggi M, Alistarh D-A,
    Hoefler T, Hensman J. 2024. QuaRot: Outlier-free 4-bit inference in rotated LLMs.
    38th Conference on Neural Information Processing Systems. NeurIPS: Neural Information
    Processing Systems, Advances in Neural Information Processing Systems, vol. 37.'
  mla: 'Ashkboos, Saleh, et al. “QuaRot: Outlier-Free 4-Bit Inference in Rotated LLMs.”
    <i>38th Conference on Neural Information Processing Systems</i>, vol. 37, Neural
    Information Processing Systems Foundation, 2024.'
  short: S. Ashkboos, A. Mohtashami, M.L. Croci, B. Li, P. Cameron, M. Jaggi, D.-A.
    Alistarh, T. Hoefler, J. Hensman, in:, 38th Conference on Neural Information Processing
    Systems, Neural Information Processing Systems Foundation, 2024.
conference:
  end_date: 2024-12-15
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-09
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T11:33:12Z
day: '20'
department:
- _id: DaAl
external_id:
  arxiv:
  - '2404.00456'
intvolume: '        37'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2404.00456
month: '12'
oa: 1
oa_version: Preprint
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
related_material:
  link:
  - relation: software
    url: https://github.com/spcl/QuaRot
scopus_import: '1'
status: public
title: 'QuaRot: Outlier-free 4-bit inference in rotated LLMs'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19512'
abstract:
- lang: eng
  text: "Differential privacy with gradual expiration models the setting where data
    items\r\narrive in a stream and at a given time t the privacy loss guaranteed
    for a data item\r\nseen at time (t − d) is εg(d), where g is a monotonically non-decreasing
    function.\r\nWe study the fundamental continual (binary) counting problem where
    each data\r\nitem consists of a bit, and the algorithm needs to output at each
    time step the sum of\r\nall the bits streamed so far. For a stream of length T
    and privacy without expiration\r\ncontinual counting is possible with maximum
    (over all time steps) additive error\r\nO(log2\r\n(T)/ε) and the best known lower
    bound is Ω(log(T)/ε); closing this gap\r\nis a challenging open problem.\r\nWe
    show that the situation is very different for privacy with gradual expiration
    by\r\ngiving upper and lower bounds for a large set of expiration functions g.
    Specifically,\r\nour algorithm achieves an additive error of O(log(T)/ε) for a
    large set of privacy\r\nexpiration functions. We also give a lower bound that
    shows that if C is the additive\r\nerror of any ε-DP algorithm for this problem,
    then the product of C and the privacy\r\nexpiration function after 2C steps must
    be Ω(log(T)/ε). Our algorithm matches\r\nthis lower bound as its additive error
    is O(log(T)/ε), even when g(2C) = O(1).\r\nOur empirical evaluation shows that
    we achieve a slowly growing privacy loss\r\nwith significantly smaller empirical
    privacy loss for large values of d than a natural\r\nbaseline algorithm."
acknowledgement: 'Monika Henzinger: This project has received funding from the European
  Research Council (ERC) under the European Union’s Horizon 2020 research and innovation
  programme (Grant agreement No. 101019564) and the Austrian Science Fund (FWF) grant
  DOI 10.55776/Z422, grant DOI 10.55776/I5982, and grant DOI 10.55776/P33775 with
  additional funding from the netidee SCIENCE Stiftung, 2020–2024. Joel Daniel Andersson
  and Rasmus Pagh are affiliated with Basic Algorithms Research Copenhagen (BARC),
  supported by the VILLUM Foundation grant 16582, and are also supported by Providentia,
  a Data Science Distinguished Investigator grant from Novo Nordisk Fonden. Teresa
  Anna Steiner is supported by a research grant (VIL51463) from VILLUM FONDEN. This
  work was done while Teresa Anna Steiner was a Postdoc at the Technical University
  of Denmark. Jalaj Upadhyay’s research was funded by the Rutgers Decanal Grant no.
  302918 and an unrestricted gift from Google.'
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Joel Daniel
  full_name: Andersson, Joel Daniel
  last_name: Andersson
- first_name: Monika H
  full_name: Henzinger, Monika H
  id: 540c9bbd-f2de-11ec-812d-d04a5be85630
  last_name: Henzinger
  orcid: 0000-0002-5008-6530
- first_name: Rasmus
  full_name: Pagh, Rasmus
  last_name: Pagh
- first_name: Teresa Anna
  full_name: Steiner, Teresa Anna
  last_name: Steiner
- first_name: Jalaj
  full_name: Upadhyay, Jalaj
  last_name: Upadhyay
citation:
  ama: 'Andersson JD, Henzinger M, Pagh R, Steiner TA, Upadhyay J. Continual counting
    with gradual privacy expiration. In: <i>38th Conference on Neural Information
    Processing Systems</i>. Vol 37. Neural Information Processing Systems Foundation;
    2024.'
  apa: 'Andersson, J. D., Henzinger, M., Pagh, R., Steiner, T. A., &#38; Upadhyay,
    J. (2024). Continual counting with gradual privacy expiration. In <i>38th Conference
    on Neural Information Processing Systems</i> (Vol. 37). Vancouver, Canada: Neural
    Information Processing Systems Foundation.'
  chicago: Andersson, Joel Daniel, Monika Henzinger, Rasmus Pagh, Teresa Anna Steiner,
    and Jalaj Upadhyay. “Continual Counting with Gradual Privacy Expiration.” In <i>38th
    Conference on Neural Information Processing Systems</i>, Vol. 37. Neural Information
    Processing Systems Foundation, 2024.
  ieee: J. D. Andersson, M. Henzinger, R. Pagh, T. A. Steiner, and J. Upadhyay, “Continual
    counting with gradual privacy expiration,” in <i>38th Conference on Neural Information
    Processing Systems</i>, Vancouver, Canada, 2024, vol. 37.
  ista: 'Andersson JD, Henzinger M, Pagh R, Steiner TA, Upadhyay J. 2024. Continual
    counting with gradual privacy expiration. 38th Conference on Neural Information
    Processing Systems. NeurIPS: Neural Information Processing Systems, Advances in
    Neural Information Processing Systems, vol. 37.'
  mla: Andersson, Joel Daniel, et al. “Continual Counting with Gradual Privacy Expiration.”
    <i>38th Conference on Neural Information Processing Systems</i>, vol. 37, Neural
    Information Processing Systems Foundation, 2024.
  short: J.D. Andersson, M. Henzinger, R. Pagh, T.A. Steiner, J. Upadhyay, in:, 38th
    Conference on Neural Information Processing Systems, Neural Information Processing
    Systems Foundation, 2024.
conference:
  end_date: 2024-12-15
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-09
corr_author: '1'
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T11:33:22Z
day: '20'
department:
- _id: MoHe
ec_funded: 1
external_id:
  arxiv:
  - '2406.03802'
intvolume: '        37'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2406.03802
month: '12'
oa: 1
oa_version: Preprint
project:
- _id: bd9ca328-d553-11ed-ba76-dc4f890cfe62
  call_identifier: H2020
  grant_number: '101019564'
  name: The design and evaluation of modern fully dynamic data structures
- _id: 34def286-11ca-11ed-8bc3-da5948e1613c
  grant_number: Z00422
  name: Efficient algorithms
- _id: bda196b2-d553-11ed-ba76-8e8ee6c21103
  grant_number: I05982
  name: Static and Dynamic Hierarchical Graph Decompositions
- _id: bd9e3a2e-d553-11ed-ba76-8aa684ce17fe
  grant_number: P33775
  name: Fast Algorithms for a Reactive Network Layer
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
scopus_import: '1'
status: public
title: Continual counting with gradual privacy expiration
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19515'
abstract:
- lang: eng
  text: "Neural models learn data representations that lie on low-dimensional manifolds,\r\nyet
    modeling the relation between these representational spaces is an ongoing challenge.
    By integrating spectral geometry principles into neural modeling, we show\r\nthat
    this problem can be better addressed in the functional domain, mitigating complexity,
    while enhancing interpretability and performances on downstream tasks.\r\nTo this
    end, we introduce a multi-purpose framework to the representation learning\r\ncommunity,
    which allows to: (i) compare different spaces in an interpretable way\r\nand measure
    their intrinsic similarity; (ii) find correspondences between them, both\r\nin
    unsupervised and weakly supervised settings, and (iii) to effectively transfer\r\nrepresentations
    between distinct spaces. We validate our framework on various\r\napplications,
    ranging from stitching to retrieval tasks, and on multiple modalities,\r\ndemonstrating
    that Latent Functional Maps can serve as a swiss-army knife for\r\nrepresentation
    alignment"
acknowledgement: MF is supported by the MSCA IST-Bridge fellowship which has received
  funding from the European Union’s Horizon 2020 research and innovation program under
  the Marie Skłodowska-Curie grant agreement No 101034413. ER and VM are supported
  by the PNRR MUR project PE0000013-FAIR. MP is supported by the Sapienza grant "Predicting
  and Explaining Clinical Trial Outcomes", prot. RG12218166FA3F13.
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Marco
  full_name: Fumero, Marco
  id: 1c1593eb-393f-11ef-bb8e-ab4f1e979650
  last_name: Fumero
- first_name: Marco
  full_name: Pegoraro, Marco
  last_name: Pegoraro
- first_name: Valentino
  full_name: Maiorca, Valentino
  last_name: Maiorca
- first_name: Francesco
  full_name: Locatello, Francesco
  id: 26cfd52f-2483-11ee-8040-88983bcc06d4
  last_name: Locatello
  orcid: 0000-0002-4850-0683
- first_name: Emanuele
  full_name: Rodolà, Emanuele
  last_name: Rodolà
citation:
  ama: 'Fumero M, Pegoraro M, Maiorca V, Locatello F, Rodolà E. Latent functional
    maps: A spectral framework for representation alignment. In: <i>38th Conference
    on Neural Information Processing Systems</i>. Vol 37. Neural Information Processing
    Systems Foundation; 2024.'
  apa: 'Fumero, M., Pegoraro, M., Maiorca, V., Locatello, F., &#38; Rodolà, E. (2024).
    Latent functional maps: A spectral framework for representation alignment. In
    <i>38th Conference on Neural Information Processing Systems</i> (Vol. 37). Vancouver,
    Canada: Neural Information Processing Systems Foundation.'
  chicago: 'Fumero, Marco, Marco Pegoraro, Valentino Maiorca, Francesco Locatello,
    and Emanuele Rodolà. “Latent Functional Maps: A Spectral Framework for Representation
    Alignment.” In <i>38th Conference on Neural Information Processing Systems</i>,
    Vol. 37. Neural Information Processing Systems Foundation, 2024.'
  ieee: 'M. Fumero, M. Pegoraro, V. Maiorca, F. Locatello, and E. Rodolà, “Latent
    functional maps: A spectral framework for representation alignment,” in <i>38th
    Conference on Neural Information Processing Systems</i>, Vancouver, Canada, 2024,
    vol. 37.'
  ista: 'Fumero M, Pegoraro M, Maiorca V, Locatello F, Rodolà E. 2024. Latent functional
    maps: A spectral framework for representation alignment. 38th Conference on Neural
    Information Processing Systems. NeurIPS: Neural Information Processing Systems,
    Advances in Neural Information Processing Systems, vol. 37.'
  mla: 'Fumero, Marco, et al. “Latent Functional Maps: A Spectral Framework for Representation
    Alignment.” <i>38th Conference on Neural Information Processing Systems</i>, vol.
    37, Neural Information Processing Systems Foundation, 2024.'
  short: M. Fumero, M. Pegoraro, V. Maiorca, F. Locatello, E. Rodolà, in:, 38th Conference
    on Neural Information Processing Systems, Neural Information Processing Systems
    Foundation, 2024.
conference:
  end_date: 2024-12-15
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-09
corr_author: '1'
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T11:36:51Z
day: '20'
department:
- _id: FrLo
ec_funded: 1
external_id:
  arxiv:
  - '2406.14183'
intvolume: '        37'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2406.14183
month: '12'
oa: 1
oa_version: Preprint
project:
- _id: fc2ed2f7-9c52-11eb-aca3-c01059dda49c
  call_identifier: H2020
  grant_number: '101034413'
  name: 'IST-BRIDGE: International postdoctoral program'
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'Latent functional maps: A spectral framework for representation alignment'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19517'
abstract:
- lang: eng
  text: "In this paper, we present a novel data-free method for merging neural networks
    in weight space. Differently from most existing works, our method optimizes for
    the permutations of network neurons globally across all layers. This allows us
    to enforce cycle consistency of the permutations when merging n ≥ 3 models, allowing
    circular compositions of permutations to be computed without accumulating error
    along the path. We qualitatively and quantitatively motivate the need for such
    a constraint, showing its benefits when merging sets of models in scenarios spanning
    varying architectures and datasets. We finally show that, when coupled\r\nwith
    activation renormalization, our approach yields the best results in the task."
acknowledgement: "This work is supported by the ERC grant no.802554 (SPECGEO), PRIN
  2020 project\r\nno.2020TA3K9N (LEGO.AI), and PNRR MUR project PE0000013-FAIR. Marco
  Fumero is supported by the MSCA IST-Bridge fellowship which has received funding
  from the European Union’s Horizon 2020 research and innovation program under the
  Marie Skłodowska-Curie grant agreement No 101034413. We thank Simone Scardapane
  for the helpful feedback on the paper."
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Donato
  full_name: Crisostomi, Donato
  last_name: Crisostomi
- first_name: Marco
  full_name: Fumero, Marco
  id: 1c1593eb-393f-11ef-bb8e-ab4f1e979650
  last_name: Fumero
- first_name: Daniele
  full_name: Baieri, Daniele
  last_name: Baieri
- first_name: Florian
  full_name: Bernard, Florian
  last_name: Bernard
- first_name: Emanuele
  full_name: Rodolà, Emanuele
  last_name: Rodolà
citation:
  ama: 'Crisostomi D, Fumero M, Baieri D, Bernard F, Rodolà E. C2M3: Cycle-consistent
    multi-model merging. In: <i>38th Conference on Neural Information Processing Systems</i>.
    Vol 37. Neural Information Processing Systems Foundation; 2024.'
  apa: 'Crisostomi, D., Fumero, M., Baieri, D., Bernard, F., &#38; Rodolà, E. (2024).
    C2M3: Cycle-consistent multi-model merging. In <i>38th Conference on Neural Information
    Processing Systems</i> (Vol. 37). Vancouver, Canada: Neural Information Processing
    Systems Foundation.'
  chicago: 'Crisostomi, Donato, Marco Fumero, Daniele Baieri, Florian Bernard, and
    Emanuele Rodolà. “C2M3: Cycle-Consistent Multi-Model Merging.” In <i>38th Conference
    on Neural Information Processing Systems</i>, Vol. 37. Neural Information Processing
    Systems Foundation, 2024.'
  ieee: 'D. Crisostomi, M. Fumero, D. Baieri, F. Bernard, and E. Rodolà, “C2M3: Cycle-consistent
    multi-model merging,” in <i>38th Conference on Neural Information Processing Systems</i>,
    Vancouver, Canada, 2024, vol. 37.'
  ista: 'Crisostomi D, Fumero M, Baieri D, Bernard F, Rodolà E. 2024. C2M3: Cycle-consistent
    multi-model merging. 38th Conference on Neural Information Processing Systems.
    NeurIPS: Neural Information Processing Systems, Advances in Neural Information
    Processing Systems, vol. 37.'
  mla: 'Crisostomi, Donato, et al. “C2M3: Cycle-Consistent Multi-Model Merging.” <i>38th
    Conference on Neural Information Processing Systems</i>, vol. 37, Neural Information
    Processing Systems Foundation, 2024.'
  short: D. Crisostomi, M. Fumero, D. Baieri, F. Bernard, E. Rodolà, in:, 38th Conference
    on Neural Information Processing Systems, Neural Information Processing Systems
    Foundation, 2024.
conference:
  end_date: 2024-12-15
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-09
corr_author: '1'
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T11:36:59Z
day: '20'
department:
- _id: FrLo
ec_funded: 1
external_id:
  arxiv:
  - '2405.17897'
intvolume: '        37'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2405.17897
month: '12'
oa: 1
oa_version: Preprint
project:
- _id: fc2ed2f7-9c52-11eb-aca3-c01059dda49c
  call_identifier: H2020
  grant_number: '101034413'
  name: 'IST-BRIDGE: International postdoctoral program'
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'C2M3: Cycle-consistent multi-model merging'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19518'
abstract:
- lang: eng
  text: "The rising footprint of machine learning has led to a focus on imposing model\r\nsparsity
    as a means of reducing computational and memory costs. For deep neural\r\nnetworks
    (DNNs), the state-of-the-art accuracy-vs-sparsity is achieved by heuristics\r\ninspired
    by the classical Optimal Brain Surgeon (OBS) framework [LeCun et al.,\r\n1989,
    Hassibi and Stork, 1992, Hassibi et al., 1993], which leverages loss curvature\r\ninformation
    to make better pruning decisions. Yet, these results still lack a solid\r\ntheoretical
    understanding, and it is unclear whether they can be improved by\r\nleveraging
    connections to the wealth of work on sparse recovery algorithms. In this\r\npaper,
    we draw new connections between these two areas and present new sparse\r\nrecovery
    algorithms inspired by the OBS framework that comes with theoretical\r\nguarantees
    under reasonable assumptions and have strong practical performance.\r\nSpecifically,
    our work starts from the observation that we can leverage curvature\r\ninformation
    in OBS-like fashion upon the projection step of classic iterative sparse\r\nrecovery
    algorithms such as IHT. We show for the first time that this leads both\r\nto
    improved convergence bounds under standard assumptions. Furthermore, we\r\npresent
    extensions of this approach to the practical task of obtaining accurate sparse\r\nDNNs,
    and validate it experimentally at scale for Transformer-based models on\r\nvision
    and language tasks."
acknowledged_ssus:
- _id: CampIT
acknowledgement: The authors thank the anonymous NeurIPS reviewers for their useful
  comments and feedback, the IT department from the Institute of Science and Technology
  Austria for the hardware support, and Weights and Biases for the infrastructure
  to track all our experiments. Mher Safaryan has received funding from the European
  Union’s Horizon 2020 research and innovation program under the Maria Skłodowska-Curie
  grant agreement No 101034413.
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Diyuan
  full_name: Wu, Diyuan
  id: 1a5914c2-896a-11ed-bdf8-fb80621a0635
  last_name: Wu
- first_name: Ionut-Vlad
  full_name: Modoranu, Ionut-Vlad
  id: 449f7a18-f128-11eb-9611-9b430c0c6333
  last_name: Modoranu
- first_name: Mher
  full_name: Safaryan, Mher
  id: dd546b39-0804-11ed-9c55-ef075c39778d
  last_name: Safaryan
- first_name: Denis
  full_name: Kuznedelev, Denis
  last_name: Kuznedelev
- first_name: Dan-Adrian
  full_name: Alistarh, Dan-Adrian
  id: 4A899BFC-F248-11E8-B48F-1D18A9856A87
  last_name: Alistarh
  orcid: 0000-0003-3650-940X
citation:
  ama: 'Wu D, Modoranu I-V, Safaryan M, Kuznedelev D, Alistarh D-A. The iterative
    optimal brain surgeon: Faster sparse recovery by leveraging second-order information.
    In: <i>38th Conference on Neural Information Processing Systems</i>. Vol 37. Neural
    Information Processing Systems Foundation; 2024.'
  apa: 'Wu, D., Modoranu, I.-V., Safaryan, M., Kuznedelev, D., &#38; Alistarh, D.-A.
    (2024). The iterative optimal brain surgeon: Faster sparse recovery by leveraging
    second-order information. In <i>38th Conference on Neural Information Processing
    Systems</i> (Vol. 37). Vancouver, Canada: Neural Information Processing Systems
    Foundation.'
  chicago: 'Wu, Diyuan, Ionut-Vlad Modoranu, Mher Safaryan, Denis Kuznedelev, and
    Dan-Adrian Alistarh. “The Iterative Optimal Brain Surgeon: Faster Sparse Recovery
    by Leveraging Second-Order Information.” In <i>38th Conference on Neural Information
    Processing Systems</i>, Vol. 37. Neural Information Processing Systems Foundation,
    2024.'
  ieee: 'D. Wu, I.-V. Modoranu, M. Safaryan, D. Kuznedelev, and D.-A. Alistarh, “The
    iterative optimal brain surgeon: Faster sparse recovery by leveraging second-order
    information,” in <i>38th Conference on Neural Information Processing Systems</i>,
    Vancouver, Canada, 2024, vol. 37.'
  ista: 'Wu D, Modoranu I-V, Safaryan M, Kuznedelev D, Alistarh D-A. 2024. The iterative
    optimal brain surgeon: Faster sparse recovery by leveraging second-order information.
    38th Conference on Neural Information Processing Systems. NeurIPS: Neural Information
    Processing Systems, Advances in Neural Information Processing Systems, vol. 37.'
  mla: 'Wu, Diyuan, et al. “The Iterative Optimal Brain Surgeon: Faster Sparse Recovery
    by Leveraging Second-Order Information.” <i>38th Conference on Neural Information
    Processing Systems</i>, vol. 37, Neural Information Processing Systems Foundation,
    2024.'
  short: D. Wu, I.-V. Modoranu, M. Safaryan, D. Kuznedelev, D.-A. Alistarh, in:, 38th
    Conference on Neural Information Processing Systems, Neural Information Processing
    Systems Foundation, 2024.
conference:
  end_date: 2024-12-15
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-09
corr_author: '1'
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T11:37:10Z
day: '20'
department:
- _id: DaAl
- _id: MaMo
ec_funded: 1
external_id:
  arxiv:
  - '2408.17163'
intvolume: '        37'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2408.17163
month: '12'
oa: 1
oa_version: Preprint
project:
- _id: fc2ed2f7-9c52-11eb-aca3-c01059dda49c
  call_identifier: H2020
  grant_number: '101034413'
  name: 'IST-BRIDGE: International postdoctoral program'
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'The iterative optimal brain surgeon: Faster sparse recovery by leveraging
  second-order information'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: publisher
OA_type: gold
_id: '19519'
abstract:
- lang: eng
  text: There has been significant interest in "extreme" compression of large language
    models (LLMs), i.e. to 1-2 bits per parameter, which allows such models to be
    executed efficiently on resource-constrained devices. Existing work focused on
    improved one-shot quantization techniques and weight representations; yet, purely
    post-training approaches are reaching diminishing returns in terms of the accuracy-vs-bit-width
    trade-off. State-of-the-art quantization methods such as QuIP# and AQLM include
    fine-tuning (part of) the compressed parameters over a limited amount of calibration
    data; however, such fine-tuning techniques over compressed weights often make
    exclusive use of straight-through estimators (STE), whose performance is not well-understood
    in this setting. In this work, we question the use of STE for extreme LLM compression,
    showing that it can be sub-optimal, and perform a systematic study of quantization-aware
    fine-tuning strategies for LLMs.We propose PV-Tuning - a representation-agnostic
    framework that generalizes and improves upon existing fine-tuning strategies,
    and provides convergence guarantees in restricted cases.On the practical side,
    when used for 1-2 bit vector quantization, PV-Tuning outperforms prior techniques
    for highly-performant models such as Llama and Mistral. Using PV-Tuning, we achieve
    the first Pareto-optimal quantization for Llama-2 family models at 2 bits per
    parameter.
acknowledgement: "Authors would like to thank Vage Egiazarian, Andrei Panferov and
  Ruslan Svirschevski for their\r\nhelp and advice on AQLM codebase and running large-scale
  experiments. We also thank Philip\r\nZmushko and Artem Fedorov for helpful discussions
  during the early stages of our research. The research of Kai Yi, Konstantin Burlachenko,
  and Peter Richtárik reported in this publication was supported by funding from King
  Abdullah University of Science and Technology (KAUST) – Center of Excellence for
  Generative AI, under award number 5940. We would also like to thank our NeurIPS
  reviewers for their helpful suggestions, we specifically highlight p3Lv’s suggestions
  to consider smaller codebook sizes and evaluate PV-Tuning with QuIP#, both of which
  produced interesting findings. Finally, we thank the open-source contributors from
  llama.cpp9 and the LocalLlama10 community for discussions and inspirations on practical
  use cases of quantized language models, and in particular, Yalda Shabanzadeh and
  Arthur Aardvark for their help with improving the codebase."
alternative_title:
- Advances in Neural Information Processing Systems
article_processing_charge: No
arxiv: 1
author:
- first_name: Vladimir
  full_name: Malinovskii, Vladimir
  last_name: Malinovskii
- first_name: Denis
  full_name: Mazur, Denis
  last_name: Mazur
- first_name: Ivan
  full_name: Ilin, Ivan
  last_name: Ilin
- first_name: Denis
  full_name: Kuznedelev, Denis
  last_name: Kuznedelev
- first_name: Konstantin
  full_name: Burlachenko, Konstantin
  last_name: Burlachenko
- first_name: Kai
  full_name: Yi, Kai
  last_name: Yi
- first_name: Dan-Adrian
  full_name: Alistarh, Dan-Adrian
  id: 4A899BFC-F248-11E8-B48F-1D18A9856A87
  last_name: Alistarh
  orcid: 0000-0003-3650-940X
- first_name: Peter
  full_name: Richtarik, Peter
  last_name: Richtarik
citation:
  ama: 'Malinovskii V, Mazur D, Ilin I, et al. PV-tuning: Beyond straight-through
    estimation for extreme LLM compression. In: <i>38th Conference on Neural Information
    Processing Systems</i>. Vol 37. Neural Information Processing Systems Foundation;
    2024.'
  apa: 'Malinovskii, V., Mazur, D., Ilin, I., Kuznedelev, D., Burlachenko, K., Yi,
    K., … Richtarik, P. (2024). PV-tuning: Beyond straight-through estimation for
    extreme LLM compression. In <i>38th Conference on Neural Information Processing
    Systems</i> (Vol. 37). Vancouver, Canada: Neural Information Processing Systems
    Foundation.'
  chicago: 'Malinovskii, Vladimir, Denis Mazur, Ivan Ilin, Denis Kuznedelev, Konstantin
    Burlachenko, Kai Yi, Dan-Adrian Alistarh, and Peter Richtarik. “PV-Tuning: Beyond
    Straight-through Estimation for Extreme LLM Compression.” In <i>38th Conference
    on Neural Information Processing Systems</i>, Vol. 37. Neural Information Processing
    Systems Foundation, 2024.'
  ieee: 'V. Malinovskii <i>et al.</i>, “PV-tuning: Beyond straight-through estimation
    for extreme LLM compression,” in <i>38th Conference on Neural Information Processing
    Systems</i>, Vancouver, Canada, 2024, vol. 37.'
  ista: 'Malinovskii V, Mazur D, Ilin I, Kuznedelev D, Burlachenko K, Yi K, Alistarh
    D-A, Richtarik P. 2024. PV-tuning: Beyond straight-through estimation for extreme
    LLM compression. 38th Conference on Neural Information Processing Systems. NeurIPS:
    Neural Information Processing Systems, Advances in Neural Information Processing
    Systems, vol. 37.'
  mla: 'Malinovskii, Vladimir, et al. “PV-Tuning: Beyond Straight-through Estimation
    for Extreme LLM Compression.” <i>38th Conference on Neural Information Processing
    Systems</i>, vol. 37, Neural Information Processing Systems Foundation, 2024.'
  short: V. Malinovskii, D. Mazur, I. Ilin, D. Kuznedelev, K. Burlachenko, K. Yi,
    D.-A. Alistarh, P. Richtarik, in:, 38th Conference on Neural Information Processing
    Systems, Neural Information Processing Systems Foundation, 2024.
conference:
  end_date: 2024-12-15
  location: Vancouver, Canada
  name: 'NeurIPS: Neural Information Processing Systems'
  start_date: 2024-12-10
date_created: 2025-04-06T22:01:32Z
date_published: 2024-12-20T00:00:00Z
date_updated: 2025-05-14T10:49:20Z
day: '20'
ddc:
- '000'
department:
- _id: DaAl
external_id:
  arxiv:
  - '2405.14852'
file:
- access_level: open_access
  checksum: 54d36f947887e26d0e568b512167001a
  content_type: application/pdf
  creator: dernst
  date_created: 2025-04-07T09:17:10Z
  date_updated: 2025-04-07T09:17:10Z
  file_id: '19521'
  file_name: 2024_NeurIPS_Malinovskii.pdf
  file_size: 939712
  relation: main_file
  success: 1
file_date_updated: 2025-04-07T09:17:10Z
has_accepted_license: '1'
intvolume: '        37'
language:
- iso: eng
month: '12'
oa: 1
oa_version: Published Version
publication: 38th Conference on Neural Information Processing Systems
publication_identifier:
  isbn:
  - '9798331314385'
  issn:
  - 1049-5258
publication_status: published
publisher: Neural Information Processing Systems Foundation
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'PV-tuning: Beyond straight-through estimation for extreme LLM compression'
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 37
year: '2024'
...
---
OA_place: repository
OA_type: green
_id: '19520'
abstract:
- lang: eng
  text: Vertebrates exhibit a wide range of motor behaviors, ranging from swimming
    to complex limb-based movements. Here we take advantage of frog metamorphosis,
    which captures a swim-to-limb-based movement transformation during the development
    of a single organism, to explore changes in the underlying spinal circuits. We
    find that the tadpole spinal cord contains small and largely homogeneous populations
    of motor neurons (MNs) and V1 interneurons (V1s) at early escape swimming stages.
    These neuronal populations only modestly increase in number and subtype heterogeneity
    with the emergence of free swimming. In contrast, during frog metamorphosis and
    the emergence of limb movement, there is a dramatic expansion of MN and V1 interneuron
    number and transcriptional heterogeneity, culminating in cohorts of neurons that
    exhibit striking molecular similarity to mammalian motor circuits. CRISPR/Cas9-mediated
    gene disruption of the limb MN and V1 determinants FoxP1 and Engrailed-1, respectively,
    results in severe but selective deficits in tail and limb function. Our work thus
    demonstrates that neural diversity scales exponentially with increasing behavioral
    complexity and illustrates striking evolutionary conservation in the molecular
    organization and function of motor circuits across species.
acknowledged_ssus:
- _id: Bio
acknowledgement: "We would like to thank the members of the Sweeney Lab (especially
  Stavros Papadopoulos and\r\nSophie Gobeil) for their contributions to this project
  and, in addition to the lab, Graziana Gatto\r\nand Mario de Bono, for discussion,
  and support. We are also grateful to Tom Jessell and Chris\r\nKintner for their
  scientific insight and mentorship during the conception of this project. This\r\nproject
  would also not have been possible with the technical support of the Matthias Nowak,\r\nVerena
  Mayer and the Aquatics as well as the Imaging and Optics Facility support teams\r\n(ISTA).
  In addition, we thank our funding sources for providing the resources to do these\r\nexperiments:
  FTI Strategy Lower Austria Dissertation Grant Number FT121-D-046 (D.V.);\r\nHorizon
  Europe ERC Starting Grant Number 101041551 (L.B.S., F.A.T. and D.V); Special\r\nResearch
  Program (SFB) of the Austrian Science Fund (FWF) Project number F7814-B (L.B.S);\r\nNINDS
  5R35NS116858 (J.S.D); CZI grant DAF2020-225401 (DOI): 10.37921/120055ratwvi\r\n(R.H.);
  NIH grant number R01NS123116 (J.B.B); American Lebanese Syrian Associated\r\nCharities
  (ALSAC) (J.B.B.); German Academic Exchange Service (DAAD) IFI Grant Number\r\n57515251-91853472
  (Z.H.); and Project A.L.S. (S.B-M.). "
article_processing_charge: No
author:
- first_name: David
  full_name: Vijatovic, David
  id: cf391e77-ec3c-11ea-a124-d69323410b58
  last_name: Vijatovic
- first_name: 'Florina Alexandra '
  full_name: 'Toma, Florina Alexandra '
  id: 2f73f876-f128-11eb-9611-b96b5a30cb0e
  last_name: Toma
- first_name: Zoe P
  full_name: Harrington, Zoe P
  id: a8144562-32c9-11ee-b5ce-d9800628bda2
  last_name: Harrington
  orcid: 0009-0008-0158-4032
- first_name: Christoph M
  full_name: Sommer, Christoph M
  id: 4DF26D8C-F248-11E8-B48F-1D18A9856A87
  last_name: Sommer
  orcid: 0000-0003-1216-9105
- first_name: Robert
  full_name: Hauschild, Robert
  id: 4E01D6B4-F248-11E8-B48F-1D18A9856A87
  last_name: Hauschild
  orcid: 0000-0001-9843-3522
- first_name: Alexandra J.
  full_name: Trevisan, Alexandra J.
  last_name: Trevisan
- first_name: Phillip
  full_name: Chapman, Phillip
  last_name: Chapman
- first_name: Mara
  full_name: Julseth, Mara
  id: 1cf464b2-dc7d-11ea-9b2f-f9b1aa9417d1
  last_name: Julseth
- first_name: Susan
  full_name: Brenner-Morton, Susan
  last_name: Brenner-Morton
- first_name: Mariano I.
  full_name: Gabitto, Mariano I.
  last_name: Gabitto
- first_name: Jeremy S.
  full_name: Dasen, Jeremy S.
  last_name: Dasen
- first_name: Jay B.
  full_name: Bikoff, Jay B.
  last_name: Bikoff
- first_name: Lora Beatrice Jaeger
  full_name: Sweeney, Lora Beatrice Jaeger
  id: 56BE8254-C4F0-11E9-8E45-0B23E6697425
  last_name: Sweeney
  orcid: 0000-0001-9242-5601
citation:
  ama: Vijatovic D, Toma FA, Harrington ZP, et al. Spinal neuron diversity scales
    exponentially with swim-to-limb transformation during frog metamorphosis. <i>bioRxiv</i>.
    doi:<a href="https://doi.org/10.1101/2024.09.20.614050">10.1101/2024.09.20.614050</a>
  apa: Vijatovic, D., Toma, F. A., Harrington, Z. P., Sommer, C. M., Hauschild, R.,
    Trevisan, A. J., … Sweeney, L. B. (n.d.). Spinal neuron diversity scales exponentially
    with swim-to-limb transformation during frog metamorphosis. <i>bioRxiv</i>. <a
    href="https://doi.org/10.1101/2024.09.20.614050">https://doi.org/10.1101/2024.09.20.614050</a>
  chicago: Vijatovic, David, Florina Alexandra  Toma, Zoe P Harrington, Christoph
    M Sommer, Robert Hauschild, Alexandra J. Trevisan, Phillip Chapman, et al. “Spinal
    Neuron Diversity Scales Exponentially with Swim-to-Limb Transformation during
    Frog Metamorphosis.” <i>BioRxiv</i>, n.d. <a href="https://doi.org/10.1101/2024.09.20.614050">https://doi.org/10.1101/2024.09.20.614050</a>.
  ieee: D. Vijatovic <i>et al.</i>, “Spinal neuron diversity scales exponentially
    with swim-to-limb transformation during frog metamorphosis,” <i>bioRxiv</i>. .
  ista: Vijatovic D, Toma FA, Harrington ZP, Sommer CM, Hauschild R, Trevisan AJ,
    Chapman P, Julseth M, Brenner-Morton S, Gabitto MI, Dasen JS, Bikoff JB, Sweeney
    LB. Spinal neuron diversity scales exponentially with swim-to-limb transformation
    during frog metamorphosis. bioRxiv, <a href="https://doi.org/10.1101/2024.09.20.614050">10.1101/2024.09.20.614050</a>.
  mla: Vijatovic, David, et al. “Spinal Neuron Diversity Scales Exponentially with
    Swim-to-Limb Transformation during Frog Metamorphosis.” <i>BioRxiv</i>, doi:<a
    href="https://doi.org/10.1101/2024.09.20.614050">10.1101/2024.09.20.614050</a>.
  short: D. Vijatovic, F.A. Toma, Z.P. Harrington, C.M. Sommer, R. Hauschild, A.J.
    Trevisan, P. Chapman, M. Julseth, S. Brenner-Morton, M.I. Gabitto, J.S. Dasen,
    J.B. Bikoff, L.B. Sweeney, BioRxiv (n.d.).
corr_author: '1'
date_created: 2025-04-07T08:48:28Z
date_published: 2024-09-27T00:00:00Z
date_updated: 2025-05-14T11:40:13Z
day: '27'
department:
- _id: LoSw
- _id: TiVo
- _id: Bio
- _id: NiBa
doi: 10.1101/2024.09.20.614050
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.1101/2024.09.20.614050
month: '09'
oa: 1
oa_version: Preprint
project:
- _id: bd73af52-d553-11ed-ba76-912049f0ac7a
  grant_number: FTI21-D-046
  name: Development of V1 interneuron diversity during swim-to-walk transition of
    Xenopus metamorphosis
- _id: ebb66355-77a9-11ec-83b8-b8ac210a4dae
  grant_number: '101041551'
  name: Development and Evolution of Tetrapod Motor Circuits
- _id: c08e9ad1-5a5b-11eb-8a69-9d1cf3b07473
  grant_number: CZI01
  name: Tools for automation and feedback microscopy
publication: bioRxiv
publication_status: submitted
status: public
title: Spinal neuron diversity scales exponentially with swim-to-limb transformation
  during frog metamorphosis
type: preprint
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2024'
...
---
_id: '14769'
abstract:
- lang: eng
  text: 'For a set of points in Rd, the Euclidean k-means problems consists of finding
    k centers such that the sum of distances squared from each data point to its closest
    center is minimized. Coresets are one the main tools developed recently to solve
    this problem in a big data context. They allow to compress the initial dataset
    while preserving its structure: running any algorithm on the coreset provides
    a guarantee almost equivalent to running it on the full data. In this work, we
    study coresets in a fully-dynamic setting: points are added and deleted with the
    goal to efficiently maintain a coreset with which a k-means solution can be computed.
    Based on an algorithm from Henzinger and Kale [ESA''20], we present an efficient
    and practical implementation of a fully dynamic coreset algorithm, that improves
    the running time by up to a factor of 20 compared to our non-optimized implementation
    of the algorithm by Henzinger and Kale, without sacrificing more than 7% on the
    quality of the k-means solution.'
acknowledgement: This   project   has   received   funding   from   the   Euro-pean  Research  Council  (ERC)  under  the  EuropeanUnion’s  Horizon  2020  research  and  innovation  programme  (Grant  agreement  No.   101019564  “The  De-sign  of  Modern  Fully  Dynamic  Data  Structures  (Mo-DynStruct)”  and  the  Austrian  Science  Fund  (FWF)project
  Z 422-N, project “Static and Dynamic Hierar-chical  Graph  Decompositions”,  I  5982-N,  and  project“Fast  Algorithms  for  a  Reactive  Network  Layer  (Re-actNet)”,
  P 33775-N, with additional funding from thenetidee SCIENCE Stiftung, 2020–2024.D.  Sauplic  has  received  funding  from  the  Euro-pean  Union’s  Horizon  2020  research  and  innovation
  programme under the Marie Sklodowska-Curie    grant    agreementNo 101034413.
article_processing_charge: No
arxiv: 1
author:
- first_name: Monika H
  full_name: Henzinger, Monika H
  id: 540c9bbd-f2de-11ec-812d-d04a5be85630
  last_name: Henzinger
  orcid: 0000-0002-5008-6530
- first_name: David
  full_name: Saulpic, David
  id: f8e48cf0-b0ff-11ed-b0e9-b4c35598f964
  last_name: Saulpic
- first_name: Leonhard
  full_name: Sidl, Leonhard
  id: 8b563fd0-b441-11ee-9101-a3891c61efa6
  last_name: Sidl
citation:
  ama: 'Henzinger M, Saulpic D, Sidl L. Experimental evaluation of fully dynamic k-means
    via coresets. In: <i>2024 Proceedings of the Symposium on Algorithm Engineering
    and Experiments</i>. Society for Industrial and Applied Mathematics; 2024:220-233.
    doi:<a href="https://doi.org/10.1137/1.9781611977929.17">10.1137/1.9781611977929.17</a>'
  apa: 'Henzinger, M., Saulpic, D., &#38; Sidl, L. (2024). Experimental evaluation
    of fully dynamic k-means via coresets. In <i>2024 Proceedings of the Symposium
    on Algorithm Engineering and Experiments</i> (pp. 220–233). Alexandria, VA, United
    States: Society for Industrial and Applied Mathematics. <a href="https://doi.org/10.1137/1.9781611977929.17">https://doi.org/10.1137/1.9781611977929.17</a>'
  chicago: Henzinger, Monika, David Saulpic, and Leonhard Sidl. “Experimental Evaluation
    of Fully Dynamic K-Means via Coresets.” In <i>2024 Proceedings of the Symposium
    on Algorithm Engineering and Experiments</i>, 220–33. Society for Industrial and
    Applied Mathematics, 2024. <a href="https://doi.org/10.1137/1.9781611977929.17">https://doi.org/10.1137/1.9781611977929.17</a>.
  ieee: M. Henzinger, D. Saulpic, and L. Sidl, “Experimental evaluation of fully dynamic
    k-means via coresets,” in <i>2024 Proceedings of the Symposium on Algorithm Engineering
    and Experiments</i>, Alexandria, VA, United States, 2024, pp. 220–233.
  ista: 'Henzinger M, Saulpic D, Sidl L. 2024. Experimental evaluation of fully dynamic
    k-means via coresets. 2024 Proceedings of the Symposium on Algorithm Engineering
    and Experiments. ALENEX: Workshop on Algorithm Engineering and Experiments, 220–233.'
  mla: Henzinger, Monika, et al. “Experimental Evaluation of Fully Dynamic K-Means
    via Coresets.” <i>2024 Proceedings of the Symposium on Algorithm Engineering and
    Experiments</i>, Society for Industrial and Applied Mathematics, 2024, pp. 220–33,
    doi:<a href="https://doi.org/10.1137/1.9781611977929.17">10.1137/1.9781611977929.17</a>.
  short: M. Henzinger, D. Saulpic, L. Sidl, in:, 2024 Proceedings of the Symposium
    on Algorithm Engineering and Experiments, Society for Industrial and Applied Mathematics,
    2024, pp. 220–233.
conference:
  end_date: 2024-01-08
  location: Alexandria, VA, United States
  name: 'ALENEX: Workshop on Algorithm Engineering and Experiments'
  start_date: 2024-01-07
corr_author: '1'
date_created: 2024-01-09T16:22:47Z
date_published: 2024-01-04T00:00:00Z
date_updated: 2025-04-14T13:50:50Z
day: '04'
department:
- _id: MoHe
doi: 10.1137/1.9781611977929.17
ec_funded: 1
external_id:
  arxiv:
  - '2310.18034'
language:
- iso: eng
main_file_link:
- open_access: '1'
  url: https://doi.org/10.48550/arXiv.2310.18034
month: '01'
oa: 1
oa_version: Preprint
page: 220-233
project:
- _id: bd9ca328-d553-11ed-ba76-dc4f890cfe62
  call_identifier: H2020
  grant_number: '101019564'
  name: The design and evaluation of modern fully dynamic data structures
- _id: 34def286-11ca-11ed-8bc3-da5948e1613c
  grant_number: Z00422
  name: Efficient algorithms
- _id: bda196b2-d553-11ed-ba76-8e8ee6c21103
  grant_number: I05982
  name: Static and Dynamic Hierarchical Graph Decompositions
- _id: bd9e3a2e-d553-11ed-ba76-8aa684ce17fe
  grant_number: P33775
  name: Fast Algorithms for a Reactive Network Layer
- _id: fc2ed2f7-9c52-11eb-aca3-c01059dda49c
  call_identifier: H2020
  grant_number: '101034413'
  name: 'IST-BRIDGE: International postdoctoral program'
publication: 2024 Proceedings of the Symposium on Algorithm Engineering and Experiments
publication_identifier:
  eisbn:
  - '9781611977929'
publication_status: published
publisher: Society for Industrial and Applied Mathematics
quality_controlled: '1'
scopus_import: '1'
status: public
title: Experimental evaluation of fully dynamic k-means via coresets
type: conference
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
year: '2024'
...
---
_id: '14794'
abstract:
- lang: eng
  text: "Mosaic analysis with double markers (MADM) technology enables the sparse
    labeling of genetically defined neurons. We present a protocol for time-lapse
    imaging of cortical projection neuron migration in mice using MADM. We describe
    steps for the isolation, culturing, and 4D imaging of neuronal dynamics in MADM-labeled
    brain tissue. While this protocol is compatible with other single-cell labeling
    methods, the MADM approach provides a genetic platform for the functional assessment
    of cell-autonomous candidate gene function and the relative contribution of non-cell-autonomous
    effects.\r\n\r\nFor complete details on the use and execution of this protocol,
    please refer to Hansen et al. (2022),1 Contreras et al. (2021),2 and Amberg and
    Hippenmeyer (2021).3"
acknowledged_ssus:
- _id: Bio
- _id: PreCl
acknowledgement: We thank Florian Pauler for discussion and his expert technical support.
  This research was supported by the Scientific Service Units (SSU) at IST Austria
  through resources provided by the Imaging and Optics Facility (IOF) and Preclinical
  Facility (PCF). A.H.H. was a recipient of a DOC Fellowship (24812) of the Austrian
  Academy of Sciences.
article_number: '102795'
article_processing_charge: Yes
article_type: review
author:
- first_name: Andi H
  full_name: Hansen, Andi H
  id: 38853E16-F248-11E8-B48F-1D18A9856A87
  last_name: Hansen
- first_name: Simon
  full_name: Hippenmeyer, Simon
  id: 37B36620-F248-11E8-B48F-1D18A9856A87
  last_name: Hippenmeyer
  orcid: 0000-0003-2279-1061
citation:
  ama: Hansen AH, Hippenmeyer S. Time-lapse imaging of cortical projection neuron
    migration in mice using mosaic analysis with double markers. <i>STAR Protocols</i>.
    2024;5(1). doi:<a href="https://doi.org/10.1016/j.xpro.2023.102795">10.1016/j.xpro.2023.102795</a>
  apa: Hansen, A. H., &#38; Hippenmeyer, S. (2024). Time-lapse imaging of cortical
    projection neuron migration in mice using mosaic analysis with double markers.
    <i>STAR Protocols</i>. Elsevier. <a href="https://doi.org/10.1016/j.xpro.2023.102795">https://doi.org/10.1016/j.xpro.2023.102795</a>
  chicago: Hansen, Andi H, and Simon Hippenmeyer. “Time-Lapse Imaging of Cortical
    Projection Neuron Migration in Mice Using Mosaic Analysis with Double Markers.”
    <i>STAR Protocols</i>. Elsevier, 2024. <a href="https://doi.org/10.1016/j.xpro.2023.102795">https://doi.org/10.1016/j.xpro.2023.102795</a>.
  ieee: A. H. Hansen and S. Hippenmeyer, “Time-lapse imaging of cortical projection
    neuron migration in mice using mosaic analysis with double markers,” <i>STAR Protocols</i>,
    vol. 5, no. 1. Elsevier, 2024.
  ista: Hansen AH, Hippenmeyer S. 2024. Time-lapse imaging of cortical projection
    neuron migration in mice using mosaic analysis with double markers. STAR Protocols.
    5(1), 102795.
  mla: Hansen, Andi H., and Simon Hippenmeyer. “Time-Lapse Imaging of Cortical Projection
    Neuron Migration in Mice Using Mosaic Analysis with Double Markers.” <i>STAR Protocols</i>,
    vol. 5, no. 1, 102795, Elsevier, 2024, doi:<a href="https://doi.org/10.1016/j.xpro.2023.102795">10.1016/j.xpro.2023.102795</a>.
  short: A.H. Hansen, S. Hippenmeyer, STAR Protocols 5 (2024).
corr_author: '1'
date_created: 2024-01-14T23:00:56Z
date_published: 2024-03-15T00:00:00Z
date_updated: 2025-04-15T07:32:40Z
day: '15'
ddc:
- '570'
department:
- _id: SiHi
doi: 10.1016/j.xpro.2023.102795
external_id:
  pmid:
  - '38165800'
file:
- access_level: open_access
  checksum: 4644d537451c5c114a9d7c7829b65bba
  content_type: application/pdf
  creator: dernst
  date_created: 2024-07-16T12:04:46Z
  date_updated: 2024-07-16T12:04:46Z
  file_id: '17264'
  file_name: 2024_STARProtoc_Hansen.pdf
  file_size: 3758943
  relation: main_file
  success: 1
file_date_updated: 2024-07-16T12:04:46Z
has_accepted_license: '1'
intvolume: '         5'
issue: '1'
language:
- iso: eng
month: '03'
oa: 1
oa_version: Published Version
pmid: 1
project:
- _id: 2625A13E-B435-11E9-9278-68D0E5697425
  grant_number: '24812'
  name: Molecular mechanisms of radial neuronal migration
publication: STAR Protocols
publication_identifier:
  eissn:
  - 2666-1667
publication_status: published
publisher: Elsevier
quality_controlled: '1'
related_material:
  link:
  - relation: software
    url: http://github.com/hippenmeyerlab
scopus_import: '1'
status: public
title: Time-lapse imaging of cortical projection neuron migration in mice using mosaic
  analysis with double markers
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 5
year: '2024'
...
---
_id: '14795'
abstract:
- lang: eng
  text: Metazoan development relies on the formation and remodeling of cell-cell contacts.
    Dynamic reorganization of adhesion receptors and the actomyosin cell cortex in
    space and time plays a central role in cell-cell contact formation and maturation.
    Nevertheless, how this process is mechanistically achieved when new contacts are
    formed remains unclear. Here, by building a biomimetic assay composed of progenitor
    cells adhering to supported lipid bilayers functionalized with E-cadherin ectodomains,
    we show that cortical F-actin flows, driven by the depletion of myosin-2 at the
    cell contact center, mediate the dynamic reorganization of adhesion receptors
    and cell cortex at the contact. E-cadherin-dependent downregulation of the small
    GTPase RhoA at the forming contact leads to both a depletion of myosin-2 and a
    decrease of F-actin at the contact center. At the contact rim, in contrast, myosin-2
    becomes enriched by the retraction of bleb-like protrusions, resulting in a cortical
    tension gradient from the contact rim to its center. This tension gradient, in
    turn, triggers centrifugal F-actin flows, leading to further accumulation of F-actin
    at the contact rim and the progressive redistribution of E-cadherin from the contact
    center to the rim. Eventually, this combination of actomyosin downregulation and
    flows at the contact determines the characteristic molecular organization, with
    E-cadherin and F-actin accumulating at the contact rim, where they are needed
    to mechanically link the contractile cortices of the adhering cells.
acknowledged_ssus:
- _id: Bio
- _id: PreCl
acknowledgement: "We are grateful to Edwin Munro for their feedback and help with
  the single particle analysis. We thank members of the Heisenberg and Loose labs
  for their help and feedback on the manuscript, notably Xin Tong for making the PCS2-mCherry-AHPH
  plasmid. Finally, we thank the Aquatics and Imaging & Optics facilities of ISTA
  for their continuous support, especially Yann Cesbron for assistance with the laser
  cutter. This work was supported by an ERC\r\nAdvanced Grant (MECSPEC) to C.-P.H."
article_processing_charge: Yes (via OA deal)
article_type: original
author:
- first_name: Feyza N
  full_name: Arslan, Feyza N
  id: 49DA7910-F248-11E8-B48F-1D18A9856A87
  last_name: Arslan
  orcid: 0000-0001-5809-9566
- first_name: Edouard B
  full_name: Hannezo, Edouard B
  id: 3A9DB764-F248-11E8-B48F-1D18A9856A87
  last_name: Hannezo
  orcid: 0000-0001-6005-1561
- first_name: Jack
  full_name: Merrin, Jack
  id: 4515C308-F248-11E8-B48F-1D18A9856A87
  last_name: Merrin
  orcid: 0000-0001-5145-4609
- first_name: Martin
  full_name: Loose, Martin
  id: 462D4284-F248-11E8-B48F-1D18A9856A87
  last_name: Loose
  orcid: 0000-0001-7309-9724
- first_name: Carl-Philipp J
  full_name: Heisenberg, Carl-Philipp J
  id: 39427864-F248-11E8-B48F-1D18A9856A87
  last_name: Heisenberg
  orcid: 0000-0002-0912-4566
citation:
  ama: Arslan FN, Hannezo EB, Merrin J, Loose M, Heisenberg C-PJ. Adhesion-induced
    cortical flows pattern E-cadherin-mediated cell contacts. <i>Current Biology</i>.
    2024;34(1):171-182.e8. doi:<a href="https://doi.org/10.1016/j.cub.2023.11.067">10.1016/j.cub.2023.11.067</a>
  apa: Arslan, F. N., Hannezo, E. B., Merrin, J., Loose, M., &#38; Heisenberg, C.-P.
    J. (2024). Adhesion-induced cortical flows pattern E-cadherin-mediated cell contacts.
    <i>Current Biology</i>. Elsevier. <a href="https://doi.org/10.1016/j.cub.2023.11.067">https://doi.org/10.1016/j.cub.2023.11.067</a>
  chicago: Arslan, Feyza N, Edouard B Hannezo, Jack Merrin, Martin Loose, and Carl-Philipp
    J Heisenberg. “Adhesion-Induced Cortical Flows Pattern E-Cadherin-Mediated Cell
    Contacts.” <i>Current Biology</i>. Elsevier, 2024. <a href="https://doi.org/10.1016/j.cub.2023.11.067">https://doi.org/10.1016/j.cub.2023.11.067</a>.
  ieee: F. N. Arslan, E. B. Hannezo, J. Merrin, M. Loose, and C.-P. J. Heisenberg,
    “Adhesion-induced cortical flows pattern E-cadherin-mediated cell contacts,” <i>Current
    Biology</i>, vol. 34, no. 1. Elsevier, p. 171–182.e8, 2024.
  ista: Arslan FN, Hannezo EB, Merrin J, Loose M, Heisenberg C-PJ. 2024. Adhesion-induced
    cortical flows pattern E-cadherin-mediated cell contacts. Current Biology. 34(1),
    171–182.e8.
  mla: Arslan, Feyza N., et al. “Adhesion-Induced Cortical Flows Pattern E-Cadherin-Mediated
    Cell Contacts.” <i>Current Biology</i>, vol. 34, no. 1, Elsevier, 2024, p. 171–182.e8,
    doi:<a href="https://doi.org/10.1016/j.cub.2023.11.067">10.1016/j.cub.2023.11.067</a>.
  short: F.N. Arslan, E.B. Hannezo, J. Merrin, M. Loose, C.-P.J. Heisenberg, Current
    Biology 34 (2024) 171–182.e8.
corr_author: '1'
date_created: 2024-01-14T23:00:56Z
date_published: 2024-01-08T00:00:00Z
date_updated: 2025-09-04T11:39:10Z
day: '08'
ddc:
- '570'
department:
- _id: CaHe
- _id: EdHa
- _id: MaLo
- _id: NanoFab
doi: 10.1016/j.cub.2023.11.067
ec_funded: 1
external_id:
  isi:
  - '001154500400001'
  pmid:
  - '38134934'
file:
- access_level: open_access
  checksum: 51220b76d72a614208f84bdbfbaf9b72
  content_type: application/pdf
  creator: dernst
  date_created: 2024-01-16T10:53:31Z
  date_updated: 2024-01-16T10:53:31Z
  file_id: '14813'
  file_name: 2024_CurrentBiology_Arslan.pdf
  file_size: 5183861
  relation: main_file
  success: 1
file_date_updated: 2024-01-16T10:53:31Z
has_accepted_license: '1'
intvolume: '        34'
isi: 1
issue: '1'
language:
- iso: eng
month: '01'
oa: 1
oa_version: Published Version
page: 171-182.e8
pmid: 1
project:
- _id: 260F1432-B435-11E9-9278-68D0E5697425
  call_identifier: H2020
  grant_number: '742573'
  name: Interaction and feedback between cell mechanics and fate specification in
    vertebrate gastrulation
publication: Current Biology
publication_identifier:
  eissn:
  - 1879-0445
  issn:
  - 0960-9822
publication_status: published
publisher: Elsevier
quality_controlled: '1'
scopus_import: '1'
status: public
title: Adhesion-induced cortical flows pattern E-cadherin-mediated cell contacts
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 317138e5-6ab7-11ef-aa6d-ffef3953e345
volume: 34
year: '2024'
...
---
OA_place: publisher
OA_type: hybrid
_id: '14797'
abstract:
- lang: eng
  text: We study a random matching problem on closed compact  2 -dimensional Riemannian
    manifolds (with respect to the squared Riemannian distance), with samples of random
    points whose common law is absolutely continuous with respect to the volume measure
    with strictly positive and bounded density. We show that given two sequences of
    numbers  n  and  m=m(n)  of points, asymptotically equivalent as  n  goes to infinity,
    the optimal transport plan between the two empirical measures  μn  and  νm  is
    quantitatively well-approximated by  (Id,exp(∇hn))#μn  where  hn  solves a linear
    elliptic PDE obtained by a regularized first-order linearization of the Monge-Ampère
    equation. This is obtained in the case of samples of correlated random points
    for which a stretched exponential decay of the  α -mixing coefficient holds and
    for a class of discrete-time Markov chains having a unique absolutely continuous
    invariant measure with respect to the volume measure.
acknowledgement: "NC has received funding from the European Research Council (ERC)
  under the European Union’s Horizon 2020 research and innovation programme (Grant
  agreement No 948819).\r\nFM is supported by the Deutsche Forschungsgemeinschaft
  (DFG, German Research Foundation) through the SPP 2265 Random Geometric Systems.
  FM has been funded by the Deutsche Forschungsgemeinschaft (DFG, German Research
  Foundation) under Germany’s Excellence Strategy EXC 2044 -390685587, Mathematics
  Münster: Dynamics–Geometry–Structure. FM has been funded by the Max Planck Institute
  for Mathematics in the Sciences."
article_processing_charge: Yes (in subscription journal)
article_type: original
arxiv: 1
author:
- first_name: Nicolas
  full_name: Clozeau, Nicolas
  id: fea1b376-906f-11eb-847d-b2c0cf46455b
  last_name: Clozeau
- first_name: Francesco
  full_name: Mattesini, Francesco
  last_name: Mattesini
citation:
  ama: Clozeau N, Mattesini F. Annealed quantitative estimates for the quadratic 2D-discrete
    random matching problem. <i>Probability Theory and Related Fields</i>. 2024;190:485-541.
    doi:<a href="https://doi.org/10.1007/s00440-023-01254-0">10.1007/s00440-023-01254-0</a>
  apa: Clozeau, N., &#38; Mattesini, F. (2024). Annealed quantitative estimates for
    the quadratic 2D-discrete random matching problem. <i>Probability Theory and Related
    Fields</i>. Springer Nature. <a href="https://doi.org/10.1007/s00440-023-01254-0">https://doi.org/10.1007/s00440-023-01254-0</a>
  chicago: Clozeau, Nicolas, and Francesco Mattesini. “Annealed Quantitative Estimates
    for the Quadratic 2D-Discrete Random Matching Problem.” <i>Probability Theory
    and Related Fields</i>. Springer Nature, 2024. <a href="https://doi.org/10.1007/s00440-023-01254-0">https://doi.org/10.1007/s00440-023-01254-0</a>.
  ieee: N. Clozeau and F. Mattesini, “Annealed quantitative estimates for the quadratic
    2D-discrete random matching problem,” <i>Probability Theory and Related Fields</i>,
    vol. 190. Springer Nature, pp. 485–541, 2024.
  ista: Clozeau N, Mattesini F. 2024. Annealed quantitative estimates for the quadratic
    2D-discrete random matching problem. Probability Theory and Related Fields. 190,
    485–541.
  mla: Clozeau, Nicolas, and Francesco Mattesini. “Annealed Quantitative Estimates
    for the Quadratic 2D-Discrete Random Matching Problem.” <i>Probability Theory
    and Related Fields</i>, vol. 190, Springer Nature, 2024, pp. 485–541, doi:<a href="https://doi.org/10.1007/s00440-023-01254-0">10.1007/s00440-023-01254-0</a>.
  short: N. Clozeau, F. Mattesini, Probability Theory and Related Fields 190 (2024)
    485–541.
corr_author: '1'
date_created: 2024-01-14T23:00:57Z
date_published: 2024-10-01T00:00:00Z
date_updated: 2025-09-04T11:43:43Z
day: '01'
ddc:
- '510'
department:
- _id: JuFi
doi: 10.1007/s00440-023-01254-0
ec_funded: 1
external_id:
  arxiv:
  - '2303.00353'
  isi:
  - '001136206200002'
file:
- access_level: open_access
  checksum: 34f44cad6a210ff66791ee37e590af2c
  content_type: application/pdf
  creator: dernst
  date_created: 2025-01-09T08:10:54Z
  date_updated: 2025-01-09T08:10:54Z
  file_id: '18788'
  file_name: 2024_ProbTheoryRelatFields_Clozeau.pdf
  file_size: 880117
  relation: main_file
  success: 1
file_date_updated: 2025-01-09T08:10:54Z
has_accepted_license: '1'
intvolume: '       190'
isi: 1
language:
- iso: eng
month: '10'
oa: 1
oa_version: Published Version
page: 485-541
project:
- _id: 0aa76401-070f-11eb-9043-b5bb049fa26d
  call_identifier: H2020
  grant_number: '948819'
  name: Bridging Scales in Random Materials
publication: Probability Theory and Related Fields
publication_identifier:
  eissn:
  - 1432-2064
  issn:
  - 0178-8051
publication_status: published
publisher: Springer Nature
quality_controlled: '1'
scopus_import: '1'
status: public
title: Annealed quantitative estimates for the quadratic 2D-discrete random matching
  problem
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 317138e5-6ab7-11ef-aa6d-ffef3953e345
volume: 190
year: '2024'
...
---
APC_amount: 3393,38 EUR
DOAJ_listed: '1'
OA_place: publisher
OA_type: gold
_id: '14802'
abstract:
- lang: eng
  text: Frequency-stable lasers form the back bone of precision measurements in science
    and technology. Such lasers typically attain their stability through frequency
    locking to reference cavities. State-of-the-art locking performances to date had
    been achieved using frequency modulation based methods, complemented with active
    drift cancellation systems. We demonstrate an all passive, modulation-free laser-cavity
    locking technique (squash locking) that utilizes changes in spatial beam ellipticity
    for error signal generation, and a coherent polarization post-selection for noise
    resilience. By comparing two identically built proof-of-principle systems, we
    show a frequency locking instability of 5×10<jats:sup>−7</jats:sup> relative to
    the cavity linewidth at 10 s averaging. The results surpass the demonstrated performances
    of methods engineered over the last five decades, potentially enabling an advancement
    in the precision control of lasers, while creating avenues for bridging the performance
    gaps between industrial grade lasers with scientific ones due to the afforded
    simplicity and scalability.
acknowledgement: We thank Rishabh Sahu and Sebastian Wald for technical contributions
  to the experiment. Funding by Institute of Science and Technology Austria.
article_processing_charge: Yes
article_type: original
author:
- first_name: Fritz R
  full_name: Diorico, Fritz R
  id: 2E054C4C-F248-11E8-B48F-1D18A9856A87
  last_name: Diorico
  orcid: 0000-0002-4947-8924
- first_name: Artem
  full_name: Zhutov, Artem
  id: 0f02ed6a-b514-11ee-b891-8379c5f19cb7
  last_name: Zhutov
- first_name: Onur
  full_name: Hosten, Onur
  id: 4C02D85E-F248-11E8-B48F-1D18A9856A87
  last_name: Hosten
  orcid: 0000-0002-2031-204X
citation:
  ama: 'Diorico FR, Zhutov A, Hosten O. Laser-cavity locking utilizing beam ellipticity:
    accessing the 10<sup>−7</sup> instability scale relative to cavity linewidth.
    <i>Optica</i>. 2024;11(1):26-31. doi:<a href="https://doi.org/10.1364/optica.507451">10.1364/optica.507451</a>'
  apa: 'Diorico, F. R., Zhutov, A., &#38; Hosten, O. (2024). Laser-cavity locking
    utilizing beam ellipticity: accessing the 10<sup>−7</sup> instability scale relative
    to cavity linewidth. <i>Optica</i>. Optica Publishing Group. <a href="https://doi.org/10.1364/optica.507451">https://doi.org/10.1364/optica.507451</a>'
  chicago: 'Diorico, Fritz R, Artem Zhutov, and Onur Hosten. “Laser-Cavity Locking
    Utilizing Beam Ellipticity: Accessing the 10<sup>−7</sup> Instability Scale Relative
    to Cavity Linewidth.” <i>Optica</i>. Optica Publishing Group, 2024. <a href="https://doi.org/10.1364/optica.507451">https://doi.org/10.1364/optica.507451</a>.'
  ieee: 'F. R. Diorico, A. Zhutov, and O. Hosten, “Laser-cavity locking utilizing
    beam ellipticity: accessing the 10<sup>−7</sup> instability scale relative to
    cavity linewidth,” <i>Optica</i>, vol. 11, no. 1. Optica Publishing Group, pp.
    26–31, 2024.'
  ista: 'Diorico FR, Zhutov A, Hosten O. 2024. Laser-cavity locking utilizing beam
    ellipticity: accessing the 10<sup>−7</sup> instability scale relative to cavity linewidth.
    Optica. 11(1), 26–31.'
  mla: 'Diorico, Fritz R., et al. “Laser-Cavity Locking Utilizing Beam Ellipticity:
    Accessing the 10<sup>−7</sup> Instability Scale Relative to Cavity Linewidth.”
    <i>Optica</i>, vol. 11, no. 1, Optica Publishing Group, 2024, pp. 26–31, doi:<a
    href="https://doi.org/10.1364/optica.507451">10.1364/optica.507451</a>.'
  short: F.R. Diorico, A. Zhutov, O. Hosten, Optica 11 (2024) 26–31.
corr_author: '1'
date_created: 2024-01-15T10:25:38Z
date_published: 2024-01-20T00:00:00Z
date_updated: 2025-09-04T12:13:27Z
day: '20'
ddc:
- '530'
department:
- _id: OnHo
doi: 10.1364/optica.507451
external_id:
  isi:
  - '001202817000004'
file:
- access_level: open_access
  checksum: eb99ca7d0fe73e22f121875175546ed7
  content_type: application/pdf
  creator: dernst
  date_created: 2024-01-17T08:53:16Z
  date_updated: 2024-01-17T08:53:16Z
  file_id: '14824'
  file_name: 2023_Optica_Diorico.pdf
  file_size: 4558986
  relation: main_file
  success: 1
file_date_updated: 2024-01-17T08:53:16Z
has_accepted_license: '1'
intvolume: '        11'
isi: 1
issue: '1'
keyword:
- Atomic and Molecular Physics
- and Optics
- Electronic
- Optical and Magnetic Materials
language:
- iso: eng
month: '01'
oa: 1
oa_version: Published Version
page: 26-31
publication: Optica
publication_identifier:
  issn:
  - 2334-2536
publication_status: published
publisher: Optica Publishing Group
quality_controlled: '1'
scopus_import: '1'
status: public
title: 'Laser-cavity locking utilizing beam ellipticity: accessing the 10<sup>−7</sup>
  instability scale relative to cavity linewidth'
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 317138e5-6ab7-11ef-aa6d-ffef3953e345
volume: 11
year: '2024'
...
---
_id: '14820'
abstract:
- lang: eng
  text: "We consider a natural problem dealing with weighted packet selection across
    a rechargeable link, which e.g., finds applications in cryptocurrency networks.
    The capacity of a link (u, v) is determined by how many nodes u and v allocate
    for this link. Specifically, the input is a finite ordered sequence of packets
    that arrive in both directions along a link. Given (u, v) and a packet of weight
    x going from u to v, node u can either accept or reject the packet. If u accepts
    the packet, the capacity on link (u, v) decreases by x. Correspondingly, v's capacity
    on \r\n increases by x. If a node rejects the packet, this will entail a cost
    affinely linear in the weight of the packet. A link is “rechargeable” in the sense
    that the total capacity of the link has to remain constant, but the allocation
    of capacity at the ends of the link can depend arbitrarily on the nodes' decisions.
    The goal is to minimise the sum of the capacity injected into the link and the
    cost of rejecting packets. We show that the problem is NP-hard, but can be approximated
    efficiently with a ratio of (1+E) . (1+3)  for some arbitrary E>0."
acknowledgement: We thank Mahsa Bastankhah and Mohammad Ali Maddah-Ali for fruitful
  discussions about different variants of the problem. This work is supported by the
  European Research Council (ERC) Consolidator Project 864228 (AdjustNet), 2020-2025,
  the ERC CoG 863818 (ForM-SMArt), and the German Research Foundation (DFG) grant
  470029389 (FlexNets), 2021-2024.
article_number: '114353'
article_processing_charge: Yes (via OA deal)
article_type: original
author:
- first_name: Stefan
  full_name: Schmid, Stefan
  last_name: Schmid
- first_name: Jakub
  full_name: Svoboda, Jakub
  id: 130759D2-D7DD-11E9-87D2-DE0DE6697425
  last_name: Svoboda
  orcid: 0000-0002-1419-3267
- first_name: Michelle X
  full_name: Yeo, Michelle X
  id: 2D82B818-F248-11E8-B48F-1D18A9856A87
  last_name: Yeo
  orcid: 0009-0001-3676-4809
citation:
  ama: 'Schmid S, Svoboda J, Yeo MX. Weighted packet selection for rechargeable links
    in cryptocurrency networks: Complexity and approximation. <i>Theoretical Computer
    Science</i>. 2024;989. doi:<a href="https://doi.org/10.1016/j.tcs.2023.114353">10.1016/j.tcs.2023.114353</a>'
  apa: 'Schmid, S., Svoboda, J., &#38; Yeo, M. X. (2024). Weighted packet selection
    for rechargeable links in cryptocurrency networks: Complexity and approximation.
    <i>Theoretical Computer Science</i>. Elsevier. <a href="https://doi.org/10.1016/j.tcs.2023.114353">https://doi.org/10.1016/j.tcs.2023.114353</a>'
  chicago: 'Schmid, Stefan, Jakub Svoboda, and Michelle X Yeo. “Weighted Packet Selection
    for Rechargeable Links in Cryptocurrency Networks: Complexity and Approximation.”
    <i>Theoretical Computer Science</i>. Elsevier, 2024. <a href="https://doi.org/10.1016/j.tcs.2023.114353">https://doi.org/10.1016/j.tcs.2023.114353</a>.'
  ieee: 'S. Schmid, J. Svoboda, and M. X. Yeo, “Weighted packet selection for rechargeable
    links in cryptocurrency networks: Complexity and approximation,” <i>Theoretical
    Computer Science</i>, vol. 989. Elsevier, 2024.'
  ista: 'Schmid S, Svoboda J, Yeo MX. 2024. Weighted packet selection for rechargeable
    links in cryptocurrency networks: Complexity and approximation. Theoretical Computer
    Science. 989, 114353.'
  mla: 'Schmid, Stefan, et al. “Weighted Packet Selection for Rechargeable Links in
    Cryptocurrency Networks: Complexity and Approximation.” <i>Theoretical Computer
    Science</i>, vol. 989, 114353, Elsevier, 2024, doi:<a href="https://doi.org/10.1016/j.tcs.2023.114353">10.1016/j.tcs.2023.114353</a>.'
  short: S. Schmid, J. Svoboda, M.X. Yeo, Theoretical Computer Science 989 (2024).
corr_author: '1'
date_created: 2024-01-16T13:40:41Z
date_published: 2024-03-21T00:00:00Z
date_updated: 2025-12-02T14:02:37Z
day: '21'
ddc:
- '000'
department:
- _id: KrCh
- _id: KrPi
doi: 10.1016/j.tcs.2023.114353
ec_funded: 1
external_id:
  isi:
  - '001168211400001'
file:
- access_level: open_access
  checksum: efd5b7e738bf845312ba53889a3e13e4
  content_type: application/pdf
  creator: dernst
  date_created: 2024-07-16T12:02:25Z
  date_updated: 2024-07-16T12:02:25Z
  file_id: '17263'
  file_name: 2024_TheorComputerScience_Schmid.pdf
  file_size: 603570
  relation: main_file
  success: 1
file_date_updated: 2024-07-16T12:02:25Z
has_accepted_license: '1'
intvolume: '       989'
isi: 1
keyword:
- General Computer Science
- Theoretical Computer Science
language:
- iso: eng
month: '03'
oa: 1
oa_version: Published Version
project:
- _id: 0599E47C-7A3F-11EA-A408-12923DDC885E
  call_identifier: H2020
  grant_number: '863818'
  name: 'Formal Methods for Stochastic Models: Algorithms and Applications'
publication: Theoretical Computer Science
publication_identifier:
  issn:
  - 0304-3975
publication_status: published
publisher: Elsevier
quality_controlled: '1'
related_material:
  record:
  - id: '19985'
    relation: earlier_version
    status: public
scopus_import: '1'
status: public
title: 'Weighted packet selection for rechargeable links in cryptocurrency networks:
  Complexity and approximation'
tmp:
  image: /images/cc_by.png
  legal_code_url: https://creativecommons.org/licenses/by/4.0/legalcode
  name: Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)
  short: CC BY (4.0)
type: journal_article
user_id: 317138e5-6ab7-11ef-aa6d-ffef3953e345
volume: 989
year: '2024'
...
---
_id: '14828'
abstract:
- lang: eng
  text: Production of hydrogen at large scale requires development of non-noble, inexpensive,
    and high-performing catalysts for constructing water-splitting devices. Herein,
    we report the synthesis of Zn-doped NiO heterostructure (ZnNiO) catalysts at room
    temperature via a coprecipitation method followed by drying (at 80 °C, 6 h) and
    calcination at an elevated temperature of 400 °C for 5 h under three distinct
    conditions, namely, air, N2, and vacuum. The vacuum-synthesized catalyst demonstrates
    a low overpotential of 88 mV at −10 mA cm–2 and a small Tafel slope of 73 mV dec–1
    suggesting relatively higher charge transfer kinetics for hydrogen evolution reactions
    (HER) compared with the specimens synthesized under N2 or O2 atmosphere. It also
    demonstrates an oxygen evolution (OER) overpotential of 260 mV at 10 mA cm–2 with
    a low Tafel slope of 63 mV dec–1. In a full-cell water-splitting device, the vacuum-synthesized
    ZnNiO heterostructure demonstrates a cell voltage of 1.94 V at 50 mA cm–2 and
    shows remarkable stability over 24 h at a high current density of 100 mA cm–2.
    It is also demonstrated in this study that Zn-doping, surface, and interface engineering
    in transition-metal oxides play a crucial role in efficient electrocatalytic water
    splitting. Also, the results obtained from density functional theory (DFT + U
    = 0–8 eV), where U is the on-site Coulomb repulsion parameter also known as Hubbard
    U, based electronic structure calculations confirm that Zn doping constructively
    modifies the electronic structure, in both the valence band and the conduction
    band, and found to be suitable in tailoring the carrier’s effective masses of
    electrons and holes. The decrease in electron’s effective masses together with
    large differences between the effective masses of electrons and holes is noticed,
    which is found to be mainly responsible for achieving the best water-splitting
    performance from a 9% Zn-doped NiO sample prepared under vacuum.
acknowledgement: This work was supported by the Technology Innovation Program (20011622,
  Development of Battery System Applied High-Efficiency Heat Control Polymer and Part
  Component) funded by the Ministry of Trade, Industry & Energy (MOTIE, Korea). Author
  acknowledge to Prof. Tsunehiro Takeuchi from Toyota Technological Institute, Nagoya,
  Japan for the support of computational resources.
article_processing_charge: No
article_type: original
author:
- first_name: Gundegowda Kalligowdanadoddi
  full_name: Kiran, Gundegowda Kalligowdanadoddi
  last_name: Kiran
- first_name: Saurabh
  full_name: Singh, Saurabh
  id: 12d625da-9cb3-11ed-9667-af09d37d3f0a
  last_name: Singh
  orcid: 0000-0003-2209-5269
- first_name: Neelima
  full_name: Mahato, Neelima
  last_name: Mahato
- first_name: Thupakula Venkata Madhukar
  full_name: Sreekanth, Thupakula Venkata Madhukar
  last_name: Sreekanth
- first_name: Gowra Raghupathy
  full_name: Dillip, Gowra Raghupathy
  last_name: Dillip
- first_name: Kisoo
  full_name: Yoo, Kisoo
  last_name: Yoo
- first_name: Jonghoon
  full_name: Kim, Jonghoon
  last_name: Kim
citation:
  ama: Kiran GK, Singh S, Mahato N, et al. Interface engineering modulation combined
    with electronic structure modification of Zn-doped NiO heterostructure for efficient
    water-splitting activity. <i>ACS Applied Energy Materials</i>. 2024;7(1):214-229.
    doi:<a href="https://doi.org/10.1021/acsaem.3c02519">10.1021/acsaem.3c02519</a>
  apa: Kiran, G. K., Singh, S., Mahato, N., Sreekanth, T. V. M., Dillip, G. R., Yoo,
    K., &#38; Kim, J. (2024). Interface engineering modulation combined with electronic
    structure modification of Zn-doped NiO heterostructure for efficient water-splitting
    activity. <i>ACS Applied Energy Materials</i>. American Chemical Society. <a href="https://doi.org/10.1021/acsaem.3c02519">https://doi.org/10.1021/acsaem.3c02519</a>
  chicago: Kiran, Gundegowda Kalligowdanadoddi, Saurabh Singh, Neelima Mahato, Thupakula
    Venkata Madhukar Sreekanth, Gowra Raghupathy Dillip, Kisoo Yoo, and Jonghoon Kim.
    “Interface Engineering Modulation Combined with Electronic Structure Modification
    of Zn-Doped NiO Heterostructure for Efficient Water-Splitting Activity.” <i>ACS
    Applied Energy Materials</i>. American Chemical Society, 2024. <a href="https://doi.org/10.1021/acsaem.3c02519">https://doi.org/10.1021/acsaem.3c02519</a>.
  ieee: G. K. Kiran <i>et al.</i>, “Interface engineering modulation combined with
    electronic structure modification of Zn-doped NiO heterostructure for efficient
    water-splitting activity,” <i>ACS Applied Energy Materials</i>, vol. 7, no. 1.
    American Chemical Society, pp. 214–229, 2024.
  ista: Kiran GK, Singh S, Mahato N, Sreekanth TVM, Dillip GR, Yoo K, Kim J. 2024.
    Interface engineering modulation combined with electronic structure modification
    of Zn-doped NiO heterostructure for efficient water-splitting activity. ACS Applied
    Energy Materials. 7(1), 214–229.
  mla: Kiran, Gundegowda Kalligowdanadoddi, et al. “Interface Engineering Modulation
    Combined with Electronic Structure Modification of Zn-Doped NiO Heterostructure
    for Efficient Water-Splitting Activity.” <i>ACS Applied Energy Materials</i>,
    vol. 7, no. 1, American Chemical Society, 2024, pp. 214–29, doi:<a href="https://doi.org/10.1021/acsaem.3c02519">10.1021/acsaem.3c02519</a>.
  short: G.K. Kiran, S. Singh, N. Mahato, T.V.M. Sreekanth, G.R. Dillip, K. Yoo, J.
    Kim, ACS Applied Energy Materials 7 (2024) 214–229.
corr_author: '1'
date_created: 2024-01-17T12:48:35Z
date_published: 2024-01-08T00:00:00Z
date_updated: 2024-10-09T21:07:53Z
day: '08'
department:
- _id: MaIb
doi: 10.1021/acsaem.3c02519
external_id:
  isi:
  - '001138342900001'
intvolume: '         7'
isi: 1
issue: '1'
keyword:
- Electrical and Electronic Engineering
- Materials Chemistry
- Electrochemistry
- Energy Engineering and Power Technology
- Chemical Engineering (miscellaneous)
language:
- iso: eng
month: '01'
oa_version: None
page: 214-229
publication: ACS Applied Energy Materials
publication_identifier:
  issn:
  - 2574-0962
publication_status: published
publisher: American Chemical Society
quality_controlled: '1'
scopus_import: '1'
status: public
title: Interface engineering modulation combined with electronic structure modification
  of Zn-doped NiO heterostructure for efficient water-splitting activity
type: journal_article
user_id: 2DF688A6-F248-11E8-B48F-1D18A9856A87
volume: 7
year: '2024'
...
