• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Hu, Jerry Yao-Chieh (Hu, Jerry Yao-Chieh.) [1] | Wang, Wei-Po (Wang, Wei-Po.) [2] | Gilani, Ammar (Gilani, Ammar.) [3] | Li, Chenyang (Li, Chenyang.) [4] | Song, Zhao (Song, Zhao.) [5] | Liu, Han (Liu, Han.) [6]

Indexed by:

EI

Abstract:

We investigate the statistical and computational limits of prompt tuning for transformer-based foundation models. Our key contributions are prompt tuning on single-head transformers with only a single self-attention layer: (i) is universal, and (ii) supports efficient (even almost-linear time) algorithms under the Strong Exponential Time Hypothesis (SETH). Statistically, we prove that prompt tuning on such simplest possible transformers are universal approximators for sequence-to-sequence Lipschitz functions. In addition, we provide an exponential-in-dL and -in-(1/ϵ) lower bound on the required soft-prompt tokens for prompt tuning to memorize any dataset with 1-layer, 1-head transformers. Computationally, we identify a phase transition in the efficiency of prompt tuning, determined by the norm of the soft-prompt-induced keys and queries, and provide an upper bound criterion. Beyond this criterion, no sub-quadratic (efficient) algorithm for prompt tuning exists under SETH. Within this criterion, we showcase our theory by proving the existence of almost-linear time prompt tuning inference algorithms. These fundamental limits provide important necessary conditions for designing expressive and efficient prompt tuning methods for practitioners. © 2025 13th International Conference on Learning Representations, ICLR 2025. All rights reserved.

Keyword:

Community:

  • [ 1 ] [Hu, Jerry Yao-Chieh]Northwestern University, United States
  • [ 2 ] [Wang, Wei-Po]National Taiwan University, Taiwan
  • [ 3 ] [Gilani, Ammar]Northwestern University, United States
  • [ 4 ] [Li, Chenyang]Fuzhou University, China
  • [ 5 ] [Song, Zhao]UC Berkeley, United States
  • [ 6 ] [Liu, Han]Northwestern University, United States

Reprint 's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2025

Page: 48389-48441

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Affiliated Colleges:

Online/Total:133/11185448
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1