Tags: in-context learning*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. This paper provides a theoretical analysis of Transformers' limitations for time series forecasting through the lens of In-Context Learning (ICL) theory, demonstrating that even powerful Transformers often fail to outperform simpler models like linear models. The study focuses on Linear Self-Attention (LSA) models and shows that they cannot achieve lower expected MSE than classical linear models for in-context forecasting, and that predictions collapse to the mean exponentially under Chain-of-Thought inference.
  2. Sparse Priming Representations (SPR) is a research project focused on developing and sharing techniques for efficiently representing complex ideas, memories, or concepts using a minimal set of keywords, phrases, or statements, enabling language models or subject matter experts to quickly reconstruct the original idea with minimal context.
  3. Jeff Dean discusses the potential of merging Google Search with large language models (LLMs) using in-context learning, emphasizing enhanced information processing and contextual accuracy while addressing computational challenges.
  4. First, using the demonstrations significantly outperforms the no demonstrations method
    even with small k (k = 4), and performance drop
    from using gold labels to using random labels is
    consistently small across varying k, in the range of
    0.8–1.6%.7
    Interestingly, model performance does
    not increase much as k increases when k ≥ 8, both
    with gold labels and with random labels.
    2024-02-14 Tags: , , , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "in-context learning"

About - Propulsed by SemanticScuttle