klotz: transformers*

Bookmarks on this page are managed by an admin user.

0 bookmark(s) - Sort by: Date / Title ↓ / - Bookmarks from other users for this tag

  1. This article explains the Long RoPE methodology used to expand the context lengths in LLMs without significant performance degradation. It discusses the importance of context length in LLMs and the limitations of previous positional encoding methods. The article then introduces Rotational Positional Encoding (RoPE) and its limitations, and explains how Long RoPE extends RoPE to larger contexts.
  2. Delving into transformer networks
  3. python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I love you'))"
  4. Exploring the architecture of OpenAI’s Generative Pre-trained Transformers.
    2023-12-10 Tags: , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: transformers

About - Propulsed by SemanticScuttle