Delving into transformer networks
Exploring the architecture of OpenAI’s Generative Pre-trained Transformers.
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I love you'))"
This article explains the Long RoPE methodology used to expand the context lengths in LLMs without significant performance degradation. It discusses the importance of context length in LLMs and the limitations of previous positional encoding methods. The article then introduces Rotational Positional Encoding (RoPE) and its limitations, and explains how Long RoPE extends RoPE to larger contexts.