0 bookmark(s) - Sort by: Date ↓ / Title /
Microsoft researchers introduce LongRoPE2, a method to extend large language model context windows to 128K tokens while maintaining over 97% short-context accuracy, addressing key limitations in positional embeddings.
First / Previous / Next / Last
/ Page 1 of 0