Optimizing the Learnable RoPE Theta Parameter in Transformers

Rotary Position Embedding (RoPE) enhances Transformer models by encoding relative positions through a frequency parameter <inline-formula> <tex-math notation="LaTeX">$\theta $ </tex-math></inline-formula>, but conventional implementations fix <inline-formula>...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhigao Huang, Musheng Chen
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11084811/
Tags: Add Tag
No Tags, Be the first to tag this record!