Confidence-Based Knowledge Distillation to Reduce Training Costs and Carbon Footprint for Low-Resource Neural Machine Translation
The transformer-based deep learning approach represents the current state-of-the-art in machine translation (MT) research. Large-scale pretrained transformer models produce state-of-the-art performance across a wide range of MT tasks for many languages. However, such deep neural network (NN) models...
Saved in:
Main Authors: | Maria Zafar, Patrick J. Wall, Souhail Bakkali, Rejwanul Haque |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-07-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/14/8091 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation
by: Xinlu Zhang, et al.
Published: (2020-01-01) -
An enhanced pivot-based neural machine translation for low-resource languages
by: Danang Arbian Sulistyo, et al.
Published: (2025-05-01) -
Distillation dynamics and control /
by: Deshpande, Pradeep B.
Published: (1985) -
N-myristoyltransferase1 regulates biomass accumulation in cucumber (Cucumis sativus L.)
by: Xin Liu, et al.
Published: (2025-05-01) -
SYNTHESIS OF THE THERMALLY COUPLED DISTILLATION SEQUENCES
by: E. А. Anokhina, et al.
Published: (2017-12-01)