HATL: Hierarchical Adaptive-Transfer Learning Framework for Sign Language Machine Translation
arXiv cs.AI / 3/23/2026
📰 NewsModels & Research
Key Points
- HATL introduces dynamic unfreezing, layer-wise learning rate decay, and stability mechanisms to adapt pretrained representations to sign language translation without overfitting.
- The framework progressively unfrozens pretrained layers based on training performance, preserving generic features while adapting to sign characteristics.
- HATL is evaluated on Sign2Text and Sign2Gloss2Text using a ST-GCN++ backbone and the Adaptive Transformer (ADAT) across PHOENIX14T, Isharah, and MedASL datasets with notable improvements.
- Experimental results show BLEU-4 gains of 15.0% on PHOENIX14T and Isharah, and 37.6% on MedASL when using ADAT, outperforming traditional transfer learning baselines.
Related Articles
Data Augmentation Using GANs
Dev.to
Zero Shot Deformation Reconstruction for Soft Robots Using a Flexible Sensor Array and Cage Based 3D Gaussian Modeling
arXiv cs.RO
Speculative Policy Orchestration: A Latency-Resilient Framework for Cloud-Robotic Manipulation
arXiv cs.RO
ReMAP-DP: Reprojected Multi-view Aligned PointMaps for Diffusion Policy
arXiv cs.RO
AGILE: A Comprehensive Workflow for Humanoid Loco-Manipulation Learning
arXiv cs.RO