ENTROPY AND INFORMATION THEORY IN TRANSLATION: MEASURING MEANING ACROSS LANGUAGES

Main Article Content

Mohammad Fawwaz Alwidyan

Abstract

Translation is inherently a process of encoding and decoding meaning between languages, often marked by ambiguity, redundancy, and information loss. This study explores how concepts from information theory, particularly Shannon entropy, mutual information, and compression algorithms, can be applied to the analysis and evaluation of translation quality. By treating language as a probabilistic system, we model sentences as information-bearing signals and examine how translation affects the entropy of those signals across different languages.


Using aligned parallel corpora from multiple language pairs, we quantify the changes in information content that occur during translation. The research investigates how entropy varies between languages with differing Morphosyntactic complexity, and how meaning is preserved or altered when messages are encoded in another linguistic system. Additionally, we explore how information gain and loss can be used as metrics for assessing the fidelity and efficiency of both human and machine translations.


The findings offer a mathematically grounded framework for understanding translation as a constrained communication channel, revealing insights into cross-linguistic meaning transfer and contributing to the development of more information-aware translation systems. This interdisciplinary approach opens new avenues for both theoretical translation studies and computational translation evaluation.

Article Details

Section
Articles