կապին հետեւելուն համար սեղմէ հոս
In this survey, we explore the broad applications of Information Theory in Machine Learning, highlighting how core concepts like entropy, Mutual Information, and KLdivergence are used to enhance learning algorithms. Since its inception by Claude Shannon, Information Theory has provided mathematical tools to quantify uncertainty, optimize decision-making, and manage the trade-off between model flexibility and generalization. These principles have been integrated across various subfields of Machine Learning, including neural networks, where the Information Bottleneck offers insights into data representation, and reinforcement learning, where entropy-based methods improve exploration strategies. Additionally, measures like Mutual Information are critical in feature selection and unsupervised learning. This survey bridges foundational theory with its practical implementations in modern Machine Learning by providing both historical context and a review of contemporary research.. We also discuss open challenges and future directions, such as scalability and interpretability, highlighting the growing importance of these techniques in next-generation models.
oai:arar.sci.am:405368
ՀՀ ԳԱԱ Հիմնարար գիտական գրադարան
Aug 7, 2025
Aug 7, 2025
0
https://arar.sci.am/publication/437397
Հրատարակութեան անունը | Թուական |
---|---|
Haroutunian, Mariam E., Information Theory Tools and Techniques to Overcome Machine Learning Challenges | Aug 7, 2025 |
Aghasi S. Poghosyan Hakob G. Sarukhanyan
Avetisyan, A. A. Grigoryan, M. T. Melikyan, A. V. Պատ․ խմբ․՝ Ա․ Գ․ Նազարով (1957-1964) Մ․ Վ․ Կասյան (1964-1988) Ռ․ Մ․ Մարտիրոսյան (1989-2017 ) Գլխավոր խմբ․՝ Վ․ Շ․ Մելիքյան (2018-)
Haroutunian. Mariam E.
Ghulghazaryan, Ruben G. Piliposyan, Davit G. Shoyan, Misak T. Nersisyan, Hayk V.