Description
Date depot: 14 février 2020
Titre: Machine learning: fundamentals and connections with memory aided communications and computations
Directeur de thèse:
Petros ELIA (Eurecom)
Domaine scientifique: Sciences et technologies de l'information et de la communication
Thématique CNRS : Non defini
Resumé:
Federico and myself have devised an ambitious plan that meets the expectations of the student, as well as of the funding ERC project DUALITY. Crucial to the development of the envisioned research will be a deeper understanding of machine learning, and its connections with coded caching and distributed computing. Emphasis on these three, and their intersections, is motivated by the fact that a) Machine learning is a powerful tool that can help us solve previously intractable problems, b) machine learning is currently treated as a black box, with very little being known about the inner workings, c) the problems of coded caching and distributed computing can immensely gain from learning the topology and user behavior, and d) the two aforementioned problems of coded caching and distributed computing are directly related, and are poised to have very substantial impact. The thesis will benefit from several technical courses that Federico will take in order to prepare for an extensive study of the basic aspects of machine learning and coding. Part of the work will be to analyze how machine learning relates to caching, analyze methods that allow for design of communication schemes for settings with caches and learning, analyze the fundamentals of machine learning, and study possible connections to multi-terminal caching. In the end, the theoretical work will be combined with a selection and design of practical caching schemes that can be implemented in the context of distributed computing. Further consolidation of the results will be achieved by revealing the fundamental properties of machine learning and the fundamental connections with cache-aided heterogeneous networks. Emphasis will also be placed on the fundamental limits of distributed computing and learning over computation networks with arbitrary topology.
Doctorant.e: Brunero Federico