Federated and Split Machine Learning in the Internet of Things
Abstract: In this talk we discuss approaches for distributed machine learning (ML) in resource-constrained edge-supported Internet of Things (IoT) networks. Federated Learning (FL) and Split Learning (SL) are popular approaches in such wireless edge networks. First, we present Early Exit of Communication (EEoC), which adaptively splits ML inference in an IoT edge computing environment to meet latency and energy constraints. This layer-based (vertically partitioned) approach has been extended by Distributed Micro-Split Deep Learning in Heterogeneous Dynamic IoT (DISNET), which adds horizontal partitioning to better support flexible, distributed, and parallel execution of neural network models on heterogeneous IoT devices under dynamic conditions. Then, we also consider the training aspect by developing and evaluating Adaptive REsource-aware Split-learning (ARES), a scheme for efficient model training in IoT systems. Recent work suggests Dynamic FL (DFL) for heterogeneous IoT, which uses resource-aware SL and FL based on similarity-based layer-wise model aggregation.
Bio: Prof. Dr. Torsten Braun is head of the Communication and Distributed Systems (CDS) research group at the Institute of Computer Science, University of Bern, where he has been a full professor since 1998. He got the Ph.D. degree from University of Karlsruhe (Germany) in 1993. From 1994 to 1995, he was a guest scientist at INRIA Sophia-Antipolis (France). From 1995 to 1997, he worked at the IBM European Networking Centre Heidelberg (Germany) as a project leader and senior consultant. He has been a vice president of the SWITCH (Swiss Research and Education Network Provider) Foundation from 2011 to 2019. He has been a Director of the Institute of Computer Science and Applied Mathematics at University of Bern between 2007 and 2011, and from 2019 to 2021.