New Conference Paper Published
A new conference paper co-authored by Enrico Fraccaroli has been published in the proceedings of the 2024 Forum on Specification & Design Languages (FDL).
We are pleased to announce the publication of our latest conference paper, titled “Enhancing Split Computing and Early Exit Applications through Predefined Sparsity”, in the proceedings of the 2024 Forum on Specification & Design Languages (FDL).
Abstract
In the past decade, Deep Neural Networks (DNNs) achieved state-of-the-art performance in a broad range of problems, spanning from object classification and action recognition to smart building and healthcare. The flexibility that makes DNNs such a pervasive technology comes at a price: the computational requirements preclude their deployment on most of the resource-constrained edge devices available today to solve real-time and real-world tasks. This paper introduces a novel approach to address this challenge by combining the concept of predefined sparsity with Split Computing (SC) and Early Exit (EE). In particular, SC aims at splitting a DNN with a part of it deployed on an edge device and the rest on a remote server. Instead, EE allows the system to stop using the remote server and rely solely on the edge device’s computation if the answer is already good enough. Specifically, how to apply such a predefined sparsity to a SC and EE paradigm has never been studied. This paper studies this problem and shows how predefined sparsity significantly reduces the computational, storage, and energy burdens during the training and inference phases, regardless of the hardware platform. This makes it a valuable approach for enhancing the performance of SC and EE applications. Experimental results showcase reductions exceeding 4× in storage and computational complexity without compromising performance. The source code is available at GitHub.
Details
- Title: Enhancing Split Computing and Early Exit Applications through Predefined Sparsity
- Authors: Luigi Capogrosso, Enrico Fraccaroli, Giulio Petrozziello, Francesco Setti, Samarjit Chakraborty, Franco Fummi, Marco Cristani
- Conference: Forum on Specification & Design Languages (FDL)
- Year: 2024
- Pages: 1-8
- Keywords: Performance evaluation, Training, Smart buildings, Source coding, Artificial neural networks, Medical services, Real-time systems, Split Computing, Early Exit, Deep Neural Networks, Predefined Sparsity, Edge Devices
Links
- DOI: 10.1109/FDL63219.2024.10673767
- Open Access Version: Read Here
- Source Code: GitHub Repository
This paper presents a novel method for improving the performance of Split Computing (SC) and Early Exit (EE) applications by applying predefined sparsity. The approach significantly reduces computational, storage, and energy demands during both training and inference phases, making it an effective solution for deploying DNNs on resource-constrained edge devices. Experimental results show more than a 4× reduction in storage and computational complexity without performance loss.
We extend our gratitude to all collaborators and contributors for their efforts. For further details, please explore the links provided above.

New Conference Paper Published
February 28, 2025

New Conference Paper Published
February 20, 2025

FlexMan: A New Adaptive Scheduling and Optimization Library Released
January 23, 2025

New Conference Paper Published
October 23, 2024

New Conference Paper Published
September 04, 2024

New Conference Paper Published
September 04, 2024

New Conference Paper Published
September 04, 2024

New Journal Article Published
July 25, 2024

New Conference Paper Published
July 03, 2024

Workshop
April 15, 2024

New Conference Paper Published
April 09, 2024

New Conference Paper Published
March 25, 2024

New Journal Article Published
December 21, 2023
