Novel Deep Learning (DL) algorithms show ever-increasing accuracy and precision in multiple application domains. However, some steps further are needed towards the ubiquitous adoption of this kind of instrument. First, effort and skills required to develop new DL models, or to adapt existing ones to new use-cases, are hardly available for small- and medium-sized businesses. Second, DL inference must be brought at the edge, to overcome limitations posed by the classically-used cloud computing paradigm. This requires implementation on low-energy computing nodes, often heteroge-nous and parallel, that are usually more complex to program and to manage. This work describes the ALOHA framework, that proposes a solution to these issue by means of an integrated tool ow that automates most phases of the development process. The framework introduces architecture-awareness, considering the target inference platform very early, already during algorithm selection, and driving the optimal porting of the resulting embedded application. Moreover it considers security, power eciency and adaptiveness as main objectives during the whole development process.

ALOHA: An architectural-aware framework for deep learning at the edge / Meloni, P.; Loi, D.; Deriu, G.; Ripolles, O.; Solans, D.; Pimentel, A. D.; Sapra, D.; Pintor, M.; Biggio, B.; Moser, B.; Shepeleva, N.; Stefanov, T.; Minakova, S.; Conti, F.; Benini, L.; Fragoulis, N.; Theodorakopoulos, I.; Masin, M.; Palumbo, F.. - (2018), pp. 19-26. (Intervento presentato al convegno 2018 Workshop on INTelligent Embedded Systems Architectures and Applications, INTESA 2018 tenutosi a ita nel 2018) [10.1145/3285017.3285019].

ALOHA: An architectural-aware framework for deep learning at the edge

Palumbo F.
Membro del Collaboration Group
2018-01-01

Abstract

Novel Deep Learning (DL) algorithms show ever-increasing accuracy and precision in multiple application domains. However, some steps further are needed towards the ubiquitous adoption of this kind of instrument. First, effort and skills required to develop new DL models, or to adapt existing ones to new use-cases, are hardly available for small- and medium-sized businesses. Second, DL inference must be brought at the edge, to overcome limitations posed by the classically-used cloud computing paradigm. This requires implementation on low-energy computing nodes, often heteroge-nous and parallel, that are usually more complex to program and to manage. This work describes the ALOHA framework, that proposes a solution to these issue by means of an integrated tool ow that automates most phases of the development process. The framework introduces architecture-awareness, considering the target inference platform very early, already during algorithm selection, and driving the optimal porting of the resulting embedded application. Moreover it considers security, power eciency and adaptiveness as main objectives during the whole development process.
2018
9781450365987
ALOHA: An architectural-aware framework for deep learning at the edge / Meloni, P.; Loi, D.; Deriu, G.; Ripolles, O.; Solans, D.; Pimentel, A. D.; Sapra, D.; Pintor, M.; Biggio, B.; Moser, B.; Shepeleva, N.; Stefanov, T.; Minakova, S.; Conti, F.; Benini, L.; Fragoulis, N.; Theodorakopoulos, I.; Masin, M.; Palumbo, F.. - (2018), pp. 19-26. (Intervento presentato al convegno 2018 Workshop on INTelligent Embedded Systems Architectures and Applications, INTESA 2018 tenutosi a ita nel 2018) [10.1145/3285017.3285019].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11388/227272
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 10
  • ???jsp.display-item.citation.isi??? 10
social impact