< Terug naar vorige pagina

Publicatie

Designing resource-constrained neural networks using neural architecture search targeting embedded devices

Tijdschriftbijdrage - e-publicatie

Recent advances in the field of Neural Architecture Search (NAS) have made it possible to develop state-of-the-art deep learning systems without requiring extensive human expertise and hyperparameter tuning. In most previous research, little concern was given to the resources required to run the generated systems. In this paper, we present an improvement on a recent NAS method, Efficient Neural Architecture Search (ENAS). We adapt ENAS to not only take into account the network’s performance, but also various constraints that would allow these networks to be ported to embedded devices. Our results show ENAS’ ability to comply with these added constraints. In order to show the efficacy of our system, we demonstrate it by designing a Recurrent Neural Network that predicts words as they are spoken, and meets the constraints set out for operation on an embedded device, along with a Convolutional Neural Network, capable of classifying 32x32 RGB images at a rate of 1 FPS on an embedded device.
Tijdschrift: Internet of Things
ISSN: 2542-6605
Volume: 12
Jaar van publicatie:2020
Trefwoorden:A1 Journal article
Toegankelijkheid:Open