< Back to previous page

Project

Applying Artificial Intelligence on Edge devices using Deep learninG with Embedded optimizations (AI@EDGE)

In Deep Learning, an algorithm is developed on a large set of data, typically on a compute server. Then the trained network is loaded into an application that works with the real data. This step is called 'inference'. However, Inference does not need to be applied to the computer system used for the training. This makes it possible to apply deep learning to systems that are less powerful, more energy efficient and less dependent on a network. In this project, various Deep Learning applications for low-cost embedded systems will be studied, on the one hand in the feasibility of such applications and on the other hand in the added value of artificial intelligence in embedded systems, such as microcontrollers and system processors.

Date:1 Mar 2020 →  28 Feb 2022
Keywords:inference, low-cost embedded systems, Deep Learning, Edge -devices
Disciplines:Artificial intelligence not elsewhere classified, Machine learning and decision making