< Back to previous page

Project

Hyper-dimensional scalable sensors: the road forward to always-on context-awareness in electronic devices? (HYPERSCALES)

We, humans, are masters at constantly capturing tons of sensory information in an “always-on” fashion. Yet, crucial to our ability to process this hyperdimensional stream of sensory information, is that we do not always devote the same level of mental effort to all sensory inputs. This dynamic scalability allows us to extract the relevant information from the sensory data with our limited human computational bandwidth. Wouldn’t it be great if electronics could also benefit from such scalable processing of a hyperdimensional stream of sensory data? This would enable robots, drones, cars, or buildings to constantly be aware of their complete surroundings. Currently, such devices struggle to process hyperdimensional visual data under the energy and processing constraints of embedded devices. This can be overcome by bringing in similar dynamic scalability when processing the hyperdimensional data. HYPERSCALES will enable such always-on hyper-dimensional, scalable sensing, focusing on visual sensors. The goal is to demonstrate a ring of many low-cost visual sensors, capturing a rich datastream of omnidirectional information. Using a new paradigm of online scalable neural networks, and aligned dynamic scaling of customly designed hardware, always-on visual awareness will become feasible with an order or magnitude lower energy consumption than the state-of-theart. Unique is the tight interplay between algorithmic (Prof. Dambre) and hardware (Prof. Verhelst) tunability.

Date:1 Jan 2018 →  31 Dec 2021
Keywords:HYPERSCALES, Scalability, Sensors
Disciplines:Computer hardware, Computer theory, Scientific computing, Other computer engineering, information technology and mathematical engineering