News Release

NUS engineers invent tiny vision processing chip for ultra-small smart vision systems and IoT applications

Novel video feature extractor uses 20 times less power than existing chips and could reduce the size of untethered vision systems down to the millimeter range

Business Announcement

National University of Singapore

NUS Engineers Invent Tiny Vision Processing Chip for Ultra-Small Smart Vision Systems

image: A team of researchers led by Associate Professor Massimo Alioto from the Department of Electrical and Computer Engineering at the NUS Faculty of Engineering has developed a tiny vision processing chip, EQSCALE, which uses 20 times less power than existing technology. view more 

Credit: National University of Singapore

A team of researchers from the National University of Singapore (NUS) has developed a novel microchip, named EQSCALE, which can capture visual details from video frames at extremely low power consumption. The video feature extractor uses 20 times less power than existing best-in-class chips, and hence requires 20 times smaller battery, and could reduce the size of smart vision systems down to the millimetre range. For example, it can be powered continuously by a millimetre-sized solar cell without the need for battery replacement.

Led by Associate Professor Massimo Alioto from the Department of Electrical and Computer Engineering at the NUS Faculty of Engineering, the team's discovery is a major step forward in developing millimetre-sized smart cameras with near-perpetual lifespan. It will also pave the way for cost-effective Internet of Things (IoT) applications, such as ubiquitous safety surveillance in airports and key infrastructure, building energy management, workplace safety, and elderly care.

"IoT is a fast-growing technology wave that uses massively distributed sensors to make our environment smarter and human-centric. Vision electronic systems with long lifetime are currently not feasible for IoT applications due to their high power consumption and large size. Our team has addressed these challenges through our tiny EQSCALE chip and we have shown that ubiquitous and always-on smart cameras are viable. We hope that this new capability will accelerate the ambitious endeavour of embedding the sense of sight in the IoT," said Assoc Prof Alioto.

Tiny vision processing chip that works non-stop

A video feature extractor captures visual details taken by a smart camera and turns them into a much smaller set of points of interest and edges for further analysis. Video feature extraction is the basis of any computer vision system that automatically detects, classifies and tracks objects in the visual scene. It needs to be performed on every single frame continuously, thus defining the minimum power of a smart vision system and hence the minimum system size.

The power consumption of previous state-of-the-art chips for feature extraction ranges from various milliwatts to hundreds of milliwatts, which is the average power consumption of a smartwatch and a smartphone, respectively. To enable near-perpetual operation, devices can be powered by solar cells that harvest energy from natural lighting in living spaces. However, such devices would require solar cells with a size in the centimetre scale or larger, thus posing a fundamental limit to the miniaturisation of such vision systems. Shrinking them down to the millimetre scale requires the reduction of the power consumption to much lesser than one milliwatt.

The NUS Engineering team's microchip, EQSCALE, can perform continuous feature extraction at 0.2 milliwatts - 20 times lower in power consumption than any existing technology. This translates into a major advancement in the level of miniaturisation for smart vision systems. The novel feature extractor is smaller than a millimetre on each side, and can be powered continuously by a solar cell that is only a few millimetres in size.

Assoc Prof Alioto explained, "This technological breakthrough is achieved through the concept of energy-quality scaling, where the trade-off between energy consumption and quality in the extraction of features is adjusted. This mimics the dynamic change in the level of attention with which humans observe the visual scene, processing it with different levels of detail and quality depending on the task at hand. Energy-quality scaling allows correct object recognition even when a substantial number of points of interests are missed due to the degraded quality of the target."

Next steps

The development of EQSCALE is a crucial step towards the future demonstration of millimetre-sized vision systems that could operate indefinitely. The NUS research team is looking into developing a miniaturised computer vision system that comprises smart cameras equipped with vision capabilities enabled by the microchip, as well as a machine learning engine that comprehends the visual scene. The ultimate goal of the NUS research team is to enable massively distributed vision systems for wide-area and ubiquitous visual monitoring, vastly exceeding the traditional concept of cameras.

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.