Article Highlight | 13-Aug-2025

Near‑sensor edge computing system enabled by a CMOS compatible photonic integrated circuit platform using bilayer AlN/Si waveguides

Shanghai Jiao Tong University Journal Center

A groundbreaking article in Nano-Micro Letters by Chengkuo Lee and co-workers from the National University of Singapore presents a hybrid photonic-electronic Near-Sensor Edge Computing (NSEC) chip that fuses aluminum-nitride (AlN) microrings with silicon Mach–Zehnder interferometers to deliver real-time AI at the edge—consuming < 0.34 pJ and responding in < 10 ns.

Why This Research Matters
Overcoming Cloud Limitations: As ChatGPT-scale models drive data-center overload, latency, privacy, and energy become bottlenecks for wearables, robotics, and immersive AR/VR. The NSEC chip processes data right at the sensor, eliminating cloud hops and cutting energy/latency by orders of magnitude.
Enabling Multimodal Edge AI: Beyond traditional electronics, the system natively handles both electrical (TENG, MEMS, bio-signals) and optical (spectroscopic, LiDAR) inputs, opening pathways for mixed-reality human–machine interfaces and agentic physical-AI.

Innovative Design & Mechanisms
Bilayer AlN/Si PIC Platform: Monolithic 8-inch foundry fabrication stacks a 220 nm Si waveguide layer beneath a 500 nm AlN layer; adiabatic couplers achieve 0.04 dB inter-layer loss across the telecom band.
Photonic Feature Extraction: AlN microrings exploit the Pockels effect (0.26 pm V-1 tuning) to convert analog TENG force/pressure into wavelength-shift-encoded optical features—equivalent to on-the-fly integration that preserves temporal dynamics.
On-Chip Neural Networks: Si thermo-optic MZIs perform 4×4 matrix–vector multiplication at ~30 dB extinction and 5.6 V π-phase shift; weights are updated in situ via back-propagation with 4-bit quantization to suppress noise (σ = 0.011).
Ultra-Low Latency & Power: Entire inference loop—from triboelectric sensor to classified output—completes in 10 ns, dissipating 0.34 pJ, outperforming CMOS ASICs by > 100× in energy-delay product.

Applications & Future Outlook
Gesture Recognition: Sensor gloves with four TENG channels achieve 96.77 % accuracy on 13 American Sign Language gestures after 4-bit quantization, versus 74.56 % raw-analog.
Gait Analysis: Sensor socks map seven stance/swing phases with 98.31 % accuracy, enabling real-time fall detection for MR safety.
Metaverse & Health: Demonstrated VR control (“turn on light/fan”) and avatar gait tracking showcase a plug-and-play path for smart wearables, tele-rehabilitation, and ubiquitous IoT.
Next Steps: Scale network depth via cascaded AlN/Si tiles, integrate on-chip phase-change memories for non-volatile weights, and develop standardized PIC-to-CMOS co-packaging for mass production.

Conclusions
This NSEC paradigm leapfrogs von Neumann bottlenecks by embedding photonic intelligence directly on wearable sensors. By uniting electro-photonic co-design, triboelectric self-power, and scalable foundry processes, the work paves the way for ubiquitous, privacy-preserving, and energy-efficient edge AI that will redefine human–machine interaction in the metaverse era.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.