Visualization of blood flow sharpens artificial heart
Peer-Reviewed Publication
This month, we’re focusing on artificial intelligence (AI), a topic that continues to capture attention everywhere. Here, you’ll find the latest research news, insights, and discoveries shaping how AI is being developed and used across the world.
Updates every hour. Last Updated: 2-Jan-2026 23:11 ET (3-Jan-2026 04:11 GMT/UTC)
This paper proposes a deep learning framework F-GCN that integrates multiple wavelet bases, and extracts MI brain electrical features based on the functional topological relationships between electrodes. The average accuracy of the fused features reaches 92.44%, which is significantly higher than that of a single wavelet basis (coif4: 67.67%, db4: 82.93%, sym4: 73.10%). It also demonstrates good stability and individual convergence in the leave-one-out verification, proving the effectiveness of the method.
A research team introduces a scalable, drone-based 3D reconstruction pipeline combined with a novel deep learning framework—SegVoteNet—to phenotype sorghum panicles in field trials.
The rapid advancement of artificial intelligence (AI) presents significant opportunities for both societies and industries. At the same time, however, it raises growing concerns about the increasing frequency and sophistication of cyberattacks. These cyber risks can lead not only to substantial direct financial losses for firms, but also to indirect losses stemming from reputational damage and cascading effects within interconnected systems. To enhance resilience in the face of such events, it is essential for scholars across disciplines to engage in rigorous analysis and interdisciplinary dialogue on the assessment and management of cyber risks.
New machine learning models developed by University of South Australia (UniSA) researchers could help clinicians identify when patients can successfully stop long-term antidepressant use.