News Release

Large-scale phase retrieval

Peer-Reviewed Publication

Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

The schematic of the reported technique for large-scale phase retrieval

image: The reported technique decomposes the large-scale phase retrieval problem into two subproblems under the PNP-GAP framework, and introduces the efficient alternating projection (AP) and enhancing network solvers for alternating optimization. The workflow realizes robust phase retrieval with low computational complexity and strong generalization on different imaging modalities. view more 

Credit: by Xuyang Chang, Liheng Bian, and Jun Zhang

Wide field of view and high resolution are both desirable for various imaging applications, providing multi-dimensional and multi-scale target information. As the recent development of phase imaging, large-scale detection has been widely employed in a variety of imaging modalities, which largely extends the spatial-bandwidth product (SBP) of optical systems from million scale to billion scale. Such a large amount of data poses a great challenge for post phase retrieval (PR) processing. Therefore, large-scale PR technique with low computational complexity and high fidelity are of great significance for those imaging and perception applications in various dimensions. However, the existing PR algorithms suffer from the tradeoff among low computational complexity, robustness to measurement noise and strong generalization, making them inapplicable for general large-scale phase retrieval.

In a newly-published research article in eLight, a team of scientists, led by Professor Jun Zhang from Beijing Institute of Technology, China have developed an efficient large-scale phase retrieval technique for realizing high-fidelity complex-domain phase imaging. They combine the conventional optimization algorithm with the deep learning technique and realize low computational complexity, robustness to measurement noise and strong generalization. They compare the reported method with the existing PR methods on three imaging modalities, including coherent diffraction imaging (CDI), coded diffraction pattern imaging (CDP) and Fourier ptychographic microscopy (FPM). The results validate that compared to the Alternating projection (AP) algorithm, the reported technique is robust to measurement noise with as much as 17dB enhancement on signal-to-noise ratio. Compared with the optimization-based algorithms, the running time is significantly reduced by more than one order of magnitude. Besides, they for the first time demonstrate ultra-large-scale phase retrieval at the 8K level in minute-level time.

The reported PR technique builds on the plug-and-play (PNP) optimization framework, and extends the efficient generalized-alternating-projection (GAP) strategy from real space to nonlinear space. These scientists summarize the characters of their technique:

“The complex-field PNP-GAP scheme ensures strong generalization of our technique on various imaging modalities, and outperforms the conventional PNP techniques with fewer auxiliary variables, lower computational complexity and faster convergence”.

“Under GAP framework, the phase retrieval problem is decomposed with two sub-problems. We introduced an alternating projection solver and an enhancing neural network respectively to solve the two sub-problems. These two solvers compensate the shortcomings of each other, allowing the optimization to bypass the poor generalization of deep learning and poor noise robustness of AP.

 “Benefiting from the flexible optimization framework, our technique is able to introduce the best solvers in the future to update itself. Besides, it is interesting to investigate the influence of employing other image enhancing solvers such as super-resolution neural network, deblurring network and distortion removal network. This may open new insights for phase retrieval with further boosted quality”, the scientists forecast.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.