Citation: | Yin W, Che YX, Li XS et al. Physics-informed deep learning for fringe pattern analysis. Opto-Electron Adv 7, 230034 (2024). doi: 10.29026/oea.2024.230034 |
[1] | Gåsvik KJ. Optical Metrology 3rd ed (John Wiley & Sons, West Sussex, 2002). |
[2] | Wyant JC, Creath K. Recent advances in interferometric optical testing. Laser Focus 21, 118–132 (1985). doi: 10.3103/S875669902004007X |
[3] | Kulkarni R, Rastogi P. Optical measurement techniques-A push for digitization. Opt Lasers Eng 87, 1–17 (2016). doi: 10.1016/j.optlaseng.2016.05.002 |
[4] | Hariharan P. Basics of Interferometry 2nd ed (Elsevier, Amsterdam, 2010). |
[5] | Schnars U, Falldorf C, Watson J et al. Digital Holography and Wavefront Sensing 2nd ed (Springer, Berlin Heidelberg, 2015). |
[6] | Gorthi SS, Rastogi P. Fringe projection techniques: whither we are? Opt Lasers Eng 48, 133–140 (2010). doi: 10.1016/j.optlaseng.2009.09.001 |
[7] | Geng J. Structured-light 3D surface imaging: a tutorial. Adv Opt Photon 3, 128–160 (2011). doi: 10.1364/AOP.3.000128 |
[8] | Servin M, Quiroga JA, Padilla JM. Fringe Pattern Analysis for Optical Metrology: Theory, Algorithms, and Applications (John Wiley & Sons, Weinheim, 2014). |
[9] | Su XY, Chen WJ. Fourier transform profilometry: a review. Opt Lasers Eng 35, 263–284 (2001). doi: 10.1016/S0143-8166(01)00023-9 |
[10] | Kemao Q. Windowed fourier transform for fringe pattern analysis. Appl Opt 43, 2695–2702 (2004). doi: 10.1364/AO.43.002695 |
[11] | Zuo C, Feng SJ, Hiang L et al. Phase shifting algorithms for fringe projection profilometry: a review. Opt Lasers Eng 109, 23–59 (2018). doi: 10.1016/j.optlaseng.2018.04.019 |
[12] | Barbastathis G, Ozcan A, Situ G. On the use of deep learning for computational imaging. Optica 6, 921–943 (2019). doi: 10.1364/OPTICA.6.000921 |
[13] | Zuo C, Qian JM, Feng SJ et al. Deep learning in optical metrology: a review. Light Sci Appl 11, 39 (2022). doi: 10.1038/s41377-022-00714-x |
[14] | Yan KT, Yi YJ, Huang CT et al. Fringe pattern denoising based on deep learning. Opt Commun 437, 148–152 (2019). doi: 10.1016/j.optcom.2018.12.058 |
[15] | Kulkarni R, Rastogi P. Fringe denoising algorithms: a review. Opt Lasers Eng 135, 106190 (2020). doi: 10.1016/j.optlaseng.2020.106190 |
[16] | Feng SJ, Chen Q, Gu GH et al. Fringe pattern analysis using deep learning. Adv Photon 1, 025001 (2019). doi: 10.1117/1.AP.1.2.025001 |
[17] | Ren ZB, Xu ZM, Lam EYM. End-to-end deep learning framework for digital holographic reconstruction. Adv Photon 1, 016004 (2019). doi: 10.1117/1.AP.1.1.016004 |
[18] | Rivenson Y, Zhang YB, Günaydın H et al. Phase recovery and holographic image reconstruction using deep learning in neural networks. Light Sci Appl 7, 17141 (2018). |
[19] | Liu KX, Wu JC, He ZH et al. 4K-DMDNet: diffraction model-driven network for 4K computer-generated holography. Opto-Electron Adv 6, 220135 (2023). doi: 10.29026/oea.2023.220135 |
[20] | Feng SJ, Zuo C, Yin W et al. Micro deep learning profilometry for high-speed 3D surface imaging. Opt Lasers Eng 121, 416–427 (2019). doi: 10.1016/j.optlaseng.2019.04.020 |
[21] | Qiao G, Huang YY, Song YP et al. A single-shot phase retrieval method for phase measuring deflectometry based on deep learning. Opt Commun 476, 126303 (2020). doi: 10.1016/j.optcom.2020.126303 |
[22] | Li YX, Qian JM, Feng SJ et al. Deep-learning-enabled dual-frequency composite fringe projection profilometry for single-shot absolute 3D shape measurement. Opto-Electron Adv 5, 210021 (2022). doi: 10.29026/oea.2022.210021 |
[23] | Yang T, Zhang ZZ, Li HH et al. Single-shot phase extraction for fringe projection profilometry using deep convolutional generative adversarial network. Meas Sci Technol 32, 015007 (2020). doi: 10.1088/1361-6501/aba5c5 |
[24] | Yin W, Zhong JX, Feng SJ et al. Composite deep learning framework for absolute 3D shape measurement based on single fringe phase retrieval and speckle correlation. J Phys Photon 2, 045009 (2020). doi: 10.1088/2515-7647/abbcd9 |
[25] | Feng SJ, Xiao YL, Yin W et al. Fringe-pattern analysis with ensemble deep learning. Adv Photon Nexus 2, 036010 (2023). doi: 10.1117/1.APN.2.3.036010 |
[26] | Osten W. What optical metrology can do for experimental me chanics. Appl Mech Mater 70, 1–20 (2011). doi: 10.4028/www.scientific.net/AMM.70.1 |
[27] | Goy A, Arthur K, Li S et al. Low photon count phase retrieval using deep learning. Phys Rev Lett 121, 243902 (2018). doi: 10.1103/PhysRevLett.121.243902 |
[28] | Wang F, Bian YM, Wang HC et al. Phase imaging with an untrained neural network. Light Sci Appl 9, 77 (2020). doi: 10.1038/s41377-020-0302-3 |
[29] | Saba A, Gigli C, Ayoub AB et al. Physics-informed neural networks for diffraction tomography. Adv Photon 4, 066001 (2022). |
[30] | Reid GT. Automatic fringe pattern analysis: a review. Opt Lasers Eng 7, 37–68 (1986–1987). |
[31] | Rajshekhar G, Rastogi P. Fringe analysis: premise and perspectives. Opt Lasers Eng 50, iii–x (2012). doi: 10.1016/j.optlaseng.2012.04.006 |
[32] | Weise T, Leibe B, Van Gool L. Fast 3D scanning with automatic motion compensation. In Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition 1–8 (IEEE, 2007); http://doi.org/10.1109/CVPR.2007.383291. |
[33] | Feng SJ, Zuo C, Tao TY et al. Robust dynamic 3-D measurements with motion-compensated phase-shifting profilometry. Opt Lasers Eng 103, 127–138 (2018). doi: 10.1016/j.optlaseng.2017.12.001 |
[34] | Zuo C, Tao TY, Feng SJ et al. Micro Fourier Transform Profilometry (μFTP): 3D shape measurement at 10, 000 frames per second. Opt Lasers Eng 102, 70–91 (2018). doi: 10.1016/j.optlaseng.2017.10.013 |
[35] | Yu CQ, Wang JB, Peng C et al. BiSeNet: bilateral segmentation network for real-time semantic segmentation. In Proceedings of the 15th European Conference on Computer Vision 325–341 (Springer, 2018); http://doi.org/10.1007/978-3-030-01261-8_20. |
[36] | Fan MY, Lai SQ, Huang JS et al. Rethinking BiSeNet for real-time semantic segmentation. In Proceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition 9716–9725 (IEEE, 2021); http://doi.org/10.1109/CVPR46437.2021.00959. |
[37] | Tao TY, Chen Q, Feng SJ et al. High-precision real-time 3D shape measurement based on a quad-camera system. J Opt 20, 014009 (2018). doi: 10.1088/2040-8986/aa9e0f |
[38] | Feng SJ, Chen Q, Zuo C. Graphics processing unit-assisted real-time three-dimensional measurement using speckle-embedded fringe. Appl Opt 54, 6865–6873 (2015). doi: 10.1364/AO.54.006865 |
[39] | Liu K, Wang YC, Lau DL et al. Dual-frequency pattern scheme for high-speed 3-D shape measurement. Opt Express 18, 5229–5244 (2010). doi: 10.1364/OE.18.005229 |
Supplementary information for Physics-informed deep learning for fringe pattern analysis | |
supplymentary movie |
Diagrams of the physics-driven method, physics-informed deep learning approach, and data-driven deep learning approach for fringe pattern analysis.
Overview of the proposed PI-FPA. (a) PI-FPA including a LeFTP module and a lightweight network. (b) Net head and Net tail. (c) The phase retrieval process of the LeFTP module.
Comparative results for single-shot fringe pattern analysis of the David model. (a–e) The phase retrieval process, wrapped phases, phase errors, and magnified views of the phase errors using FTP, LeFTP, Net head + LeFTP, U-Net, and PI-FPA.
Comparative fringe analysis results of the industrial part. (a) The industrial part and the phase errors using FTP, U-Net, and PI-FPA. (b) The magnified views of the phase errors. (c) Single-shot 3D imaging results using different methods. (d) The magnified views of (c). (e) The line profiles in (d).
Precision analysis for a ceramic plane and a standard sphere moving along the Z axis. (a) 3D reconstruction results using PI-FPA at different time points. (b–c) the error distributions of the sphere and plane. (d–e) temporal precision analysis results of the plane and sphere over a 1.62 s period using 3-step PS, FTP, U-Net, and PI-FPA. (f–i) the color-coded 3D reconstruction and the corresponding error distributions of the plane and the standard sphere using different methods at T = 0.81 s.
Fast 3D measurement results using different fringe pattern analysis methods. (a) The representative fringe images at different time points and the corresponding color-coded 3D reconstructions results for the rotated workpiece model using 3-step PS, FTP, U-Net, and PI-FPA. (b) The representative fringe images at different time points and the corresponding color-coded 3D reconstructions results for non-rigid dynamic face using 3-step PS, FTP, U-Net, and PI-FPA. (c) 360-degree 3D reconstruction of the workpiece model using PI-FPA. (d) 3D measurement results of non-rigid dynamic face using PI-FPA.