-
1.
公开(公告)号:US20240037765A1
公开(公告)日:2024-02-01
申请号:US17769230
申请日:2020-08-27
Applicant: NANJING UNIVERSITY OF SCIENCE AND TECHNOLOGY
Inventor: Chao ZUO , Jiaming QIAN , Qian CHEN , Shijie FENG , Tianyang TAO , Yan HU , Wei YIN , Liang ZHANG , Kai LIU , Shuaijie WU , Mingzhu XU , Jiaye WANG
CPC classification number: G06T7/337 , H04N23/90 , G06T7/85 , G06T3/0068 , G06T2207/10028 , G06T2207/10012
Abstract: Disclosed is a high-precision dynamic real-time 360-degree omnidirectional point cloud acquisition method based on fringe projection. The method comprises: firstly, by means of the fringe projection technology based on a stereoscopic phase unwrapping method, and with the assistance of an adaptive dynamic depth constraint mechanism, acquiring high-precision three-dimensional (3D) data of an object in real time without any additional auxiliary fringe pattern; and then, after a two-dimensional (2D) matching points optimized by the means of corresponding 3D information is rapidly acquired, by means of a two-thread parallel mechanism, carrying out coarse registration based on Simultaneous Localization and Mapping (SLAM) technology and fine registration based on Iterative Closest Point (ICP) technology. By means of the invention, low-cost, high-speed, high-precision, unconstrained and rapid-feedback omnidirectional 3D real-time molding becomes possible, and a new gate is opened into the fields of 360-degree workpiece 3D surface defect detection, rapid reverse forming, etc.
-
2.
公开(公告)号:US20230122985A1
公开(公告)日:2023-04-20
申请号:US17909780
申请日:2020-08-27
Applicant: Nanjing University of Science and Technology
Inventor: Shijie FENG , Qian CHEN , Chao ZUO , Yuzhen ZHANG , Jiasong SUN , Yan HU , Wei YIN , Jiaming QIAN
IPC: G06N3/094 , G06N3/0475 , G06N3/048 , G06V10/44
Abstract: The invention discloses a single-frame fringe pattern analysis method based on multi-scale generative adversarial network. A multi-scale generative adversarial neural network model is constructed and a comprehensive loss function is applied. Next, training data are collected to train the multi-scale generative adversarial network. During the prediction, a fringe pattern is fed into the trained multi-scale network where the generator outputs the sine term, cosine term, and the modulation image of the input pattern. Finally, the arctangent function is applied to compute the phase. When the network is trained, the parameters of the network do not need to manually tune during the calculation. Since the input of the neural network is only a single fringe pattern, the invention provides an efficient and high-precision phase calculation method for moving objects.
-