Patents by Inventor Pei-Jung LIANG
Pei-Jung LIANG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250136105Abstract: A device and a method for steering a vehicle are provided. The method includes following steps: obtaining a point cloud data of a vehicle through a lidar, obtaining an RGB image of the vehicle through a camera, and obtaining a current speed of the vehicle through a wheel speed sensor; using the current speed and local path way points associated with the point cloud data to obtain a target angle; using the current speed and a central lane distance error associated with the RGB image to obtain a compensator angle; and using the target angle and the compensator angle to obtain a steering command, and steering the vehicle to drive in a lane according to the steering command.Type: ApplicationFiled: January 4, 2024Publication date: May 1, 2025Applicant: Industrial Technology Research InstituteInventors: Jheng-Rong Wu, Chiung-Hung Chen, Hong-Xian Tsai, Pei-Jung Liang
-
Publication number: 20240142237Abstract: A localization device and a localization method for a vehicle are provided. The localization device includes an inertia measurer, an encoder, an image capturing device, and a processor. The processor obtains an encoded data by the encoder to generate a first odometer data, obtains an inertial data by the inertia measurer to generate a heading angle estimation data, and obtains an environmental image data by the image capturing device to generate a second odometer data. In a first fusion stage, the processor fuses the heading angle estimation data and the first odometer data to generate first fusion data. In a second fusion stage, the processor fuses the first fusion data, the heading angle estimation data and the second odometer data to generate pose estimation data corresponding to the localization device.Type: ApplicationFiled: January 5, 2023Publication date: May 2, 2024Applicant: Industrial Technology Research InstituteInventors: Yu-Jhong Chen, Pei-Jung Liang, Ren-Yi Huang
-
Patent number: 10852420Abstract: In one of the exemplary embodiments, the disclosure is directed to an object detection system including a first type of sensor for generating a first sensor data; a second type of sensor for generating a second sensor data; and a processor coupled to the first type of sensor and the second type of sensor and configured at least for: processing the first sensor data by using a first plurality of object detection algorithms and processing the second sensor data by using a second plurality of object detection algorithms, wherein each of the first plurality of object detection algorithms and each of the second plurality of object detection algorithms include environmental parameters calculated from a plurality of parameter detection algorithms; and determining for each detected object a bounding box resulted from processing the first sensor data and processing the second sensor data.Type: GrantFiled: June 15, 2018Date of Patent: December 1, 2020Assignee: Industrial Technology Research InstituteInventors: Peter Chondro, Pei-Jung Liang
-
Patent number: 10748033Abstract: The disclosure is directed to an object detection method using a CNN model and an object detection apparatus thereof. In an aspect, the object detection method includes generating a sensor data; processing the sensor data by using a first object detection algorithm to generate a first object detection result; processing the first object detection result by using a plurality of stages of sparse update mapping algorithm to generate a plurality of stages of updated first object detection result; processing a first stage of the stages of updated first object detection result by using a plurality of stages of spatial pooling algorithm between each of stages of sparse update mapping algorithm; executing a plurality of stages of deep convolution layer algorithm to extract a plurality of feature results; and performing a detection prediction based on a last-stage feature result.Type: GrantFiled: December 11, 2018Date of Patent: August 18, 2020Assignee: Industrial Technology Research InstituteInventors: Wei-Hao Lai, Pei-Jung Liang, Peter Chondro, Tse-Min Chen, Shanq-Jang Ruan
-
Patent number: 10699430Abstract: In one of the exemplary embodiments, the disclosure is directed to a depth estimation apparatus including a first type of sensor for generating a first sensor data; a second type of sensor for generating a second sensor data; and a processor coupled to the first type of sensor and the second type of sensor and configured at least for: processing the first sensor data by using two stage segmentation algorithms to generate a first segmentation result and a second segmentation result; synchronizing parameters of the first segmentation result and parameters of the second sensor data to generate a synchronized second sensor data; fusing the first segmentation result, the synchronized second sensor data, and the second segmentation result by using two stage depth estimation algorithms to generate a first depth result and a second depth result.Type: GrantFiled: October 9, 2018Date of Patent: June 30, 2020Assignee: Industrial Technology Research InstituteInventors: Peter Chondro, Wei-Hao Lai, Pei-Jung Liang
-
Publication number: 20200184260Abstract: The disclosure is directed to an object detection method using a CNN model and an object detection apparatus thereof. In an aspect, the object detection method includes generating a sensor data; processing the sensor data by using a first object detection algorithm to generate a first object detection result; processing the first object detection result by using a plurality of stages of sparse update mapping algorithm to generate a plurality of stages of updated first object detection result; processing a first stage of the stages of updated first object detection result by using a plurality of stages of spatial pooling algorithm between each of stages of sparse update mapping algorithm; executing a plurality of stages of deep convolution layer algorithm to extract a plurality of feature results; and performing a detection prediction based on a last-stage feature result.Type: ApplicationFiled: December 11, 2018Publication date: June 11, 2020Applicant: Industrial Technology Research InstituteInventors: Wei-Hao Lai, Pei-Jung Liang, Peter Chondro, Tse-Min Chen, Shanq-Jang Ruan
-
Publication number: 20200111225Abstract: In one of the exemplary embodiments, the disclosure is directed to a depth estimation apparatus including a first type of sensor for generating a first sensor data; a second type of sensor for generating a second sensor data; and a processor coupled to the first type of sensor and the second type of sensor and configured at least for: processing the first sensor data by using two stage segmentation algorithms to generate a first segmentation result and a second segmentation result; synchronizing parameters of the first segmentation result and parameters of the second sensor data to generate a synchronized second sensor data; fusing the first segmentation result, the synchronized second sensor data, and the second segmentation result by using two stage depth estimation algorithms to generate a first depth result and a second depth result.Type: ApplicationFiled: October 9, 2018Publication date: April 9, 2020Applicant: Industrial Technology Research InstituteInventors: Peter Chondro, Wei-Hao Lai, Pei-Jung Liang
-
Patent number: 10600208Abstract: An object detecting device, an object detecting method and a non-transitory computer-readable medium are provided. The object detecting method includes the following steps: A classifier generates a current color image and a current gray scale image. The classifier generates an initial characteristic pattern from the current color image via a neural network algorithm. The classifier adjusts a current dimension of the initial characteristic pattern to generate an adjusted characteristic pattern according to a gray scale image dimension of the current gray scale image. The classifier concatenates the adjusted characteristic pattern and the current gray scale image to calculate a class confidence. The classifier determines whether the class confidence is larger than a confidence threshold, and outputs a current classification result if the class confidence is larger than the confidence threshold. A storage device stories the current classification result.Type: GrantFiled: June 13, 2018Date of Patent: March 24, 2020Assignee: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTEInventors: Pei-Jung Liang, Wei-Hao Lai
-
Publication number: 20190353774Abstract: In one of the exemplary embodiments, the disclosure is directed to an object detection system including a first type of sensor for generating a first sensor data; a second type of sensor for generating a second sensor data; and a processor coupled to the first type of sensor and the second type of sensor and configured at least for: processing the first sensor data by using a first plurality of object detection algorithms and processing the second sensor data by using a second plurality of object detection algorithms, wherein each of the first plurality of object detection algorithms and each of the second plurality of object detection algorithms include environmental parameters calculated from a plurality of parameter detection algorithms; and determining for each detected object a bounding box resulted from processing the first sensor data and processing the second sensor data.Type: ApplicationFiled: June 15, 2018Publication date: November 21, 2019Applicant: Industrial Technology Research InstituteInventors: Peter Chondro, Pei-Jung Liang
-
Publication number: 20190197729Abstract: An object detecting device, an object detecting method and a non-transitory computer-readable medium are provided. The object detecting method includes the following steps: A classifier generates a current color image and a current gray scale image. The classifier generates an initial characteristic pattern from the current color image via a neural network algorithm. The classifier adjusts a current dimension of the initial characteristic pattern to generate an adjusted characteristic pattern according to a gray scale image dimension of the current gray scale image. The classifier concatenates the adjusted characteristic pattern and the current gray scale image to calculate a class confidence. The classifier determines whether the class confidence is larger than a confidence threshold, and outputs a current classification result if the class confidence is larger than the confidence threshold. A storage device stories the current classification result.Type: ApplicationFiled: June 13, 2018Publication date: June 27, 2019Inventors: Pei-Jung LIANG, Wei-Hao LAI