Patents by Inventor Zhiheng Niu
Zhiheng Niu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220188582Abstract: A method is provided for classifying a tracked object in an environment of a vehicle. The vehicle includes a plurality of radar sensors and a processing device configured to establish a neural network. According to the method, local radar detections are captured from an object in the environment of the vehicle via the radar sensors. Based on the local radar detections, point features and tracker features are determined. The point features are encoded via point encoding layers of the neural network, whereas the tracker features are encoded via track encoding layers of the neural network. A temporal fusion of the encoded point features and the encoded tracker features is performed via temporal fusion layers of the neural network. The tracked object is classified based on the fused encoded point and tracker features via classifying layers of the neural network.Type: ApplicationFiled: December 9, 2021Publication date: June 16, 2022Inventors: Weimeng Zhu, Florian Kaestner, Zhiheng Niu, Arne Grumpe
-
Publication number: 20220026568Abstract: A computer implemented method for detection of objects in a vicinity of a vehicle comprises the following steps carried out by computer hardware components: acquiring radar data from a radar sensor; determining a plurality of features based on the radar data; providing the plurality of features to a single detection head; and determining a plurality of properties of an object based on an output of the single detection head.Type: ApplicationFiled: July 23, 2021Publication date: January 27, 2022Inventors: Mirko Meuter, Jittu Kurian, Yu Su, Jan Siegemund, Zhiheng Niu, Stephanie Lessmann, Saeid Khalili Dehkordi, Florian Kästner, Igor Kossaczky, Sven Labusch, Arne Grumpe, Markus Schoeler, Moritz Luszek, Weimeng Zhu, Adrian Becker, Alessandro Cennamo, Kevin Kollek, Marco Braun, Dominic Spata, Simon Roesler
-
Patent number: 9465979Abstract: A measurement-target-selecting device that is capable of estimating a face shape with high precision and at low computational time. In this device, a face texture assessment value calculating part (103) calculates a face texture assessment value representing a degree of match between an input face image and the texture of a face shape candidate, a facial-expression-change-likelihood-calculating part (104) calculates a first likelihood between a face shape constituting a reference and a face shape candidate, a correlation assessment part (105); calculates a first correlation assessment value representing the strength of a correlation between the face texture assessment value and the first likelihood, and a selection part (107) selects from among the plurality of face shape candidates as a measurement target a face shape candidate having a first correlation assessment value that is lower than a first threshold.Type: GrantFiled: December 4, 2012Date of Patent: October 11, 2016Assignee: PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICAInventors: Sotaro Tsukizawa, Hiroyuki Kubotani, ZhiHeng Niu, Sugiri Pranata
-
Patent number: 9367758Abstract: Device (10) comprises: a comparison object pixel acquisition unit (433) which acquires pixel values of a plurality of comparison object pixels which, when each pixel of an image is designated as a pixel of interest, and a ring-shaped region with the pixel of interest being designated as the center thereof being designated a vicinity region, said comparison object pixels are included in the vicinity region; a pixel difference calculation unit (434) which calculates the difference between the pixel value of the pixel of interest and the pixel values of each comparison object pixel; and a local binary pattern generation unit (435) which generates a local binary pattern for each pixel. A plurality of vicinity regions are present for each pixel of interest, and the distance of the vicinity regions are established on the basis of the spatial frequency characteristics of a lens with which the image is photographed.Type: GrantFiled: January 11, 2013Date of Patent: June 14, 2016Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.Inventors: Yunyun Cao, Hirofumi Nishimura, Sugiri Pranata, Zhiheng Niu
-
Patent number: 9082002Abstract: A detection device capable of reliably detecting an object to be detected. An intersection region pattern setting unit (106) sets a configuration pattern of a first intersection region pattern group in sequence for each unit image pair. Each intersection region pattern is defined by set image information which denotes locations and sizes of regions (where n is a natural number greater than 1) within respective unit images (e.g., unit image plane coordinates), as well as whether each region is set within either or both of a first unit image and a second unit image. A detection unit (108) detects the object to be detected, based on a total feature value relating to each configuration pattern of the first intersection region pattern group, computed by a feature value computation unit (107), and a strong identification apparatus configured from a plurality of weak identification apparatuses and stored in an identification apparatus storage unit (112).Type: GrantFiled: December 20, 2011Date of Patent: July 14, 2015Assignee: PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICAInventors: Sotaro Tsukizawa, Hiroyuki Kubotani, Zhiheng Niu, Sugiri Pranata
-
Patent number: 9053384Abstract: Provided is a feature extraction device whereby it is possible, while using local binary patterns, to extract image features with which object detection which is robust against disparities in a photographic environment is possible. A feature extraction unit (440) comprises: a binary pattern generation unit (443) which generates, for each of all pixels or partial pixels in an image, local binary patterns which denote, by bit values, whether the difference in pixel values between the pixel and the surrounding adjacent pixels is greater than or equal to a threshold value; a weighting generation unit (444) which determines, for each generated local binary pattern, a weighting according to the pixel value difference; and a histogram generation unit (445) which applies the determined weightings to the corresponding local binary patterns and generates a histogram which denotes the distribution of the local binary patterns which are generated from the image.Type: GrantFiled: January 13, 2012Date of Patent: June 9, 2015Assignee: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD.Inventors: Yunyun Cao, Hirofumi Nishimura, Sugiri Pranata, Zhiheng Niu
-
Publication number: 20150016679Abstract: Device (10) comprises: a comparison object pixel acquisition unit (433) which acquires pixel values of a plurality of comparison object pixels which, when each pixel of an image is designated as a pixel of interest, and a ring-shaped region with the pixel of interest being designated as the center thereof being designated a vicinity region, said comparison object pixels are included in the vicinity region; a pixel difference calculation unit (434) which calculates the difference between the pixel value of the pixel of interest and the pixel values of each comparison object pixel; and a local binary pattern generation unit (435) which generates a local binary pattern for each pixel. A plurality of vicinity regions are present for each pixel of interest, and the distance of the vicinity regions are established on the basis of the spatial frequency characteristics of a lens with which the image is photographed.Type: ApplicationFiled: January 11, 2013Publication date: January 15, 2015Inventors: Yunyun Cao, Hirofumi Nishimura, Sugiri Pranata, Zhiheng Niu
-
Publication number: 20140369571Abstract: A measurement-target-selecting device that is capable of estimating a face shape with high precision and at low computational time. In this device, a face texture assessment value calculating part (103) calculates a face texture assessment value representing a degree of match between an input face image and the texture of a face shape candidate, a facial-expression-change-likelihood-calculating part (104) calculates a first likelihood between a face shape constituting a reference and a face shape candidate, a correlation assessment part (105); calculates a first correlation assessment value representing the strength of a correlation between the face texture assessment value and the first likelihood, and a selection part (107) selects from among the plurality of face shape candidates as a measurement target a face shape candidate having a first correlation assessment value that is lower than a first threshold.Type: ApplicationFiled: December 4, 2012Publication date: December 18, 2014Inventors: Sotaro Tsukizawa, Hiroyuki Kubotani, ZhiHeng Niu, Sugiri Pranata
-
Publication number: 20130163870Abstract: Provided is a feature extraction device whereby it is possible, while using local binary patterns, to extract image features with which object detection which is robust against disparities in a photographic environment is possible. A feature extraction unit (440) comprises: a binary pattern generation unit (443) which generates, for each of all pixels or partial pixels in an image, local binary patterns which denote, by bit values, whether the difference in pixel values between the pixel and the surrounding adjacent pixels is greater than or equal to a threshold value; a weighting generation unit (444) which determines, for each generated local binary pattern, a weighting according to the pixel value difference; and a histogram generation unit (445) which applies the determined weightings to the corresponding local binary patterns and generates a histogram which denotes the distribution of the local binary patterns which are generated from the image.Type: ApplicationFiled: January 13, 2012Publication date: June 27, 2013Applicant: Panasonic CorporationInventors: Yunyun Cao, Hirofumi Nishimura, Sugiri Pranata, Zhiheng Niu
-
Publication number: 20130142416Abstract: A detection device capable of reliably detecting an object to be detected. An intersection region pattern setting unit (106) sets a configuration pattern of a first intersection region pattern group in sequence for each unit image pair. Each intersection region pattern is defined by set image information which denotes locations and sizes of regions (where n is a natural number greater than 1) within respective unit images (e.g., unit image plane coordinates), as well as whether each region is set within either or both of a first unit image and a second unit image. A detection unit (108) detects the object to be detected, based on a total feature value relating to each configuration pattern of the first intersection region pattern group, computed by a feature value computation unit (107), and a strong identification apparatus configured from a plurality of weak identification apparatuses and stored in an identification apparatus storage unit (112).Type: ApplicationFiled: December 20, 2011Publication date: June 6, 2013Applicant: PANASONIC CORPORATIONInventors: Sotaro Tsukizawa, Hiroyuki Kubotani, Zhiheng Niu, Sugiri Pranata