Patents by Inventor Youding Zhu

Youding Zhu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8795077
    Abstract: A personal control mechanism is disclosed. The personal control mechanism includes at least one controller with full functionalities in a sense that the controller does not need any attachment and is operable with one hand to fully control an interactive environment being shown. According to one embodiment of the controller, the controller has a top surface including a number of buttons and a joystick operable by a finger of one hand, and a bottom surface including a trigger operable by another finger of the hand. A housing of the controller is sized comfortably to fit into either hand of a user and has a holding portion designed in a way that the trigger, the buttons and the joystick are readily operated when the holding portion is being held in either hand.
    Type: Grant
    Filed: October 24, 2010
    Date of Patent: August 5, 2014
    Assignee: AiLive Inc.
    Inventors: Charles Musick, Jr., Robert Kay, William Robert Powers, III, Dana Wilkinson, Youding Zhu
  • Patent number: 8795078
    Abstract: Techniques for providing compatibility between two different game controllers are disclosed. When a new or more advanced controller is introduced, it is important that such a new controller works with a system originally configured for an existing or old controller. The new controller may provide more functionalities than the old one does. In some cases, the new controller provides more sensing signals than the old one does. The new controller is configured to work with the system to transform the sensing signals therefrom to masquerade as though they were coming from the old controller. The transforming of the sensing signals comprises: replicating operational characteristics of the old controller, and relocating virtually the sensing signals to appear as though the sensing signals were generated from inertial sensors located in a certain location in the new controller responsive to a certain location of the inertial sensors in the old controller.
    Type: Grant
    Filed: October 24, 2010
    Date of Patent: August 5, 2014
    Assignee: AiLive Inc.
    Inventors: Charles Musick, Jr., Robert Kay, William Robert Powers, III, Dana Wilkinson, Youding Zhu
  • Patent number: 8781162
    Abstract: Techniques for performing accurate and automatic head pose estimation are disclosed. According to one aspect of the techniques, head pose estimation is integrated with a scale-invariant head tracking method along with facial features detected from a located head in images. Thus the head pose estimation works efficiently even when there are large translational movements resulting from the head motion. Various computation techniques are used to optimize the process of estimation so that the head pose estimation can be applied to control one or more objects in a virtual environment and virtual character gaze control.
    Type: Grant
    Filed: January 5, 2011
    Date of Patent: July 15, 2014
    Assignee: AiLive Inc.
    Inventors: Youding Zhu, Charles Musick, Jr., Robert Kay, William Robert Powers, III, Dana Wilkinson, Stuart Reynolds
  • Patent number: 8351646
    Abstract: A method and apparatus for estimating poses of a subject by grouping data points generated by a depth image into groups representing labeled parts of the subject, and then fitting a model representing the subject to the data points using the grouping of the data points. The grouping of the data points is performed by grouping the data points to segments based on proximity of the data points, and then using constraint conditions to assign the segments to the labeled parts. The model is fitted to the data points by using the grouping of the data points to the labeled parts.
    Type: Grant
    Filed: October 9, 2007
    Date of Patent: January 8, 2013
    Assignee: Honda Motor Co., Ltd.
    Inventors: Kikuo Fujimura, Youding Zhu
  • Publication number: 20120256835
    Abstract: Techniques for using a motion sensitive device as a controller are disclosed. A motion controller as an input/control device is used to control an existing electronic device (a.k.a., controlled device) previously configured for taking inputs from a pre-defined controlling device. The signals from the input device are in a different form from the pre-defined controlling device. According to one aspect of the present invention, the controlled device was designed to respond to signals from a pre-defined controlling device (e.g., a touch-screen device). The inputs from the motion controller are converted into touch-screen like signals that are then sent to the controlled device or programs being executed in the controlled device to cause the behavior of the controlled device to change or respond thereto, without reconfiguration of the applications running on the controlled device.
    Type: Application
    Filed: September 30, 2011
    Publication date: October 11, 2012
    Applicant: Ailive Inc.
    Inventors: Charles Musick,, JR., Robert Kay, Stuart Reynolds, Dana Wilkinson, Anupam Chakravorty, William Robert Powers, III, Wei Yen, Youding Zhu
  • Publication number: 20120208639
    Abstract: Techniques for using a variety of motion sensitive signals to remotely control an existing electronic device or system are described. Output signals from a motion sensitive device may be in a different form from those of a pre-defined controlling device. According to one aspect of the present invention, a controlled device is designed to respond to signals from a touch screen or touch screen-like signals. The output signals from a motion sensitive device include motion sensitive inputs to a controlled device and converted into touch-screen like signals that are coupled to the controlled device or programs being executed in the controlled device, subsequently causing the behavior of the controlled device to change or respond thereto, without reconfiguration of the applications running on the controlled device.
    Type: Application
    Filed: December 1, 2011
    Publication date: August 16, 2012
    Applicant: AiLive Inc.
    Inventors: Stuart Reynolds, Charles Musick, JR., Dana Wilkinson, Wei Yen, Youding Zhu
  • Publication number: 20120169887
    Abstract: Techniques for performing accurate and automatic head pose estimation are disclosed. According to one aspect of the techniques, head pose estimation is integrated with a scale-invariant head tracking method along with facial features detected from a located head in images. Thus the head pose estimation works efficiently even when there are large translational movements resulting from the head motion. Various computation techniques are used to optimize the process of estimation so that the head pose estimation can be applied to control one or more objects in a virtual environment and virtual character gaze control.
    Type: Application
    Filed: January 5, 2011
    Publication date: July 5, 2012
    Applicant: AiLive Inc.
    Inventors: Youding Zhu, Charles Musick, JR., Robert Kay, William Robert Powers, III, Dana Wilkinson, Stuart Reynolds
  • Patent number: 8170287
    Abstract: A system, method, and computer program product for avoiding collision of a body segment with unconnected structures in an articulated system are described. A virtual surface is constructed surrounding an actual surface of the body segment. Distances between the body segment and unconnected structures are monitored. Responding to an unconnected structure penetrating the virtual surface, a redirected joint motion that prevents the unconnected structure from penetrating deeper into the virtual surface is determined. The body segment is redirected based on the redirected joint motion to avoid colliding with the unconnected structure.
    Type: Grant
    Filed: October 24, 2008
    Date of Patent: May 1, 2012
    Assignee: Honda Motor Co., Ltd.
    Inventors: Behzad Dariush, Youding Zhu, Kikuo Fujimura
  • Patent number: 8031906
    Abstract: A system for estimating orientation of a target based on real-time video data uses depth data included in the video to determine the estimated orientation. The system includes a time-of-flight camera capable of depth sensing within a depth window. The camera outputs hybrid image data (color and depth). Segmentation is performed to determine the location of the target within the image. Tracking is used to follow the target location from frame to frame. During a training mode, a target-specific training image set is collected with a corresponding orientation associated with each frame. During an estimation mode, a classifier compares new images with the stored training set to determine an estimated orientation. A motion estimation approach uses an accumulated rotation/translation parameter calculation based on optical flow and depth constrains. The parameters are reset to a reference value each time the image corresponds to a dominant orientation.
    Type: Grant
    Filed: October 2, 2009
    Date of Patent: October 4, 2011
    Assignee: Honda Motor Co., Ltd.
    Inventors: Kikuo Fujimura, Youding Zhu
  • Publication number: 20100034427
    Abstract: A system for estimating orientation of a target based on real-time video data uses depth data included in the video to determine the estimated orientation. The system includes a time-of-flight camera capable of depth sensing within a depth window. The camera outputs hybrid image data (color and depth). Segmentation is performed to determine the location of the target within the image. Tracking is used to follow the target location from frame to frame. During a training mode, a target-specific training image set is collected with a corresponding orientation associated with each frame. During an estimation mode, a classifier compares new images with the stored training set to determine an estimated orientation. A motion estimation approach uses an accumulated rotation/translation parameter calculation based on optical flow and depth constrains. The parameters are reset to a reference value each time the image corresponds to a dominant orientation.
    Type: Application
    Filed: October 2, 2009
    Publication date: February 11, 2010
    Inventors: Kikuo Fujimura, Youding Zhu
  • Patent number: 7620202
    Abstract: A system for estimating orientation of a target based on real-time video data uses depth data included in the video to determine the estimated orientation. The system includes a time-of-flight camera capable of depth sensing within a depth window. The camera outputs hybrid image data (color and depth). Segmentation is performed to determine the location of the target within the image. Tracking is used to follow the target location from frame to frame. During a training mode, a target-specific training image set is collected with a corresponding orientation associated with each frame. During an estimation mode, a classifier compares new images with the stored training set to determine an estimated orientation. A motion estimation approach uses an accumulated rotation/translation parameter calculation based on optical flow and depth constrains. The parameters are reset to a reference value each time the image corresponds to a dominant orientation.
    Type: Grant
    Filed: June 14, 2004
    Date of Patent: November 17, 2009
    Assignees: Honda Motor Co., Ltd., The Ohio State University Research Foundation
    Inventors: Kikuo Fujimura, Youding Zhu
  • Publication number: 20090175540
    Abstract: A system, method, and computer program product for estimating upper body human pose are described. According to one aspect, a plurality of anatomical features are detected in a depth image of the human actor. The method detects a head, neck, and torso (H-N-T) template in the depth image, and detects the features in the depth image based on the H-N-T template. An estimated pose of a human model is estimated based on the detected features and kinematic constraints of the human model.
    Type: Application
    Filed: December 19, 2008
    Publication date: July 9, 2009
    Applicant: Honda Motor Co., Ltd.
    Inventors: Behzad Dariush, Youding Zhu, Kikuo Fujimura
  • Publication number: 20090074252
    Abstract: A system, method, and computer program product for avoiding collision of a body segment with unconnected structures in an articulated system are described. A virtual surface is constructed surrounding an actual surface of the body segment. Distances between the body segment and unconnected structures are monitored. Responding to an unconnected structure penetrating the virtual surface, a redirected joint motion that prevents the unconnected structure from penetrating deeper into the virtual surface is determined. The body segment is redirected based on the redirected joint motion to avoid colliding with the unconnected structure.
    Type: Application
    Filed: October 24, 2008
    Publication date: March 19, 2009
    Applicant: Honda Motor Co., Ltd.
    Inventors: BEHZAD DARIUSH, Youding Zhu, Kikuo Fujimura
  • Publication number: 20080152191
    Abstract: A method and apparatus for estimating poses of a subject by grouping data points generated by a depth image into groups representing labeled parts of the subject, and then fitting a model representing the subject to the data points using the grouping of the data points. The grouping of the data points is performed by grouping the data points to segments based on proximity of the data points, and then using constraint conditions to assign the segments to the labeled parts. The model is fitted to the data points by using the grouping of the data points to the labeled parts.
    Type: Application
    Filed: October 9, 2007
    Publication date: June 26, 2008
    Applicant: HONDA MOTOR CO., LTD.
    Inventors: Kikuo Fujimura, Youding Zhu
  • Patent number: 7317836
    Abstract: Methods and systems for estimating a pose of a subject. The subject can be a human, an animal, a robot, or the like. A camera receives depth information associated with a subject, a pose estimation module to determine a pose or action of the subject from images, and an interaction module to output a response to the perceived pose or action. The pose estimation module separates portions of the image containing the subject into classified and unclassified portions. The portions can be segmented using k-means clustering. The classified portions can be known objects, such as a head and a torso, that are tracked across the images. The unclassified portions are swept across an x and y axis to identify local minimums and local maximums. The critical points are derived from the local minimums and local maximums. Potential joint sections are identified by connecting various critical points, and the joint sections having sufficient probability of corresponding to an object on the subject are selected.
    Type: Grant
    Filed: March 17, 2006
    Date of Patent: January 8, 2008
    Assignee: Honda Motor Co., Ltd.
    Inventors: Kikuo Fujimura, Youding Zhu
  • Publication number: 20060274947
    Abstract: Methods and systems for estimating a pose of a subject. The subject can be a human, an animal, a robot, or the like. A camera receives depth information associated with a subject, a pose estimation module to determine a pose or action of the subject from images, and an interaction module to output a response to the perceived pose or action. The pose estimation module separates portions of the image containing the subject into classified and unclassified portions. The portions can be segmented using k-means clustering. The classified portions can be known objects, such as a head and a torso, that are tracked across the images. The unclassified portions are swept across an x and y axis to identify local minimums and local maximums. The critical points are derived from the local minimums and local maximums. Potential joint sections are identified by connecting various critical points, and the joint sections having sufficient probability of corresponding to an object on the subject are selected.
    Type: Application
    Filed: March 17, 2006
    Publication date: December 7, 2006
    Inventors: Kikuo Fujimura, Youding Zhu
  • Publication number: 20050058337
    Abstract: A system for estimating orientation of a target based on real-time video data uses depth data included in the video to determine the estimated orientation. The system includes a time-of-flight camera capable of depth sensing within a depth window. The camera outputs hybrid image data (color and depth). Segmentation is performed to determine the location of the target within the image. Tracking is used to follow the target location from frame to frame. During a training mode, a target-specific training image set is collected with a corresponding orientation associated with each frame. During an estimation mode, a classifier compares new images with the stored training set to determine an estimated orientation. A motion estimation approach uses an accumulated rotation/translation parameter calculation based on optical flow and depth constrains. The parameters are reset to a reference value each time the image corresponds to a dominant orientation.
    Type: Application
    Filed: June 14, 2004
    Publication date: March 17, 2005
    Inventors: Kikuo Fujimura, Youding Zhu