Patents by Inventor Xiujuan Chai

Xiujuan Chai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250191361
    Abstract: A factory mushroom picking robot includes a mobile platform, a chassis (100), a lifting device (200), a mushroom stick take-up device (300) and a mushroom picking device (400). The mushroom stick take-up device (300) is configured to take mushroom sticks out of a mushroom rack (4), and the mushroom sticks from different layers are taken through the lifting device (200). The mushroom picking device (400) includes a mechanical arm (402) and an execution end (403). The image data of the mushroom sticks are acquired through three depth cameras. Mushroom targets to be picked are identified through multi-view target matching and are divided in terms of the quality grade, so as to achieve graded picking. A vision-based graded picking method, a mushroom detection and grading method based on multi-view fusion and a non-transitory storage medium of executing corresponding method are further provided.
    Type: Application
    Filed: February 13, 2025
    Publication date: June 12, 2025
    Inventors: Xiujuan CHAI, Shulong ZHAO, Shuo ZHOU, Ning ZHANG, Qixin SUN, Tan SUN
  • Publication number: 20250176482
    Abstract: This application relates to a collaborative control method for a picking robot based on collaborative picking and collection of mushrooms, including the following steps: picking, by a suction cup, the mushrooms, and conveying, by a built-in conveying apparatus, the mushrooms to a discharge port; dynamically adjusting a movement path of the picking robot; synchronously moving a receiving mechanism and the picking robot, and ensuring the receiving mechanism to be aligned to the discharge port; separating, by a separating apparatus, mushrooms that meet a quality standard and unqualified mushrooms, where the mushrooms that meet a quality standard and the unqualified mushrooms respectively enter a first dropping hopper and a second dropping hopper; grading the mushrooms in the first dropping hopper for a second time, classifying the mushrooms based on volumes and diameters of the mushrooms; when any dropping hopper is to be fully loaded, triggering the receiving mechanism to alternately operate.
    Type: Application
    Filed: February 12, 2025
    Publication date: June 5, 2025
    Inventors: Ning ZHANG, Xiujuan CHAI, Zhiyu SONG, Shuo ZHOU, Tan SUN
  • Patent number: 10817716
    Abstract: Embodiments provide a process to identify one or more areas containing a hand or hands of one or more subjects in an image. The detection process can start with coarsely locating one or more segments in the image that contain portions of the hand(s) of the subject(s) in the image using a coarse CNN. The detection process can then combine these segments to obtain the one or more areas capturing the hand(s) of the subject(s) in the image. The combined area(s) can then be fed to a grid-based deep neural network finely detect area(s) in the image that contain only the hand(s) of the subject(s) captured.
    Type: Grant
    Filed: December 20, 2018
    Date of Patent: October 27, 2020
    Assignees: MIDEA GROUP CO., LTD., SEETATECH (BEIJING) TECHNOLOGY CO., LTD.
    Inventors: Zixuan Yang, Dahai Yu, Zhuang Liu, Junyang Zhou, Xiujuan Chai, Shiguang Shan, Xilin Chen
  • Publication number: 20190122041
    Abstract: Embodiments provide a process to identify one or more areas containing a hand or hands of one or more subjects in an image. The detection process can start with coarsely locating one or more segments in the image that contain portions of the hand(s) of the subject(s) in the image using a coarse CNN. The detection process can then combine these segments to obtain the one or more areas capturing the hand(s) of the subject(s) in the image. The combined area(s) can then be fed to a grid-based deep neural network finely detect area(s) in the image that contain only the hand(s) of the subject(s) captured.
    Type: Application
    Filed: December 20, 2018
    Publication date: April 25, 2019
    Inventors: Zixuan Yang, Dahai Yu, Zhuang Liu, Junyang Zhou, Xiujuan Chai, Shiguang Shan, Xilin Chen
  • Patent number: 8433138
    Abstract: A computer interface may use touch- and non-touch-based gesture detection systems to detect touch and non-touch gestures on a computing device. The systems may each capture an image, and interpret the image as corresponding to a predetermined gesture. The systems may also generate similarity values to indicate the strength of a match between a captured image and corresponding gesture, and the system may combine gesture identifications from both touch- and non-touch-based gesture identification systems to ultimately determine the gesture. A threshold comparison algorithm may be used to apply different thresholds for different gesture detection systems and gesture types.
    Type: Grant
    Filed: October 29, 2008
    Date of Patent: April 30, 2013
    Assignee: Nokia Corporation
    Inventors: Hao Wang, Xiujuan Chai, Kun Yu
  • Patent number: 8379987
    Abstract: A method for providing hand segmentation for gesture analysis may include determining a target region based at least in part on depth range data corresponding to an intensity image. The intensity image may include data descriptive of a hand. The method may further include determining a point of interest of a hand portion of the target region, determining a shape corresponding to a palm region of the hand, and removing a selected portion of the target region to identify a portion of the target region corresponding to the hand. An apparatus and computer program product corresponding to the method are also provided.
    Type: Grant
    Filed: December 30, 2008
    Date of Patent: February 19, 2013
    Assignee: Nokia Corporation
    Inventors: Xiujuan Chai, Yikai Fang, Hao Wang, Kongqiao Wang
  • Patent number: 8325978
    Abstract: A method for providing adaptive gesture analysis may include dividing a distance range into a plurality of depth ranges, generating a plurality of intensity images for at least two image frames, each of the intensity images providing image data indicative of a presence of objects at a corresponding depth range for a respective image frame, determining motion variation between the two image frames for each corresponding depth range, and determining depth of a target based at least in part on the motion variation. An apparatus and computer program product corresponding to the method are also provided.
    Type: Grant
    Filed: October 30, 2008
    Date of Patent: December 4, 2012
    Assignee: Nokia Corporation
    Inventors: Xiujuan Chai, Kongqiao Wang
  • Publication number: 20100166258
    Abstract: A method for providing hand segmentation for gesture analysis may include determining a target region based at least in part on depth range data corresponding to an intensity image. The intensity image may include data descriptive of a hand. The method may further include determining a point of interest of a hand portion of the target region, determining a shape corresponding to a palm region of the hand, and removing a selected portion of the target region to identify a portion of the target region corresponding to the hand. An apparatus and computer program product corresponding to the method are also provided.
    Type: Application
    Filed: December 30, 2008
    Publication date: July 1, 2010
    Inventors: Xiujuan Chai, Yikai Fang, Hao Wang, Kongqiao Wang
  • Publication number: 20100111358
    Abstract: A method for providing adaptive gesture analysis may include dividing a distance range into a plurality of depth ranges, generating a plurality of intensity images for at least two image frames, each of the intensity images providing image data indicative of a presence of objects at a corresponding depth range for a respective image frame, determining motion variation between the two image frames for each corresponding depth range, and determining depth of a target based at least in part on the motion variation. An apparatus and computer program product corresponding to the method are also provided.
    Type: Application
    Filed: October 30, 2008
    Publication date: May 6, 2010
    Inventors: Xiujuan Chai, Kongqiao Wang
  • Publication number: 20100104134
    Abstract: A computer interface may use touch- and non-touch-based gesture detection systems to detect touch and non-touch gestures on a computing device. The systems may each capture an image, and interpret the image as corresponding to a predetermined gesture. The systems may also generate similarity values to indicate the strength of a match between a captured image and corresponding gesture, and the system may combine gesture identifications from both touch- and non-touch-based gesture identification systems to ultimately determine the gesture. A threshold comparison algorithm may be used to apply different thresholds for different gesture detection systems and gesture types.
    Type: Application
    Filed: October 29, 2008
    Publication date: April 29, 2010
    Applicant: Nokia Corporation
    Inventors: Hao Wang, Xiujuan Chai, Kun Yu
  • Publication number: 20090167760
    Abstract: Embodiments are directed to creating a triangle mesh by using a distance-minimum criterion on a plurality of feature points detected from an image, computing, based on the triangle mesh, global features that describe a global representation of content of the image, and computing, based on the triangle mesh, local features that describe a local representation of content of the image. The global features may include a triangle distribution scatter of mesh that shows a texture density of the content of the image and a color histogram of mesh region that represents image color information corresponding to a mesh region of interest. The local features may include a definition of each mesh triangle shape via its three angles and a color histogram of each mesh triangle to represent image color information corresponding to each triangle region.
    Type: Application
    Filed: December 27, 2007
    Publication date: July 2, 2009
    Applicant: Nokia Corporation
    Inventors: Kongqiao Wang, Xiujuan Chai