Patents by Inventor Crystal Chao

Crystal Chao has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11957807
    Abstract: A cleaning robot may determine a three-dimensional model of a physical environment based on data collected from one or more sensors. The cleaning robot may then identify a surface within the physical environment to clean. Having identified that surface, the robot may autonomously navigate to a location proximate to the surface, position an ultraviolet light source in proximity to the surface, and activate the ultraviolet light source for a period of time.
    Type: Grant
    Filed: March 22, 2021
    Date of Patent: April 16, 2024
    Assignee: Robust AI, Inc.
    Inventors: Rodney Allen Brooks, Dylan Bourgeois, Crystal Chao, Alexander Jay Bruen Trevor, Mohamed Rabie Amer, Anthony Sean Jules, Gary Fred Marcus
  • Patent number: 11717587
    Abstract: A model of a physical environment may be determined based at least in part on sensor data collected by one or more sensors at a robot. The model may include a plurality of constraints and a plurality of data values. A trajectory through the physical environment may be determined for an ultraviolet end effector coupled with the robot to clean one or more surfaces in the physical environment. The ultraviolet end effector may include one or more ultraviolet light sources. The ultraviolet end effector may be moved along the trajectory.
    Type: Grant
    Filed: March 19, 2021
    Date of Patent: August 8, 2023
    Assignee: Robust AI, Inc.
    Inventors: Alexander Jay Bruen Trevor, Dylan Bourgeois, Marina Kollmitz, Crystal Chao
  • Publication number: 20220347860
    Abstract: An interactive robot having mechanical torso, limbs, and head assembled for movement with multiple degrees of freedom to enable life-like movements and responses. The robot's head may further include LED displays as the eyes and mouth, and a speaker associated with the mouth. These features enable life-like audio and visual responses, including complex facial expressions and conversational audio interaction.
    Type: Application
    Filed: July 15, 2022
    Publication date: November 3, 2022
    Inventors: Crystal Chao, Kristopher Li
  • Publication number: 20210346557
    Abstract: A robot may identify a human located proximate to the robot in a physical environment based on sensor data captured from one or more sensors on the robot. A trajectory of the human through space may be predicted. When the predicted trajectory of the human intersects with a current path of the robot, an updated path to a destination location in the environment may be determined so as to avoid a collision between the robot and the human along the predicted trajectory. The robot may then move along the determined path.
    Type: Application
    Filed: March 19, 2021
    Publication date: November 11, 2021
    Applicant: Robust AI, Inc.
    Inventors: Rodney Allen Brooks, Dylan Bourgeois, Crystal Chao, Alexander Jay Bruen Trevor, Mohamed Rabie Amer, Anthony Sean Jules, Gary Fred Marcus, Michelle Ho
  • Publication number: 20210346543
    Abstract: A cleaning robot may determine a three-dimensional model of a physical environment based on data collected from one or more sensors. The cleaning robot may then identify a surface within the physical environment to clean. Having identified that surface, the robot may autonomously navigate to a location proximate to the surface, position an ultraviolet light source in proximity to the surface, and activate the ultraviolet light source for a period of time.
    Type: Application
    Filed: March 22, 2021
    Publication date: November 11, 2021
    Applicant: Robust AI, Inc.
    Inventors: Rodney Allen Brooks, Dylan Bourgeois, Crystal Chao, Alexander Jay Bruen Trevor, Mohamed Rabie Amer, Anthony Sean Jules, Gary Fred Marcus
  • Publication number: 20210347048
    Abstract: A model of a physical environment may be determined based at least in part on sensor data collected by one or more sensors at a robot. The model may include a plurality of constraints and a plurality of data values. A trajectory through the physical environment may be determined for an ultraviolet end effector coupled with the robot to clean one or more surfaces in the physical environment. The ultraviolet end effector may include one or more ultraviolet light sources. The ultraviolet end effector may be moved along the trajectory.
    Type: Application
    Filed: March 19, 2021
    Publication date: November 11, 2021
    Applicant: Robust AI, Inc.
    Inventors: Alexander Jay Bruen Trevor, Dylan Bourgeois, Marina Kollmitz, Crystal Chao
  • Patent number: 11027425
    Abstract: Methods, apparatus, systems, and computer-readable media are provided for enabling users to approximately identify a space within an environment inhabited by a plurality of objects that user wishes for a robot to manipulate. In various implementations, an approximation of a space within an environment may be identified based on user input. The actual space within the environment may then be extrapolated based at least in part on the approximation and one or more attributes of the environment. A plurality of objects that are co-present within the space and that are to be manipulated by a robot may be identified. The robot may then be operated to manipulate the identified plurality of objects.
    Type: Grant
    Filed: July 20, 2018
    Date of Patent: June 8, 2021
    Assignee: X DEVELOPMENT LLC
    Inventor: Crystal Chao
  • Patent number: 10166676
    Abstract: Some implementations are directed to methods and apparatus for determining, based on sensor data generated during physical manipulation of a robot by a user, one or more grasp parameters to associate with an object model. Some implementations are directed to methods and apparatus for determining control commands to provide to actuator(s) of a robot to attempt a grasp of an object, where those control commands are determined based on grasp parameters associated with an object model that conforms to the object. The grasp parameter(s) associated with an object model may include end effector pose(s) that each define a pose of a grasping end effector relative to the object model and/or translational force measure(s) that each indicate force applied to an object by a grasping end effector, where the force is at least partially the result of translation of an entirety of the grasping end effector.
    Type: Grant
    Filed: June 8, 2016
    Date of Patent: January 1, 2019
    Assignee: X DEVELOPMENT LLC
    Inventors: Nicolas Henry Hudson, Crystal Chao
  • Patent number: 10058997
    Abstract: Methods, apparatus, systems, and computer-readable media are provided for enabling users to approximately identify a space within an environment inhabited by a plurality of objects that user wishes for a robot to manipulate. In various implementations, an approximation of a space within an environment may be identified based on user input. The actual space within the environment may then be extrapolated based at least in part on the approximation and one or more attributes of the environment. A plurality of objects that are co-present within the space and that are to be manipulated by a robot may be identified. The robot may then be operated to manipulate the identified plurality of objects.
    Type: Grant
    Filed: June 16, 2016
    Date of Patent: August 28, 2018
    Assignee: X DEVELOPMENT LLC
    Inventor: Crystal Chao