Patents by Inventor Paul Bechard

Paul Bechard has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11938638
    Abstract: Active utilization of a robotic simulator in control of one or more real world robots. A simulated environment of the robotic simulator can be configured to reflect a real world environment in which a real robot is currently disposed, or will be disposed. The robotic simulator can then be used to determine a sequence of robotic actions for use by the real world robot(s) in performing at least part of a robotic task. The sequence of robotic actions can be applied, to a simulated robot of the robotic simulator, to generate a sequence of anticipated simulated state data instances. The real robot can be controlled to implement the sequence of robotic actions. The implementation of one or more of the robotic actions can be contingent on a real state data instance having at least a threshold degree of similarity to a corresponding one of the anticipated simulated state data instances.
    Type: Grant
    Filed: June 3, 2021
    Date of Patent: March 26, 2024
    Assignee: GOOGLE LLC
    Inventors: Yunfei Bai, Tigran Gasparian, Brent Austin, Andreas Christiansen, Matthew Bennice, Paul Bechard
  • Publication number: 20240058954
    Abstract: Implementations are provided for training robot control policies using augmented reality (AR) sensor data comprising physical sensor data injected with virtual objects. In various implementations, physical pose(s) of physical sensor(s) of a physical robot operating in a physical environment may be determined. Virtual pose(s) of virtual object(s) in the physical environment may also be determined. Based on the physical poses virtual poses, the virtual object(s) may be injected into sensor data generated by the one or more physical sensors to generate AR sensor data. The physical robot may be operated in the physical environment based on the AR sensor data and a robot control policy. The robot control policy may be trained based on virtual interactions between the physical robot and the one or more virtual objects.
    Type: Application
    Filed: August 18, 2022
    Publication date: February 22, 2024
    Inventors: Matthew Bennice, Paul Bechard, Joséphine Simon, Jiayi Lin
  • Publication number: 20240033904
    Abstract: Implementations are provided for operably coupling multiple robot controllers to a single virtual environment, e.g., to generate training examples for training machine learning model(s). In various implementations, a virtual environment may be simulated that includes an interactive object and a plurality of robot avatars that are controlled independently and contemporaneously by a corresponding plurality of robot controllers that are external from the virtual environment. Sensor data generated from a perspective of each robot avatar of the plurality of robot avatars may be provided to a corresponding robot controller. Joint commands that cause actuation of one or more joints of each robot avatar may be received from the corresponding robot controller. Joint(s) of each robot avatar may be actuated pursuant to corresponding joint commands. The actuating may cause two or more of the robot avatars to act upon the interactive object in the virtual environment.
    Type: Application
    Filed: October 12, 2023
    Publication date: February 1, 2024
    Inventors: Matthew Bennice, Paul Bechard
  • Patent number: 11845190
    Abstract: Implementations are provided for increasing realism of robot simulation by injecting noise into various aspects of the robot simulation. In various implementations, a three-dimensional (3D) environment may be simulated and may include a simulated robot controlled by an external robot controller. Joint command(s) issued by the robot controller and/or simulated sensor data passed to the robot controller may be intercepted. Noise may be injected into the joint command(s) to generate noisy commands. Additionally or alternatively, noise may be injected into the simulated sensor data to generate noisy sensor data. Joint(s) of the simulated robot may be operated in the simulated 3D environment based on the one or more noisy commands. Additionally or alternatively, the noisy sensor data may be provided to the robot controller to cause the robot controller to generate joint commands to control the simulated robot in the simulated 3D environment.
    Type: Grant
    Filed: June 2, 2021
    Date of Patent: December 19, 2023
    Assignee: GOOGLE LLC
    Inventors: Matthew Bennice, Paul Bechard, Joséphine Simon, Chuyuan Fu, Wenlong Lu
  • Patent number: 11813748
    Abstract: Implementations are provided for operably coupling multiple robot controllers to a single virtual environment, e.g., to generate training examples for training machine learning model(s). In various implementations, a virtual environment may be simulated that includes an interactive object and a plurality of robot avatars that are controlled independently and contemporaneously by a corresponding plurality of robot controllers that are external from the virtual environment. Sensor data generated from a perspective of each robot avatar of the plurality of robot avatars may be provided to a corresponding robot controller. Joint commands that cause actuation of one or more joints of each robot avatar may be received from the corresponding robot controller. Joint(s) of each robot avatar may be actuated pursuant to corresponding joint commands. The actuating may cause two or more of the robot avatars to act upon the interactive object in the virtual environment.
    Type: Grant
    Filed: October 13, 2020
    Date of Patent: November 14, 2023
    Assignee: GOOGLE LLC
    Inventors: Matthew Bennice, Paul Bechard
  • Patent number: 11654550
    Abstract: Implementations are described herein for single iteration, multiple permutation robot simulation. In various implementations, one or more poses of a simulated object may be determined across one or more virtual environments. A plurality of simulated robots may be operated across the one or more virtual environments. For each simulated robot of the plurality of simulated robots, a camera transformation may be determined based on respective poses of the simulated robot and simulated object in the particular virtual environment. The camera transformation may be applied to the simulated object in the particular virtual environment of the one or more virtual environments in which the simulated robot operates. Based on the camera transformation, simulated vision data may be rendered that depicts the simulated object from a perspective of the simulated robot. Each of the plurality of simulated robots may be operated based on corresponding simulated vision data.
    Type: Grant
    Filed: November 13, 2020
    Date of Patent: May 23, 2023
    Assignee: X DEVELOPMENT LLC
    Inventors: Paul Bechard, Matthew Bennice
  • Publication number: 20220288782
    Abstract: Implementations are provided for controlling a plurality of simulated robots in a virtual environment using a single robot controller. In various implementations, a three-dimensional (3D) environment may be simulated that includes a plurality of simulated robots controlled by a single robot controller. Multiple instances of an interactive object may be rendered in the simulated 3D environment. Each instance of the interactive object may have a simulated physical characteristics such as a pose that is unique among the multiple instances of the interactive object. A common set of joint commands may be received from the single robot controller. The common set of joint commands may be issued to each of the plurality of simulated robots. For each simulated robot of the plurality of simulated robots, the common command may cause actuation of one or more joints of the simulated robot to interact with a respective instance of the interactive object in the simulated 3D environment.
    Type: Application
    Filed: March 10, 2021
    Publication date: September 15, 2022
    Inventors: Matthew Bennice, Paul Bechard
  • Publication number: 20220203535
    Abstract: Active utilization of a robotic simulator in control of one or more real world robots. A simulated environment of the robotic simulator can be configured to reflect a real world environment in which a real robot is currently disposed, or will be disposed. The robotic simulator can then be used to determine a sequence of robotic actions for use by the real world robot(s) in performing at least part of a robotic task. The sequence of robotic actions can be applied, to a simulated robot of the robotic simulator, to generate a sequence of anticipated simulated state data instances. The real robot can be controlled to implement the sequence of robotic actions. The implementation of one or more of the robotic actions can be contingent on a real state data instance having at least a threshold degree of similarity to a corresponding one of the anticipated simulated state data instances.
    Type: Application
    Filed: June 3, 2021
    Publication date: June 30, 2022
    Inventors: Yunfei Bai, Tigran Gasparian, Brent Austin, Andreas Christiansen, Matthew Bennice, Paul Bechard
  • Publication number: 20220111517
    Abstract: Implementations are provided for operably coupling multiple robot controllers to a single virtual environment, e.g., to generate training examples for training machine learning model(s). In various implementations, a virtual environment may be simulated that includes an interactive object and a plurality of robot avatars that are controlled independently and contemporaneously by a corresponding plurality of robot controllers that are external from the virtual environment. Sensor data generated from a perspective of each robot avatar of the plurality of robot avatars may be provided to a corresponding robot controller. Joint commands that cause actuation of one or more joints of each robot avatar may be received from the corresponding robot controller. Joint(s) of each robot avatar may be actuated pursuant to corresponding joint commands. The actuating may cause two or more of the robot avatars to act upon the interactive object in the virtual environment.
    Type: Application
    Filed: October 13, 2020
    Publication date: April 14, 2022
    Inventors: Matthew Bennice, Paul Bechard
  • Patent number: 10922889
    Abstract: Systems and methods for drawing attention to points of interest within inserted content are provided. For example, the inserted content may include augmented reality content that is inserted into a physical space or a representation of the physical space such as an image. An example system and method may include receiving an image and identifying content to display over the image. The system and method may also include identifying a location within the image to display the content and identifying a point of interest of the content. Additionally, the example system and method may also include triggering display of the content overlaid on the image by identifying a portion of the content based on the point of interest, rendering the portion of the content using first shading parameters; and rendering the content other than the portion using second shading parameters.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: February 16, 2021
    Assignee: Google LLC
    Inventors: Xavier Benavides Palos, Brett Barros, Paul Bechard
  • Publication number: 20200273251
    Abstract: Systems and methods for drawing attention to points of interest within inserted content are provided. For example, the inserted content may include augmented reality content that is inserted into a physical space or a representation of the physical space such as an image. An example system and method may include receiving an image and identifying content to display over the image. The system and method may also include identifying a location within the image to display the content and identifying a point of interest of the content. Additionally, the example system and method may also include triggering display of the content overlaid on the image by identifying a portion of the content based on the point of interest, rendering the portion of the content using first shading parameters; and rendering the content other than the portion using second shading parameters.
    Type: Application
    Filed: November 19, 2018
    Publication date: August 27, 2020
    Inventors: Xavier Benavides Palos, Brett Barros, Paul Bechard
  • Publication number: 20200030260
    Abstract: In certain embodiments, the present disclosure is directed to methods and uses for treating a mammal having an epileptic seizure disorder or being at risk for having an epileptic seizure disorder, comprising administering certain herein disclosed isolated fenfluramine enantiomers that are surprisingly effective as anti-epilepsy drugs (AEDs), despite having lower anti-seizure potency than fenfluramine racemate, by virtue of also being less cardiotoxic than fenfluramine racemate. Preferred embodiments contemplate treatment of Dravet syndrome; other preferred embodiments contemplate treatment of other epileptic seizure disorders.
    Type: Application
    Filed: July 26, 2019
    Publication date: January 30, 2020
    Inventors: Robin Paul Sherrington, Jean-Jacques Alexandre Cadieux, Parisa Karimi Tari, Jeffrey Paul Bechard