Patents by Inventor Jeffrey Bingham

Jeffrey Bingham has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11975446
    Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
    Type: Grant
    Filed: July 5, 2022
    Date of Patent: May 7, 2024
    Assignee: Google LLC
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
  • Patent number: 11752625
    Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.
    Type: Grant
    Filed: August 28, 2020
    Date of Patent: September 12, 2023
    Assignee: Google LLC
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
  • Patent number: 11745332
    Abstract: Methods, apparatus, and computer readable media applicable to balancing robots. Some implementations are directed to maintaining a given end effector pose (relative to a world frame) of an end effector of a balancing robot when there is a disturbance to a balancing base of the balancing robot. Some implementations are additionally or alternatively directed to transitioning a balancing robot from a fallen configuration to a balanced configuration. Some implementations are additionally or alternatively directed to mitigating the risk that a balancing robot will fall when interacting with actuable environmental objects (e.g., doors) and/or to lessen the disturbance to a balancing base when interacting with actuable environmental objects.
    Type: Grant
    Filed: January 11, 2022
    Date of Patent: September 5, 2023
    Assignee: GOOGLE LLC
    Inventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
  • Patent number: 11607802
    Abstract: Generating and utilizing action image(s) that represent a candidate pose (e.g., a candidate end effector pose), in determining whether to utilize the candidate pose in performance of a robotic task. The action image(s) and corresponding current image(s) can be processed, using a trained critic network, to generate a value that indicates a probability of success of the robotic task if component(s) of the robot are traversed to the particular pose. When the value satisfies one or more conditions (e.g., satisfies a threshold), the robot can be controlled to cause the component(s) to traverse to the particular pose in performing the robotic task.
    Type: Grant
    Filed: May 28, 2020
    Date of Patent: March 21, 2023
    Assignee: X DEVELOPMENT LLC
    Inventors: Seyed Mohammad Khansari Zadeh, Daniel Kappler, Jianlan Luo, Jeffrey Bingham, Mrinal Kalakrishnan
  • Publication number: 20220339803
    Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
    Type: Application
    Filed: July 5, 2022
    Publication date: October 27, 2022
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
  • Patent number: 11407125
    Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
    Type: Grant
    Filed: May 6, 2020
    Date of Patent: August 9, 2022
    Assignee: X Development LLC
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
  • Patent number: 11285604
    Abstract: Cascading variational autoencoder (“VAE”) models can be used to detect robot collisions while a robot is performing a task. For a current state of the robot, various implementations include a VAE used to generate a latent space of the current state, and a predictor network used to generate a predicted latent space for the current state. A collision can be determined based on a difference between the latent space for the current state and the predicted latent space for the current state.
    Type: Grant
    Filed: December 23, 2019
    Date of Patent: March 29, 2022
    Assignee: X DEVELOPMENT LLC
    Inventors: Jeffrey Bingham, Jianlan Luo
  • Patent number: 11253990
    Abstract: Methods, apparatus, and computer readable media applicable to balancing robots. Some implementations are directed to maintaining a given end effector pose (relative to a world frame) of an end effector of a balancing robot when there is a disturbance to a balancing base of the balancing robot. Some implementations are additionally or alternatively directed to transitioning a balancing robot from a fallen configuration to a balanced configuration. Some implementations are additionally or alternatively directed to mitigating the risk that a balancing robot will fall when interacting with actuable environmental objects (e.g., doors) and/or to lessen the disturbance to a balancing base when interacting with actuable environmental objects.
    Type: Grant
    Filed: October 16, 2019
    Date of Patent: February 22, 2022
    Assignee: X DEVELOPMENT LLC
    Inventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
  • Publication number: 20210316455
    Abstract: Cascading variational autoencoder (“VAE”) models can be used to detect robot collisions while a robot is performing a task. For a current state of the robot, various implementations include a VAE used to generate a latent space of the current state, and a predictor network used to generate a predicted latent space for the current state. A collision can be determined based on a difference between the latent space for the current state and the predicted latent space for the current state.
    Type: Application
    Filed: December 23, 2019
    Publication date: October 14, 2021
    Inventors: Jeffrey Bingham, Jianlan Luo
  • Publication number: 20210187735
    Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.
    Type: Application
    Filed: March 5, 2021
    Publication date: June 24, 2021
    Inventors: Bianca Homberg, Jeffrey Bingham
  • Patent number: 10987813
    Abstract: Methods, apparatus, and computer readable media applicable to robots, such as balancing robots. Some implementations are directed to determining multiple measures of a property of a robot for a given time and determining a final measure of the property of the robot for the given time based on the multiple measures. One or more control commands may be generated based on the final measure of the property and provided to one or more actuators of the robot.
    Type: Grant
    Filed: April 3, 2019
    Date of Patent: April 27, 2021
    Assignee: X DEVELOPMENT LLC
    Inventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
  • Patent number: 10967507
    Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.
    Type: Grant
    Filed: May 2, 2018
    Date of Patent: April 6, 2021
    Assignee: X Development LLC
    Inventors: Bianca Homberg, Jeffrey Bingham
  • Publication number: 20210078167
    Abstract: Generating and utilizing action image(s) that represent a candidate pose (e.g., a candidate end effector pose), in determining whether to utilize the candidate pose in performance of a robotic task. The action image(s) and corresponding current image(s) can be processed, using a trained critic network, to generate a value that indicates a probability of success of the robotic task if component(s) of the robot are traversed to the particular pose. When the value satisfies one or more conditions (e.g., satisfies a threshold), the robot can be controlled to cause the component(s) to traverse to the particular pose in performing the robotic task.
    Type: Application
    Filed: May 28, 2020
    Publication date: March 18, 2021
    Inventors: Seyed Mohammad Khansari Zadeh, Daniel Kappler, Jianlan Luo, Jeffrey Bingham, Mrinal Kalakrishnan
  • Publication number: 20200391378
    Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.
    Type: Application
    Filed: August 28, 2020
    Publication date: December 17, 2020
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
  • Patent number: 10792809
    Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.
    Type: Grant
    Filed: December 12, 2017
    Date of Patent: October 6, 2020
    Assignee: X Development LLC
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
  • Publication number: 20200262089
    Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
    Type: Application
    Filed: May 6, 2020
    Publication date: August 20, 2020
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
  • Patent number: 10682774
    Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
    Type: Grant
    Filed: December 12, 2017
    Date of Patent: June 16, 2020
    Assignee: X Development LLC
    Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
  • Patent number: 10556630
    Abstract: An example implementation includes a robotic system including a first wheel and a second wheel configured to rotate about a first axis. Each wheel of the first wheel and the second wheel includes a contact surface and a motor coupled to a rotatable component. Each motor is configured to rotate the rotatable component about a respective second axis. The rotatable component is frictionally engaged with the contact surface such that a rotation of the rotatable component about the respective second axis is translated to a rotation of the wheel about the first axis. The robotic system further includes a controller configured to operate the motor of the first wheel and the motor of the second wheel in order to cause the robotic system to maintain its balance and navigate within an environment based on data received from one or more sensors.
    Type: Grant
    Filed: June 29, 2016
    Date of Patent: February 11, 2020
    Assignee: X Development LLC
    Inventors: Jeffrey Bingham, Ben Berkowitz, Benjamin Holson
  • Patent number: 10493617
    Abstract: Methods, apparatus, and computer readable media applicable to balancing robots. Some implementations are directed to maintaining a given end effector pose (relative to a world frame) of an end effector of a balancing robot when there is a disturbance to a balancing base of the balancing robot. Some implementations are additionally or alternatively directed to transitioning a balancing robot from a fallen configuration to a balanced configuration. Some implementations are additionally or alternatively directed to mitigating the risk that a balancing robot will fall when interacting with actuable environmental objects (e.g., doors) and/or to lessen the disturbance to a balancing base when interacting with actuable environmental objects.
    Type: Grant
    Filed: October 21, 2016
    Date of Patent: December 3, 2019
    Assignee: X DEVELOPMENT LLC
    Inventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
  • Publication number: 20190337152
    Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.
    Type: Application
    Filed: May 2, 2018
    Publication date: November 7, 2019
    Inventors: Bianca Homberg, Jeffrey Bingham