Patents by Inventor Jeffrey Bingham
Jeffrey Bingham has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11975446Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: GrantFiled: July 5, 2022Date of Patent: May 7, 2024Assignee: Google LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 11752625Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: GrantFiled: August 28, 2020Date of Patent: September 12, 2023Assignee: Google LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
-
Patent number: 11745332Abstract: Methods, apparatus, and computer readable media applicable to balancing robots. Some implementations are directed to maintaining a given end effector pose (relative to a world frame) of an end effector of a balancing robot when there is a disturbance to a balancing base of the balancing robot. Some implementations are additionally or alternatively directed to transitioning a balancing robot from a fallen configuration to a balanced configuration. Some implementations are additionally or alternatively directed to mitigating the risk that a balancing robot will fall when interacting with actuable environmental objects (e.g., doors) and/or to lessen the disturbance to a balancing base when interacting with actuable environmental objects.Type: GrantFiled: January 11, 2022Date of Patent: September 5, 2023Assignee: GOOGLE LLCInventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
-
Patent number: 11607802Abstract: Generating and utilizing action image(s) that represent a candidate pose (e.g., a candidate end effector pose), in determining whether to utilize the candidate pose in performance of a robotic task. The action image(s) and corresponding current image(s) can be processed, using a trained critic network, to generate a value that indicates a probability of success of the robotic task if component(s) of the robot are traversed to the particular pose. When the value satisfies one or more conditions (e.g., satisfies a threshold), the robot can be controlled to cause the component(s) to traverse to the particular pose in performing the robotic task.Type: GrantFiled: May 28, 2020Date of Patent: March 21, 2023Assignee: X DEVELOPMENT LLCInventors: Seyed Mohammad Khansari Zadeh, Daniel Kappler, Jianlan Luo, Jeffrey Bingham, Mrinal Kalakrishnan
-
Publication number: 20220339803Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: ApplicationFiled: July 5, 2022Publication date: October 27, 2022Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 11407125Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: GrantFiled: May 6, 2020Date of Patent: August 9, 2022Assignee: X Development LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 11285604Abstract: Cascading variational autoencoder (“VAE”) models can be used to detect robot collisions while a robot is performing a task. For a current state of the robot, various implementations include a VAE used to generate a latent space of the current state, and a predictor network used to generate a predicted latent space for the current state. A collision can be determined based on a difference between the latent space for the current state and the predicted latent space for the current state.Type: GrantFiled: December 23, 2019Date of Patent: March 29, 2022Assignee: X DEVELOPMENT LLCInventors: Jeffrey Bingham, Jianlan Luo
-
Patent number: 11253990Abstract: Methods, apparatus, and computer readable media applicable to balancing robots. Some implementations are directed to maintaining a given end effector pose (relative to a world frame) of an end effector of a balancing robot when there is a disturbance to a balancing base of the balancing robot. Some implementations are additionally or alternatively directed to transitioning a balancing robot from a fallen configuration to a balanced configuration. Some implementations are additionally or alternatively directed to mitigating the risk that a balancing robot will fall when interacting with actuable environmental objects (e.g., doors) and/or to lessen the disturbance to a balancing base when interacting with actuable environmental objects.Type: GrantFiled: October 16, 2019Date of Patent: February 22, 2022Assignee: X DEVELOPMENT LLCInventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
-
Publication number: 20210316455Abstract: Cascading variational autoencoder (“VAE”) models can be used to detect robot collisions while a robot is performing a task. For a current state of the robot, various implementations include a VAE used to generate a latent space of the current state, and a predictor network used to generate a predicted latent space for the current state. A collision can be determined based on a difference between the latent space for the current state and the predicted latent space for the current state.Type: ApplicationFiled: December 23, 2019Publication date: October 14, 2021Inventors: Jeffrey Bingham, Jianlan Luo
-
Publication number: 20210187735Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.Type: ApplicationFiled: March 5, 2021Publication date: June 24, 2021Inventors: Bianca Homberg, Jeffrey Bingham
-
Patent number: 10987813Abstract: Methods, apparatus, and computer readable media applicable to robots, such as balancing robots. Some implementations are directed to determining multiple measures of a property of a robot for a given time and determining a final measure of the property of the robot for the given time based on the multiple measures. One or more control commands may be generated based on the final measure of the property and provided to one or more actuators of the robot.Type: GrantFiled: April 3, 2019Date of Patent: April 27, 2021Assignee: X DEVELOPMENT LLCInventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
-
Patent number: 10967507Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.Type: GrantFiled: May 2, 2018Date of Patent: April 6, 2021Assignee: X Development LLCInventors: Bianca Homberg, Jeffrey Bingham
-
Publication number: 20210078167Abstract: Generating and utilizing action image(s) that represent a candidate pose (e.g., a candidate end effector pose), in determining whether to utilize the candidate pose in performance of a robotic task. The action image(s) and corresponding current image(s) can be processed, using a trained critic network, to generate a value that indicates a probability of success of the robotic task if component(s) of the robot are traversed to the particular pose. When the value satisfies one or more conditions (e.g., satisfies a threshold), the robot can be controlled to cause the component(s) to traverse to the particular pose in performing the robotic task.Type: ApplicationFiled: May 28, 2020Publication date: March 18, 2021Inventors: Seyed Mohammad Khansari Zadeh, Daniel Kappler, Jianlan Luo, Jeffrey Bingham, Mrinal Kalakrishnan
-
Publication number: 20200391378Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: ApplicationFiled: August 28, 2020Publication date: December 17, 2020Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
-
Patent number: 10792809Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: GrantFiled: December 12, 2017Date of Patent: October 6, 2020Assignee: X Development LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
-
Publication number: 20200262089Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: ApplicationFiled: May 6, 2020Publication date: August 20, 2020Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 10682774Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: GrantFiled: December 12, 2017Date of Patent: June 16, 2020Assignee: X Development LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 10556630Abstract: An example implementation includes a robotic system including a first wheel and a second wheel configured to rotate about a first axis. Each wheel of the first wheel and the second wheel includes a contact surface and a motor coupled to a rotatable component. Each motor is configured to rotate the rotatable component about a respective second axis. The rotatable component is frictionally engaged with the contact surface such that a rotation of the rotatable component about the respective second axis is translated to a rotation of the wheel about the first axis. The robotic system further includes a controller configured to operate the motor of the first wheel and the motor of the second wheel in order to cause the robotic system to maintain its balance and navigate within an environment based on data received from one or more sensors.Type: GrantFiled: June 29, 2016Date of Patent: February 11, 2020Assignee: X Development LLCInventors: Jeffrey Bingham, Ben Berkowitz, Benjamin Holson
-
Patent number: 10493617Abstract: Methods, apparatus, and computer readable media applicable to balancing robots. Some implementations are directed to maintaining a given end effector pose (relative to a world frame) of an end effector of a balancing robot when there is a disturbance to a balancing base of the balancing robot. Some implementations are additionally or alternatively directed to transitioning a balancing robot from a fallen configuration to a balanced configuration. Some implementations are additionally or alternatively directed to mitigating the risk that a balancing robot will fall when interacting with actuable environmental objects (e.g., doors) and/or to lessen the disturbance to a balancing base when interacting with actuable environmental objects.Type: GrantFiled: October 21, 2016Date of Patent: December 3, 2019Assignee: X DEVELOPMENT LLCInventors: Benjamin Holson, Jeffrey Bingham, Ben Berkowitz
-
Publication number: 20190337152Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.Type: ApplicationFiled: May 2, 2018Publication date: November 7, 2019Inventors: Bianca Homberg, Jeffrey Bingham