Patents by Inventor Bianca Homberg
Bianca Homberg has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11975446Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: GrantFiled: July 5, 2022Date of Patent: May 7, 2024Assignee: Google LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 11752625Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: GrantFiled: August 28, 2020Date of Patent: September 12, 2023Assignee: Google LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
-
Patent number: 11691277Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.Type: GrantFiled: July 19, 2021Date of Patent: July 4, 2023Assignee: X DEVELOPMENT LLCInventors: Umashankar Nagarajan, Bianca Homberg
-
Publication number: 20220339803Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: ApplicationFiled: July 5, 2022Publication date: October 27, 2022Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 11407125Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: GrantFiled: May 6, 2020Date of Patent: August 9, 2022Assignee: X Development LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Publication number: 20220007590Abstract: One variation of a method for automating transfer of plants within an agricultural facility includes: dispatching a loader to autonomously deliver a first module—defining a first array of plant slots at a first density and loaded with a first set of plants at a first growth stage—from a first grow location within an agricultural facility to a transfer station within the agricultural facility; dispatching the loader to autonomously deliver a second module—defining a second array of plant slots at a second density less than the first density and empty of plants—to the transfer station; recording a module-level optical scan of the first module; extracting a viability parameter of the first set of plants from features detected in the module-level optical scan; and if the viability parameter falls outside of a target viability range, rejecting transfer of the first set of plants from the first module.Type: ApplicationFiled: July 23, 2021Publication date: January 13, 2022Inventors: Brandon Ace Alexander, Jonathan Binney, Winnie Ding, Bianca Homberg, Warren Huffman, Tom Kendall, Saqib Naveed, Pete Turner, Adrian Canoso, Ansgar Lorenz
-
Publication number: 20210347040Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.Type: ApplicationFiled: July 19, 2021Publication date: November 11, 2021Inventors: Umashankar Nagarajan, Bianca Homberg
-
Patent number: 11097418Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.Type: GrantFiled: January 4, 2018Date of Patent: August 24, 2021Assignee: X DEVELOPMENT LLCInventors: Umashankar Nagarajan, Bianca Homberg
-
Publication number: 20210187735Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.Type: ApplicationFiled: March 5, 2021Publication date: June 24, 2021Inventors: Bianca Homberg, Jeffrey Bingham
-
Patent number: 10967507Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.Type: GrantFiled: May 2, 2018Date of Patent: April 6, 2021Assignee: X Development LLCInventors: Bianca Homberg, Jeffrey Bingham
-
Publication number: 20200391378Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: ApplicationFiled: August 28, 2020Publication date: December 17, 2020Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
-
Patent number: 10792809Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: GrantFiled: December 12, 2017Date of Patent: October 6, 2020Assignee: X Development LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto
-
Publication number: 20200262089Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: ApplicationFiled: May 6, 2020Publication date: August 20, 2020Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Patent number: 10682774Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: GrantFiled: December 12, 2017Date of Patent: June 16, 2020Assignee: X Development LLCInventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Publication number: 20190337152Abstract: In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.Type: ApplicationFiled: May 2, 2018Publication date: November 7, 2019Inventors: Bianca Homberg, Jeffrey Bingham
-
Publication number: 20190248003Abstract: Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.Type: ApplicationFiled: January 4, 2018Publication date: August 15, 2019Inventors: Umashankar Nagarajan, Bianca Homberg
-
Publication number: 20190176348Abstract: A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.Type: ApplicationFiled: December 12, 2017Publication date: June 13, 2019Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto, Alex Shafer
-
Publication number: 20190176326Abstract: A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.Type: ApplicationFiled: December 12, 2017Publication date: June 13, 2019Inventors: Jeffrey Bingham, Taylor Alexander, Bianca Homberg, Joseph DelPreto