Patents by Inventor Kurt Konolige
Kurt Konolige has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9707682Abstract: Methods and systems for recognizing machine-readable information on three-dimensional (3D) objects are described. A robotic manipulator may move at least one physical object through a designated area in space. As the at least one physical object is being moved through the designated area, one or more optical sensors may determine a location of a machine-readable code on the at least one physical object and, based on the determined location, scan the machine-readable code so as to determine information associated with the at least one physical object encoded in the machine-readable code. Based on the information associated with the at least one physical object, a computing device may then determine a respective location in a physical environment of the robotic manipulator at which to place the at least one physical object. The robotic manipulator may then be directed to place the at least one physical object at the respective location.Type: GrantFiled: November 24, 2015Date of Patent: July 18, 2017Assignee: X Development LLCInventors: Kurt Konolige, Ethan Rublee, Gary Bradski
-
Patent number: 9694498Abstract: Methods and systems for depth sensing are provided. A system includes a first and second optical sensor each including a first plurality of photodetectors configured to capture visible light interspersed with a second plurality of photodetectors configured to capture infrared light within a particular infrared band. The system also includes a computing device configured to (i) identify first corresponding features of the environment between a first visible light image captured by the first optical sensor and a second visible light image captured by the second optical sensor; (ii) identify second corresponding features of the environment between a first infrared light image captured by the first optical sensor and a second infrared light image captured by the second optical sensor; and (iii) determine a depth estimate for at least one surface in the environment based on the first corresponding features and the second corresponding features.Type: GrantFiled: March 30, 2015Date of Patent: July 4, 2017Assignee: X Development LLCInventor: Kurt Konolige
-
Patent number: 9630321Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.Type: GrantFiled: December 10, 2015Date of Patent: April 25, 2017Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
-
Patent number: 9630320Abstract: Methods and systems for detecting and reconstructing environments to facilitate robotic interaction with such environments are described. An example method may involve determining a three-dimensional (3D) virtual environment representative of a physical environment of the robotic manipulator including a plurality of 3D virtual objects corresponding to respective physical objects in the physical environment. The method may then involve determining two-dimensional (2D) images of the virtual environment including 2D depth maps. The method may then involve determining portions of the 2D images that correspond to a given one or more physical objects. The method may then involve determining, based on the portions and the 2D depth maps, 3D models corresponding to the portions. The method may then involve, based on the 3D models, selecting a physical object from the given one or more physical objects. The method may then involve providing an instruction to the robotic manipulator to move that object.Type: GrantFiled: July 7, 2015Date of Patent: April 25, 2017Assignee: Industrial Perception, Inc.Inventors: Kurt Konolige, Ethan Rublee, Stefan Hinterstoisser, Troy Straszheim, Gary Bradski, Hauke Malte Strasdat
-
Patent number: 9630316Abstract: Example systems and methods may be used to determine a trajectory for moving an object using a robotic device. One example method includes determining a plurality of possible trajectories for moving an object with an end effector of a robotic manipulator based on a plurality of possible object measurements. The method may further include causing the robotic manipulator to pick up the object with the end effector. After causing the robotic manipulator to pick up the object with the end effector, the method may also include receiving sensor data from one or more sensors indicative of one or more measurements of the object. Based on the received sensor data, the method may additionally include selecting a trajectory for moving the object from the plurality of possible trajectories. The method may further include causing the robotic manipulator to move the object through the selected trajectory.Type: GrantFiled: January 26, 2016Date of Patent: April 25, 2017Assignee: X Development LLCInventors: Kurt Konolige, Ethan Rublee, Mrinal Kalakrishnan
-
Patent number: 9507995Abstract: Methods and systems for determining depth information using a combination of stereo and structured-light processing are provided. An example method involves receiving a plurality of images captured with at least two optical sensors, and determining a first depth estimate for at least one surface based on corresponding features between a first image and a second image. Further, the method involves causing a texture projector to project a known texture pattern, and determining, based on the first depth estimate, at least one region of at least one image of the plurality of images within which to search for a particular portion of the known texture pattern. And the method involves determining points corresponding to the particular portion of the known texture pattern within the at least one region, and determining a second depth estimate for the at least one surface based on the determined points corresponding to the known texture pattern.Type: GrantFiled: December 29, 2014Date of Patent: November 29, 2016Assignee: X Development LLCInventors: Kurt Konolige, Ethan Rublee
-
Patent number: 9498887Abstract: An example two-faced linearly actuated suction gripper includes a first gripping surface having one or more first suction cups arranged to provide suction in a first direction. The suction gripper also includes a second gripping surface comprising one or more second suction cups arranged to provide suction in a second direction which is perpendicular to the first direction. The suction gripper further includes a linear actuator configured to provide movement of the second gripping surface parallel to the second direction towards a face of an object. The suction gripper includes a sensor configured to generate data indicating that the face of the object is adjacent to the second gripping surface; and an engageable brake that, when engaged, stops the movement of the linear actuator in response to the data from the sensor indicating that the second gripping surface is adjacent to the face of the object.Type: GrantFiled: July 24, 2014Date of Patent: November 22, 2016Assignee: X Development LLCInventors: John Zevenbergen, Ethan Rublee, Kurt Konolige, Troy Straszheim
-
Patent number: 9492924Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.Type: GrantFiled: June 15, 2016Date of Patent: November 15, 2016Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
-
Patent number: 9465390Abstract: Example methods and systems may provide for a system that includes a control system communicatively coupled to a first robotic device and a second robotic device. The control system may identify a collaborative operation to be performed by a first robotic device and a second robotic device that is based on a relative positioning between the first robotic device and the second robotic device. The control system may also determine respective locations of the first robotic device and the second robotic device. The control system may further initiate a movement of the first robotic device along a path from the determined location of the first robotic device towards the determined location of the second robotic device. The first robotic device and the second robotic device may then establish a visual handshake that indicates the relative positioning between the first robotic device and the second robotic device for the collaborative operation.Type: GrantFiled: November 11, 2014Date of Patent: October 11, 2016Assignee: Google Inc.Inventors: Julian Mason, Kurt Konolige
-
Publication number: 20160288330Abstract: Methods and systems for depth sensing are provided. A system includes a first and second optical sensor each including a first plurality of photodetectors configured to capture visible light interspersed with a second plurality of photodetectors configured to capture infrared light within a particular infrared band. The system also includes a computing device configured to (i) identify first corresponding features of the environment between a first visible light image captured by the first optical sensor and a second visible light image captured by the second optical sensor; (ii) identify second corresponding features of the environment between a first infrared light image captured by the first optical sensor and a second infrared light image captured by the second optical sensor; and (iii) determine a depth estimate for at least one surface in the environment based on the first corresponding features and the second corresponding features.Type: ApplicationFiled: March 30, 2015Publication date: October 6, 2016Inventor: Kurt Konolige
-
Publication number: 20160288324Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.Type: ApplicationFiled: June 15, 2016Publication date: October 6, 2016Applicant: Industrial Perception, Inc.Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
-
Patent number: 9457477Abstract: An example suction gripper is disclosed that includes a contacting pillow including a plurality of particles inside a non-rigid membrane that allow the contacting pillow to conform to a shape of an object when the contacting pillow is pressed against the object, a plurality of suction cups arranged on the non-rigid membrane of the contacting pillow, and a vacuum system coupled to the contacting pillow and to the plurality of suction cups. The vacuum system may be configured to apply suction to the object through at least one of the plurality of suction cups that is in contact with the object when the contacting pillow is pressed against the object and increase stiffness of the contacting pillow by removing air between the plurality of particles inside the non-rigid membrane of the contacting pillow.Type: GrantFiled: December 29, 2014Date of Patent: October 4, 2016Assignee: Google Inc.Inventors: Ethan Rublee, John Zevenbergen, Kurt Konolige
-
Publication number: 20160221187Abstract: Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.Type: ApplicationFiled: April 7, 2016Publication date: August 4, 2016Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser, Steve Croft, John Zevenbergen
-
Patent number: 9393686Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.Type: GrantFiled: March 14, 2014Date of Patent: July 19, 2016Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
-
Publication number: 20160136808Abstract: Example systems and methods may be used to determine a trajectory for moving an object using a robotic device. One example method includes determining a plurality of possible trajectories for moving an object with an end effector of a robotic manipulator based on a plurality of possible object measurements. The method may further include causing the robotic manipulator to pick up the object with the end effector. After causing the robotic manipulator to pick up the object with the end effector, the method may also include receiving sensor data from one or more sensors indicative of one or more measurements of the object. Based on the received sensor data, the method may additionally include selecting a trajectory for moving the object from the plurality of possible trajectories. The method may further include causing the robotic manipulator to move the object through the selected trajectory.Type: ApplicationFiled: January 26, 2016Publication date: May 19, 2016Inventors: Kurt Konolige, Ethan Rublee, Mrinal Kalakrishnan
-
Publication number: 20160132059Abstract: Example methods and systems may provide for a system that includes a control system communicatively coupled to a first robotic device and a second robotic device. The control system may identify a collaborative operation to be performed by a first robotic device and a second robotic device that is based on a relative positioning between the first robotic device and the second robotic device. The control system may also determine respective locations of the first robotic device and the second robotic device. The control system may further initiate a movement of the first robotic device along a path from the determined location of the first robotic device towards the determined location of the second robotic device. The first robotic device and the second robotic device may then establish a visual handshake that indicates the relative positioning between the first robotic device and the second robotic device for the collaborative operation.Type: ApplicationFiled: November 11, 2014Publication date: May 12, 2016Inventors: Julian Mason, Kurt Konolige
-
Patent number: 9333649Abstract: Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.Type: GrantFiled: March 14, 2014Date of Patent: May 10, 2016Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser, Steve Croft, John Zevenbergen
-
Patent number: 9327406Abstract: One or more images of a physical environment may be received, where the one or more images may include one or more objects. A type of surface feature predicted to be contained on a portion of one or more surfaces of a single object may be determined. Surface features of the type within regions of the one or more images may then be identified. The regions may then be associated to corresponding objects in the physical environment based on the identified surface features. Based at least in part on the regions associated to the corresponding objects, a virtual representation of the physical environment may be determined, the representation including at least one distinct object segmented from a remaining portion of the physical environment so as to virtually distinguish a boundary of the at least one distinct object from boundaries of objects present in the remaining portion of the physical environment.Type: GrantFiled: August 19, 2014Date of Patent: May 3, 2016Assignee: Google Inc.Inventors: Stefan Hinterstoisser, Kurt Konolige
-
Publication number: 20160089791Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.Type: ApplicationFiled: December 10, 2015Publication date: March 31, 2016Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
-
Publication number: 20160084642Abstract: Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.Type: ApplicationFiled: December 7, 2015Publication date: March 24, 2016Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee