Patents by Inventor Kurt Konolige

Kurt Konolige has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9707682
    Abstract: Methods and systems for recognizing machine-readable information on three-dimensional (3D) objects are described. A robotic manipulator may move at least one physical object through a designated area in space. As the at least one physical object is being moved through the designated area, one or more optical sensors may determine a location of a machine-readable code on the at least one physical object and, based on the determined location, scan the machine-readable code so as to determine information associated with the at least one physical object encoded in the machine-readable code. Based on the information associated with the at least one physical object, a computing device may then determine a respective location in a physical environment of the robotic manipulator at which to place the at least one physical object. The robotic manipulator may then be directed to place the at least one physical object at the respective location.
    Type: Grant
    Filed: November 24, 2015
    Date of Patent: July 18, 2017
    Assignee: X Development LLC
    Inventors: Kurt Konolige, Ethan Rublee, Gary Bradski
  • Patent number: 9694498
    Abstract: Methods and systems for depth sensing are provided. A system includes a first and second optical sensor each including a first plurality of photodetectors configured to capture visible light interspersed with a second plurality of photodetectors configured to capture infrared light within a particular infrared band. The system also includes a computing device configured to (i) identify first corresponding features of the environment between a first visible light image captured by the first optical sensor and a second visible light image captured by the second optical sensor; (ii) identify second corresponding features of the environment between a first infrared light image captured by the first optical sensor and a second infrared light image captured by the second optical sensor; and (iii) determine a depth estimate for at least one surface in the environment based on the first corresponding features and the second corresponding features.
    Type: Grant
    Filed: March 30, 2015
    Date of Patent: July 4, 2017
    Assignee: X Development LLC
    Inventor: Kurt Konolige
  • Patent number: 9630321
    Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.
    Type: Grant
    Filed: December 10, 2015
    Date of Patent: April 25, 2017
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
  • Patent number: 9630320
    Abstract: Methods and systems for detecting and reconstructing environments to facilitate robotic interaction with such environments are described. An example method may involve determining a three-dimensional (3D) virtual environment representative of a physical environment of the robotic manipulator including a plurality of 3D virtual objects corresponding to respective physical objects in the physical environment. The method may then involve determining two-dimensional (2D) images of the virtual environment including 2D depth maps. The method may then involve determining portions of the 2D images that correspond to a given one or more physical objects. The method may then involve determining, based on the portions and the 2D depth maps, 3D models corresponding to the portions. The method may then involve, based on the 3D models, selecting a physical object from the given one or more physical objects. The method may then involve providing an instruction to the robotic manipulator to move that object.
    Type: Grant
    Filed: July 7, 2015
    Date of Patent: April 25, 2017
    Assignee: Industrial Perception, Inc.
    Inventors: Kurt Konolige, Ethan Rublee, Stefan Hinterstoisser, Troy Straszheim, Gary Bradski, Hauke Malte Strasdat
  • Patent number: 9630316
    Abstract: Example systems and methods may be used to determine a trajectory for moving an object using a robotic device. One example method includes determining a plurality of possible trajectories for moving an object with an end effector of a robotic manipulator based on a plurality of possible object measurements. The method may further include causing the robotic manipulator to pick up the object with the end effector. After causing the robotic manipulator to pick up the object with the end effector, the method may also include receiving sensor data from one or more sensors indicative of one or more measurements of the object. Based on the received sensor data, the method may additionally include selecting a trajectory for moving the object from the plurality of possible trajectories. The method may further include causing the robotic manipulator to move the object through the selected trajectory.
    Type: Grant
    Filed: January 26, 2016
    Date of Patent: April 25, 2017
    Assignee: X Development LLC
    Inventors: Kurt Konolige, Ethan Rublee, Mrinal Kalakrishnan
  • Patent number: 9507995
    Abstract: Methods and systems for determining depth information using a combination of stereo and structured-light processing are provided. An example method involves receiving a plurality of images captured with at least two optical sensors, and determining a first depth estimate for at least one surface based on corresponding features between a first image and a second image. Further, the method involves causing a texture projector to project a known texture pattern, and determining, based on the first depth estimate, at least one region of at least one image of the plurality of images within which to search for a particular portion of the known texture pattern. And the method involves determining points corresponding to the particular portion of the known texture pattern within the at least one region, and determining a second depth estimate for the at least one surface based on the determined points corresponding to the known texture pattern.
    Type: Grant
    Filed: December 29, 2014
    Date of Patent: November 29, 2016
    Assignee: X Development LLC
    Inventors: Kurt Konolige, Ethan Rublee
  • Patent number: 9498887
    Abstract: An example two-faced linearly actuated suction gripper includes a first gripping surface having one or more first suction cups arranged to provide suction in a first direction. The suction gripper also includes a second gripping surface comprising one or more second suction cups arranged to provide suction in a second direction which is perpendicular to the first direction. The suction gripper further includes a linear actuator configured to provide movement of the second gripping surface parallel to the second direction towards a face of an object. The suction gripper includes a sensor configured to generate data indicating that the face of the object is adjacent to the second gripping surface; and an engageable brake that, when engaged, stops the movement of the linear actuator in response to the data from the sensor indicating that the second gripping surface is adjacent to the face of the object.
    Type: Grant
    Filed: July 24, 2014
    Date of Patent: November 22, 2016
    Assignee: X Development LLC
    Inventors: John Zevenbergen, Ethan Rublee, Kurt Konolige, Troy Straszheim
  • Patent number: 9492924
    Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.
    Type: Grant
    Filed: June 15, 2016
    Date of Patent: November 15, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
  • Patent number: 9465390
    Abstract: Example methods and systems may provide for a system that includes a control system communicatively coupled to a first robotic device and a second robotic device. The control system may identify a collaborative operation to be performed by a first robotic device and a second robotic device that is based on a relative positioning between the first robotic device and the second robotic device. The control system may also determine respective locations of the first robotic device and the second robotic device. The control system may further initiate a movement of the first robotic device along a path from the determined location of the first robotic device towards the determined location of the second robotic device. The first robotic device and the second robotic device may then establish a visual handshake that indicates the relative positioning between the first robotic device and the second robotic device for the collaborative operation.
    Type: Grant
    Filed: November 11, 2014
    Date of Patent: October 11, 2016
    Assignee: Google Inc.
    Inventors: Julian Mason, Kurt Konolige
  • Publication number: 20160288330
    Abstract: Methods and systems for depth sensing are provided. A system includes a first and second optical sensor each including a first plurality of photodetectors configured to capture visible light interspersed with a second plurality of photodetectors configured to capture infrared light within a particular infrared band. The system also includes a computing device configured to (i) identify first corresponding features of the environment between a first visible light image captured by the first optical sensor and a second visible light image captured by the second optical sensor; (ii) identify second corresponding features of the environment between a first infrared light image captured by the first optical sensor and a second infrared light image captured by the second optical sensor; and (iii) determine a depth estimate for at least one surface in the environment based on the first corresponding features and the second corresponding features.
    Type: Application
    Filed: March 30, 2015
    Publication date: October 6, 2016
    Inventor: Kurt Konolige
  • Publication number: 20160288324
    Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.
    Type: Application
    Filed: June 15, 2016
    Publication date: October 6, 2016
    Applicant: Industrial Perception, Inc.
    Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
  • Patent number: 9457477
    Abstract: An example suction gripper is disclosed that includes a contacting pillow including a plurality of particles inside a non-rigid membrane that allow the contacting pillow to conform to a shape of an object when the contacting pillow is pressed against the object, a plurality of suction cups arranged on the non-rigid membrane of the contacting pillow, and a vacuum system coupled to the contacting pillow and to the plurality of suction cups. The vacuum system may be configured to apply suction to the object through at least one of the plurality of suction cups that is in contact with the object when the contacting pillow is pressed against the object and increase stiffness of the contacting pillow by removing air between the plurality of particles inside the non-rigid membrane of the contacting pillow.
    Type: Grant
    Filed: December 29, 2014
    Date of Patent: October 4, 2016
    Assignee: Google Inc.
    Inventors: Ethan Rublee, John Zevenbergen, Kurt Konolige
  • Publication number: 20160221187
    Abstract: Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.
    Type: Application
    Filed: April 7, 2016
    Publication date: August 4, 2016
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser, Steve Croft, John Zevenbergen
  • Patent number: 9393686
    Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: July 19, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
  • Publication number: 20160136808
    Abstract: Example systems and methods may be used to determine a trajectory for moving an object using a robotic device. One example method includes determining a plurality of possible trajectories for moving an object with an end effector of a robotic manipulator based on a plurality of possible object measurements. The method may further include causing the robotic manipulator to pick up the object with the end effector. After causing the robotic manipulator to pick up the object with the end effector, the method may also include receiving sensor data from one or more sensors indicative of one or more measurements of the object. Based on the received sensor data, the method may additionally include selecting a trajectory for moving the object from the plurality of possible trajectories. The method may further include causing the robotic manipulator to move the object through the selected trajectory.
    Type: Application
    Filed: January 26, 2016
    Publication date: May 19, 2016
    Inventors: Kurt Konolige, Ethan Rublee, Mrinal Kalakrishnan
  • Publication number: 20160132059
    Abstract: Example methods and systems may provide for a system that includes a control system communicatively coupled to a first robotic device and a second robotic device. The control system may identify a collaborative operation to be performed by a first robotic device and a second robotic device that is based on a relative positioning between the first robotic device and the second robotic device. The control system may also determine respective locations of the first robotic device and the second robotic device. The control system may further initiate a movement of the first robotic device along a path from the determined location of the first robotic device towards the determined location of the second robotic device. The first robotic device and the second robotic device may then establish a visual handshake that indicates the relative positioning between the first robotic device and the second robotic device for the collaborative operation.
    Type: Application
    Filed: November 11, 2014
    Publication date: May 12, 2016
    Inventors: Julian Mason, Kurt Konolige
  • Patent number: 9333649
    Abstract: Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: May 10, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser, Steve Croft, John Zevenbergen
  • Patent number: 9327406
    Abstract: One or more images of a physical environment may be received, where the one or more images may include one or more objects. A type of surface feature predicted to be contained on a portion of one or more surfaces of a single object may be determined. Surface features of the type within regions of the one or more images may then be identified. The regions may then be associated to corresponding objects in the physical environment based on the identified surface features. Based at least in part on the regions associated to the corresponding objects, a virtual representation of the physical environment may be determined, the representation including at least one distinct object segmented from a remaining portion of the physical environment so as to virtually distinguish a boundary of the at least one distinct object from boundaries of objects present in the remaining portion of the physical environment.
    Type: Grant
    Filed: August 19, 2014
    Date of Patent: May 3, 2016
    Assignee: Google Inc.
    Inventors: Stefan Hinterstoisser, Kurt Konolige
  • Publication number: 20160089791
    Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.
    Type: Application
    Filed: December 10, 2015
    Publication date: March 31, 2016
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
  • Publication number: 20160084642
    Abstract: Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.
    Type: Application
    Filed: December 7, 2015
    Publication date: March 24, 2016
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee