Patents by Inventor Ethan Rublee

Ethan Rublee has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9233470
    Abstract: Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: January 12, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee
  • Patent number: 9227323
    Abstract: Methods and systems for recognizing machine-readable information on three-dimensional (3D) objects are described. A robotic manipulator may move at least one physical object through a designated area in space. As the at least one physical object is being moved through the designated area, one or more optical sensors may determine a location of a machine-readable code on the at least one physical object and, based on the determined location, scan the machine-readable code so as to determine information associated with the at least one physical object encoded in the machine-readable code. Based on the information associated with the at least one physical object, a computing device may then determine a respective location in a physical environment of the robotic manipulator at which to place the at least one physical object. The robotic manipulator may then be directed to place the at least one physical object at the respective location.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: January 5, 2016
    Assignee: Google Inc.
    Inventors: Kurt Konolige, Ethan Rublee, Gary Bradski
  • Patent number: 9205558
    Abstract: Example embodiments may provide for control of a suction gripper with multiple suction cups. One example system includes a suction gripper and a control system. The suction gripper may include a vacuum pump, a plurality of suction cups coupled to the vacuum pump, and a plurality of sensors corresponding to the suction cups, where a sensor is positioned between the vacuum pump and a suction cup and measures a vacuum pressure of the suction cup. The control system may be configured to activate the vacuum pump to cause the suction gripper to apply suction to an object through one or more active suction cups, receive sensor data indicative of the vacuum pressure of the one or more active suction cups from the corresponding sensors, identify at least one suction cup to deactivate from the one or more active suction cups, and deactivate the at least one identified suction cup.
    Type: Grant
    Filed: July 16, 2014
    Date of Patent: December 8, 2015
    Assignee: GOOGLE INC.
    Inventors: John Zevenbergen, Ethan Rublee
  • Patent number: 9102055
    Abstract: Methods and systems for detecting and reconstructing environments to facilitate robotic interaction with such environments are described. An example method may involve determining a three-dimensional (3D) virtual environment representative of a physical environment of the robotic manipulator including a plurality of 3D virtual objects corresponding to respective physical objects in the physical environment. The method may then involve determining two-dimensional (2D) images of the virtual environment including 2D depth maps. The method may then involve determining portions of the 2D images that correspond to a given one or more physical objects. The method may then involve determining, based on the portions and the 2D depth maps, 3D models corresponding to the portions. The method may then involve, based on the 3D models, selecting a physical object from the given one or more physical objects. The method may then involve providing an instruction to the robotic manipulator to move that object.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: August 11, 2015
    Assignee: Industrial Perception, Inc.
    Inventors: Kurt Konolige, Ethan Rublee, Stefan Hinterstoisser, Troy Straszheim, Gary Bradski, Hauke Strasdat