Patents Assigned to Perception, Inc.
  • Patent number: 11658818
    Abstract: Anonymizing systems and methods comprising a native configurations database including a set of configurations, a key management database including a plurality of private keys, a processor in communication with the native configurations database and the key management database, and a memory coupled to the processor. The set of configurations includes one or more ranges, wherein each range includes a contiguous sequence comprised of IP addresses, port numbers, or IP addresses and port numbers. The processor is configured to retrieve the set of configurations from the native configurations database, wherein the set of configurations includes a plurality of objects; retrieve a private key from the key management database; assign a unique cryptographically secure identity to each object; and anonymize the plurality of objects based on the cryptographically secure identities and the private key.
    Type: Grant
    Filed: January 25, 2021
    Date of Patent: May 23, 2023
    Assignee: Network Perception, Inc.
    Inventor: David M. Nicol
  • Patent number: 11568511
    Abstract: A sensing and computing system and method for capturing images and data regarding an object and calculating one or more parameters regarding the object using an internal, integrated CPU/GPU. The system comprises an imaging system, including a depth imaging system, color camera, and light source, that capture images of the object and sends data or signals relating to the images to the CPU/GPU, which performs calculations based on those signals/data according to pre-programmed algorithms to determine the parameters. The CPU/GPU and imaging system are contained within a protective housing. The CPU/GPU transmits information regarding the parameters, rather than raw data/signals, to one or more external devices to perform tasks in an industrial environment related to the object imaged.
    Type: Grant
    Filed: January 27, 2021
    Date of Patent: January 31, 2023
    Assignee: Cloud 9 Perception, Inc.
    Inventors: Christopher D. McMurrough, James Francis Staud
  • Patent number: 10903998
    Abstract: Anonymizing systems and methods comprising a native configurations database including a set of configurations, a key management database including a plurality of private keys, a processor in communication with the native configurations database and the key management database, and a memory coupled to the processor. The set of configurations includes one or more ranges, wherein each range includes a contiguous sequence comprised of IP addresses, port numbers, or IP addresses and port numbers. The processor is configured to retrieve the set of configurations from the native configurations database, wherein the set of configurations includes a plurality of objects; retrieve a private key from the key management database; assign a unique cryptographically secure identity to each object; and anonymize the plurality of objects based on the cryptographically secure identities and the private key.
    Type: Grant
    Filed: October 15, 2018
    Date of Patent: January 26, 2021
    Assignee: NETWORK PERCEPTION, INC
    Inventor: David M. Nicol
  • Patent number: 10055892
    Abstract: Some augmented reality (AR) and virtual reality (VR) applications may require that an “activity region” be defined prior to their use. For example, a user performing a video conferencing application or playing a game may need to identify an appropriate space in which they may walk and gesture while wearing a Head Mounted Display without causing injury. This may be particularly important in VR applications where, e.g., the user's vision is completely obscured by the VR display, and/or the user will not see their actual environment as the user moves around. Various embodiments provide systems and methods for anticipating, defining, and applying the active region. In some embodiments, the system may represent real-world obstacles to the user in the user's field of view, e.g., outlining the contour of the problematic object to call the user's attention to the object's presence in the active region.
    Type: Grant
    Filed: January 13, 2017
    Date of Patent: August 21, 2018
    Assignee: Eonite Perception Inc.
    Inventors: Anna Petrovskaya, Peter Varvak, Anton Geraschenko, Dylan Koenig, Youssri Helmy
  • Patent number: 10043319
    Abstract: While many augmented reality systems provide “see-through” transparent or translucent displays upon which to project virtual objects, many virtual reality systems instead employ opaque, enclosed screens. Indeed, eliminating the user's perception of the real-world may be integral to some successful virtual reality experiences. Thus, head mounted displays designed exclusively for virtual reality experiences may not be easily repurposed to capture significant portions of the augmented reality market. Various of the disclosed embodiments facilitate the repurposing of a virtual reality device for augmented reality use. Particularly, by anticipating user head motion, embodiments may facilitate scene renderings better aligned with user expectations than naïve renderings generated within the enclosed field of view. In some embodiments, the system may use procedural mapping methods to generate a virtual model of the environment. The system may then use this model to supplement the anticipatory rendering.
    Type: Grant
    Filed: January 13, 2017
    Date of Patent: August 7, 2018
    Assignee: Eonite Perception Inc.
    Inventors: Anna Petrovskaya, Peter Varvak
  • Patent number: 9972137
    Abstract: Various of the disclosed embodiments provide systems and methods for acquiring and applying a depth determination of an environment in e.g., various augmented reality applications. A user may passively or actively scan a device (e.g., a tablet device, a mobile phone device, etc.) about the environment acquiring depth data for various regions. The system may integrate these scans into an internal three-dimensional model. This model may then be used in conjunction with subsequent data acquisitions to determine a device's location and orientation within the environment with high fidelity. In some embodiments, these determinations may be accomplished in real-time or near-real-time. Using the high-fidelity orientation and position determination, various augmented reality applications may then be possible using the same device used to acquire the depth data or a new device.
    Type: Grant
    Filed: July 14, 2017
    Date of Patent: May 15, 2018
    Assignee: Eonite Perception Inc.
    Inventors: Anna Petrovskaya, Peter Varvak
  • Patent number: 9916002
    Abstract: Augmented and virtual reality systems are becoming increasingly popular. Unfortunately, their potential for social interaction is difficult to realize with existing techniques. Various of the disclosed embodiments facilitate social augmented and virtual reality experiences using, e.g., topologies connecting disparate device types, shared-environments, messaging systems, virtual object placements, etc. Some embodiments employ pose-search systems and methods that provide more granular pose determinations than were previously possible. Such granularity may facilitate functionality that would otherwise be difficult or impossible to achieve.
    Type: Grant
    Filed: February 25, 2016
    Date of Patent: March 13, 2018
    Assignee: Eonite Perception Inc.
    Inventors: Anna Petrovskaya, Peter Varvak
  • Patent number: 9754419
    Abstract: Various of the disclosed embodiments provide systems and methods for acquiring and applying a depth determination of an environment in e.g., various augmented reality applications. A user may passively or actively scan a device (e.g., a tablet device, a mobile phone device, etc.) about the environment acquiring depth data for various regions. The system may integrate these scans into an internal three-dimensional model. This model may then be used in conjunction with subsequent data acquisitions to determine a device's location and orientation within the environment with high fidelity. In some embodiments, these determinations may be accomplished in real-time or near-real-time. Using the high-fidelity orientation and position determination, various augmented reality applications may then be possible using the same device used to acquire the depth data or a new device.
    Type: Grant
    Filed: November 13, 2015
    Date of Patent: September 5, 2017
    Assignee: Eonite Perception Inc.
    Inventors: Anna Petrovskaya, Peter Varvak
  • Patent number: 9630320
    Abstract: Methods and systems for detecting and reconstructing environments to facilitate robotic interaction with such environments are described. An example method may involve determining a three-dimensional (3D) virtual environment representative of a physical environment of the robotic manipulator including a plurality of 3D virtual objects corresponding to respective physical objects in the physical environment. The method may then involve determining two-dimensional (2D) images of the virtual environment including 2D depth maps. The method may then involve determining portions of the 2D images that correspond to a given one or more physical objects. The method may then involve determining, based on the portions and the 2D depth maps, 3D models corresponding to the portions. The method may then involve, based on the 3D models, selecting a physical object from the given one or more physical objects. The method may then involve providing an instruction to the robotic manipulator to move that object.
    Type: Grant
    Filed: July 7, 2015
    Date of Patent: April 25, 2017
    Assignee: Industrial Perception, Inc.
    Inventors: Kurt Konolige, Ethan Rublee, Stefan Hinterstoisser, Troy Straszheim, Gary Bradski, Hauke Malte Strasdat
  • Patent number: 9630321
    Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.
    Type: Grant
    Filed: December 10, 2015
    Date of Patent: April 25, 2017
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
  • Patent number: 9492924
    Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.
    Type: Grant
    Filed: June 15, 2016
    Date of Patent: November 15, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
  • Publication number: 20160288324
    Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.
    Type: Application
    Filed: June 15, 2016
    Publication date: October 6, 2016
    Applicant: Industrial Perception, Inc.
    Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
  • Patent number: 9393686
    Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: July 19, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
  • Patent number: 9333649
    Abstract: Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: May 10, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser, Steve Croft, John Zevenbergen
  • Patent number: 9238304
    Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: January 19, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
  • Patent number: 9233470
    Abstract: Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: January 12, 2016
    Assignee: Industrial Perception, Inc.
    Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee
  • Patent number: 9102055
    Abstract: Methods and systems for detecting and reconstructing environments to facilitate robotic interaction with such environments are described. An example method may involve determining a three-dimensional (3D) virtual environment representative of a physical environment of the robotic manipulator including a plurality of 3D virtual objects corresponding to respective physical objects in the physical environment. The method may then involve determining two-dimensional (2D) images of the virtual environment including 2D depth maps. The method may then involve determining portions of the 2D images that correspond to a given one or more physical objects. The method may then involve determining, based on the portions and the 2D depth maps, 3D models corresponding to the portions. The method may then involve, based on the 3D models, selecting a physical object from the given one or more physical objects. The method may then involve providing an instruction to the robotic manipulator to move that object.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: August 11, 2015
    Assignee: Industrial Perception, Inc.
    Inventors: Kurt Konolige, Ethan Rublee, Stefan Hinterstoisser, Troy Straszheim, Gary Bradski, Hauke Strasdat
  • Patent number: 8121345
    Abstract: A system and method of identifying a position of a crop row in a field, where an image of two or more crop rows is transmitted to a vision data processor. The vision data processor defines a candidate scan line profile for a corresponding heading and pitch associated with a directional movement of a vehicle, for example, traversing the two or more crop rows. The candidate scan line profile comprises an array of vector quantities, where each vector quantity comprises an intensity value and a corresponding position datum. A preferential scan line profile in a search space about the candidate scan line profile is determined, and the candidate scan line profile is identified as a preferential scan line profile for estimating a position (e.g., peak variation) of one or more crop rows if a variation in the intensity level of the candidate scan line profile exceeds a threshold variation value.
    Type: Grant
    Filed: February 23, 2007
    Date of Patent: February 21, 2012
    Assignee: Applied Perception, Inc.
    Inventors: Todd Jochem, Parag Batavia, Mark Ollis
  • Patent number: 8019513
    Abstract: A system and method of identifying a position of a crop row in a field, where an image of two or more crop rows is transmitted to a vision data processor. A preferential scan line profile in a search space about a candidate scan line profile is determined, and the candidate scan line profile is identified as a preferential scan line profile for estimating a position (e.g., peak variation) of one or more crop rows if a variation in the intensity level of the candidate scan line profile exceeds a threshold variation value. Alternatively, a position datum associated with a highest intensity value within the array of vector quantities can be selected as being indicative of a candidate position of a crop row. The candidate position is then identified as a preliminary row position if a variation in intensity level of the candidate scan line profile exceeds a threshold variation value.
    Type: Grant
    Filed: February 23, 2007
    Date of Patent: September 13, 2011
    Assignee: Applied Perception Inc.
    Inventors: Todd Jochem, Parag Batavia, Mark Ollis
  • Patent number: D669339
    Type: Grant
    Filed: March 2, 2011
    Date of Patent: October 23, 2012
    Assignee: Perception, Inc.
    Inventor: David Panik