Patents Assigned to Perception, Inc.
-
Patent number: 11997203Abstract: Anonymizing systems and methods comprising a native configurations database including a set of configurations, a key management database including a plurality of private keys, a processor in communication with the native configurations database and the key management database, and a memory coupled to the processor. The set of configurations includes one or more textual descriptions and one or more ranges, wherein each range includes a contiguous sequence comprised of IP addresses, port numbers, or IP addresses and port numbers. The processor is configured to retrieve the set of configurations from the native configurations database, wherein the set of configurations includes a plurality of objects; retrieve a private key from the key management database; assign a unique cryptographically secure identity to each object; and anonymize the plurality of objects based on the cryptographically secure identities and the private key.Type: GrantFiled: April 12, 2023Date of Patent: May 28, 2024Assignee: Network Perception, Inc.Inventor: David M. Nicol
-
Patent number: 11860269Abstract: A centralized object detection sensor network system comprises a central unit configured to generate one or more probing signals for detecting one or more objects in an environment, and one or more transponders configured to receive the one or more probing signals and convert them into free space waves for detecting the one or more objects in the environment. The one or more transponders are communicatively coupled to the central unit through one or more communication links.Type: GrantFiled: May 23, 2022Date of Patent: January 2, 2024Assignee: Perceptive Inc.Inventor: Alberto Stochino
-
Patent number: 11762064Abstract: A lidar sensor comprising a laser, an optical sensor, and a processor. The lidar sensor can determine a distance to one or more objects. The lidar sensor can optionally embed a code in beams transmitted into the environment such that those beams can be individually identified when their corresponding reflection is received.Type: GrantFiled: August 20, 2020Date of Patent: September 19, 2023Assignee: Perceptive Inc.Inventor: Alberto Stochino
-
Patent number: 11675051Abstract: A lidar sensor comprising a laser, an optical sensor, and a processor. The lidar sensor can determine a distance to one or more objects. The lidar sensor can optionally embed a code in beams transmitted into the environment such that those beams can be individually identified when their corresponding reflection is received.Type: GrantFiled: February 23, 2022Date of Patent: June 13, 2023Assignee: Perceptive Inc.Inventor: Alberto Stochino
-
Patent number: 11658818Abstract: Anonymizing systems and methods comprising a native configurations database including a set of configurations, a key management database including a plurality of private keys, a processor in communication with the native configurations database and the key management database, and a memory coupled to the processor. The set of configurations includes one or more ranges, wherein each range includes a contiguous sequence comprised of IP addresses, port numbers, or IP addresses and port numbers. The processor is configured to retrieve the set of configurations from the native configurations database, wherein the set of configurations includes a plurality of objects; retrieve a private key from the key management database; assign a unique cryptographically secure identity to each object; and anonymize the plurality of objects based on the cryptographically secure identities and the private key.Type: GrantFiled: January 25, 2021Date of Patent: May 23, 2023Assignee: Network Perception, Inc.Inventor: David M. Nicol
-
Patent number: 11568511Abstract: A sensing and computing system and method for capturing images and data regarding an object and calculating one or more parameters regarding the object using an internal, integrated CPU/GPU. The system comprises an imaging system, including a depth imaging system, color camera, and light source, that capture images of the object and sends data or signals relating to the images to the CPU/GPU, which performs calculations based on those signals/data according to pre-programmed algorithms to determine the parameters. The CPU/GPU and imaging system are contained within a protective housing. The CPU/GPU transmits information regarding the parameters, rather than raw data/signals, to one or more external devices to perform tasks in an industrial environment related to the object imaged.Type: GrantFiled: January 27, 2021Date of Patent: January 31, 2023Assignee: Cloud 9 Perception, Inc.Inventors: Christopher D. McMurrough, James Francis Staud
-
Patent number: 11340346Abstract: A centralized object detection sensor network system comprises a central unit configured to generate one or more probing signals for detecting one or more objects in an environment, and one or more transponders configured to receive the one or more probing signals and convert them into free space waves for detecting the one or more objects in the environment. The one or more transponders are communicatively coupled to the central unit through one or more communication links.Type: GrantFiled: February 9, 2021Date of Patent: May 24, 2022Assignee: Perceptive Inc.Inventor: Alberto Stochino
-
Patent number: 10754011Abstract: A lidar sensor comprising a laser, an optical sensor, and a processor. The lidar sensor can determine a distance to one or more objects. The lidar sensor can optionally embed a code in beams transmitted into the environment such that those beams can be individually identified when their corresponding reflection is received.Type: GrantFiled: October 11, 2019Date of Patent: August 25, 2020Assignee: Perceptive Inc.Inventor: Alberto Stochino
-
Patent number: 10444366Abstract: A lidar sensor comprising a laser, an optical sensor, and a processor. The lidar sensor can determine a distance to one or more objects. The lidar sensor can optionally embed a code in beams transmitted into the environment such that those beams can be individually identified when their corresponding reflection is received.Type: GrantFiled: March 11, 2019Date of Patent: October 15, 2019Assignee: Perceptive Inc.Inventor: Alberto Stochino
-
Patent number: 10055892Abstract: Some augmented reality (AR) and virtual reality (VR) applications may require that an “activity region” be defined prior to their use. For example, a user performing a video conferencing application or playing a game may need to identify an appropriate space in which they may walk and gesture while wearing a Head Mounted Display without causing injury. This may be particularly important in VR applications where, e.g., the user's vision is completely obscured by the VR display, and/or the user will not see their actual environment as the user moves around. Various embodiments provide systems and methods for anticipating, defining, and applying the active region. In some embodiments, the system may represent real-world obstacles to the user in the user's field of view, e.g., outlining the contour of the problematic object to call the user's attention to the object's presence in the active region.Type: GrantFiled: January 13, 2017Date of Patent: August 21, 2018Assignee: Eonite Perception Inc.Inventors: Anna Petrovskaya, Peter Varvak, Anton Geraschenko, Dylan Koenig, Youssri Helmy
-
Patent number: 10043319Abstract: While many augmented reality systems provide “see-through” transparent or translucent displays upon which to project virtual objects, many virtual reality systems instead employ opaque, enclosed screens. Indeed, eliminating the user's perception of the real-world may be integral to some successful virtual reality experiences. Thus, head mounted displays designed exclusively for virtual reality experiences may not be easily repurposed to capture significant portions of the augmented reality market. Various of the disclosed embodiments facilitate the repurposing of a virtual reality device for augmented reality use. Particularly, by anticipating user head motion, embodiments may facilitate scene renderings better aligned with user expectations than naïve renderings generated within the enclosed field of view. In some embodiments, the system may use procedural mapping methods to generate a virtual model of the environment. The system may then use this model to supplement the anticipatory rendering.Type: GrantFiled: January 13, 2017Date of Patent: August 7, 2018Assignee: Eonite Perception Inc.Inventors: Anna Petrovskaya, Peter Varvak
-
Patent number: 9972137Abstract: Various of the disclosed embodiments provide systems and methods for acquiring and applying a depth determination of an environment in e.g., various augmented reality applications. A user may passively or actively scan a device (e.g., a tablet device, a mobile phone device, etc.) about the environment acquiring depth data for various regions. The system may integrate these scans into an internal three-dimensional model. This model may then be used in conjunction with subsequent data acquisitions to determine a device's location and orientation within the environment with high fidelity. In some embodiments, these determinations may be accomplished in real-time or near-real-time. Using the high-fidelity orientation and position determination, various augmented reality applications may then be possible using the same device used to acquire the depth data or a new device.Type: GrantFiled: July 14, 2017Date of Patent: May 15, 2018Assignee: Eonite Perception Inc.Inventors: Anna Petrovskaya, Peter Varvak
-
Patent number: 9916002Abstract: Augmented and virtual reality systems are becoming increasingly popular. Unfortunately, their potential for social interaction is difficult to realize with existing techniques. Various of the disclosed embodiments facilitate social augmented and virtual reality experiences using, e.g., topologies connecting disparate device types, shared-environments, messaging systems, virtual object placements, etc. Some embodiments employ pose-search systems and methods that provide more granular pose determinations than were previously possible. Such granularity may facilitate functionality that would otherwise be difficult or impossible to achieve.Type: GrantFiled: February 25, 2016Date of Patent: March 13, 2018Assignee: Eonite Perception Inc.Inventors: Anna Petrovskaya, Peter Varvak
-
Patent number: 9754419Abstract: Various of the disclosed embodiments provide systems and methods for acquiring and applying a depth determination of an environment in e.g., various augmented reality applications. A user may passively or actively scan a device (e.g., a tablet device, a mobile phone device, etc.) about the environment acquiring depth data for various regions. The system may integrate these scans into an internal three-dimensional model. This model may then be used in conjunction with subsequent data acquisitions to determine a device's location and orientation within the environment with high fidelity. In some embodiments, these determinations may be accomplished in real-time or near-real-time. Using the high-fidelity orientation and position determination, various augmented reality applications may then be possible using the same device used to acquire the depth data or a new device.Type: GrantFiled: November 13, 2015Date of Patent: September 5, 2017Assignee: Eonite Perception Inc.Inventors: Anna Petrovskaya, Peter Varvak
-
Patent number: 9630320Abstract: Methods and systems for detecting and reconstructing environments to facilitate robotic interaction with such environments are described. An example method may involve determining a three-dimensional (3D) virtual environment representative of a physical environment of the robotic manipulator including a plurality of 3D virtual objects corresponding to respective physical objects in the physical environment. The method may then involve determining two-dimensional (2D) images of the virtual environment including 2D depth maps. The method may then involve determining portions of the 2D images that correspond to a given one or more physical objects. The method may then involve determining, based on the portions and the 2D depth maps, 3D models corresponding to the portions. The method may then involve, based on the 3D models, selecting a physical object from the given one or more physical objects. The method may then involve providing an instruction to the robotic manipulator to move that object.Type: GrantFiled: July 7, 2015Date of Patent: April 25, 2017Assignee: Industrial Perception, Inc.Inventors: Kurt Konolige, Ethan Rublee, Stefan Hinterstoisser, Troy Straszheim, Gary Bradski, Hauke Malte Strasdat
-
Patent number: 9630321Abstract: Example systems and methods allow for dynamic updating of a plan to move objects using a robotic device. One example method includes determining a virtual environment by one or more processors based on sensor data received from one or more sensors, the virtual environment representing a physical environment containing a plurality of physical objects, developing a plan, based on the virtual environment, to cause a robotic manipulator to move one or more of the physical objects in the physical environment, causing the robotic manipulator to perform a first action according to the plan, receiving updated sensor data from the one or more sensors after the robotic manipulator performs the first action, modifying the virtual environment based on the updated sensor data, determining one or more modifications to the plan based on the modified virtual environment, and causing the robotic manipulator to perform a second action according to the modified plan.Type: GrantFiled: December 10, 2015Date of Patent: April 25, 2017Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser
-
Patent number: 9492924Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.Type: GrantFiled: June 15, 2016Date of Patent: November 15, 2016Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
-
Publication number: 20160288324Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.Type: ApplicationFiled: June 15, 2016Publication date: October 6, 2016Applicant: Industrial Perception, Inc.Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
-
Patent number: 9393686Abstract: Example embodiments provide for robotic apparatuses that facilitate moving objects within an environment, such as to load or unload boxes or to construct or deconstruct pallets (e.g., from a container or truck bed). One example apparatus includes a horizontal conveyor and a robotic manipulator that are both provided on a moveable cart. A first end of the robotic manipulator is mounted to the moveable cart and a second end of the robotic manipulator has an end effector, such as a grasper. The apparatus also includes a control system configured to receive sensor data indicative of an environment containing a plurality of objects, and then cause the robotic manipulator to place an object from the plurality of objects on the horizontal conveyor.Type: GrantFiled: March 14, 2014Date of Patent: July 19, 2016Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Steve Croft, Kurt Konolige, Ethan Rublee, Troy Straszheim, John Zevenbergen
-
Patent number: 9333649Abstract: Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.Type: GrantFiled: March 14, 2014Date of Patent: May 10, 2016Assignee: Industrial Perception, Inc.Inventors: Gary Bradski, Kurt Konolige, Ethan Rublee, Troy Straszheim, Hauke Strasdat, Stefan Hinterstoisser, Steve Croft, John Zevenbergen