Patents by Inventor Nick Cherukuri

Nick Cherukuri has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240071003
    Abstract: To provide an improved experience in generating and experiencing augmented reality training, a system and method may be provided. The process generally involves: selecting a digital twin of an apparatus or system to be used as part of a procedure for a trainee to be trained to perform; generating, on a first processor, an object-detection model based on the digital twin; receiving the digital twin at a second processor configured to provide a virtual reality (VR) authoring environment, and allowing a user to generate a training module based on the digital twin, the training module defining the procedure for the trainee to be trained to perform; and receiving, at a third processor, the object-detection model and the training module. Augmented Reality (AR) headsets and/or other AR-capable devices can then use the object-detection model and training module in order to provide an enhanced AR training experience.
    Type: Application
    Filed: August 31, 2022
    Publication date: February 29, 2024
    Applicant: ThirdEye Gen, Inc
    Inventor: Nick Cherukuri
  • Publication number: 20240020626
    Abstract: An augmented reality (AR) system may be provided. The AR system may include multiple Internet of Things (IoT) tags, each IoT tag being coupled to an inventory item, each inventory item located in a supply chain operating environment. The AR system may include at least one remote processor, and an AR headset configured to communicate with the IoT tags and the remote processor(s). The AR headset may be configured to receive a first transmission of first information from at least one IoT tag, transmit a second transmission comprising the first information to the remote processor(s), receive a third transmission comprising second information from the remote processor(s), display at least a portion of the second information on a display coupled to a frame of the AR headset, and transmit fourth information comprising video or images of a field of view of the AR headset to the at least one remote processor.
    Type: Application
    Filed: July 14, 2022
    Publication date: January 18, 2024
    Inventor: Nick Cherukuri
  • Publication number: 20240012090
    Abstract: A tracking system that works through walls, below grade, and at long range is provided, which may be utilized as a first responder tracking system. The system combines UWB (Ultrawideband) with AI-enhanced IMU motion tracking with use on, e.g., android-based smart glasses, phones and tablets. The UWB tracking provides a stable reference point in GPS denied environments, offering 3D tracking under the most challenging conditions. When walls or distance make UWB untenable, disclosed tags stream back IMU data processed by machine learning algorithms which remove, e.g., the effect of drift, noise, and error common to motion-based tracking. The system may process the UWB and IMU input data and produce a platform and language agnostic serialized message stream with position data. This published stream allows any visualization platform to subscribe and consume 3D position data, be it a local graph for engineering debug, cloud-based first responder software, or a third-party solution.
    Type: Application
    Filed: July 7, 2022
    Publication date: January 11, 2024
    Applicant: ThirdEye Gen, Inc
    Inventor: Nick Cherukuri
  • Publication number: 20220179209
    Abstract: An augmented reality eyewear device is configured to operate augmented reality applications and provides a wide-angle field view that can be utilized for low vision, vision impaired, and the blind. The software utilizes custom firmware to enable features such as greyscale viewing and the use of smart glasses in outdoors and indoors. The hardware is specifically designed to be entirely hands-free with no wires, thereby enabling easy use by the low vision community. Further, the augmented reality eyewear device is used for Glaucoma, Macular degeneration and other vision-impaired impairments.
    Type: Application
    Filed: December 7, 2020
    Publication date: June 9, 2022
    Applicant: ThirdEye Gen, Inc
    Inventor: Nick Cherukuri
  • Patent number: 11353708
    Abstract: An augmented reality eyewear device is configured to operate augmented reality applications and provides a wide-angle field view that can be utilized for low vision, vision impaired, and the blind. The software utilizes custom firmware to enable features such as greyscale viewing and the use of smart glasses in outdoors and indoors. The hardware is specifically designed to be entirely hands-free with no wires, thereby enabling easy use by the low vision community. Further, the augmented reality eyewear device is used for Glaucoma, Macular degeneration and other vision-impaired impairments.
    Type: Grant
    Filed: December 7, 2020
    Date of Patent: June 7, 2022
    Assignee: THIRDEYE GEN, INC.
    Inventor: Nick Cherukuri
  • Patent number: 11009698
    Abstract: A system and method for providing gaze-based user interface for an augmented and mixed reality device, is disclosed. The system comprises a plurality of sensors, modules stored in a memory and executable by processors. The sensors are configured to track gaze direction and head motion of a user. A display module is configured to display a virtual menu comprising interactive contents and a cursor for selecting interactive contents. A gaze and head motion detector module is configured to receive an input representing a gaze direction and head motion of the user from the sensors to determine a target point on the virtual display. A mapping module is configured to map the target point with the interactive content on the virtual display. An output module is configured to execute functions of the mapped interactive content. The present invention utilizes plurality of sensor data to manipulate an interactive content at high accuracy.
    Type: Grant
    Filed: March 13, 2019
    Date of Patent: May 18, 2021
    Inventor: Nick Cherukuri
  • Patent number: 10922888
    Abstract: An augmented reality eyewear device to operate augmented reality applications and provides a wide-angle field view, is disclosed. The eyewear device comprises a frame which is associated with a processor, a sensor assembly, a camera assembly, and a user interface control assembly coupled to the processor. The sensor assembly coupled to the processor comprises at least two inertial measurement unit (IMU) sensor to transmit raw IMU data of at least one IMU sensor and an android connected IMU data of at least one IMU sensor. The camera assembly coupled to the processor comprises at least two wide angle cameras synchronized with one another is configured to transmit camera feed data from the camera assembly to the processor. The processor is configured to dually synchronize raw IMU Data and android connected IMU data with the camera feed data providing a seamless display of 3D content of the augmented reality applications.
    Type: Grant
    Filed: November 25, 2018
    Date of Patent: February 16, 2021
    Inventor: Nick Cherukuri
  • Publication number: 20200292813
    Abstract: A system and method for providing gaze-based user interface for an augmented and mixed reality device, is disclosed. The system comprises a plurality of sensors, modules stored in a memory and executable by processors. The sensors are configured to track gaze direction and head motion of a user. A display module is configured to display a virtual menu comprising interactive contents and a cursor for selecting interactive contents. A gaze and head motion detector module is configured to receive an input representing a gaze direction and head motion of the user from the sensors to determine a target point on the virtual display. A mapping module is configured to map the target point with the interactive content on the virtual display. An output module is configured to execute functions of the mapped interactive content. The present invention utilizes plurality of sensor data to manipulate an interactive content at high accuracy.
    Type: Application
    Filed: March 13, 2019
    Publication date: September 17, 2020
    Applicant: ThirdEye Gen, Inc.
    Inventor: Nick Cherukuri
  • Publication number: 20200242835
    Abstract: A system and method to create an augmented reality 3D model in an editable format using a video feed from an augmented reality device, is disclosed. The augmented reality device comprises one or more image capturing device to capture the video feed data, and a processing component. The processing component is configured to process the video feed data and produce a scan data comprising a sparse point cloud, wherein the scan data comprises a list of keyframes, and wherein each keyframes comprises an image. The system further comprises one or more computing devices in communication with the augmented reality device via a network. The computing device is configured to create a dense point cloud utilizing the scan data, and convert the dense point cloud into an editable 3D model.
    Type: Application
    Filed: January 30, 2019
    Publication date: July 30, 2020
    Applicant: ThirdEye Gen, Inc.
    Inventor: Nick Cherukuri
  • Publication number: 20200168000
    Abstract: An augmented reality eyewear device to operate augmented reality applications and provides a wide-angle field view, is disclosed. The eyewear device comprises a frame which is associated with a processor, a sensor assembly, a camera assembly, and a user interface control assembly coupled to the processor. The sensor assembly coupled to the processor comprises at least two inertial measurement unit (IMU) sensor to transmit raw IMU data of at least one IMU sensor and an android connected IMU data of at least one IMU sensor. The camera assembly coupled to the processor comprises at least two wide angle cameras synchronized with one another is configured to transmit camera feed data from the camera assembly to the processor. The processor is configured to dually synchronize raw IMU Data and android connected IMU data with the camera feed data providing a seamless display of 3D content of the augmented reality applications.
    Type: Application
    Filed: November 25, 2018
    Publication date: May 28, 2020
    Applicant: ThirdEye Gen, Inc.
    Inventor: Nick Cherukuri