Including Eye Movement Detection Patents (Class 351/209)
  • Patent number: 11983309
    Abstract: A processing device includes a memory configured to store executable instructions, and processing circuitry configured to execute the executable instructions stored in the memory to thereby realize: a first acquisition unit configured to acquire timing of blink motion performed by a dialogue device, a second acquisition unit configured to acquire blink timing by user performing a dialogue with the dialogue device, and a processing unit configured to perform processing according to difference between the timing of blink motion performed by the dialogue device and the blink timing by the user.
    Type: Grant
    Filed: September 17, 2020
    Date of Patent: May 14, 2024
    Inventor: Tamami Nakano
  • Patent number: 11966048
    Abstract: A head-mounted device may have a head-mounted support structure. Gaze tracking systems may be supported by the support structure so that the gaze of a user may be monitored. Lenses may be supported by the support structure. Display systems may provide computer-generated images to the user while the user is viewing real-world objects through the lenses. The gaze tracking systems may include image-sensor-based systems such as glint-based systems. A glint-based gaze tracking system may include light-emitting devices that emit light beams that create eye glints on the surface of a user's eyes and may include an image sensor that measures the eye glints to gather information on the user's gaze. A low-power gaze tracking system may be included in the head-mounted device. The low-power gaze tracking system may use light detectors to measure the magnitudes of respective light reflections of the light beams.
    Type: Grant
    Filed: June 9, 2021
    Date of Patent: April 23, 2024
    Assignee: Apple Inc.
    Inventors: Michael J. Oudenhoven, Brian S. Lau, David A. Kalinowski
  • Patent number: 11931171
    Abstract: The present disclosure is related to a method and apparatus for determining drug usage or a physiological characteristic of a patient. The present disclosure describes acquiring a video sequence, of an eye of a patient, the video sequence being a plurality of video frames, determining a frequency spectrum from a pupillary data of the video sequence, and determining, based on the frequency spectrum, the physiological characteristic or drug of use of the patient. In an embodiment, at least one frequency can be probed based on which physiological characteristic is being explored.
    Type: Grant
    Filed: January 15, 2019
    Date of Patent: March 19, 2024
    Inventor: Julia C. Finkel
  • Patent number: 11933975
    Abstract: An optical system includes an optical waveguide, and a first optical element configured to direct a first ray, having a first circular polarization and impinging on the first optical element at a first incidence angle, in a first direction so that the first ray propagates through the optical waveguide via total internal reflection toward a second optical element. The first optical element is configured to also direct a second ray, having a second circular polarization that is distinct from the first circular polarization and impinging on the first optical element at the first incidence angle, in a second direction that is distinct from the first direction so that the second ray propagates away from the second optical element. The second optical element is configured to direct the first ray propagating through the optical waveguide toward a detector.
    Type: Grant
    Filed: February 22, 2022
    Date of Patent: March 19, 2024
    Inventors: Babak Amirsolaimani, Pasi Saarikko, Ying Geng, Yusufu Njoni Bamaxam Sulai, Scott Charles McEldowney
  • Patent number: 11910865
    Abstract: The disclosed embodiments illustrate methods and systems for training users in sports using mixed reality. The method includes retrieving data from athletes wearing helmets, wearable glasses, and/or motion-capture suits in real time. The helmets and the wearable glasses are integrated with mixed-reality technology. Further, physical performance data of the athletes is captured using a variety of time-synchronized measurement techniques. Thereafter, the athletes are trained using the captured data and audio, visual and haptic feedback.
    Type: Grant
    Filed: August 19, 2023
    Date of Patent: February 27, 2024
    Assignee: RobotArmy Corp.
    Inventors: Damien Phelan Stolarz, Alan Gary Brown
  • Patent number: 11911108
    Abstract: A blood flow measuring unit of a blood flow measurement apparatus of an embodiment includes an optical system for applying an OCT scan to an eye fundus and obtains blood flow information based on data acquired by the OCT scan. A movement mechanism moves the optical system. A controller applies a first movement control to the movement mechanism to move the optical system in a first direction orthogonal to an optical axis of the optical system by a predetermined distance from the optical axis. A judging unit judges occurrence of vignetting based on a detection result of return light of light incident on the eye through the optical system after the first movement control. The controller applies a second movement control to the movement mechanism to further move the optical system based on a judgement result obtained by the judging unit.
    Type: Grant
    Filed: December 26, 2018
    Date of Patent: February 27, 2024
    Inventors: Jun Sakai, Shunsuke Nakamura, Kana Nishikawa, Masahiro Akiba
  • Patent number: 11907419
    Abstract: Systems and methods disclosed herein are related to an intelligent UI element selection system using eye-gaze technology. In some example aspects, a UI element selection zone may be determined. The selection zone may be defined as an area surrounding a boundary of the UI element. Gaze input may be received and the gaze input may be compared with the selection zone to determine an intent of the user. The gaze input may comprise one or more gaze locations. Each gaze location may be assigned a value according to its proximity to the UI element and/or its relation to the UI element's selection zone. Each UI element may be assigned a threshold. If the aggregated value of gaze input is equal to or greater than the threshold for the UI element, then the UI element may be selected.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: February 20, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Narasimhan Raghunath, Austin B. Hodges, Fei Su, Akhilesh Kaza, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Patent number: 11877799
    Abstract: The present invention relates to the field of instruments for visualizing the inner structures of the human body, and in particular of the eye. More specifically, its object is an optical coherence tomography system and method of the “Fourier-domain” type with the removal of unwanted artifacts through digital image processing.
    Type: Grant
    Filed: July 30, 2019
    Date of Patent: January 23, 2024
    Inventors: Stefano Faini, Lorenzo Rosellini, Francesco Versaci, Gabriele Vestri
  • Patent number: 11874962
    Abstract: A first determination unit determines whether a movement of an eyelid or an eyebrow of a wearer has occurred, based on an electric signal measured by a measurement electrode with a common electrode as a ground potential. A second determination unit determines that a movement of a face of the wearer has occurred when a distortion detection unit detects distortion of the spectacle frame. An output unit outputs a command due to operation of the spectacle frame by the wearer when the first determination unit determines that the movement of the eyelid or the eyebrow of the wearer has occurred and at the same time the second determination unit determines that the movement of the face of the wearer has occurred.
    Type: Grant
    Filed: June 24, 2020
    Date of Patent: January 16, 2024
    Assignee: Nippon Telegraph and Telephone Corporation
    Inventors: Kazuyoshi Ono, Shin Toyota
  • Patent number: 11864831
    Abstract: A visual field test device includes: first probability density function acquisition unit performs step (1) of obtaining a probability density function f(x1) for a result value x1 obtained in a first visual field test; a stimulation threshold determination unit performs step (2) of setting a stimulation threshold t1 to a value in the range of x1 in the f(x1); a test result acquisition unit performs step (3) of obtaining test results indicating if a result greater than or equal to t1 was obtained in the first visual field test; a second probability density function acquisition unit performs step (4) of obtaining a probability density function f(x2) by removing probability density values that's greater than or equal to t1, or removing values that are less than t1 from f(x1); and a determination unit performs step (5) of determining if a standard deviation ? of the f(x2) is below a predetermined value.
    Type: Grant
    Filed: March 6, 2019
    Date of Patent: January 9, 2024
    Inventors: Satoshi Inoue, Shinji Kimura, Kenzo Yamanaka
  • Patent number: 11860358
    Abstract: Angular sensors that may be used in eye-tracking systems are disclosed. An eye-tracking system may include a plurality of light sources to emit illumination light and a plurality of angular light sensors to receive returning light that is the illumination light reflecting from an eyebox region. The angular light sensors may output angular signals representing an angle of incidence of the returning light.
    Type: Grant
    Filed: August 1, 2022
    Date of Patent: January 2, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Liliana Ruiz Diaz, Ruiting Huang, Robin Sharma, Jonathan Robert Peterson, Christopher Yuan Ting Liao, Andrew John Ouderkirk, Giancarlo Sigurd Sante Nucci, Clare Joyce Robinson
  • Patent number: 11849999
    Abstract: Computer implemented methods for mobile device, mobile devices and computer programs are utilized for determining the center of rotation of the eye. An image of an eye of a person is captured in at least two positions of the mobile device, and the position of the center of rotation is determined based on the images and the position. In a similar manner, optionally a pupil position may be determined.
    Type: Grant
    Filed: January 12, 2023
    Date of Patent: December 26, 2023
    Assignee: Carl Zeiss Vision International GmbH
    Inventors: Mario Berger, Steffen Urban
  • Patent number: 11850508
    Abstract: A system for simulating an output in a virtual reality gaming environment including a pod for being entered by a user. A virtual reality device is located in the pod for being worn by the user. A controller unit is electrically connected with the virtual reality device and the pod and defines an electronic three-dimensional grid representative of the pod. At least one motion sensor is electrically connected to the controller unit for detecting movement within the pod and providing the controller unit input data. The input data includes three-dimensional coordinates representative of a location of the detected movement in the pod. At least one output device is disposed in the pod and electrically connected with the controller unit. The controller unit receives the input data and activates the output device at three-dimensional coordinates in the pod that correspond with the three-dimensional coordinates of input data.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: December 26, 2023
    Assignee: OSIRIUS GROUP, LLC
    Inventors: Richard Matthew Cieszkowski, III, Timothy David Smith
  • Patent number: 11850730
    Abstract: Eye gaze measurements are used to give input to a surgical robotic system. Eye tracking input is enhanced using a pair of eye trackers positioned to track the gaze of a user observing an endoscopic image on a display. At least one of the eye trackers is moveable relative to the display in a horizontal and/or vertical direction relative to the image display.
    Type: Grant
    Filed: July 17, 2020
    Date of Patent: December 26, 2023
    Assignee: Asensus Surgical US, Inc.
    Inventor: Andrea D'Ambrosio
  • Patent number: 11841502
    Abstract: A head-mounted display device includes a first reflective polarizer having a first optical surface and a second optical surface that is opposite to the first optical surface. The first optical surface of the first reflective polarizer is curved, and the first reflective polarizer is configured to reflect light having a first polarization. The head-mounted display device also includes a first electronic display configured to project toward the first reflective polarizer a light pattern. At least a portion of the light pattern is reflected by the first reflective polarizer. Also disclosed is a method, which includes directing a light pattern from the first electronic display toward the first reflective polarizer and reflecting at least a portion of the light pattern toward an eye of a user.
    Type: Grant
    Filed: July 1, 2020
    Date of Patent: December 12, 2023
    Inventors: Youngshik Yoon, Richard Han-Soo Cho
  • Patent number: 11826158
    Abstract: A system and associated method for computerized rotational head impulse test (crHIT) to assess the semicircular canals of the human vestibular system clinically in patients with balance disorders. The system utilizes a rotary chair combined with a head mounted VOG system with head tracking sensors. The crHIT protocol uses the same physiologic principles as the known video head impulse test (vHIT). The crHIT utilizes whole-body rotation via the chair to yield a persistent controlled, repeatable, comfortable, reliable stimulus can be delivered while recording eye movements with video-oculogaphy.
    Type: Grant
    Filed: January 27, 2020
    Date of Patent: November 28, 2023
    Assignee: NEURO KINETICS, INC.
    Inventors: Alexander D Kiderman, Ian A Shirey, Joseph M Furman
  • Patent number: 11822077
    Abstract: A virtual reality or augmented reality vision system including: a first emission module, emitting light in the visible range; a second emission module, emitting light in the infrared range; an infrared-sensitive photodetection module; and a micromirror array. Each micromirror is capable of assuming a first position and a second position. Each micromirror of the micromirror array is configured to, in its first position, receive light from the first emission module and reflect it along an axis of interest. Each micromirror of the micromirror array is configured to, in its second position, receive light from the second emission module and reflect it substantially along the axis of interest, and receive light propagating along the axis of interest and reflect it towards the photodetection module.
    Type: Grant
    Filed: October 6, 2021
    Date of Patent: November 21, 2023
    Inventor: Jean-François Mainguet
  • Patent number: 11810486
    Abstract: An electronic device may have a display and a camera. Control circuitry in the device can gather information on a user's point of gaze using a gaze tracking system and other sensors, can gather information on the real-world image such as information on content, motion, and other image attributes by analyzing the real-world image, can gather user vision information such as user acuity, contrast sensitivity, field of view, and geometrical distortions, can gather user input such as user preferences and user mode selection commands, and can gather other input. Based on the point-of-gaze information and/or other gathered information, the control circuitry can display the real-world image and supplemental information on the display. The supplemental information can include augmentations such as icons, text labels, and other computer-generated text and graphics overlaid on the real world image and can include enhanced image content such as magnified portions of the real-world image.
    Type: Grant
    Filed: July 13, 2020
    Date of Patent: November 7, 2023
    Assignee: Apple Inc.
    Inventors: Ramin Samadani, Christina G. Gambacorta, Elijah H. Kleeman, Nicolas P. Bonnier
  • Patent number: 11809621
    Abstract: A state of a user of a mobile terminal having a display for displaying an image may be estimated. A sightline detector detects a sightline of the user of a mobile terminal and processing circuitry is configured to estimate a state of the user. The processing circuitry is configured to determine a top-down index value that is correlated with an attention source amount allocated to top-down attention of the user with respect to an image displayed on a display of the mobile terminal. The processing circuitry is configured to determine a bottom-up index value correlated with the attention source amount allocated to bottom-up attention of the user with respect to the image displayed on the display. The processing circuitry is configured to estimate the user state including an attention function degraded state based on the top-down index value and the bottom-up index value.
    Type: Grant
    Filed: February 18, 2021
    Date of Patent: November 7, 2023
    Inventors: Koji Iwase, Kazuo Sakamoto, Akihide Takami
  • Patent number: 11803237
    Abstract: Disclosed herein is utilization of photosensor-oculography along with a video camera to implement a novel form of eye tracking, which can be important for many applications, such as augmented reality (AR) on smartglasses. In one embodiment, an eye tracking system includes a photosensor-oculography device (PSOG) that emits light and takes measurements of reflections of the light from an eye of a user, and a camera that captures images of the eye. A computer calculates values indicative of eye movement velocity (EMV) based on the measurements of the reflections obtained by the PSOG. These values then are used to determine how data is read from the camera, which can save power in some cases: the computer reads data from the camera at a higher bitrate when the values are indicative of the EMV are below a threshold compared to when the values are indicative of the EMV are above the threshold.
    Type: Grant
    Filed: September 30, 2021
    Date of Patent: October 31, 2023
    Assignee: Facense Ltd.
    Inventors: Arie Tzvieli, Ari M Frank, Gil Thieberger
  • Patent number: 11782513
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.
    Type: Grant
    Filed: June 11, 2021
    Date of Patent: October 10, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Samuel Holz
  • Patent number: 11778149
    Abstract: An apparatus for mounting on a head including a frame, A face-wearable near-ocular optics and a micro-display for displaying data in front of the eyes is provided. A computing device is coupled to the micro-display. At least one sensor is coupled to the computing device for receiving biometric human information.
    Type: Grant
    Filed: August 9, 2021
    Date of Patent: October 3, 2023
    Assignee: Snap Inc.
    Inventors: Erick Miller, Jonathan Rodriguez
  • Patent number: 11771908
    Abstract: A stimulation system stimulates anatomical targets in a patient for treatment of dry eye. The system may include a controller and a microstimulator. The controller may be implemented externally to or internally within the microstimulator. The components of the controller and microstimulator may be implemented in a single unit or in separate devices. When implemented separately, the controller and microstimulator may communicate wirelessly or via a wired connection. The microstimulator may generate pulses from a controller signal and apply the signal via one or more electrodes to an anatomical target. The microstimulator may not have any intelligence or logic to shape or modify a signal. The microstimulator may be a passive device configured to generate a pulse based on a signal received from the controller. The microstimulator may shape or modify a signal. Waveforms having different frequency, amplitude and period characteristics may stimulate different anatomical targets in a patient.
    Type: Grant
    Filed: June 24, 2020
    Date of Patent: October 3, 2023
    Assignee: The Board of Trustees of the Leland Stanford Junior University
    Inventors: Douglas Michael Ackermann, Daniel Palanker, James Donald Loudin, Garrett Cale Smith, Victor Wayne McCray, Brandon McNary Felkins
  • Patent number: 11756335
    Abstract: An apparatus for providing gaze tracking in a near-eye display. Certain examples provide an apparatus including a light modulator configured to receive light of a first range of wavelengths and generate an image beam therefrom. The light modulator is further configured to receive light of a second range of wavelengths and generate a probe beam therefrom. The apparatus also includes one or more light guides including one or more in-coupling element areas, and one or more out-coupling element areas. The one or more in-coupling diffractive element areas are configured to receive and in-couple the image beam and the probe beam into the one or more light guides. The one or more out-coupling element areas are configured to out-couple, from the one or more light guides: the image beam to a user's eye for user viewing, and the probe beam to the user's eye for detection of reflection therefrom.
    Type: Grant
    Filed: April 5, 2022
    Date of Patent: September 12, 2023
    Assignee: Magic Leap, Inc.
    Inventor: Toni Jarvenpaa
  • Patent number: 11709364
    Abstract: A projector for illuminating a target area is presented. The projector includes an array of emitters positioned on a substrate according to a distribution. Each emitter in the array of emitters has a non-circular emission area. Operation of at least a portion of the array of emitters is controlled based in part on emission instructions to emit light. The light from the projector is configured to illuminate the target area. The projector can be part of a depth camera assembly for depth sensing of a local area, or part of an eye tracker for determining a gaze direction for an eye.
    Type: Grant
    Filed: April 25, 2022
    Date of Patent: July 25, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Zhaoming Zhu, Mark Timothy Sullivan, Jonatan Ginzburg
  • Patent number: 11705232
    Abstract: A method, computer program product, and computing system for receiving audio-based content from a user who is reviewing an image on a display screen; receiving gaze information that defines a gaze location of the user; and temporally aligning the audio-based content and the gaze information to form location-based content.
    Type: Grant
    Filed: February 10, 2022
    Date of Patent: July 18, 2023
    Assignee: Nuance Communications, Inc.
    Inventor: Joel Praveen Pinto
  • Patent number: 11690510
    Abstract: Systems and methods are disclosed for evaluating human eye tracking. One method includes receiving data representing the location of and/or information tracked by an individual's eye or eyes before, during, or after the individual performs a task; identifying a temporal phase or a biomechanical phase of the task performed by the individual; identifying a visual cue in the identified temporal phase or biomechanical phase; and scoring the tracking of the individual's eye or eyes by comparing the data to the visual cue.
    Type: Grant
    Filed: November 1, 2021
    Date of Patent: July 4, 2023
    Assignee: RightEye LLC
    Inventors: Adam Todd Gross, Melissa Hunfalvay
  • Patent number: 11669162
    Abstract: A method for detecting an eye event of a user using an eye tracking system, the method comprising capturing a first image of a first eye of a user, capturing an image of a second eye of the user a first period after capturing the first image of the first eye and a second period before capturing a next image of the first eye, capturing a second image of the first eye the second period after capturing the image of the second eye, determining that an eye event has occurred based on a difference between the first and second images of the first eye, and performing at least one action if it is determined that that an eye event has occurred.
    Type: Grant
    Filed: April 5, 2022
    Date of Patent: June 6, 2023
    Assignee: Tobii AB
    Inventor: Andreas Klingström
  • Patent number: 11660031
    Abstract: A method is disclosed for testing hearing in infants based on pupil dilation response. It does not require sedation or depend on subjective judgments of human testers. Pupil dilation response to sound is measured by presenting on a display a visually engaging video containing periodic changes; presenting sounds synchronized with the periodic changes of the video; recording images from a camera directed in front of the display, where the camera is sensitive to infrared wavelengths; processing the images to measure pupil sizes; and processing the measured pupil sizes to determine, statistically, the presence of a pupil dilation response to the sounds.
    Type: Grant
    Filed: February 7, 2020
    Date of Patent: May 30, 2023
    Assignee: University of Oregon
    Inventors: Avinash Deep Bala Singh, Terry Takeshi Takahashi
  • Patent number: 11633099
    Abstract: Disclosed are methods for diagnosing declarative memory loss using mouse tracking to follow the visual gaze of a subject taking a visual paired comparison test. Also disclosed are methods for diagnosing dementia such as mild cognitive impairment and Alzheimer's disease.
    Type: Grant
    Filed: May 29, 2020
    Date of Patent: April 25, 2023
    Inventors: Yevgeny E. Agichtein, Elizabeth A. Buffalo, Dmitry Lagun, Cecelia Manzanares, Stuart Zola
  • Patent number: 11612342
    Abstract: Provided is a control system that interfaces with an individual through tracking the eyes and/or tracking other physiological signals generated by an individual. The system, is configured to classify the captured eye images into gestures, that emulate a joystick-like control of the computer. These gestures permit the user to operate, for instance a computer or a system with menu items.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: March 28, 2023
    Inventors: Itai Kornberg, Or Retzkin
  • Patent number: 11602296
    Abstract: A mental impairment detection system and non-invasive method of detecting mental impairment of a user are provided. A test (e.g., an inhibitory reflex test or a sustained attention test) is administered to the user, brain activity in a frontal lobe of the user is non-invasively detected while the test is administered to the user, and a level of mental impairment of the user is determined based on the brain activity detected in the frontal lobe of the user.
    Type: Grant
    Filed: January 27, 2022
    Date of Patent: March 14, 2023
    Assignee: HI LLC
    Inventors: Husam Katnani, Daniel Sobek, Antonio H. Lara
  • Patent number: 11602273
    Abstract: A system and method for determining a subject's attentional response to a stimulus. The method includes measuring microsaccadic eye movement dynamics of the subject, detecting whether a microsaccadic signature (a suppression in microsaccadic rate) is present in the measured microsaccadic eye movement relative to a time of the stimulus, and correlating the subject's attentional response to the stimulus based on the detection. The method further includes determining that the stimulus was sensed if the microsaccadic signature is present and determining that the stimulus was not sensed if the microsaccadic signature is absent.
    Type: Grant
    Filed: December 29, 2017
    Date of Patent: March 14, 2023
    Assignee: DIGNITY HEALTH
    Inventors: Jorge Otero-Millan, Stephen L. Macknik, Susana Martinez-Conde
  • Patent number: 11589797
    Abstract: Embodiments of the presently-disclosed subject matter include methods and systems for measuring a level of a neurotransmitter in a subject. Embodiments of the present methods comprise displaying a fixation point, a reward target, and a non-reward target, and measuring one or more saccade movement parameters for reward saccades and non-reward saccades. The saccade movement parameters can include velocity, amplitude, reaction time, or a combination thereof. The present methods can further include determining a reward modulation of the subject, the reward modulation being equal to a difference between the reward and the non-reward values for a respective saccade movement parameter. Some embodiments further include identifying the subject as including a deficiency of the neurotransmitter if there is a statistically measurable difference between the reward modulation of the subject and a reference reward modulation and/or if the non-reward and the reward saccade movement parameters are statistically equivalent.
    Type: Grant
    Filed: June 2, 2014
    Date of Patent: February 28, 2023
    Assignee: University of Mississippi Medical Center
    Inventor: Lewis Chen
  • Patent number: 11580874
    Abstract: The subject matter described herein includes methods, systems, and computer readable media for automated attention assessment. According to one method, a method for automated attention assessment includes obtaining head and iris positions of a user using a camera while the user watches a display screen displaying a video containing dynamic region-based stimuli designed for identifying a neurodevelopmental and/or psychiatric (neurodevelopmental/psychiatric) disorder; analyzing the head and iris positions of the user to detect attention assessment information associated with the user, wherein the attention assessment information indicates how often and/or how long the user attended to one or more regions of the display screen while watching the video; determining that the attention assessment information is indicative of the neurodevelopmental/psychiatric disorder; and providing, via a communications interface, the attention assessment information, a diagnosis, or related data.
    Type: Grant
    Filed: November 8, 2019
    Date of Patent: February 14, 2023
    Assignee: Duke University
    Inventors: Guillermo Sapiro, Geraldine Dawson, Matthieu Bovery, Jordan Hashemi
  • Patent number: 11573634
    Abstract: A display method for displaying an image by an HMD mounted on a head of a user includes: an identifying step for identifying a direction in which the user gazes; and an adjusting step for adjusting a display aspect of a display image so that a gaze region that the user gazes at in the display image displayed by the HMD approaches a predetermined position corresponding to a front of the user.
    Type: Grant
    Filed: January 24, 2022
    Date of Patent: February 7, 2023
    Inventor: Fusashi Kimura
  • Patent number: 11559243
    Abstract: In described embodiments, a device and method for diagnosing brain and neurological issues is provided. The device measures the performance of Convergence, Divergence, and binocular tracking capabilities of a subject's eyes, which can be used to determine whether a subject has experienced a brain or other neurological event.
    Type: Grant
    Filed: July 23, 2020
    Date of Patent: January 24, 2023
    Inventor: Mansour Zarreii
  • Patent number: 11559197
    Abstract: A Progressive Lens Simulator comprises an Eye Tracker, for tracking an eye axis direction to determine a gaze distance, an Off-Axis Progressive Lens Simulator, for generating an Off-Axis progressive lens simulation and an Axial Power-Distance Simulator, for simulating a progressive lens power in the eye axis direction. The Progressive Lens Simulator can alternatively include an integrated Progressive Lens Simulator, for creating a Comprehensive Progressive Lens Simulation. The Progressive Lens Simulator can be Head-mounted, A Guided Lens Design Exploration System for the Progressive Lens Simulator can include a Progressive Lens Simulator, a Feedback-Control Interface, and a Progressive Lens Design processor, to generate a modified progressive lens simulation for the patient after a guided modification of the progressive lens design.
    Type: Grant
    Filed: March 6, 2019
    Date of Patent: January 24, 2023
    Assignee: Neurolens, Inc.
    Inventor: Gergely T. Zimanyi
  • Patent number: 11561392
    Abstract: The invention relates to a method for generating and displaying a virtual object to an individual user by an optical system consisting of gaze-tracking glasses and at least one display unit connected to the gaze-tracking glasses, the display unit having a first display, the gaze-tracking glasses having a first eye-tracking camera, the first display being arranged in a first viewing region of the gaze-tracking glasses. According to the invention, the optical system is adapted to an individual user, a first target value for adaptation of a display control unit of the display unit for controlling the first display being determined, a current viewing direction of the first eye being determined by the gaze-tracking glasses, a virtual object being generated and, taking account of the first target value, the virtual object being displayed in the first display at a position in the determined viewing direction of the first eye.
    Type: Grant
    Filed: December 19, 2019
    Date of Patent: January 24, 2023
    Inventors: Julian Grahsl, Michael Mörtenhuber, Frank Linsenmaier
  • Patent number: 11537202
    Abstract: A system and method for generating data suitable for calibrating a head-wearable device is disclosed. In one example the device includes a first eye camera and a scene camera. The first eye camera is used to generate first images of at least a portion of a first eye of the user while the user is expected to look at the object and mimic the not translatory movement. Respective positions of the object in the field images are determined. The determined positions of the object are used to determine the first images respective ground truth values of at least one gaze-direction related parameter of the user.
    Type: Grant
    Filed: January 16, 2019
    Date of Patent: December 27, 2022
    Assignee: Pupil Labs GmbH
    Inventors: Moritz Kassner, Marc Tonsen, Kai Dierkes
  • Patent number: 11503998
    Abstract: The present disclosure relates to a method and a system for detecting a neurological disease and an eye gaze-pattern abnormality related to the neurological disease of a user. The method comprises displaying stimulus videos on a screen of an electronic device and simultaneously filming with a camera of the electronic device to generate a video of the user's face for each one of the stimulus videos, each one of the stimulus videos corresponding to a task. The method further comprises providing a machine learning model for gaze predictions, generating the gaze predictions for each video frame of the recorded video, and determining features for each task to detect the neurological disease using a pre-trained machine learning model.
    Type: Grant
    Filed: May 5, 2021
    Date of Patent: November 22, 2022
    Inventors: Etienne De Villers-Sidani, Paul Alexandre Drouin-Picaro, Yves Desgagne
  • Patent number: 11509816
    Abstract: An image processing apparatus includes a display unit configured to display an image captured via an optical system, an estimation unit configured to estimate a gazing point position of a user on a display unit, and a control unit configured to change a zoom position of an image displayed on the display unit. When start is instructed of image pickup assist control, the control unit zooms out the image displayed on the display unit from a first zoom position to a second zoom position on more wide-angle side than the first zoom position. When stop is instructed of the image pickup assist control, the control unit zooms in on a zoom-out image from the second zoom position to a third zoom position on more telephoto side than the second zoom position, based on the gazing point position estimated by the estimation unit.
    Type: Grant
    Filed: February 2, 2021
    Date of Patent: November 22, 2022
    Inventors: Yuya Ebata, Hiroyuki Yaguchi
  • Patent number: 11490809
    Abstract: A system and/or method for measuring a human ocular parameter comprises a human-wearable face shield which has an eye sensor, a head orientation sensor, and an electronic circuit, and a face shield. The eye sensor comprises a video camera that measures horizontal eye movement, vertical eye movement, pupillometry, and/or eyelid movement. The head orientation sensor measures pitch and/or yaw of the wearer's face. The electronic circuit is response to the eye sensor and the head orientation sensor and measures an ocular parameter such as vestibulo-ocular reflex, ocular saccades, pupillometry, pursuit tracking during visual pursuit, vergence, eye closure, focused position of the eyes, dynamic visual acuity, kinetic visual acuity, virtual retinal stability, retinal image stability, foveal fixation stability, or nystagmus.
    Type: Grant
    Filed: June 16, 2020
    Date of Patent: November 8, 2022
    Inventor: Wesley W. O. Krueger
  • Patent number: 11487358
    Abstract: A display apparatus including: light source(s); camera(s); and processor(s) configured to: display extended-reality image for presentation to user, whilst capturing eye image(s) of user's eyes; analyse eye image(s) to detect eye features; employ existing calibration model to determine gaze directions of user's eyes; determine gaze location of user; identify three-dimensional bounding box at gaze location within extended-reality environment, based on position and optical depth of gaze location; identify inlying pixels of extended-reality image lying within three-dimensional bounding box, based on optical depths of pixels in extended-reality image; compute probability of user focussing on given inlying pixel and generate probability distribution of probabilities computed for inlying pixels; identify at least one inlying pixel calibration target, based on probability distribution; and map position of calibration target to eye features, to update existing calibration model to generate new calibration model.
    Type: Grant
    Filed: April 19, 2021
    Date of Patent: November 1, 2022
    Assignee: Varjo Technologies Oy
    Inventors: Ville Miettinen, Mikko Strandborg
  • Patent number: 11478141
    Abstract: Methods and systems for assessing a visual field of a person are provided. Information can be presented to a person undergoing a visual field testing in a manner that utilizes the person's natural tendency to look at an object that is displayed so that it attracts the person's attention. A fixation target can be displayed on a display viewed by a user. Once it is determined that the user has viewed the fixation target and the person's eye(s) location is determined, a test target is displayed on the display in a location corresponding to a location on the user's visual field. The test target is determined to be either detected or missed based on user input acquired as the user is viewing the display.
    Type: Grant
    Filed: November 14, 2018
    Date of Patent: October 25, 2022
    Assignee: Vivid Vision, Inc.
    Inventors: James J. Blaha, Benjamin T. Backus, Manish Z. Gupta
  • Patent number: 11454811
    Abstract: The present disclosure provides a method and apparatus for unlocking a head-mounted display device based on gaze point information. The method includes: acquiring gaze point information of a user who wears a locked head-mounted display device; generating unlocking information according to the gaze point information; performing a matching operation on the unlocking information and a pre-stored unlocking key; and unlocking the head-mounted display device when the unlocking information matches the unlocking key. With this method, convenience and safety of the unlocking operation may be improved, and user's experience may be enhanced.
    Type: Grant
    Filed: September 6, 2019
    Date of Patent: September 27, 2022
    Inventors: Bing Xiao, Chi Xu
  • Patent number: 11426107
    Abstract: The present disclosure relates generally to a system and method for detecting or indicating a state of impairment of a test subject or user due to drugs or alcohol, and more particularly to a method, system and application or software program configured to creating a virtual-reality (“VR”) environment that implements drug and alcohol impairment tests, and which utilizes eye tracking technology to detect or indicate impairment.
    Type: Grant
    Filed: October 17, 2019
    Date of Patent: August 30, 2022
    Inventors: Scott M. Gibbons, Aaron Frank, Matthew Kromer, Bohdan Paselsky
  • Patent number: 11375891
    Abstract: Methods, systems, and devices to improve the assessment of visual function and overcoming limitations of current methods to identify the visual function that potentially could be reached by a given eye. Multiple eye tests including visual stimuli plus optical measurement components, and methods to combine the results, identify and quantify sources of decreased vision resulting from optical sources, such as a lens and cornea as distinguished from retinal sources. By identifying potentially correctable optical sources of decreased vision, and overcoming physiological limitations such as size of the eye's pupil, the visual benefits of treatment such as by cataract or corneal surgery are distinguished from retinal pathology that requires medical intervention. The devices and methods provide metrics that include an expected value of the visual function and sources of variability including both optical and neural components, to guide treatment and improve clinical trials.
    Type: Grant
    Filed: December 17, 2021
    Date of Patent: July 5, 2022
    Assignee: The Trustees of Indiana University
    Inventor: Ann E. Elsner
  • Patent number: 11369262
    Abstract: The invention relates to a method for determining the likelihood of being at or beyond a certain rate and/or risk of visual field progression (VFP) of a user, comprising the following steps: an installing step (S100) comprising placing or implanting a continuous-wear sensor on or in an eye of a user; a measuring step (S101) comprising measuring ocular biomechanical properties (OBP) through the continuous-wear sensor placed on or implanted in the eye, with said measurement comprising repeated data capture at regular time intervals; a recording step (S102) comprising recording the user's ocular biomechanical properties in the form of at least one OBP time series plot in a recorder; a processing step (S103) wherein at least one of a plurality of OBP parameters are extracted from the at least one recorded OBP time series plot; a calculation step (S104) wherein the at least one of a plurality of OBP parameters are associated to VFP; a determining step (S105) wherein one determines whether the visual field progress
    Type: Grant
    Filed: June 14, 2017
    Date of Patent: June 28, 2022
    Assignee: Sensimed SA
    Inventors: Mario Schlund, Thierry Varidel, Raphael Fritschi, Carlos Gustavo De Moraes
  • Patent number: 11368609
    Abstract: The present disclosure relates to an electronic instrument capable of downsizing an electronic instrument having a function of imaging surroundings of a user. In an electronic instrument worn or used by a user, the electronic instrument includes an imaging unit arranged at a position where surroundings of the user wearing or using the electronic instrument is capturable, the imaging unit including a plurality of pixel output units that each receive incident light from a subject incident not via either an imaging lens or a pinhole and output one detection signal indicating an output pixel value modulated depending on an incident angle of the incident light. The present disclosure can be applied to, for example, a wearable device.
    Type: Grant
    Filed: October 19, 2018
    Date of Patent: June 21, 2022
    Inventors: Akira Tokuse, Yoshitaka Miyatani, Noriaki Kozuka