Patents by Inventor Sergio ORTIZ EGEA
Sergio ORTIZ EGEA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20200301239Abstract: A near-eye display system comprises a display projector configured to emit display light, an optical waveguide, a fixed-focus lens, and a variable-focus lens of variable optical power. The optical waveguide is configured to receive the display light and to release the display light toward an observer. The fixed-focus lens is arranged to adjust a vergence of the display light released from the optical waveguide. The variable-focus lens is arranged in series with the fixed-focus lens and configured to vary, responsive to a focusing bias, the vergence of the display light released from the optical waveguide.Type: ApplicationFiled: March 18, 2019Publication date: September 24, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Onur Can AKKAYA, Bernard Charles KRESS, Sergio ORTIZ EGEA, Alfonsus D. LUNARDHI
-
Patent number: 10785422Abstract: A camera is configured to output a test depth+multi-spectral image including a plurality of pixels. Each pixel corresponds to one of the plurality of sensors of a sensor array of the camera and includes at least a depth value and a spectral value for each spectral light sub-band of a plurality of spectral illuminators of the camera. A face recognition machine is previously trained with a set of labeled training depth+multi-spectral images having a same structure as the test depth+multi-spectral image. The face recognition machine is configured to output a confidence value indicating a likelihood that the test depth+multi-spectral image includes a face.Type: GrantFiled: May 29, 2018Date of Patent: September 22, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Sergio Ortiz Egea, Michael Scott Fenton, Abdelrehim Ahmed
-
Publication number: 20200271583Abstract: A camera system includes one or more spectral illuminators, a tunable optical filter, and a sensor array. Active spectral light emitted from the one or more spectral illuminators towards a scene is dynamically tuned to an illumination sub-band selected from a plurality of different illumination sub-bands. Sequentially for each of a plurality of fluorescing light sub-bands different than the selected illumination sub-band, the tunable optical filter is adjusted to block light from being transmitted from the scene to the sensor array in all but a tested fluorescing light sub-band from the plurality of different fluorescing light sub-bands, and the sensor array is addressed to acquire one or more image of the scene in the tested fluorescing light sub-band.Type: ApplicationFiled: February 27, 2019Publication date: August 27, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Sergio ORTIZ EGEA, Maria Esther PACE, Onur Can AKKAYA, Michael Scott FENTON
-
Publication number: 20200240781Abstract: A time-of-flight (ToF) system disclosed herein provides a method of a method of separating a direct component of light collected by a time of flight (ToF) detector from a global component of light collected by the ToF detector, the method comprising acquiring three or more images represented by three or more matrices in response to illuminating a target with a light source using a first spatial pattern at three or more different modulation frequencies, acquiring an additional image represented by an additional matrix in response to illuminating the target with the light source using a second spatial pattern, the second spatial pattern being different than the first spatial pattern, and determining one or more parameters of the direct component of light and the global component of light based on analysis of the three or more matrices and the additional matrix.Type: ApplicationFiled: January 30, 2019Publication date: July 30, 2020Inventors: Sergio ORTIZ EGEA, Hung-Ming LIN
-
Patent number: 10718942Abstract: An eye tracking system for a NED device includes sensors that are directed toward and angularly offset from a user's eyes in a manner that causes circular features (e.g., irises and/or pupils) of the user's eyes to appear elliptical within sensor planes that correspond to the individual sensors. The eye tracking system determines parameters associated with detected ellipses and then uses these parameters to generate 3D propagations from the detected ellipses back to the user's eyes. By analyzing these 3D propagations with respect to ocular rotation models that represent how the pupils and iris rotate about the center of the eye, the eye tracking system determines pupil orientation parameters that define physical characteristics of the user's eyes. The pupil orientation parameters may then be used to determine interpupillary distance and/or vergence of the user's visual axis that extend from the user's fovea.Type: GrantFiled: October 23, 2018Date of Patent: July 21, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Sergio Ortiz Egea, Venkata Satya Reghavendra Bulusu, Bernard C. Kress, Alfonsus D. Lunardhi, Onur Can Akkaya
-
Publication number: 20200228753Abstract: A camera system is configured to automatically monitor an area. Depth image(s) of the area are acquired based on active IR light emitted by the camera system and reflected from the area to a sensor array of the camera system. The depth image(s) are computer analyzed to identify a human subject. For each spectral illuminator of the camera system, spectral light image(s) of the area are acquired based on active spectral light in the spectral light sub-band of the spectral illuminator reflected from the area to the sensor array. The spectral light image(s) for the spectral illuminators are computer analyzed to identify an interaction between the human subject and an object in the area. In response to identifying the interaction between the human subject and the object in the area, an action to be performed for the object in the area is computer issued.Type: ApplicationFiled: January 15, 2019Publication date: July 16, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Sergio ORTIZ EGEA, Riaz Imdad ALI, Michael Scott FENTON, Onur Can AKKAYA
-
Publication number: 20200132277Abstract: A camera includes one or more spectral illuminators, a tunable optical filter optically intermediate the one or more spectral illuminators and a scene, and a sensor array. The one or more spectral illuminators are configured to emit active spectral light. The tunable optical filter is dynamically adjustable to change a selected sub-band of the active spectral light that illuminates the scene. The sensor array includes a plurality of sensors each configured to measure spectral light reflected from the scene in the selected sub-band.Type: ApplicationFiled: November 12, 2019Publication date: April 30, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Onur Can AKKAYA, Sergio ORTIZ EGEA, Cyrus Soli BAMJI
-
Publication number: 20200137324Abstract: A camera includes one or more spectral illuminators, a tunable optical filter optically intermediate the one or more spectral illuminators and a scene, and a sensor array. The one or more spectral illuminators are configured to emit active spectral light. The tunable optical filter is dynamically adjustable to change a selected sub-band of the active spectral light that illuminates the scene. The sensor array includes a plurality of sensors each configured to measure spectral light reflected from the scene in the selected sub-band.Type: ApplicationFiled: October 31, 2018Publication date: April 30, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Onur Can AKKAYA, Sergio ORTIZ EGEA, Cyrus Soli BAMJI
-
Publication number: 20200121183Abstract: Techniques for implementing eye tracking using various real-time computational solutions to a three-dimensional eye tracking framework. An exemplary eye tracking system for a NED device includes sensors that are directed toward and angularly offset from a user's eyes in a manner that causes circular features (e.g., irises and/or pupils) of the user's eyes to appear elliptical within sensor planes of the individual sensors. An iris and/or pupil of an eye will appear circular when the eye is looked at straight on (i.e., perpendicular to an optical axis of the eye's lens) but elliptical when observed from an angular offset. The eye tracking systems and methods disclosed herein exploit these principles to track movements of the user's eyes with a higher degree of accuracy than conventional eye tracking systems.Type: ApplicationFiled: May 16, 2019Publication date: April 23, 2020Inventors: Sergio ORTIZ EGEA, Jian Feng GAO, Alfonsus D. LUNARDHI, Venkata Satya Raghavendra BULUSU
-
Publication number: 20200124844Abstract: An eye tracking system for a NED device includes sensors that are directed toward and angularly offset from a user's eyes in a manner that causes circular features (e.g., irises and/or pupils) of the user's eyes to appear elliptical within sensor planes that correspond to the individual sensors. The eye tracking system determines parameters associated with detected ellipses and then uses these parameters to generate 3D propagations from the detected ellipses back to the user's eyes. By analyzing these 3D propagations with respect to ocular rotation models that represent how the pupils and iris rotate about the center of the eye, the eye tracking system determines pupil orientation parameters that define physical characteristics of the user's eyes. The pupil orientation parameters may then be used to determine interpupillary distance and/or vergence of the user's visual axis that extend from the user's fovea.Type: ApplicationFiled: October 23, 2018Publication date: April 23, 2020Inventors: Sergio ORTIZ EGEA, Venkata Satya Raghavendra BULUSU, Bernard C. KRESS, Alfonsus D. LUNARDHI, Onur Can AKKAYA
-
Publication number: 20200125166Abstract: Technologies for performing user-specific calibration of eye tracking systems for Near-Eye-Display (NED) devices. The NED device may sequentially present different virtual stimuli to a user while concurrently capturing instances of eye tracking data. The eye tracking data reveals calibration ellipse centers that uniquely correspond to individual virtual stimuli. The calibration ellipse centers may be used define a polygon grid in association with a sensor plane. The resulting polygon grid is used during operation to interpolate the real-time gaze direction of the user. For example, a real-time instance of eye tracking data may be analyzed to determine which particular polygon of the polygon grid a real-time ellipse center falls within. Then, distances between the real-time ellipse center and the vertices of the particular polygon may be determined. A proportionality factor is then determined based on these distances and is used to interpolate the real-time eye gaze of the user.Type: ApplicationFiled: May 16, 2019Publication date: April 23, 2020Inventors: Sergio ORTIZ EGEA, Jian Feng GAO, Alfonsus D. LUNARDHI, Venkata Satya Raghavendra BULUSU
-
Publication number: 20200128231Abstract: A Near-Eye-Display (NED) device utilizes eye tracking to interpret gaze direction as user input for hands free positioning of virtual items. The NED device includes an eye tracking system to monitor gaze direction and a display component that renders virtual items within a user's view of a physical real-world environment. The NED device receives an indication that the user desires to adjust a position of the virtual item. In response, the NED device uses tracks the user's eye movements to dynamically change the position at which the display component is rendering the virtual item. In this way, a NED device may identify when a user desires to adjust a position at which a virtual item is being rendered and then may enable the user to make the desired adjustments simply by “dragging” the virtual item with deliberate and controlled eye movements.Type: ApplicationFiled: April 12, 2019Publication date: April 23, 2020Inventors: Maria Esther PACE, Sergio ORTIZ EGEA
-
Publication number: 20200125165Abstract: A Near-Eye-Display (NED) devices that translates combinations of user gaze direction and predetermined facial gestures into user input instructions. The NED device includes an eye tracking system and a display that renders computer-generated images within a user's field-of-view. The eye tracking system may continually track the user's eye movements with a high degree of accuracy to identify specific computer-generated images that a user is focused on. The eye tracking system may also identify various facial gestures such as, for example, left-eye blinks and/or right-eye blinks that are performed while the specific computer-generated images are being focused on. In this way, NED devices are enabled to identify combinations of user gaze direction and predetermined facial gestures and to translate these identified combinations into user input instructions that correspond to specific computer-generated images.Type: ApplicationFiled: April 12, 2019Publication date: April 23, 2020Inventors: Maria Esther PACE, Sergio ORTIZ EGEA
-
Patent number: 10607352Abstract: A time-of-flight (ToF) camera is configured to operate in a manner that reduces power consumption of the ToF camera. For a key frame, a key-frame depth image is generated based on a plurality of sets of key-frame IR images. Each set of key-frame IR images is acquired using a different modulation frequency of active IR light. For a P-frame after the key frame, a P-frame depth image is generated based on a set of P-frame IR images acquired using a single modulation frequency of active IR light.Type: GrantFiled: May 17, 2018Date of Patent: March 31, 2020Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Sergio Ortiz Egea, Hung-Ming Lin
-
Patent number: 10598768Abstract: A time-of-flight (ToF) system disclosed herein provides a method of separating a direct component of light collected by a time of flight (ToF) detector from a global component of light collected by the ToF detector by acquiring a first image represented by a first matrix in response to illuminating a target with a light source using a first spatial pattern, acquiring a second image represented by a second matrix in response to illuminating the target with the light source using a second spatial pattern, the second spatial pattern being different than the first spatial pattern, and determining one or more parameters of the direct component of light and the global component of light based on analysis of the first matrix and the second matrix.Type: GrantFiled: May 24, 2017Date of Patent: March 24, 2020Assignee: Microsoft Technology Licensing, LLCInventor: Sergio Ortiz Egea
-
Patent number: 10592753Abstract: The described implementations relate to managing depth cameras. One example can include a depth camera that includes an emitter for illuminating light on a scene and a sensor for sensing light reflected from the scene. The example can also include a resource-conserving camera control component configured to determine when the scene is static by comparing captures and/or frames of the scene from the sensor. The resource-conserving camera control component can operate the depth camera in resource constrained modes while the scene remains static.Type: GrantFiled: March 1, 2019Date of Patent: March 17, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Sergio Ortiz Egea, Onur C. Akkaya, Cyrus Bamji
-
Publication number: 20200036880Abstract: Examples are disclosed that relate to methods and systems for determining if a fluid is present on a surface. In one example, a method comprises illuminating the surface with narrow-band light and using an image sensor comprising a narrow-bandpass filter matching the bandwidth of the narrow-band light to obtain a first image of the surface. A second image of the surface with the narrow-band light deactivated is obtained. A third image is generated by subtracting the second image from the first image. The third image is thresholded, one or more contrasting regions are detected, and the presence of fluid on the surface is determined.Type: ApplicationFiled: July 25, 2018Publication date: January 30, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Sergio ORTIZ EGEA, Michael Scott FENTON, Venkata Satya Raghavendra BULUSU, Riaz Imdad ALI
-
Publication number: 20190373186Abstract: A camera is configured to output a test depth+multi-spectral image including a plurality of pixels. Each pixel corresponds to one of the plurality of sensors of a sensor array of the camera and includes at least a depth value and a spectral value for each spectral light sub-band of a plurality of spectral illuminators of the camera. A face recognition machine is previously trained with a set of labeled training depth+multi-spectral images having a same structure as the test depth+multi-spectral image. The face recognition machine is configured to output a confidence value indicating a likelihood that the test depth+multi-spectral image includes a face.Type: ApplicationFiled: May 29, 2018Publication date: December 5, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Sergio ORTIZ EGEA, Michael Scott FENTON, Abdelrehim AHMED
-
Publication number: 20190364254Abstract: A camera includes an optical filter for a sensor array of the camera. The optical filter includes a plurality of liquid crystal layers switchable between a reflection state and a transmission state. Each liquid crystal layer is configured to block spectral light in a different spectral light sub-band and transmit spectral light outside of the spectral light sub-band in the reflection state, and to transmit spectral light in the spectral light sub-band in the transmission state. An active illuminator of the camera is configured to emit active light in a selected spectral light sub-band. One or more of the plurality of liquid crystal layers is switched from the transmission state to the reflection state to tune the optical filter to block spectral light in all but the selected spectral light sub-band. The sensors of the sensor array are addressed to measure spectral light in the selected spectral light sub-band.Type: ApplicationFiled: May 23, 2018Publication date: November 28, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Sergio ORTIZ EGEA, Onur Can AKKAYA, Cyrus Soli BAMJI
-
Publication number: 20190355136Abstract: A time-of-flight (ToF) camera is configured to operate in a manner that reduces power consumption of the ToF camera. For a key frame, a key-frame depth image is generated based on a plurality of sets of key-frame IR images. Each set of key-frame IR images is acquired using a different modulation frequency of active IR light. For a P-frame after the key frame, a P-frame depth image is generated based on a set of P-frame IR images acquired using a single modulation frequency of active IR light.Type: ApplicationFiled: May 17, 2018Publication date: November 21, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Sergio ORTIZ EGEA, Hung-Ming LIN