Patents by Inventor Andrew C. Goris

Andrew C. Goris has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11513605
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Grant
    Filed: October 22, 2020
    Date of Patent: November 29, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Patent number: 11314321
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Grant
    Filed: July 7, 2020
    Date of Patent: April 26, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Drew Steedly, Michael Edward Samples, Alexandru Octavian Balan, William Douglas Guyman, Vuk Jovanovic, Taras Khapko, Ivan Razumenic, Vladimir Carapic, Martin Thomas Shetter, Jelena Mojasevic, Andrew C. Goris, Marko Bezulj
  • Publication number: 20210034161
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Application
    Filed: October 22, 2020
    Publication date: February 4, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Patent number: 10908694
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Grant
    Filed: February 1, 2016
    Date of Patent: February 2, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Publication number: 20200333878
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Application
    Filed: July 7, 2020
    Publication date: October 22, 2020
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Drew STEEDLY, Michael Edward SAMPLES, Alexandru Octavian BALAN, William Douglas GUYMAN, Vuk JOVANOVIC, Taras KHAPKO, Ivan RAZUMENIC, Vladimir CARAPIC, Martin Thomas SHETTER, Jelena MOJASEVIC, Andrew C. GORIS, Marko BEZULJ
  • Patent number: 10719125
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: July 21, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Drew Steedly, Michael Edward Samples, Alexandru Octavian Balan, William Douglas Guyman, Vuk Jovanovic, Taras Khapko, Ivan Razumenic, Vladimir Carapic, Martin Thomas Shetter, Jelena Mojasevic, Andrew C. Goris, Marko Bezulj
  • Publication number: 20190340317
    Abstract: Systems and methods are disclosed for using a synthetic world interface to model environments, sensors, and platforms, such as for computer vision sensor platform design. Digital models may be passed through a simulation service to generate synthetic experiment data. Systematic sweeps of parameters for various components of the sensor or platform design under test, under multiple environmental conditions, can facilitate time- and cost-efficient engineering efforts by revealing parameter sensitivities and environmental effects for multiple proposed configurations. Searches through the generated synthetic experimental data results can permit rapid identification of desirable design configuration candidates.
    Type: Application
    Filed: May 7, 2018
    Publication date: November 7, 2019
    Inventors: Jonathan Chi Hang CHAN, Michael EBSTYNE, Alex A. KIPMAN, Pedro U. ESCOS, Andrew C. GORIS
  • Patent number: 10228561
    Abstract: An example see-through head-mounted display system includes a freeform prism and a display device configured to emit display light through the freeform prism to an eye of a user. The see-through head-mounted display system may also include an imaging device configured to receive gaze-detection light reflected from the eye and directed through the freeform prism.
    Type: Grant
    Filed: June 25, 2013
    Date of Patent: March 12, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Steve Robbins, Scott McEldowney, Xinye Lou, David Nister, Drew Steedly, Quentin Simon Charles Miller, David D Bohn, James Peele Terrell, Jr., Andrew C. Goris, Nathan Ackerman
  • Publication number: 20180329484
    Abstract: One disclosed example provides a head-mounted device configured to control a plurality of light sources of a handheld object and acquire image data comprising a sequence of environmental tracking exposures in which the plurality of light sources are controlled to have a lower integrated intensity and handheld object tracking exposures in which the plurality of light sources are controlled to have a higher integrated intensity. The instructions are further executable to detect, via an environmental tracking exposure, one or more features of the surrounding environment, determine a pose of the head-mounted device based upon the one or more features of the surrounding environment detected, detect via a handheld object tracking exposure the plurality of light sources of the handheld object, determine a pose of the handheld object relative to the head-mounted device based upon the plurality of light sources detected, and output the pose of the handheld object.
    Type: Application
    Filed: November 29, 2017
    Publication date: November 15, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Drew STEEDLY, Michael Edward SAMPLES, Alexandru Octavian BALAN, William Douglas GUYMAN, Vuk JOVANOVIC, Taras KHAPKO, Ivan RAZUMENIC, Vladimir CARAPIC, Martin Thomas SHETTER, Jelena MOJASEVIC, Andrew C. GORIS, Marko BEZULJ
  • Publication number: 20170220119
    Abstract: Examples of mixed reality computing devices that utilize remote sensors and local sensors as input devices are disclosed. In one example, a mixed reality computing device comprises an image sensor, a remote input device, a processor, and storage comprising stored instructions. The stored instructions are executable by the processor to perform object motion tracking and environmental tracking based on output from the image sensor, and in response to detecting that the remote input device is in use, adjust a parameter of the motion tracking while maintaining the environmental tracking.
    Type: Application
    Filed: February 1, 2016
    Publication date: August 3, 2017
    Inventors: Lori Ann Potts, Lev Cherkashin, David Rohn, Steven James Velat, Andrew C. Goris, Scott Francis Fullam, Travis Scott Legg, Craig Haskins, James Tichenor
  • Patent number: 9639985
    Abstract: A system and method are disclosed for detecting angular displacement of a display element relative to a reference position on a head mounted display device for presenting a mixed reality or virtual reality experience. Once the displacement is detected, it may be corrected for to maintain the proper binocular disparity of virtual images displayed to the left and right display elements of the head mounted display device. In one example, the detection system uses an optical assembly including collimated LEDs and a camera which together are insensitive to linear displacement. Such a system provides a true measure of angular displacement of one or both display elements on the head mounted display device.
    Type: Grant
    Filed: June 24, 2013
    Date of Patent: May 2, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Steven John Robbins, Drew Steedly, Nathan Ackerman, Quentin Simon Charles Miller, Andrew C. Goris
  • Patent number: 9330464
    Abstract: Embodiments are disclosed that relate to controlling a depth camera. In one example, a method comprises emitting light from an illumination source toward a scene through an optical window, selectively routing a at least a portion of the light emitted from the illumination source to an image sensor such that the portion of the light is not transmitted through the optical window, receiving an output signal generated by the image sensor based on light reflected by the scene, the output signal including at least one depth value of the scene, and adjusting the output signal based on the selectively routed portion of the light.
    Type: Grant
    Filed: December 12, 2014
    Date of Patent: May 3, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Nathan Ackerman, Andrew C. Goris, Amir Nevet, David Mandelboum, Asaf Pellman
  • Publication number: 20150003819
    Abstract: Technology disclosed herein automatically focus a camera based on eye tracking. Techniques include tracking an eye gaze of eyes to determine a location at which the user is focusing. Then, a camera lens may be focused on that location. In one aspect, a first vector that corresponds to a first direction in which a first eye of a user is gazing at a point in time is determined. A second vector that corresponds to a second direction in which a second eye of the user is gazing at the point in time is determined. A location of an intersection of the first vector and the second vector is determined. A distance between the location of intersection and a location of a lens of the camera is determined. The lens is focused based on the distance. The lens could be focused based on a single eye vector and a depth image.
    Type: Application
    Filed: June 28, 2013
    Publication date: January 1, 2015
    Inventors: Nathan Ackerman, Andrew C. Goris, Bruno Silva
  • Publication number: 20140375681
    Abstract: A system and method are disclosed for detecting angular displacement of a display element relative to a reference position on a head mounted display device for presenting a mixed reality or virtual reality experience. Once the displacement is detected, it may be corrected for to maintain the proper binocular disparity of virtual images displayed to the left and right display elements of the head mounted display device. In one example, the detection system uses an optical assembly including collimated LEDs and a camera which together are insensitive to linear displacement. Such a system provides a true measure of angular displacement of one or both display elements on the head mounted display device.
    Type: Application
    Filed: June 24, 2013
    Publication date: December 25, 2014
    Inventors: Steven John Robbins, Drew Steedly, Nathan Ackerman, Quentin Simon Charles Miller, Andrew C. Goris
  • Publication number: 20140375790
    Abstract: Embodiments are disclosed for a see-through head-mounted display system. In one embodiment, the see-through head-mounted display system comprises a freeform prism, and a display device configured to emit display light through the freeform prism to an eye of a user. The see-through head-mounted display system may also comprise an imaging device having an entrance pupil positioned at a back focal plane of the freeform prism, the imaging device configured to receive gaze-detection light reflected from the eye and directed through the freeform prism.
    Type: Application
    Filed: June 25, 2013
    Publication date: December 25, 2014
    Inventors: Steve Robbins, Scott McEldowney, Xinye Lou, David Nister, Drew Steedly, Quentin Simon Charles Miller, David D. Bohn, James Peele Terrell, JR., Andrew C. Goris, Nathan Ackerman
  • Publication number: 20140253790
    Abstract: A portable device including a light source, an image capture component with an image sensor, a filter with a first polarized portion and a second polarized portion, a mechanism to reposition the filter to overlap the light source with the first polarized portion of the filter and overlap the image sensor with the second polarized portion of the filter, and a controller to capture visual media with the image capture component.
    Type: Application
    Filed: June 30, 2011
    Publication date: September 11, 2014
    Applicant: Hewlette-Packard Development Company, L.P.
    Inventors: Kevin James Matherson, Andrew C Goris, Dan L Dalton
  • Patent number: 8767007
    Abstract: An apparatus for processing images of a display includes an image sequence receiving module adapted to receive a first sequence of images of the display. The first sequence of images includes at least one image corrupted due to partial refreshing of the display. The apparatus also includes an image processing module operable to process the first sequence of images on an image by image basis to eliminate the at least one corrupted image from at least one of the first sequence of images and a second sequence of images captured subsequent to the first sequence of images.
    Type: Grant
    Filed: October 22, 2008
    Date of Patent: July 1, 2014
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Timothy Paul James Kindberg, Andrew C. Goris, Timothy Louis Kohler
  • Patent number: 8500004
    Abstract: In a method for obtaining a resource to read a symbol, an image containing at least one of the symbol and an indicator of the symbol is obtained. At least a portion of the obtained image is sent to a symbol identification service and the resource from the symbol identification service is received, wherein the resource enables the symbol to be read.
    Type: Grant
    Filed: October 22, 2008
    Date of Patent: August 6, 2013
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Timothy Paul James Kindberg, Andrew C. Goris, Timothy Louis Kohler
  • Patent number: 8488010
    Abstract: A machine-implemented method of generating a stabilized video sequence includes receiving an input video sequence captured by an image capture device. The input video sequence includes a plurality of pairs of successive frames. Motion sensor data indicative of motion of the image capture device while the input video sequence was being captured is received. A set of matching features for each pair of successive frames is identified. Global motion features are identified in each set of matching features and qualified based on the motion sensor data. The global motion features are indicative of movement of the image capture device. A stabilized video sequence is generated based on the input video sequence and the identified global motion features.
    Type: Grant
    Filed: September 21, 2010
    Date of Patent: July 16, 2013
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Shane D. Voss, Wilfred F. Brake, Andrew C. Goris
  • Patent number: 8391698
    Abstract: Systems and methods for implementing Z-buffer generation in a camera are disclosed. In an exemplary embodiment the method may comprise exposing a plurality of pixels on an image capture device to a modulated light signal reflected from different regions of a scene adjacent a camera after different delays. The method may also comprise correlating intensity of the modulated light signal received by the image capture device for each the different delays to determine a flight time of the modulated light signal. The method may also comprise generating a Z-buffer corresponding to the different regions of the scene based on the correlation.
    Type: Grant
    Filed: October 28, 2005
    Date of Patent: March 5, 2013
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Richard Turley, Andrew C. Goris, David Branson, David R. Lawson, Donald J. Stavely, David K. Campbell