Patents by Inventor Otmar Hilliges

Otmar Hilliges has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240029333
    Abstract: A method including selecting a first point from a 3D model representing an avatar, the first point being associated with an eye, selecting a second point from the 3D model, the second point being associated with a periocular region associated with the eye, generating an albedo and spherical harmonics (SH) coefficients based on the first point and the second point, and generating an image point based on the albedo, and the SH coefficients.
    Type: Application
    Filed: July 19, 2023
    Publication date: January 25, 2024
    Inventors: Abhimitra Meka, Thabo Beeler, Franziska Müller, Gengyan Li, Marcel Bühler, Otmar Hilliges
  • Patent number: 11709506
    Abstract: According to the present invention there is provided a drone (1) comprising one or more propellers (2) and one or more actuators (3) for actuating said one or more propellers (2) to generating a thrust force which enables the drone (1) to fly; a controller (4) which is configured such that it can control the flight of the drone (1), wherein the controller (4) comprises a memory (6) having stored therein a plurality of predefined sets of positions which define a virtual rail which can be used to guide the flight of the drone (1) so that the drone can avoid collision with an subject; and wherein the controller further comprises a mathematical model (7) of the drone; wherein the controller (4) is configured to control the flight of the drone by performing at least the following steps, (a) approximating lag error based on the position of the drone (1) measured by a sensor (5) and the virtual rail, wherein the lag error is the distance between a point along the virtual rail which is closest to the drone (1) and an
    Type: Grant
    Filed: July 12, 2018
    Date of Patent: July 25, 2023
    Assignee: ETH Zurich
    Inventors: Otmar Hilliges, Tobias Nägeli
  • Publication number: 20220196840
    Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.
    Type: Application
    Filed: December 29, 2021
    Publication date: June 23, 2022
    Inventors: Otmar HILLIGES, Malte Hanno WEISS, Shahram IZADI, David KIM, Carsten Curt Eckard ROTHER
  • Patent number: 11215711
    Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.
    Type: Grant
    Filed: December 18, 2017
    Date of Patent: January 4, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Otmar Hilliges, Malte Hanno Weiss, Shahram Izadi, David Kim, Carsten Curt Eckard Rother
  • Publication number: 20200409395
    Abstract: According to the present invention there is provided a drone (1) comprising one or more propellers (2) and one or more actuators (3) for actuating said one or more propellers (2) to generating a thrust force which enables the drone (1) to fly; a controller (4) which is configured such that it can control the flight of the drone (1), wherein the controller (4) comprises a memory (6) having stored therein a plurality of predefined sets of positions which define a virtual rail which can be used to guide the flight of the drone (1) so that the drone can avoid collision with an subject; and wherein the controller further comprises a mathematical model (7) of the drone; wherein the controller (4) is configured to control the flight of the drone by performing at least the following steps, (a) approximating lag error based on the position of the drone (1) measured by a sensor (5) and the virtual rail, wherein the lag error is the distance between a point along the virtual rail which is closest to the drone (1) and an
    Type: Application
    Filed: July 12, 2018
    Publication date: December 31, 2020
    Inventors: Otmar HILLIGES, Tobias NÄGELI
  • Patent number: 10399327
    Abstract: Embodiments herein describe deformable controllers that rely on piezoelectric material embedded in the controllers to detect when the input device is being manipulated into a particular deformation or gesture. The computing system may perform different actions depending on which deformation is detected. The embodiments herein describe design techniques for optimizing the placement of the piezoelectric material in the controller to improve the accuracy of a mapping function that maps sensor responses of the material to different controller deformations. In one embodiment, the user specifies the different deformations of the controller she wishes to be recognized by the computing system (e.g., raising a leg, twisting a torso, squeezing a hand, etc.). The design optimizer uses the locations of the desired deformations to move the location of the piezoelectric material such that the sensor response of the material can be uniquely mapped to these locations.
    Type: Grant
    Filed: April 22, 2016
    Date of Patent: September 3, 2019
    Assignees: Disney Enterprises, Inc., ETH Zurich (Eidgenoessische Technische Hochschule Zurich
    Inventors: Moritz Niklaus Bächer, Benjamin Hepp, Fabrizio Pece, Paul Gregory Kry, Bernd Bickel, Bernhard Steffen Thomaszewski, Otmar Hilliges
  • Patent number: 10234941
    Abstract: A wearable sensor for tracking articulated body parts is described such as a wrist-worn device which enables 3D tracking of fingers and optionally also the arm and hand without the need to wear a glove or markers on the hand. In an embodiment a camera captures images of an articulated part of a body of a wearer of the device and an articulated model of the body part is tracked in real time to enable gesture-based control of a separate computing device such as a smart phone, laptop computer or other computing device. In examples the device has a structured illumination source and a diffuse illumination source for illuminating the articulated body part.
    Type: Grant
    Filed: October 4, 2012
    Date of Patent: March 19, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David Kim, Shahram Izadi, Otmar Hilliges, David Alexander Butler, Stephen Hodges, Patrick Luke Olivier, Jiawen Chen, Iason Oikonomidis
  • Patent number: 10049458
    Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.
    Type: Grant
    Filed: January 20, 2016
    Date of Patent: August 14, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
  • Publication number: 20180106905
    Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.
    Type: Application
    Filed: December 18, 2017
    Publication date: April 19, 2018
    Inventors: Otmar HILLIGES, Malte Hanno WEISS, Shahram IZADI, David KIM, Carsten Curt Eckard ROTHER
  • Patent number: 9891704
    Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.
    Type: Grant
    Filed: December 26, 2016
    Date of Patent: February 13, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, David Molyneaux, Stephen Edward Hodges, David Alexander Butler
  • Patent number: 9857470
    Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.
    Type: Grant
    Filed: December 28, 2012
    Date of Patent: January 2, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Otmar Hilliges, Malte Hanno Weiss, Shahram Izadi, David Kim, Carsten Curt Eckard Rother
  • Publication number: 20170308061
    Abstract: Embodiments herein describe deformable controllers that rely on piezoelectric material embedded in the controllers to detect when the input device is being manipulated into a particular deformation or gesture. The computing system may perform different actions depending on which deformation is detected. The embodiments herein describe design techniques for optimizing the placement of the piezoelectric material in the controller to improve the accuracy of a mapping function that maps sensor responses of the material to different controller deformations. In one embodiment, the user specifies the different deformations of the controller she wishes to be recognized by the computing system (e.g., raising a leg, twisting a torso, squeezing a hand, etc.). The design optimizer uses the locations of the desired deformations to move the location of the piezoelectric material such that the sensor response of the material can be uniquely mapped to these locations.
    Type: Application
    Filed: April 22, 2016
    Publication date: October 26, 2017
    Inventors: Moritz Niklaus BÄCHER, Benjamin HEPP, Fabrizio PECE, Paul Gregory KRY, Bernd BICKEL, Bernhard Steffen THOMASZEWSKI, Otmar HILLIGES
  • Publication number: 20170199580
    Abstract: An augmented reality system which enables grasping of virtual objects is described such as to stack virtual cubes or to manipulate virtual objects in other ways. In various embodiments a user's hand or another real object is tracked in an augmented reality environment. In examples, the shape of the tracked real object is approximated using at least two different types of particles and the virtual objects are updated according to simulated forces exerted between the augmented reality environment and at least some of the particles. In various embodiments 3D positions of a first one of the types of particles, kinematic particles, are updated according to the tracked real object; and passive particles move with linked kinematic particles without penetrating virtual objects. In some examples a real-time optic flow process is used to track motion of the real object.
    Type: Application
    Filed: January 24, 2017
    Publication date: July 13, 2017
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, Malte Hanno Weiss
  • Publication number: 20170109938
    Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.
    Type: Application
    Filed: December 26, 2016
    Publication date: April 20, 2017
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, David Moiyneaux, Stephen Edward Hodges, David Alexander Butler
  • Patent number: 9552673
    Abstract: An augmented reality system which enables grasping of virtual objects is described such as to stack virtual cubes or to manipulate virtual objects in other ways. In various embodiments a user's hand or another real object is tracked in an augmented reality environment. In examples, the shape of the tracked real object is approximated using at least two different types of particles and the virtual objects are updated according to simulated forces exerted between the augmented reality environment and at least some of the particles. In various embodiments 3D positions of a first one of the types of particles, kinematic particles, are updated according to the tracked real object; and passive particles move with linked kinematic particles without penetrating virtual objects. In some examples a real-time optic flow process is used to track motion of the real object.
    Type: Grant
    Filed: October 17, 2012
    Date of Patent: January 24, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, Malte Hanno Weiss
  • Patent number: 9529424
    Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.
    Type: Grant
    Filed: November 5, 2010
    Date of Patent: December 27, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Otmar Hilliges, David Kim, Shahram Izadi, David Molyneaux, Stephen Edward Hodges, David Alexander Butler
  • Publication number: 20160163054
    Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.
    Type: Application
    Filed: January 20, 2016
    Publication date: June 9, 2016
    Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
  • Patent number: 9247238
    Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.
    Type: Grant
    Filed: January 31, 2011
    Date of Patent: January 26, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
  • Patent number: 9242171
    Abstract: Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.
    Type: Grant
    Filed: February 23, 2013
    Date of Patent: January 26, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
  • Patent number: 9053571
    Abstract: Generating computer models of 3D objects is described. In one example, depth images of an object captured by a substantially static depth camera are used to generate the model, which is stored in a memory device in a three-dimensional volume. Portions of the depth image determined to relate to the background are removed to leave a foreground depth image. The position and orientation of the object in the foreground depth image is tracked by comparison to a preceding depth image, and the foreground depth image is integrated into the volume by using the position and orientation to determine where to add data derived from the foreground depth image into the volume. In examples, the object is hand-rotated by a user before the depth camera. Hands that occlude the object are integrated out of the model as they do not move in sync with the object due to re-gripping.
    Type: Grant
    Filed: June 6, 2011
    Date of Patent: June 9, 2015
    Assignee: Microsoft Corporation
    Inventors: Jamie Daniel Joseph Shotton, Shahram Izadi, Otmar Hilliges, David Kim, David Molyneaux, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges