Patents by Inventor Otmar Hilliges
Otmar Hilliges has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12235363Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.Type: GrantFiled: December 29, 2021Date of Patent: February 25, 2025Assignee: Microsoft Technology Licensing, LLCInventors: Otmar Hilliges, Malte Hanno Weiss, Shahram Izadi, David Kim, Carsten Curt Eckard Rother
-
Publication number: 20240029333Abstract: A method including selecting a first point from a 3D model representing an avatar, the first point being associated with an eye, selecting a second point from the 3D model, the second point being associated with a periocular region associated with the eye, generating an albedo and spherical harmonics (SH) coefficients based on the first point and the second point, and generating an image point based on the albedo, and the SH coefficients.Type: ApplicationFiled: July 19, 2023Publication date: January 25, 2024Inventors: Abhimitra Meka, Thabo Beeler, Franziska Müller, Gengyan Li, Marcel Bühler, Otmar Hilliges
-
Patent number: 11709506Abstract: According to the present invention there is provided a drone (1) comprising one or more propellers (2) and one or more actuators (3) for actuating said one or more propellers (2) to generating a thrust force which enables the drone (1) to fly; a controller (4) which is configured such that it can control the flight of the drone (1), wherein the controller (4) comprises a memory (6) having stored therein a plurality of predefined sets of positions which define a virtual rail which can be used to guide the flight of the drone (1) so that the drone can avoid collision with an subject; and wherein the controller further comprises a mathematical model (7) of the drone; wherein the controller (4) is configured to control the flight of the drone by performing at least the following steps, (a) approximating lag error based on the position of the drone (1) measured by a sensor (5) and the virtual rail, wherein the lag error is the distance between a point along the virtual rail which is closest to the drone (1) and anType: GrantFiled: July 12, 2018Date of Patent: July 25, 2023Assignee: ETH ZurichInventors: Otmar Hilliges, Tobias Nägeli
-
Publication number: 20220196840Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.Type: ApplicationFiled: December 29, 2021Publication date: June 23, 2022Inventors: Otmar HILLIGES, Malte Hanno WEISS, Shahram IZADI, David KIM, Carsten Curt Eckard ROTHER
-
Patent number: 11215711Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.Type: GrantFiled: December 18, 2017Date of Patent: January 4, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Otmar Hilliges, Malte Hanno Weiss, Shahram Izadi, David Kim, Carsten Curt Eckard Rother
-
Publication number: 20200409395Abstract: According to the present invention there is provided a drone (1) comprising one or more propellers (2) and one or more actuators (3) for actuating said one or more propellers (2) to generating a thrust force which enables the drone (1) to fly; a controller (4) which is configured such that it can control the flight of the drone (1), wherein the controller (4) comprises a memory (6) having stored therein a plurality of predefined sets of positions which define a virtual rail which can be used to guide the flight of the drone (1) so that the drone can avoid collision with an subject; and wherein the controller further comprises a mathematical model (7) of the drone; wherein the controller (4) is configured to control the flight of the drone by performing at least the following steps, (a) approximating lag error based on the position of the drone (1) measured by a sensor (5) and the virtual rail, wherein the lag error is the distance between a point along the virtual rail which is closest to the drone (1) and anType: ApplicationFiled: July 12, 2018Publication date: December 31, 2020Inventors: Otmar HILLIGES, Tobias NÄGELI
-
Patent number: 10399327Abstract: Embodiments herein describe deformable controllers that rely on piezoelectric material embedded in the controllers to detect when the input device is being manipulated into a particular deformation or gesture. The computing system may perform different actions depending on which deformation is detected. The embodiments herein describe design techniques for optimizing the placement of the piezoelectric material in the controller to improve the accuracy of a mapping function that maps sensor responses of the material to different controller deformations. In one embodiment, the user specifies the different deformations of the controller she wishes to be recognized by the computing system (e.g., raising a leg, twisting a torso, squeezing a hand, etc.). The design optimizer uses the locations of the desired deformations to move the location of the piezoelectric material such that the sensor response of the material can be uniquely mapped to these locations.Type: GrantFiled: April 22, 2016Date of Patent: September 3, 2019Assignees: Disney Enterprises, Inc., ETH Zurich (Eidgenoessische Technische Hochschule ZurichInventors: Moritz Niklaus Bächer, Benjamin Hepp, Fabrizio Pece, Paul Gregory Kry, Bernd Bickel, Bernhard Steffen Thomaszewski, Otmar Hilliges
-
Patent number: 10234941Abstract: A wearable sensor for tracking articulated body parts is described such as a wrist-worn device which enables 3D tracking of fingers and optionally also the arm and hand without the need to wear a glove or markers on the hand. In an embodiment a camera captures images of an articulated part of a body of a wearer of the device and an articulated model of the body part is tracked in real time to enable gesture-based control of a separate computing device such as a smart phone, laptop computer or other computing device. In examples the device has a structured illumination source and a diffuse illumination source for illuminating the articulated body part.Type: GrantFiled: October 4, 2012Date of Patent: March 19, 2019Assignee: Microsoft Technology Licensing, LLCInventors: David Kim, Shahram Izadi, Otmar Hilliges, David Alexander Butler, Stephen Hodges, Patrick Luke Olivier, Jiawen Chen, Iason Oikonomidis
-
Patent number: 10049458Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.Type: GrantFiled: January 20, 2016Date of Patent: August 14, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
-
Publication number: 20180106905Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.Type: ApplicationFiled: December 18, 2017Publication date: April 19, 2018Inventors: Otmar HILLIGES, Malte Hanno WEISS, Shahram IZADI, David KIM, Carsten Curt Eckard ROTHER
-
Patent number: 9891704Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.Type: GrantFiled: December 26, 2016Date of Patent: February 13, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Otmar Hilliges, David Kim, Shahram Izadi, David Molyneaux, Stephen Edward Hodges, David Alexander Butler
-
Patent number: 9857470Abstract: Detecting material properties such reflectivity, true color and other properties of surfaces in a real world environment is described in various examples using a single hand-held device. For example, the detected material properties are calculated using a photometric stereo system which exploits known relationships between lighting conditions, surface normals, true color and image intensity. In examples, a user moves around in an environment capturing color images of surfaces in the scene from different orientations under known lighting conditions. In various examples, surfaces normals of patches of surfaces are calculated using the captured data to enable fine detail such as human hair, netting, textured surfaces to be modeled. In examples, the modeled data is used to render images depicting the scene with realism or to superimpose virtual graphics on the real world in a realistic manner.Type: GrantFiled: December 28, 2012Date of Patent: January 2, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Otmar Hilliges, Malte Hanno Weiss, Shahram Izadi, David Kim, Carsten Curt Eckard Rother
-
Publication number: 20170308061Abstract: Embodiments herein describe deformable controllers that rely on piezoelectric material embedded in the controllers to detect when the input device is being manipulated into a particular deformation or gesture. The computing system may perform different actions depending on which deformation is detected. The embodiments herein describe design techniques for optimizing the placement of the piezoelectric material in the controller to improve the accuracy of a mapping function that maps sensor responses of the material to different controller deformations. In one embodiment, the user specifies the different deformations of the controller she wishes to be recognized by the computing system (e.g., raising a leg, twisting a torso, squeezing a hand, etc.). The design optimizer uses the locations of the desired deformations to move the location of the piezoelectric material such that the sensor response of the material can be uniquely mapped to these locations.Type: ApplicationFiled: April 22, 2016Publication date: October 26, 2017Inventors: Moritz Niklaus BÄCHER, Benjamin HEPP, Fabrizio PECE, Paul Gregory KRY, Bernd BICKEL, Bernhard Steffen THOMASZEWSKI, Otmar HILLIGES
-
Publication number: 20170199580Abstract: An augmented reality system which enables grasping of virtual objects is described such as to stack virtual cubes or to manipulate virtual objects in other ways. In various embodiments a user's hand or another real object is tracked in an augmented reality environment. In examples, the shape of the tracked real object is approximated using at least two different types of particles and the virtual objects are updated according to simulated forces exerted between the augmented reality environment and at least some of the particles. In various embodiments 3D positions of a first one of the types of particles, kinematic particles, are updated according to the tracked real object; and passive particles move with linked kinematic particles without penetrating virtual objects. In some examples a real-time optic flow process is used to track motion of the real object.Type: ApplicationFiled: January 24, 2017Publication date: July 13, 2017Inventors: Otmar Hilliges, David Kim, Shahram Izadi, Malte Hanno Weiss
-
Publication number: 20170109938Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.Type: ApplicationFiled: December 26, 2016Publication date: April 20, 2017Inventors: Otmar Hilliges, David Kim, Shahram Izadi, David Moiyneaux, Stephen Edward Hodges, David Alexander Butler
-
Patent number: 9552673Abstract: An augmented reality system which enables grasping of virtual objects is described such as to stack virtual cubes or to manipulate virtual objects in other ways. In various embodiments a user's hand or another real object is tracked in an augmented reality environment. In examples, the shape of the tracked real object is approximated using at least two different types of particles and the virtual objects are updated according to simulated forces exerted between the augmented reality environment and at least some of the particles. In various embodiments 3D positions of a first one of the types of particles, kinematic particles, are updated according to the tracked real object; and passive particles move with linked kinematic particles without penetrating virtual objects. In some examples a real-time optic flow process is used to track motion of the real object.Type: GrantFiled: October 17, 2012Date of Patent: January 24, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Otmar Hilliges, David Kim, Shahram Izadi, Malte Hanno Weiss
-
Patent number: 9529424Abstract: Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.Type: GrantFiled: November 5, 2010Date of Patent: December 27, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Otmar Hilliges, David Kim, Shahram Izadi, David Molyneaux, Stephen Edward Hodges, David Alexander Butler
-
Publication number: 20160163054Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.Type: ApplicationFiled: January 20, 2016Publication date: June 9, 2016Inventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli
-
Patent number: 9242171Abstract: Real-time camera tracking using depth maps is described. In an embodiment depth map frames are captured by a mobile depth camera at over 20 frames per second and used to dynamically update in real-time a set of registration parameters which specify how the mobile depth camera has moved. In examples the real-time camera tracking output is used for computer game applications and robotics. In an example, an iterative closest point process is used with projective data association and a point-to-plane error metric in order to compute the updated registration parameters. In an example, a graphics processing unit (GPU) implementation is used to optimize the error metric in real-time. In some embodiments, a dense 3D model of the mobile camera environment is used.Type: GrantFiled: February 23, 2013Date of Patent: January 26, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Richard Newcombe, Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Pushmeet Kohli, Andrew Fitzgibbon, Stephen Edward Hodges, David Alexander Butler
-
Patent number: 9247238Abstract: Systems and methods for reducing interference between multiple infra-red depth cameras are described. In an embodiment, the system comprises multiple infra-red sources, each of which projects a structured light pattern into the environment. A controller is used to control the sources in order to reduce the interference caused by overlapping light patterns. Various methods are described including: cycling between the different sources, where the cycle used may be fixed or may change dynamically based on the scene detected using the cameras; setting the wavelength of each source so that overlapping patterns are at different wavelengths; moving source-camera pairs in independent motion patterns; and adjusting the shape of the projected light patterns to minimize overlap. These methods may also be combined in any way. In another embodiment, the system comprises a single source and a mirror system is used to cast the projected structured light pattern around the environment.Type: GrantFiled: January 31, 2011Date of Patent: January 26, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Shahram Izadi, David Molyneaux, Otmar Hilliges, David Kim, Jamie Daniel Joseph Shotton, Stephen Edward Hodges, David Alexander Butler, Andrew Fitzgibbon, Pushmeet Kohli