Patents by Inventor Eyal Ofek

Eyal Ofek has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190216340
    Abstract: A sensor device is described herein. The sensor device includes a multi-dimensional optical sensor and processing circuitry, wherein the multi-dimensional optical sensor generates images and the processing circuitry is configured to output data that is indicative of hemodynamics of a user based upon the images. The sensor device is non-invasive, and is able to be incorporated into wearable devices, thereby allowing for continuous output of the data that is indicative of the hemodynamics of the user.
    Type: Application
    Filed: January 15, 2018
    Publication date: July 18, 2019
    Inventors: Christian HOLZ, Eyal OFEK, Michael J. SINCLAIR
  • Publication number: 20190201784
    Abstract: A controller is provided that can provide haptic feedback to a user by controlling a separation of a stationary portion and a moveable portion, such as a moveable arm, which can include one or more mounts for one or more of a user's fingers. A sensor can be included on the stationary portion to sense whether the user's thumb is proximate a thumb rest. Different haptic interaction modes can be set depending on whether the user's thumb is not proximate the sensor, such as a touch mode, or is proximate the sensor, such as a grasping or trigger mode. When grasping and trigger modes are provided, they can be determined based on the nature of a virtual object grasped by a user. Additional haptic sensations can be provided, such as to a user's fingertip, such as by a vibratory component or a rotatable object of one or more haptic elements.
    Type: Application
    Filed: December 29, 2017
    Publication date: July 4, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Christian Holz, Eyal Ofek, Michael Jack Sinclair, Hrvoje Benko, Inrak Choi, Eric Whitmire
  • Patent number: 10304251
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Grant
    Filed: June 15, 2017
    Date of Patent: May 28, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Patent number: 10290153
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Grant
    Filed: September 13, 2017
    Date of Patent: May 14, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian, Mark Hancock
  • Patent number: 10289239
    Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Grant
    Filed: December 12, 2016
    Date of Patent: May 14, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
  • Patent number: 10216982
    Abstract: Various systems and methods for projecting a remote object are described herein. In one example, a method includes collecting environment data corresponding to a local environment in which a system is located and detecting a remote object corresponding to a remote user in a remote environment. The method can also include detecting a viewpoint of a local user in the local environment, and projecting the remote object corresponding to the remote user in the local environment based on the viewpoint of the local user, the virtual copy of the remote object to be positioned in the local environment by taking into account geometry of local objects in the local environment.
    Type: Grant
    Filed: March 12, 2015
    Date of Patent: February 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tomislav Pejsa, Andrew Wilson, Hrvoje Benko, Eyal Ofek, Julian Kantor
  • Patent number: 10215585
    Abstract: Various embodiments provide techniques for geographic navigation via one or more block views. According to some embodiments, a block view can include a visual image of a geographic location that is visually similar to a panoramic image. In some example implementations, a block view can be scrolled to navigate images of a geographic location. In one or more embodiments, a bubble view can be displayed of one or more locations within the block view. The bubble view can include a zoomed image of one or more aspects of a block view. Further to some embodiments, a map view can be utilized along with the block view and/or the bubble view. The map view can include a two-dimensional representation of the geographic location from an aerial perspective, and can include a more general level of detail concerning the geographic location, such as streets, cities, states, bodies of water, and so on.
    Type: Grant
    Filed: February 12, 2016
    Date of Patent: February 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Pasquale DeMaio, Yonatan Wexler
  • Publication number: 20190004683
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20190004684
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20190005724
    Abstract: Methods and systems for rendering augmented reality display data to locations of a physical presentation environment based on a presentation configuration are provided. A physical presentation environment configuration may be accessed that includes locations of a physical presentation environment for mapping augmented reality display data. The augmented reality display data may include a plurality of augmented reality objects that are rendered for display. Presentation attributes of the augmented reality display data may be used in conjunction with the presentation configuration for mapping and rendering the augmented reality display data. The rendered augmented reality display data may be dynamically interactive, and may be generated based on previous presentation configurations, mapping preferences, mapping limitations, and/or other factors.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20180365897
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Application
    Filed: June 15, 2017
    Publication date: December 20, 2018
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Publication number: 20180364808
    Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.
    Type: Application
    Filed: June 15, 2017
    Publication date: December 20, 2018
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Publication number: 20180364853
    Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.
    Type: Application
    Filed: June 15, 2017
    Publication date: December 20, 2018
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Sasa JUNUZOVIC
  • Publication number: 20180333088
    Abstract: Embodiments relate to using a display and camera of a computing device to perform pulse oximetry. The display of the device is used as an illuminant, a finger is placed over a portion of the display and a camera facing in the same direction as the display. One or more colors are selected to enhance hemoglobin-deoxyhemoglobin contrast in view of display and camera sensitivities. The one or more colors are displayed while a body part covers the displayed color and the camera. The camera captures images of light that has passed through the finger and been internally reflected to the camera. The light reaching the camera has been absorbed by arterial hemoglobin and deoxyhemoglobin at different rates in respective different wavebands. Differences in attenuation of display light at the different wavebands provide sufficient contrast to compute an accurate pulse oxygenation estimate.
    Type: Application
    Filed: May 17, 2017
    Publication date: November 22, 2018
    Inventors: Christian Holz, Eyal Ofek
  • Publication number: 20180322701
    Abstract: In various embodiments, computerized methods and systems for syndicating direct and indirect interactions with objects in a computer-mediated environment to facilitate precise interactions with the objects in the computer-mediated environment are provided. The system detects a direct interaction with an object in the computer-mediated reality environment. The direct interaction may be a natural or hypernatural interaction. Subsequently, the system may determine various options of indirect interaction with the object related to the direct interaction. The indirect interaction may be generated by a controller. Upon receiving an indirect interaction, the system may modify the object based on the syndication of the direct interaction and the indirect interaction.
    Type: Application
    Filed: May 4, 2017
    Publication date: November 8, 2018
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER
  • Publication number: 20180321737
    Abstract: In various embodiments, methods and systems for implementing an integrated mixed-input system are provided. The integrated mixed-input system includes paired mixed-input devices for interacting and controlling virtual space input interfaces using real inputs and virtual inputs, sensors, and passive and active haptic feedback associated with the paired mixed-input devices. Real device space tracker data and virtual device space tracker data are accessed via the paired mixed-input devices to determine real input and virtual input that are processed to determine virtual space input. The real device space tracker data and virtual device space tracker data also are used to generate different interaction contexts. In one embodiment, integrated mixed-input system supports interface deviation, where a physical mixed-input device interface is a different size from a size of the virtual space input interface. The virtual space input is communicated to control the virtual space input interface.
    Type: Application
    Filed: June 30, 2017
    Publication date: November 8, 2018
    Inventors: Michel PAHUD, Eyal OFEK
  • Publication number: 20180314484
    Abstract: Various systems and techniques for intuitively and non-obstructively obtaining awareness of fellow collaborator views and/or statuses in an augmented reality environment are disclosed. An interaction is detected between a first and second HMD, both collaborative participants of a shared dataset represented in an augmented reality environment. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request in accordance with a privacy policy to generate a set of visual data as a response to the request. The second HMD can communicate the generated set of visual data to the first HMD, such that the first HMD can render the received set of visual data as additional computer-generated object(s) to be displayed concurrently with its own augmented view.
    Type: Application
    Filed: April 28, 2017
    Publication date: November 1, 2018
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Publication number: 20180232050
    Abstract: A computing system including a head mounted display device with a processor and an associated display is provided. A sensor in communication with the processor is configured to detect a movable body part of a user. A plurality of physical haptic feedback structures are configured to be contacted by the movable body part. The processor is configured to operate the display device, receive data from the sensor, and determine an intended virtual target of the movable body part and a target physical structure having haptic characteristics corresponding to the intended virtual target. Also, the processor is configured to compute a path in real three-dimensional space from the movable body part to the target physical structure, compute a spatial warping pattern, and display via the display the virtual space and the virtual reality representation according to the spatial warping pattern.
    Type: Application
    Filed: February 14, 2017
    Publication date: August 16, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Andrew Wilson, Hrvoje Benko, Christian Holz, Lung-Pan Cheng
  • Patent number: 10013065
    Abstract: An example system includes a plurality of moveable light emitters, each moveable light emitter configured to independently emit a display light from a current display location within that moveable light emitter's range of motion responsive to activation from a corresponding light activator. The system also includes a location engine to determine, for each light emitter, the current display location of that light emitter, and a mapping engine to map, for each current display location, the light activator activating the light emitter currently located at that current display location.
    Type: Grant
    Filed: February 13, 2015
    Date of Patent: July 3, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Sasa Junuzovic, Michel Pahud, Eyal Ofek, Yoichi Ochiai, Michael J. Sinclair
  • Patent number: 9959675
    Abstract: A “Layout Optimizer” provides various real-time iterative constraint-satisfaction methodologies that use constraint-based frameworks to generate optimized layouts that map or embed virtual objects into environments. The term environment refers to combinations of environmental characteristics, including, but not limited to, 2D or 3D scene geometry or layout, scene colors, patterns, and/or textures, scene illumination, scene heat sources, fixed or moving people, objects or fluids, etc., any of which may evolve or change over time. A set of parameters are specified or selected for each object. Further, the environmental characteristics are determined automatically or specified by users. Relationships between objects and/or the environment derived from constraints associated with objects and the environment are then used to iteratively determine optimized self-consistent and scene-consistent object layouts.
    Type: Grant
    Filed: June 9, 2014
    Date of Patent: May 1, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Ran Gal, Pushmeet Kohli, Eyal Ofek, Lior Shapira