Patents by Inventor Eyal Ofek

Eyal Ofek has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190244434
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Application
    Filed: April 17, 2019
    Publication date: August 8, 2019
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Patent number: 10373381
    Abstract: A method, computing device and head-mounted display device for manipulating a virtual object displayed via a display device are disclosed. In one example, image data of a physical environment comprising physical features is received. A three dimensional model of at least a portion of the environment is generated. Candidate anchor features that each correspond to one of the physical features are extracted from the image data. User input is received that manipulates the virtual object as displayed within the environment. Based on the manipulation, a correspondence between a virtual anchor feature of the virtual object and a corresponding candidate anchor feature is identified. An indication of the corresponding candidate anchor feature at its corresponding physical feature within the environment is displayed to the user.
    Type: Grant
    Filed: March 30, 2016
    Date of Patent: August 6, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Benjamin Nuernberger, Hrvoje Benko, Andrew Wilson, Eyal Ofek
  • Publication number: 20190216340
    Abstract: A sensor device is described herein. The sensor device includes a multi-dimensional optical sensor and processing circuitry, wherein the multi-dimensional optical sensor generates images and the processing circuitry is configured to output data that is indicative of hemodynamics of a user based upon the images. The sensor device is non-invasive, and is able to be incorporated into wearable devices, thereby allowing for continuous output of the data that is indicative of the hemodynamics of the user.
    Type: Application
    Filed: January 15, 2018
    Publication date: July 18, 2019
    Inventors: Christian HOLZ, Eyal OFEK, Michael J. SINCLAIR
  • Publication number: 20190201784
    Abstract: A controller is provided that can provide haptic feedback to a user by controlling a separation of a stationary portion and a moveable portion, such as a moveable arm, which can include one or more mounts for one or more of a user's fingers. A sensor can be included on the stationary portion to sense whether the user's thumb is proximate a thumb rest. Different haptic interaction modes can be set depending on whether the user's thumb is not proximate the sensor, such as a touch mode, or is proximate the sensor, such as a grasping or trigger mode. When grasping and trigger modes are provided, they can be determined based on the nature of a virtual object grasped by a user. Additional haptic sensations can be provided, such as to a user's fingertip, such as by a vibratory component or a rotatable object of one or more haptic elements.
    Type: Application
    Filed: December 29, 2017
    Publication date: July 4, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Christian Holz, Eyal Ofek, Michael Jack Sinclair, Hrvoje Benko, Inrak Choi, Eric Whitmire
  • Patent number: 10325335
    Abstract: A method for using a health information exchange system which stores patient record data regarding a multiplicity of patients, to serve a first plurality of EMRs each interacting with an EMR community including a set of at least one EMR, the method comprising: for each individual EMR within the first plurality of EMRs, performing a computerized context interception process using a processor to intercept context from the individual EMR and to identify there within an event whereby a health provider using the individual EMR calls up an individual patient's record from said individual EMR; and responsive to identification of the event, using a computerized output device for providing patient record data, pertaining to the individual patient, to the health provider.
    Type: Grant
    Filed: December 31, 2013
    Date of Patent: June 18, 2019
    Assignees: ALLSCRIPTS SOFTWARE, LLC, DBMOTION LTD.
    Inventors: Robert Wartenfeld, Ziv Ofek, Eyal Greenberg, Ziv Gome, Shiri Ben-Tal
  • Patent number: 10304251
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Grant
    Filed: June 15, 2017
    Date of Patent: May 28, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Patent number: 10297343
    Abstract: A method for using a health information exchange system which stores patient record data regarding a multiplicity of patients, to serve a first plurality of EMRs each interacting with an EMR community including a set of at least one EMR, the method comprising: for each individual EMR within the first plurality of EMRs, performing a computerized context interception process using a processor to intercept context from the individual EMR and to identify there within an event whereby a health provider using the individual EMR calls up an individual patient's record from said individual EMR; and responsive to identification of the event, using a computerized output device for providing patient record data, pertaining to the individual patient, to the health provider.
    Type: Grant
    Filed: December 31, 2013
    Date of Patent: May 21, 2019
    Assignee: DBMOTION LTD.
    Inventors: Robert Wartenfeld, Ziv Ofek, Eyal Greenberg, Ziv Gome, Shiri Ben-Tal
  • Patent number: 10290153
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Grant
    Filed: September 13, 2017
    Date of Patent: May 14, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian, Mark Hancock
  • Patent number: 10289239
    Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Grant
    Filed: December 12, 2016
    Date of Patent: May 14, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
  • Patent number: 10216982
    Abstract: Various systems and methods for projecting a remote object are described herein. In one example, a method includes collecting environment data corresponding to a local environment in which a system is located and detecting a remote object corresponding to a remote user in a remote environment. The method can also include detecting a viewpoint of a local user in the local environment, and projecting the remote object corresponding to the remote user in the local environment based on the viewpoint of the local user, the virtual copy of the remote object to be positioned in the local environment by taking into account geometry of local objects in the local environment.
    Type: Grant
    Filed: March 12, 2015
    Date of Patent: February 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Tomislav Pejsa, Andrew Wilson, Hrvoje Benko, Eyal Ofek, Julian Kantor
  • Patent number: 10215585
    Abstract: Various embodiments provide techniques for geographic navigation via one or more block views. According to some embodiments, a block view can include a visual image of a geographic location that is visually similar to a panoramic image. In some example implementations, a block view can be scrolled to navigate images of a geographic location. In one or more embodiments, a bubble view can be displayed of one or more locations within the block view. The bubble view can include a zoomed image of one or more aspects of a block view. Further to some embodiments, a map view can be utilized along with the block view and/or the bubble view. The map view can include a two-dimensional representation of the geographic location from an aerial perspective, and can include a more general level of detail concerning the geographic location, such as streets, cities, states, bodies of water, and so on.
    Type: Grant
    Filed: February 12, 2016
    Date of Patent: February 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Pasquale DeMaio, Yonatan Wexler
  • Publication number: 20190004683
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20190004684
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20190005724
    Abstract: Methods and systems for rendering augmented reality display data to locations of a physical presentation environment based on a presentation configuration are provided. A physical presentation environment configuration may be accessed that includes locations of a physical presentation environment for mapping augmented reality display data. The augmented reality display data may include a plurality of augmented reality objects that are rendered for display. Presentation attributes of the augmented reality display data may be used in conjunction with the presentation configuration for mapping and rendering the augmented reality display data. The rendered augmented reality display data may be dynamically interactive, and may be generated based on previous presentation configurations, mapping preferences, mapping limitations, and/or other factors.
    Type: Application
    Filed: June 30, 2017
    Publication date: January 3, 2019
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Publication number: 20180364853
    Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.
    Type: Application
    Filed: June 15, 2017
    Publication date: December 20, 2018
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Sasa JUNUZOVIC
  • Publication number: 20180364808
    Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.
    Type: Application
    Filed: June 15, 2017
    Publication date: December 20, 2018
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Publication number: 20180365897
    Abstract: Systems and techniques from displaying virtual representations of real-world spaces and objects in various environments are disclosed. A source environment at a first location can be scanned by a head-mounted display (HMD) device to generate three-dimensional datasets corresponding to the physical environment at the first location. The three-dimensional datasets can include detected physical properties associated with the physical environment. At a second location, the HMD can re-create the source environment, and render for display a virtual representation of the physical environment based on the three-dimensional datasets, where the virtual representation of the source environment is rendered to maintain any one of the detected physical properties associated with the physical environment.
    Type: Application
    Filed: June 15, 2017
    Publication date: December 20, 2018
    Inventors: MICHEL PAHUD, NATHALIE RICHE, EYAL OFEK, CHRISTOPHE HURTER
  • Publication number: 20180333088
    Abstract: Embodiments relate to using a display and camera of a computing device to perform pulse oximetry. The display of the device is used as an illuminant, a finger is placed over a portion of the display and a camera facing in the same direction as the display. One or more colors are selected to enhance hemoglobin-deoxyhemoglobin contrast in view of display and camera sensitivities. The one or more colors are displayed while a body part covers the displayed color and the camera. The camera captures images of light that has passed through the finger and been internally reflected to the camera. The light reaching the camera has been absorbed by arterial hemoglobin and deoxyhemoglobin at different rates in respective different wavebands. Differences in attenuation of display light at the different wavebands provide sufficient contrast to compute an accurate pulse oxygenation estimate.
    Type: Application
    Filed: May 17, 2017
    Publication date: November 22, 2018
    Inventors: Christian Holz, Eyal Ofek
  • Publication number: 20180321737
    Abstract: In various embodiments, methods and systems for implementing an integrated mixed-input system are provided. The integrated mixed-input system includes paired mixed-input devices for interacting and controlling virtual space input interfaces using real inputs and virtual inputs, sensors, and passive and active haptic feedback associated with the paired mixed-input devices. Real device space tracker data and virtual device space tracker data are accessed via the paired mixed-input devices to determine real input and virtual input that are processed to determine virtual space input. The real device space tracker data and virtual device space tracker data also are used to generate different interaction contexts. In one embodiment, integrated mixed-input system supports interface deviation, where a physical mixed-input device interface is a different size from a size of the virtual space input interface. The virtual space input is communicated to control the virtual space input interface.
    Type: Application
    Filed: June 30, 2017
    Publication date: November 8, 2018
    Inventors: Michel PAHUD, Eyal OFEK
  • Publication number: 20180322701
    Abstract: In various embodiments, computerized methods and systems for syndicating direct and indirect interactions with objects in a computer-mediated environment to facilitate precise interactions with the objects in the computer-mediated environment are provided. The system detects a direct interaction with an object in the computer-mediated reality environment. The direct interaction may be a natural or hypernatural interaction. Subsequently, the system may determine various options of indirect interaction with the object related to the direct interaction. The indirect interaction may be generated by a controller. Upon receiving an indirect interaction, the system may modify the object based on the syndication of the direct interaction and the indirect interaction.
    Type: Application
    Filed: May 4, 2017
    Publication date: November 8, 2018
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER