Patents by Inventor Eyal Ofek

Eyal Ofek has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220221928
    Abstract: The present concepts relate to devices that can employ graspable controllers that can be employed in various scenarios, such as virtual reality scenarios and augmented reality scenarios. One example device can include multiple expansion assemblies having independently adjustable girths. The multiple expansion assemblies can be stacked adjacent to one another along an axis. A controller can be configured to expand or contract the girths of the expansion assemblies to collectively approximate girths of an object.
    Type: Application
    Filed: January 11, 2021
    Publication date: July 14, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mar GONZALEZ FRANCO, Michael Jack SINCLAIR, Eyal OFEK, Eric Jordan GONZALEZ
  • Publication number: 20220129060
    Abstract: In some examples, a surface, such as a desktop, in front or around a portable electronic device may be used as a relatively large surface for interacting with the portable electronic device, which typically has a small display screen. A user may write or draw on the surface using any object such as a finger, pen, or stylus. The surface may also be used to simulate a partial or full size keyboard. The use of a camera to sense the three-dimensional (3D) location or motion of the object may enable use of above-the-surface gestures, entry of directionality, and capture of real objects into a document being processed or stored by the electronic device. One or more objects may be used to manipulate elements displayed by the portable electronic device.
    Type: Application
    Filed: November 4, 2021
    Publication date: April 28, 2022
    Inventors: Eyal Ofek, Michel Pahud, Pourang P. Irani
  • Publication number: 20220100278
    Abstract: The present concepts relate to haptic controllers. In one example the haptic controller can include first and second capstans rotationally secured to a base and an energy storage mechanism connected between the first and second capstans. The example haptic controller can also include a user engagement assembly secured to the first capstan and a controller configured to control rotational forces imparted on the user engagement assembly by controlling rotational friction experienced by the first and second capstans.
    Type: Application
    Filed: December 13, 2021
    Publication date: March 31, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Michael Jack SINCLAIR, Mar GONZALEZ FRANCO, Christian HOLZ, Eyal OFEK
  • Patent number: 11226685
    Abstract: The present concepts relate to haptic controllers. In one example the haptic controller can include first and second capstans rotationally secured to a base and an energy storage mechanism connected between the first and second capstans. The example haptic controller can also include a user engagement assembly secured to the first capstan and a controller configured to control rotational forces imparted on the user engagement assembly by controlling rotational friction experienced by the first and second capstans.
    Type: Grant
    Filed: June 12, 2019
    Date of Patent: January 18, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Jack Sinclair, Mar Gonzalez Franco, Christian Holz, Eyal Ofek
  • Publication number: 20220011924
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Application
    Filed: June 1, 2021
    Publication date: January 13, 2022
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER, Steven Mark DRUCKER
  • Patent number: 11188143
    Abstract: In some examples, a surface, such as a desktop, in front or around a portable electronic device may be used as a relatively large surface for interacting with the portable electronic device, which typically has a small display screen. A user may write or draw on the surface using any object such as a finger, pen, or stylus. The surface may also be used to simulate a partial or full size keyboard. The use of a camera to sense the three-dimensional (3D) location or motion of the object may enable use of above-the-surface gestures, entry of directionality, and capture of real objects into a document being processed or stored by the electronic device. One or more objects may be used to manipulate elements displayed by the portable electronic device.
    Type: Grant
    Filed: January 4, 2016
    Date of Patent: November 30, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Michel Pahud, Pourang P Irani
  • Patent number: 11119581
    Abstract: In various embodiments, computerized systems and methods for displacement oriented interaction with objects in a computer-mediated environment are provided. In one embodiment, the system detects a wearable device moved with a displacement transversal to a longitudinal axis of the wearable device. If the system determines that the displacement is within a displacement range associated with an actionable item, the system may select the actionable item or activate an operation associated with the actionable item, such as modifying an object in the computer-mediated reality environment.
    Type: Grant
    Filed: March 6, 2020
    Date of Patent: September 14, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Patent number: 11086398
    Abstract: Examples are disclosed that relate to haptic rendering. One disclosed example provides a haptic rendering device including a patterned layer that exhibits auxetic behavior, and a plurality of actuators configured to move the patterned layer, each actuator being individually controllable to cooperatively change a curvature of the patterned layer in two dimensions.
    Type: Grant
    Filed: June 7, 2019
    Date of Patent: August 10, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mar Gonzalez Franco, Eyal Ofek, Michael Jack Sinclair, Anthony James Steed
  • Patent number: 11087518
    Abstract: The claimed subject matter relates to an architecture that can provide for a second-person avatar. The second-person avatar can rely upon a second-person-based perspective such that the avatar is displayed to appear to encompass all or portions of a target user. Accordingly, actions or a configuration of the avatar can serve as a model or demonstration for the user in order to aid the user in accomplishing a particular task. Updates to avatar activity or configuration can be provided by a dynamic virtual handbook. The virtual handbook can be constructed based upon a set of instruction associated with accomplishing the desired task and further based upon features or aspects of the user as well as those of the local environment.
    Type: Grant
    Filed: August 4, 2016
    Date of Patent: August 10, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Avi Bar-Zeev, Gur Kimchi, Jason Szabo
  • Publication number: 20210241536
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Application
    Filed: April 9, 2021
    Publication date: August 5, 2021
    Inventors: Andrew D. WILSON, Christian HOLZ, Eyal OFEK, Jeremy HARTMANN
  • Patent number: 11068111
    Abstract: Systems and methods for enabling user-interactions with virtual objects (VOs) included in immersive environments (IEs) are provided. A head-mounted display (HMD) device is communicatively coupled with a hover-sensing (HS) device, via a communication session. The HMD device provides an IE to a wearer by displaying a field-of-view (FOV) that includes a VO. The user executes user-interactions, such as 2D and/or 3D hand gestures, fingertip gestures, multi-fingertip gestures, stylus gestures, hover gestures, and the like. The HS device detects the user-interactions and generates interaction data. The interaction data is provided to the HMD device via the communication session. The HMD device updates the FOV and/or the VO based on the interaction data. A physical overlay that includes a 3D protrusion is coupled with the HS device. The overlay is transparent to the hover-sensing capabilities of the HS device. The protrusion provides tactile feedback to the user for the user-interactions.
    Type: Grant
    Filed: November 26, 2019
    Date of Patent: July 20, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Sasa Junuzovic
  • Patent number: 11055891
    Abstract: Examples of the present disclosure describe systems and methods for providing real-time motion styling in virtual reality (VR), augmented reality (AR), and/or mixed reality (MR) environments. In aspects, input data corresponding to user interaction with a VR, an AR, or an MR environment may be received. The input data may be featurized to generate a feature set. The feature set may be compared to a set of stored motion data comprising motion capture data representing one or more motion styles for executing an action or activity. Based on the comparison, the feature set may be matched to feature data for one or more motions styles in the stored motion data. The one or more motions styles may then be executed by a virtual avatar or a virtual object in the VR/AR/MR environment.
    Type: Grant
    Filed: March 10, 2020
    Date of Patent: July 6, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Mar Gonzalez Franco, Andrew D. Wilson, Karan Ahuja, Christian Holz
  • Patent number: 11054894
    Abstract: In various embodiments, methods and systems for implementing an integrated mixed-input system are provided. The integrated mixed-input system includes paired mixed-input devices for interacting and controlling virtual space input interfaces using real inputs and virtual inputs, sensors, and passive and active haptic feedback associated with the paired mixed-input devices. Real device space tracker data and virtual device space tracker data are accessed via the paired mixed-input devices to determine real input and virtual input that are processed to determine virtual space input. The real device space tracker data and virtual device space tracker data also are used to generate different interaction contexts. In one embodiment, integrated mixed-input system supports interface deviation, where a physical mixed-input device interface is a different size from a size of the virtual space input interface. The virtual space input is communicated to control the virtual space input interface.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: July 6, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, Eyal Ofek
  • Patent number: 11023109
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: June 1, 2021
    Assignee: Microsoft Techniogy Licensing, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Patent number: 11003247
    Abstract: The present concepts relate to devices that can employ deployable controllers. In one example the device can include a base assembly configured to ground the device to a non-hand body part of a user. The example can also include an engagement assembly configured to receive tactile input from a hand of the user or to deliver tactile output to the hand of the user. The device can further include a deployment assembly extending from the base assembly to the engagement assembly and configured to deploy the engagement assembly from a storage orientation proximate to the base assembly to a deployed orientation proximate to the hand of the user.
    Type: Grant
    Filed: March 20, 2020
    Date of Patent: May 11, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Jack Sinclair, Robert Kovacs, Eyal Ofek, Mar Gonzalez Franco
  • Patent number: 11004269
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Grant
    Filed: April 22, 2019
    Date of Patent: May 11, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Christian Holz, Eyal Ofek, Jeremy Hartmann
  • Patent number: 10976816
    Abstract: Various embodiments are provided herein for modifying a virtual scene based on eye-tracking. A computing device coupled to a HMD can provide a virtual scene for display on the HMD. The computing device can receive sensor data from a set of eye-tracking sensors coupled to the HMD. Based on the received sensor data, the computing device can determine a set of focal regions of the displayed virtual scene, including a perifoveal region of the displayed virtual scene. A portion of the virtual scene can then be modified based, in part, on a determination that the portion is outside of the determined perifoveal region.
    Type: Grant
    Filed: June 25, 2019
    Date of Patent: April 13, 2021
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Sebastian Lennard Marwecki, Eyal Ofek, Christian Holz
  • Publication number: 20210100459
    Abstract: A sensor device is described herein. The sensor device includes a multi-dimensional optical sensor and processing circuitry, wherein the multi-dimensional optical sensor generates images and the processing circuitry is configured to output data that is indicative of hemodynamics of a user based upon the images. The sensor device is non-invasive, and is able to be incorporated into wearable devices, thereby allowing for continuous output of the data that is indicative of the hemodynamics of the user.
    Type: Application
    Filed: November 25, 2020
    Publication date: April 8, 2021
    Inventors: Christian HOLZ, Eyal OFEK, Michael J. SINCLAIR
  • Patent number: 10895966
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection input associated with a virtual object is accessed. The selection input is based on real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display that displays the virtual object and the mixed-reality device has a second display that displays the virtual object. A selection profile is generated based on the selection input. The selection profile comprises one or more selection profile attributes for isolating a portion of the virtual object. A selected portion of the virtual object is determined based on the selection profile. The selected portion of the virtual reality object is caused to be displayed on the first display of the selection device.
    Type: Grant
    Filed: June 30, 2017
    Date of Patent: January 19, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Patent number: 10885710
    Abstract: In various embodiments, computerized methods and systems for dynamically updating a fully-immersive virtual environment based on tracked physical environment data. A computing device coupled to a HMD receives sensor data from a variety of sensors. The computing device can generate a virtual scene based on the received sensor data, whereby the virtual scene includes at least a portion of a virtual path that corresponds to at least a portion of a navigable path determined based on the received sensor data. The computing device can modify the virtual scene include a virtual obstruction that corresponds to a physical object detected based on additional sensor data received from the sensors. The modified virtual scene is presented to the user for display, so that the user can safely traverse the physical environment while staying fully-immersed in the virtual environment.
    Type: Grant
    Filed: March 14, 2019
    Date of Patent: January 5, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Christian Holz, Eyal Ofek, Andrew D. Wilson, Lung-Pan Cheng, Junrui Yang