Patents by Inventor Eyal Ofek

Eyal Ofek has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240005579
    Abstract: Systems and methods for representing two-dimensional representations as three-dimensional avatars are provided herein. In some examples, one or more input video streams are received. A first subject, within the one or more input video streams, is identified. Based on the one or more input video streams, a first view of the first subject is identified. Based on the one or more input video streams, a second view of the first subject is identified. The first subject is segmented into a plurality of planar object. The plurality of planar objects are transformed with respect to each other. The plurality of planar objects are based on the first and second views of the first subject. The plurality of planar objects are output in an output video stream. The plurality of planar objects provide perspective of the first subject to one or more viewers.
    Type: Application
    Filed: June 30, 2022
    Publication date: January 4, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mar GONZALEZ FRANCO, Payod PANDA, Andrew D. WILSON, Kori M. INKPEN, Eyal OFEK, William Arthur Stewart BUXTON
  • Publication number: 20230421722
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11822712
    Abstract: The present concepts relate to devices that can employ graspable controllers that can be employed in various scenarios, such as virtual reality scenarios and augmented reality scenarios. One example device can include multiple expansion assemblies having independently adjustable girths. The multiple expansion assemblies can be stacked adjacent to one another along an axis. A controller can be configured to expand or contract the girths of the expansion assemblies to collectively approximate girths of an object.
    Type: Grant
    Filed: December 14, 2022
    Date of Patent: November 21, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mar Gonzalez Franco, Michael Jack Sinclair, Eyal Ofek, Eric Jordan Gonzalez
  • Publication number: 20230367824
    Abstract: Aspects relate to observing various activities, interactions, behaviors, and other factors associated with a data exchange and creating one or more markers based on significant details associated with the observance. The one or more markers are retained and selectively rendered as a function of one or more conditions that should be satisfied before the marker is presented to the user. Some markers can contain parameters that should be satisfied in order for the marker to be considered complete. If a parameter is not satisfied, subsequent markers can be created as a function of the rendered marker. The subsequent markers can be rendered when a condition associated with the subsequent marker is satisfied.
    Type: Application
    Filed: July 26, 2023
    Publication date: November 16, 2023
    Inventors: Gur KIMCHI, Stephen L. Lawler, Blaise H. Aguera y Arcas, Eyal Ofek
  • Publication number: 20230350628
    Abstract: Users can intuitively and non-obstructively obtain awareness of fellow collaborator views and/or statuses in an augmented reality environment. An interaction is detected between a first and second HMD. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request to generate a set of visual data as a response to the request. The second HMD can communicate the set of visual data to the first HMD, so that the first HMD can render the received set of visual data to be displayed concurrently with its own augmented view. Additional computer-generated object(s) can be positioned in accordance with a real-world or virtualized position of the second HMD, such that the now-visible state information associated with the second HMD does not obstruct a view of the first or second HMD's user.
    Type: Application
    Filed: July 12, 2023
    Publication date: November 2, 2023
    Inventors: Michel PAHUD, Nathalie RICHE, Eyal OFEK, Christophe HURTER
  • Patent number: 11792364
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: October 17, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Patent number: 11782669
    Abstract: Various systems and techniques for intuitively and non-obstructively obtaining awareness of fellow collaborator views and/or statuses in an augmented reality environment are disclosed. An interaction is detected between a first and second HMD, both collaborative participants of a shared dataset represented in an augmented reality environment. The detected interaction can cause the first HMD to request state information associated with the second HMD. The second HMD can process the request in accordance with a privacy policy to generate a set of visual data as a response to the request. The second HMD can communicate the generated set of visual data to the first HMD, such that the first HMD can render the received set of visual data as additional computer-generated object(s) to be displayed concurrently with its own augmented view.
    Type: Grant
    Filed: April 28, 2017
    Date of Patent: October 10, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter
  • Publication number: 20230263403
    Abstract: A sensor device is described herein. The sensor device includes a multi-dimensional optical sensor and processing circuitry, wherein the multi-dimensional optical sensor generates images and the processing circuitry is configured to output data that is indicative of hemodynamics of a user based upon the images. The sensor device is non-invasive, and is able to be incorporated into wearable devices, thereby allowing for continuous output of the data that is indicative of the hemodynamics of the user.
    Type: Application
    Filed: May 1, 2023
    Publication date: August 24, 2023
    Inventors: Christian HOLZ, Eyal OFEK, Michael J. SINCLAIR
  • Publication number: 20230266830
    Abstract: Aspects of the present disclosure relate to semantic user input for a computing device. In examples, user input is identified and processed to identify and automatically perform an associated semantic action. The semantic action may be determined based at least in part on an environmental context associated with the user input. Thus, an action determined for a given user input may change according to the environmental context in which the input was received. For example, an association between user input, an environmental context, and an action may be used to affect the behavior of a computing device as a result of identifying the user input in a scenario that has the environmental context. Such associations may be dynamically determined as a result of user interactions associated with manually provided input, for example to create, update, and/or remove semantic actions associated with a variety of user inputs.
    Type: Application
    Filed: February 22, 2022
    Publication date: August 24, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Eyal OFEK, Michel PAHUD, Edward Sean Lloyd RINTEL, Mar Gonzalez FRANCO, Payod PANDA
  • Patent number: 11734366
    Abstract: Aspects relate to observing various activities, interactions, behaviors, and other factors associated with a data exchange and creating one or more markers based on significant details associated with the observance. The one or more markers are retained and selectively rendered as a function of one or more conditions that should be satisfied before the marker is presented to the user. Some markers can contain parameters that should be satisfied in order for the marker to be considered complete. If a parameter is not satisfied, subsequent markers can be created as a function of the rendered marker. The subsequent markers can be rendered when a condition associated with the subsequent marker is satisfied.
    Type: Grant
    Filed: October 11, 2013
    Date of Patent: August 22, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Gur Kimchi, Stephen L. Lawler, Blaise H. Aguera y Arcas, Eyal Ofek
  • Publication number: 20230259322
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: April 26, 2023
    Publication date: August 17, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11672429
    Abstract: A sensor device is described herein. The sensor device includes a multi-dimensional optical sensor and processing circuitry, wherein the multi-dimensional optical sensor generates images and the processing circuitry is configured to output data that is indicative of hemodynamics of a user based upon the images. The sensor device is non-invasive, and is able to be incorporated into wearable devices, thereby allowing for continuous output of the data that is indicative of the hemodynamics of the user.
    Type: Grant
    Filed: November 25, 2020
    Date of Patent: June 13, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Christian Holz, Eyal Ofek, Michael J. Sinclair
  • Patent number: 11669294
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: June 6, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Publication number: 20230115959
    Abstract: The present concepts relate to devices that can employ graspable controllers that can be employed in various scenarios, such as virtual reality scenarios and augmented reality scenarios. One example device can include multiple expansion assemblies having independently adjustable girths. The multiple expansion assemblies can be stacked adjacent to one another along an axis. A controller can be configured to expand or contract the girths of the expansion assemblies to collectively approximate girths of an object.
    Type: Application
    Filed: December 14, 2022
    Publication date: April 13, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mar GONZALEZ FRANCO, Michael Jack SINCLAIR, Eyal OFEK, Eric Jordan GONZALEZ
  • Patent number: 11580704
    Abstract: Various embodiments are provided herein for tracking a user's physical environment, to facilitate on-the-fly blending of a virtual environment with detected aspects of the physical environment. Embodiments can be employed to facilitate virtual roaming by compositing virtual representations of detected physical objects into virtual environments. A computing device coupled to a HMD can select portions of a depth map generated based on the user's physical environment, to generate virtual objects that correspond to the selected portions. The computing device can composite the generated virtual objects into an existing virtual environment, such that the user can traverse the virtual environment while remaining aware of their physical environment. Among other things, the computing device can employ various blending techniques for compositing, and further provide image pass-through techniques for selective viewing of the physical environment while remaining fully-immersed in virtual reality.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: February 14, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Christian Holz, Eyal Ofek, Jeremy Hartmann
  • Patent number: 11556168
    Abstract: The present concepts relate to devices that can employ graspable controllers that can be employed in various scenarios, such as virtual reality scenarios and augmented reality scenarios. One example device can include multiple expansion assemblies having independently adjustable girths. The multiple expansion assemblies can be stacked adjacent to one another along an axis. A controller can be configured to expand or contract the girths of the expansion assemblies to collectively approximate girths of an object.
    Type: Grant
    Filed: January 11, 2021
    Date of Patent: January 17, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mar Gonzalez Franco, Michael Jack Sinclair, Eyal Ofek, Eric Jordan Gonzalez
  • Patent number: 11543947
    Abstract: In various embodiments, methods and systems for implementing a multi-device mixed interactivity system are provided. The interactivity system includes paired mixed-input devices for interacting and controlling virtual objects. In operation, a selection profile associated with a virtual object is accessed. The selection profile is generated based on a selection input determined using real input associated with a selection device and virtual input associated with a mixed-reality device. The selection device has a first display and the mixed-reality device has a second display that both display the virtual object. An annotation input for the virtual object based on a selected portion corresponding to the selection profile is received. An annotation profile based on the annotation input is generated. The annotation profile includes annotation profile attributes for annotating a portion of the virtual object. An annotation of the selected portion of the virtual reality object is caused to be displayed.
    Type: Grant
    Filed: June 1, 2021
    Date of Patent: January 3, 2023
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Michel Pahud, Nathalie Riche, Eyal Ofek, Christophe Hurter, Steven Mark Drucker
  • Publication number: 20220409996
    Abstract: The present concepts relate to devices that can approximate virtual objects when touched by a user. One example device can include a location assembly configured to sense a location of the device on a surface and configured to move the robotic device on the surface and a shape assembly secured to the location assembly and configured to adjust a height and a pitch of an upper surface of the robotic device.
    Type: Application
    Filed: June 24, 2021
    Publication date: December 29, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mar GONZALEZ FRANCO, Eyal OFEK, Michael Jack SINCLAIR, Ryo SUZUKI
  • Publication number: 20220382506
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20220385855
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA