Patents by Inventor Eyal Ofek

Eyal Ofek has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9916003
    Abstract: A controller device for a virtual environment includes a handle and a contact device having a substantially planar surface. A position of the contact device relative to the handle is adjustable. An actuator module is arranged to adjust the position of the contact device relative to the handle. A control module in communication with the virtual environment selectively controls the actuator module to adjust the position of the contact device in response to data received from the virtual environment. The data includes an indication of an interaction between a user and an object represented within the virtual environment.
    Type: Grant
    Filed: September 2, 2016
    Date of Patent: March 13, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Jack Sinclair, Eyal Ofek, Hrvoje Benko, Christian Holz
  • Publication number: 20180067543
    Abstract: A controller device for a virtual environment includes a handle and a contact device having a substantially planar surface. A position of the contact device relative to the handle is adjustable. An actuator module is arranged to adjust the position of the contact device relative to the handle. A control module in communication with the virtual environment selectively controls the actuator module to adjust the position of the contact device in response to data received from the virtual environment. The data includes an indication of an interaction between a user and an object represented within the virtual environment.
    Type: Application
    Filed: September 2, 2016
    Publication date: March 8, 2018
    Inventors: Michael Jack SINCLAIR, Eyal OFEK, Hrvoje BENKO, Christian HOLZ
  • Publication number: 20180005451
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Application
    Filed: September 13, 2017
    Publication date: January 4, 2018
    Inventors: Hrvoje Benko, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian, Mark Hancock
  • Publication number: 20170371964
    Abstract: Aspects relate to observing various activities, interactions, behaviors, and other factors associated with a data exchange and creating one or more markers based on significant details associated with the observance. The one or more markers are retained and selectively rendered as a function of one or more conditions that should be satisfied before the marker is presented to the user. Some markers can contain parameters that should be satisfied in order for the marker to be considered complete. If a parameter is not satisfied, subsequent markers can be created as a function of the rendered marker. The subsequent markers can be rendered when a condition associated with the subsequent marker is satisfied.
    Type: Application
    Filed: October 11, 2013
    Publication date: December 28, 2017
    Inventors: Gur Kimchi, Stephen L. Lawler, Blaise H. Aguera y Arcas, Eyal Ofek
  • Patent number: 9805514
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Grant
    Filed: April 21, 2016
    Date of Patent: October 31, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Mark Hancock, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian
  • Publication number: 20170309071
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Application
    Filed: April 21, 2016
    Publication date: October 26, 2017
    Inventors: Hrvoje Benko, Mark Hancock, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian
  • Publication number: 20170287218
    Abstract: A method, computing device and head-mounted display device for manipulating a virtual object displayed via a display device are disclosed. In one example, image data of a physical environment comprising physical features is received. A three dimensional model of at least a portion of the environment is generated. Candidate anchor features that each correspond to one of the physical features are extracted from the image data. User input is received that manipulates the virtual object as displayed within the environment. Based on the manipulation, a correspondence between a virtual anchor feature of the virtual object and a corresponding candidate anchor feature is identified. An indication of the corresponding candidate anchor feature at its corresponding physical feature within the environment is displayed to the user.
    Type: Application
    Filed: March 30, 2016
    Publication date: October 5, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Benjamin Nuernberger, Hrvoje Benko, Andrew Wilson, Eyal Ofek
  • Publication number: 20170192493
    Abstract: In some examples, a surface, such as a desktop, in front or around a portable electronic device may be used as a relatively large surface for interacting with the portable electronic device, which typically has a small display screen. A user may write or draw on the surface using any object such as a finger, pen, or stylus. The surface may also be used to simulate a partial or full size keyboard. The use of a camera to sense the three-dimensional (3D) location or motion of the object may enable use of above-the-surface gestures, entry of directionality, and capture of real objects into a document being processed or stored by the electronic device. One or more objects may be used to manipulate elements displayed by the portable electronic device.
    Type: Application
    Filed: January 4, 2016
    Publication date: July 6, 2017
    Inventors: Eyal Ofek, Michel Pahud, Pourang P. Irani
  • Patent number: 9679144
    Abstract: An “AR Privacy API” provides an API that allows applications and web browsers to use various content rendering abstractions to protect user privacy in a wide range of web-based immersive augmented reality (AR) scenarios. The AR Privacy API extends the traditional concept of “web pages” to immersive “web rooms” wherein any desired combination of existing or new 2D and 3D content is rendered within a user's room or other space. Advantageously, the AR Privacy API and associated rendering abstractions are useable by a wide variety of applications and web content for enhancing the user's room or other space with web-based immersive AR content. Further, the AR Privacy API is implemented using any existing or new web page coding platform, including, but not limited to HTML, XML, CSS, JavaScript, etc., thereby enabling existing web content and coding techniques to be smoothly integrated into a wide range of web room AR scenarios.
    Type: Grant
    Filed: November 15, 2013
    Date of Patent: June 13, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David Molnar, John Vilk, Eyal Ofek, Alexander Moshchuk, Jiahe Wang, Ran Gal, Lior Shapira, Douglas Christopher Burger, Blair MacIntyre, Benjamin Livshits
  • Publication number: 20170153741
    Abstract: A camera of a display device may be used to capture images of a hovering finger or stylus. Image processing techniques may be applied to the captured image to sense right-left position of the hovering finger or stylus. To measure distance to the hovering finger or stylus from the camera, a pattern may be displayed by the display so that the hovering finger or stylus is illuminated by a particular portion or color of the pattern over which the finger or stylus hovers. The image processing techniques may be used to determine, from the captured image, which particular portion or color of the pattern illuminates the finger or stylus. This determination, in conjunction with the known displayed pattern, may provide the 3D location or the distance to the hovering finger or stylus from the camera.
    Type: Application
    Filed: December 1, 2015
    Publication date: June 1, 2017
    Inventors: Eyal Ofek, Michel Pahud
  • Publication number: 20170090666
    Abstract: A sensing device, such as a user-wearable device (UWD) worn by a user of a touchscreen, may provide kinematic data of the sensing device or UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 12, 2016
    Publication date: March 30, 2017
    Inventors: Michel Pahud, William Buxton, Kenneth P. Hinckley, Andrew M. Webb, Eyal Ofek
  • Patent number: 9594960
    Abstract: Video from a video camera can be integrated into a still image, with which it shares common elements, to provide greater context and understandability. Pre-processing can derive transformation parameters for transforming and aligning the video to be integrated into the still image in a visually fluid manner. The transformation parameters can then be utilized to transform and align the video in real-time and display it within the still image. Pre-processing can comprise stabilization of video, if the video camera is moveable, and can comprise identification of areas of motion and of static elements. Transformation parameters can be derived by fitting the static elements of the video to portions of one or more existing images. Display of the video in real-time in the still image can include display of the entire transformed and aligned video image, or of only selected sections, to provide for a smoother visual integration.
    Type: Grant
    Filed: September 14, 2010
    Date of Patent: March 14, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Billy Chen
  • Publication number: 20170010695
    Abstract: A user-wearable device (UWD) worn by a user of a touchscreen may provide kinematic data of the UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 8, 2015
    Publication date: January 12, 2017
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton, Eyal Ofek, Andrew M. Webb
  • Publication number: 20170010733
    Abstract: A user-wearable device (UWD) worn by a user of a touchscreen may provide kinematic data of the UWD and/or identification data of the user to a processor that operates the touchscreen. Such data may allow the processor to perform a number of user-touchscreen interactions, such as displaying user-specific windows or menus, processing user-manipulation of displayed objects, and determining which hand of a user performs a touch event, just to name a few examples.
    Type: Application
    Filed: December 8, 2015
    Publication date: January 12, 2017
    Inventors: Michel Pahud, Kenneth P. Hinckley, William Buxton, Eyal Ofek, Andrew M. Webb
  • Publication number: 20160371884
    Abstract: The described implementations relate to complementary augmented reality. One implementation is manifest as a system including a projector that can project a base image from an ancillary viewpoint into an environment. The system also includes a camera that can provide spatial mapping data for the environment and a display device that can display a complementary three-dimensional (3D) image to a user in the environment. In this example, the system can generate the complementary 3D image based on the spatial mapping data and the base image so that the complementary 3D image augments the base image and is dependent on a perspective of the user. The system can also update the complementary 3D image as the perspective of the user in the environment changes.
    Type: Application
    Filed: June 17, 2015
    Publication date: December 22, 2016
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje BENKO, Andrew D. WILSON, Eyal OFEK, Feng ZHENG
  • Publication number: 20160342432
    Abstract: The claimed subject matter relates to an architecture that can provide for a second-person avatar. The second-person avatar can rely upon a second-person-based perspective such that the avatar is displayed to appear to encompass all or portions of a target user. Accordingly, actions or a configuration of the avatar can serve as a model or demonstration for the user in order to aid the user in accomplishing a particular task. Updates to avatar activity or configuration can be provided by a dynamic virtual handbook. The virtual handbook can be constructed based upon a set of instruction associated with accomplishing the desired task and further based upon features or aspects of the user as well as those of the local environment.
    Type: Application
    Filed: August 4, 2016
    Publication date: November 24, 2016
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Avi Bar-Zeev, Gur Kimchi, Jason Szabo
  • Patent number: 9480907
    Abstract: A primary display displays a primary image. A peripheral illusion is displayed around the primary display by an environmental display so that the peripheral illusion appears as an extension of the primary image.
    Type: Grant
    Filed: May 9, 2013
    Date of Patent: November 1, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Hrvoje Benko, Brett R. Jones, Eyal Ofek, Andrew Wilson, Gritsko Perez
  • Publication number: 20160267642
    Abstract: Various systems and methods for projecting a remote object are described herein. In one example, a method includes collecting environment data corresponding to a local environment in which a system is located and detecting a remote object corresponding to a remote user in a remote environment. The method can also include detecting a viewpoint of a local user in the local environment, and projecting the remote object corresponding to the remote user in the local environment based on the viewpoint of the local user, the virtual copy of the remote object to be positioned in the local environment by taking into account geometry of local objects in the local environment.
    Type: Application
    Filed: March 12, 2015
    Publication date: September 15, 2016
    Applicant: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Tomislav Pejsa, Andrew Wilson, Hrvoje Benko, Eyal Ofek, Julian Kantor
  • Patent number: 9436276
    Abstract: The claimed subject matter relates to an architecture that can provide for a second-person avatar. The second-person avatar can rely upon a second-person-based perspective such that the avatar is displayed to appear to encompass all or portions of a target user. Accordingly, actions or a configuration of the avatar can serve as a model or demonstration for the user in order to aid the user in accomplishing a particular task. Updates to avatar activity or configuration can be provided by a dynamic virtual handbook. The virtual handbook can be constructed based upon a set of instruction associated with accomplishing the desired task and further based upon features or aspects of the user as well as those of the local environment.
    Type: Grant
    Filed: February 25, 2009
    Date of Patent: September 6, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Eyal Ofek, Blaise H. Aguera y Arcas, Avi Bar-Zeev, Gur Kimchi, Jason Szabo
  • Patent number: 9424676
    Abstract: Technologies are described herein for transitioning between a top-down map display of a reconstructed structure within a 3-D scene and an associated local-navigation display. An application transitions between the top-down map display and the local-navigation display by animating a view in a display window over a period of time while interpolating camera parameters from values representing a starting camera view to values representing an ending camera view. In one embodiment, the starting camera view is the top-down map display view and the ending camera view is the camera view associated with a target photograph. In another embodiment, the starting camera view is the camera view associated with a currently-viewed photograph in the local-navigation display and the ending camera view is the top-down map display.
    Type: Grant
    Filed: December 10, 2013
    Date of Patent: August 23, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Billy Chen, Eyal Ofek, David Maxwell Gedye, Jonathan Robert Dughi, Mark Ruane Dawson, Joshua Podolak