Patents by Inventor Hrvoje Benko

Hrvoje Benko has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180003990
    Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.
    Type: Application
    Filed: May 6, 2013
    Publication date: January 4, 2018
    Inventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
  • Patent number: 9857938
    Abstract: A unique system and method is provided that facilitates pixel-accurate targeting with respect to multi-touch sensitive displays when selecting or viewing content with a cursor. In particular, the system and method can track dual inputs from a primary finger and a secondary finger, for example. The primary finger can control movement of the cursor while the secondary finger can adjust a control-display ratio of the screen. As a result, cursor steering and selection of an assistance mode can be performed at about the same time or concurrently. In addition, the system and method can stabilize a cursor position at a top middle point of a user's finger in order to mitigate clicking errors when making a selection.
    Type: Grant
    Filed: December 20, 2013
    Date of Patent: January 2, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Andrew D. Wilson, Patrick M. Baudisch
  • Patent number: 9857915
    Abstract: Described herein is an apparatus that includes a curved display surface that has an interior and an exterior. The curved display surface is configured to display images thereon. The apparatus also includes an emitter that emits light through the interior of the curved display surface. A detector component analyzes light reflected from the curved display surface to detect a position on the curved display surface where a first member is in physical contact with the exterior of the curved display surface.
    Type: Grant
    Filed: May 19, 2008
    Date of Patent: January 2, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Hrvoje Benko, Andrew Wilson, Ravin Balakrishnan
  • Publication number: 20170329446
    Abstract: A method, system, and one or more computer-readable storage media for providing multi-dimensional haptic touch screen interaction are provided herein. The method includes detecting a force applied to a touch screen by an object and determining a magnitude, direction, and location of the force. The method also includes determining a haptic force feedback to be applied by the touch screen on the object based on the magnitude, direction, and location of the force applied to the touch screen, and displacing the touch screen in a specified direction such that the haptic force feedback is applied by the touch screen on the object.
    Type: Application
    Filed: June 20, 2017
    Publication date: November 16, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Michael J. SINCLAIR, Michel PAHUD, Hrvoje BENKO
  • Publication number: 20170330031
    Abstract: The cross-modal sensor fusion technique described herein tracks mobile devices and the users carrying them. The technique matches motion features from sensors on a mobile device to image motion features obtained from images of the device. For example, the acceleration of a mobile device, as measured by an onboard internal measurement unit, is compared to similar acceleration observed in the color and depth images of a depth camera. The technique does not require a model of the appearance of either the user or the device, nor in many cases a direct line of sight to the device. The technique can operate in real time and can be applied to a wide variety of ubiquitous computing scenarios.
    Type: Application
    Filed: May 11, 2017
    Publication date: November 16, 2017
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Patent number: 9805514
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Grant
    Filed: April 21, 2016
    Date of Patent: October 31, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Mark Hancock, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian
  • Publication number: 20170309071
    Abstract: Dynamic haptic retargeting can be implemented using world warping techniques and body warping techniques. World warping is applied to improve an alignment between a virtual object and a physical object, while body warping is applied to redirect a user's motion to increase a likelihood that a physical hand will reach the physical object at the same time a virtual representation of the hand reaches the virtual object. Threshold values and/or a combination of world warping a body warping can be used to mitigate negative impacts that may be caused by using either technique excessively or independently.
    Type: Application
    Filed: April 21, 2016
    Publication date: October 26, 2017
    Inventors: Hrvoje Benko, Mark Hancock, Andrew D. Wilson, Eyal Ofek, Mahdi Azmandian
  • Publication number: 20170300170
    Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
    Type: Application
    Filed: July 1, 2017
    Publication date: October 19, 2017
    Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
  • Publication number: 20170287218
    Abstract: A method, computing device and head-mounted display device for manipulating a virtual object displayed via a display device are disclosed. In one example, image data of a physical environment comprising physical features is received. A three dimensional model of at least a portion of the environment is generated. Candidate anchor features that each correspond to one of the physical features are extracted from the image data. User input is received that manipulates the virtual object as displayed within the environment. Based on the manipulation, a correspondence between a virtual anchor feature of the virtual object and a corresponding candidate anchor feature is identified. An indication of the corresponding candidate anchor feature at its corresponding physical feature within the environment is displayed to the user.
    Type: Application
    Filed: March 30, 2016
    Publication date: October 5, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Benjamin Nuernberger, Hrvoje Benko, Andrew Wilson, Eyal Ofek
  • Publication number: 20170285344
    Abstract: Various technologies described herein pertain to a head mounted display device having a display with a central portion and a periphery portion. Graphical content can be displayed on the central portion of the display. The central portion can be a primary display that provides a field of view and displays the graphical content, and the periphery portion can be a peripheral display. The peripheral display can be positioned relative to the primary display such that an overall field of view provided by the primary display and the peripheral display is extended compared to the field of view of the primary display. Further, complementary content can be rendered based on the graphical content and caused to be displayed on the periphery portion (e.g., the peripheral display). The complementary content can include a countervection visualization viewable in a far periphery region of a field of view of human vision.
    Type: Application
    Filed: March 29, 2016
    Publication date: October 5, 2017
    Inventors: Hrvoje Benko, Bo Robert Xiao
  • Patent number: 9727161
    Abstract: Pen and computing device sensor correlation technique embodiments correlate sensor signals received from various grips on a touch-sensitive pen and touches to a touch-sensitive computing device in order to determine the context of such grips and touches and to issue context-appropriate commands to the touch-sensitive pen or the touch-sensitive computing device. A combination of concurrent sensor inputs received from both a touch-sensitive pen and a touch-sensitive computing device are correlated. How the touch-sensitive pen and the touch-sensitive computing device are touched or gripped are used to determine the context of their use and the user's intent. A context-appropriate user interface action based can then be initiated. Also the context can be used to label metadata.
    Type: Grant
    Filed: June 12, 2014
    Date of Patent: August 8, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Ken Hinckley, Hrvoje Benko, Michel Pahud, Andrew D. Wilson, Pourang Polad Irani, Francois Guimbretiere
  • Patent number: 9715300
    Abstract: A method, system, and one or more computer-readable storage media for providing multi-dimensional haptic touch screen interaction are provided herein. The method includes detecting a force applied to a touch screen by an object and determining a magnitude, direction, and location of the force. The method also includes determining a haptic force feedback to be applied by the touch screen on the object based on the magnitude, direction, and location of the force applied to the touch screen, and displacing the touch screen in a specified direction such that the haptic force feedback is applied by the touch screen on the object.
    Type: Grant
    Filed: March 4, 2013
    Date of Patent: July 25, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael J. Sinclair, Michel Pahud, Hrvoje Benko
  • Publication number: 20170201722
    Abstract: A tele-immersive environment is described that provides interaction among participants of a tele-immersive session. The environment includes two or more set-ups, each associated with a participant. Each set-up, in turn, includes mirror functionality for presenting a three-dimensional virtual space for viewing by a local participant. The virtual space shows at least some of the participants as if the participants were physically present at a same location and looking into a mirror. The mirror functionality can be implemented as a combination of a semi-transparent mirror and a display device, or just a display device acting alone. According to another feature, the environment may present a virtual object in a manner that allows any of the participants of the tele-immersive session to interact with the virtual object.
    Type: Application
    Filed: March 28, 2017
    Publication date: July 13, 2017
    Inventors: Andrew D. Wilson, Zhengyou Zhang, Philip A. Chou, Neil S. Fishman, Donald M. Gillett, Hrvoje Benko
  • Patent number: 9703398
    Abstract: A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device.
    Type: Grant
    Filed: June 16, 2009
    Date of Patent: July 11, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David Alexander Butler, Nicolas Villar, John Helmes, Shahram Izadi, Stephen E. Hodges, Daniel Rosenfeld, Hrvoje Benko
  • Patent number: 9696427
    Abstract: Embodiments for a depth sensing camera with a wide field of view are disclosed. In one example, a depth sensing camera comprises an illumination light projection subsystem, an image detection subsystem configured to acquire image data having a wide angle field of view, a logic subsystem configured to execute instructions, and a data-holding subsystem comprising stored instructions executable by the logic subsystem to control projection of illumination light and to determine depth values from image data acquired via the image sensor. The image detection subsystem comprises an image sensor and one or more lenses.
    Type: Grant
    Filed: August 14, 2012
    Date of Patent: July 4, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew Wilson, Hrvoje Benko, Jay Kapur, Stephen Edward Hodges
  • Patent number: 9679199
    Abstract: The cross-modal sensor fusion technique described herein tracks mobile devices and the users carrying them. The technique matches motion features from sensors on a mobile device to image motion features obtained from images of the device. For example, the acceleration of a mobile device, as measured by an onboard internal measurement unit, is compared to similar acceleration observed in the color and depth images of a depth camera. The technique does not require a model of the appearance of either the user or the device, nor in many cases a direct line of sight to the device. The technique can operate in real time and can be applied to a wide variety of ubiquitous computing scenarios.
    Type: Grant
    Filed: December 4, 2013
    Date of Patent: June 13, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Andrew D. Wilson, Hrvoje Benko
  • Patent number: 9641805
    Abstract: A tele-immersive environment is described that provides interaction among participants of a tele-immersive session. The environment includes two or more set-ups, each associated with a participant. Each set-up, in turn, includes mirror functionality for presenting a three-dimensional virtual space for viewing by a local participant. The virtual space shows at least some of the participants as if the participants were physically present at a same location and looking into a mirror. The mirror functionality can be implemented as a combination of a semi-transparent mirror and a display device, or just a display device acting alone. According to another feature, the environment may present a virtual object in a manner that allows any of the participants of the tele-immersive session to interact with the virtual object.
    Type: Grant
    Filed: March 18, 2016
    Date of Patent: May 2, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Andrew D. Wilson, Zhengyou Zhang, Philip A. Chou, Neil S. Fishman, Donald M. Gillett, Hrvoje Benko
  • Publication number: 20170115782
    Abstract: By correlating user grip information with micro-mobility events, electronic devices can provide support for a broad range of interactions and contextually-dependent techniques. Such correlation allows electronic devices to better identify device usage contexts, and in turn provide a more responsive and helpful user experience, especially in the context of reading and task performance. To allow for accurate and efficient device usage context identification, a model may be used to make device usage context determinations based on the correlated gesture and micro-mobility data. Once a context, device usage context, or gesture is identified, an action can be taken on one or more electronic devices.
    Type: Application
    Filed: October 23, 2015
    Publication date: April 27, 2017
    Inventors: Kenneth P. Hinckley, Hrvoje Benko, Michel Pahud, Dongwook Yoon
  • Patent number: 9569094
    Abstract: An input device has both a touch sensor and a position sensor. A computer using data from the input device uses the relative motion of a contact on a touch sensor with respect to motion from a position detector to disambiguate intentional from incidental motion. The input device provides synchronized position sensor and touch sensor data to the computer to permit processing the relative motion and performing other computations on both position sensor and touch sensor data. The input device can encode the magnitude and direction of motion of the position sensor and combines it with the touch sensor data from the same time frame, and output the synchronized data to the computer.
    Type: Grant
    Filed: July 21, 2014
    Date of Patent: February 14, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Christopher Stoumbos, John Miller, Robert Young, Hrvoje Benko, David Perek, Peter Ansell, Oyvind Haehre
  • Publication number: 20170024370
    Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.
    Type: Application
    Filed: May 18, 2016
    Publication date: January 26, 2017
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Björn U. HARTMANN, Andrew D. WILSON, Hrvoje BENKO, Meredith J. MORRIS