Patents by Inventor Hrvoje Benko
Hrvoje Benko has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 8619049Abstract: One or more techniques and/or systems are provided for monitoring interactions by an input object with an interactive interface projected onto an interface object. That is, an input object (e.g., a finger) and an interface object (e.g., a wall, a hand, a notepad, etc.) may be identified and tracked in real-time using depth data (e.g., depth data extracted from images captured by a depth camera). An interactive interface (e.g., a calculator, an email program, a keyboard, etc.) may be projected onto the interface object, such that the input object may be used to interact with the interactive interface. For example, the input object may be tracked to determine whether the input object is touching or hovering above the interface object and/or a projected portion of the interactive interface. If the input object is in a touch state, then a corresponding event associated with the interactive interface may be invoked.Type: GrantFiled: May 17, 2011Date of Patent: December 31, 2013Assignee: Microsoft CorporationInventors: Chris Harrison, Hrvoje Benko, Andrew David Wilson
-
Publication number: 20130342525Abstract: Methods, systems, and computer-readable media providing focal feedback and control in a three-dimensional display. Focal anchors are provided at different depths and used to determine at what depth the user is currently focusing. The focal anchors are also used to receive input from the user. By looking at a focal anchor, the use can cause the portion of content associated with the focal anchor to be displayed more prominently relative to content displayed at different depths. In one embodiment, predictive feedback is provided at a depth associated with one of the focal anchors.Type: ApplicationFiled: June 22, 2012Publication date: December 26, 2013Applicant: MICROSOFT CORPORATIONInventors: Hrvoje Benko, Daniel Chaim Robbins
-
Publication number: 20130343639Abstract: An automatic handwriting morphing and modification system and method for digitally altering the handwriting of a user while maintaining the overall appearance and style of the user's handwriting. Embodiments of the system and method do not substitute or replace characters or words but instead morph and modify the user's handwritten strokes to retain a visual correlation between the original user's handwriting and the morphed and modified version of the user's handwriting. Embodiments of the system and method input the user's handwriting and a set of morph rules that determine what the handwritten strokes of the user can look more like after processing. Morphs, which are a specific type or appearance of a handwritten stroke, are selected based on the target handwriting. The selected morphs are applied using geometric tuning, semantic tuning, or both. The result is a morphed and modified version of the user's handwriting.Type: ApplicationFiled: June 20, 2012Publication date: December 26, 2013Applicant: Microsoft CorporationInventors: Hrvoje Benko, Benoit Barabe
-
Publication number: 20130335594Abstract: Captured data is obtained, including various types of captured or recorded data (e.g., image data, audio data, video data, etc.) and/or metadata describing various aspects of the capture device and/or the manner in which the data is captured. One or more elements of the captured data that can be replaced by one or more substitute elements are determined, the replaceable elements are removed from the captured data, and links to the substitute elements are associated with the captured data. Links to additional elements to enhance the captured data are also associated with the captured data. Enhanced content can subsequently be constructed based on the captured data as well as the links to the substitute elements and additional elements.Type: ApplicationFiled: June 18, 2012Publication date: December 19, 2013Applicant: MICROSOFT CORPORATIONInventors: Hrvoje Benko, Paul Henry Dietz, Stephen G. Latta, Kevin Geisner, Steven Nabil Bathiche
-
Publication number: 20130300668Abstract: Grip-based device adaptations are described in which a touch-aware skin of a device is employed to adapt device behavior in various ways. The touch-aware skin may include a plurality of sensors from which a device may obtain input and decode the input to determine grip characteristics indicative of a user's grip. On-screen keyboards and other input elements may then be configured and located in a user interface according to a determined grip. In at least some embodiments, a gesture defined to facilitate selective launch of on-screen input element may be recognized and used in conjunction with grip characteristics to launch the on-screen input element in dependence upon grip. Additionally, touch and gesture recognition parameters may be adjusted according to a determined grip to reduce misrecognition.Type: ApplicationFiled: May 20, 2013Publication date: November 14, 2013Inventors: Anatoly Churikov, Catherine N. Boulanger, Hrvoje Benko, Luis E. Cabrera-Cordon, Paul Henry Dietz, Steven Nabil Bathiche, Kenneth P. Hinckley
-
Patent number: 8581856Abstract: Described herein is a system that includes a receiver component that receives gesture data from a sensor unit that is coupled to a body of a gloveless user, wherein the gesture data is indicative of a bodily gesture of the user, wherein the bodily gesture comprises movement pertaining to at least one limb of the gloveless user. The system further includes a location determiner component that determines location of the bodily gesture with respect to a touch-sensitive display apparatus. The system also includes a display component that causes the touch-sensitive display apparatus to display an image based at least in part upon the received gesture data and the determined location of the bodily gesture with respect to the touch-sensitive display apparatus.Type: GrantFiled: May 27, 2009Date of Patent: November 12, 2013Assignee: Microsoft CorporationInventors: Hrvoje Benko, Desney S. Tan, Daniel Morris, Timothy Scott Saponas
-
Publication number: 20130295539Abstract: Physical movement of a human subject may be guided by a visual cue. A physical environment may be observed to identify a current position of a body portion of the human subject. A model path of travel may be obtained for the body portion of the human subject. The visual cue may be projected onto the human subject and/or into a field of view of the human subject. The visual cue may indicate the model path of travel for the body portion of the human subject.Type: ApplicationFiled: May 3, 2012Publication date: November 7, 2013Applicant: MICROSOFT CORPORATIONInventors: Andrew Wilson, Hrvoje Benko, Rajinder Sodhi
-
Publication number: 20130286223Abstract: Photos are shared among devices that are in close proximity to one another and for which there is a connection among the devices. The photos can be shared automatically, or alternatively based on various user inputs. Various different controls can also be placed on sharing photos to restrict the other devices with which photos can be shared, the manner in which photos can be shared, and/or how the photos are shared.Type: ApplicationFiled: April 25, 2012Publication date: October 31, 2013Applicant: MICROSOFT CORPORATIONInventors: Stephen G. Latta, Kenneth P. Hinckley, Kevin Geisner, Steven Nabil Bathiche, Hrvoje Benko, Vivek Pradeep
-
Publication number: 20130257777Abstract: A “Motion and Context Sharing Technique” uses a pen or stylus enhanced to incorporate multiple sensors, i.e., a “sensor pen,” and a power supply to enable various input techniques and gestures. Various combinations of pen stroke, pressure, motion, and other sensor pen inputs are used to enable various hybrid input techniques that incorporate simultaneous, concurrent, sequential, and/or interleaved, sensor pen inputs and touch inputs (i.e., finger, palm, hand, etc.) on displays or other touch sensitive surfaces. This enables a variety of motion-gesture inputs relating to the context of how the sensor pen is used or held, even when the pen is not in contact or within sensing range of the computing device digitizer. In other words, any particular touch inputs or combinations of touch inputs are correlated with any desired sensor pen inputs, with those correlated inputs then being used to initiate any desired action by the computing device.Type: ApplicationFiled: May 28, 2013Publication date: October 3, 2013Applicant: Microsoft CorporationInventors: Hrvoje Benko, Xiang Chen, Kenneth Paul Hinckley
-
Publication number: 20130241806Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.Type: ApplicationFiled: May 6, 2013Publication date: September 19, 2013Applicant: Microsoft CorporationInventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
-
Publication number: 20130234992Abstract: In some implementations, a touch point on a surface of a touchscreen device may be determined. An image of a region of space above the surface and surrounding the touch point may be determined The image may include a brightness gradient that captures a brightness of objects above the surface. A binary image that includes one or more binary blobs may be created based on a brightness of portions of the image. A determination may be made as to which of the one more binary blobs are connected to each other to form portions of a particular user. A determination may be made that the particular user generated the touch point.Type: ApplicationFiled: April 22, 2013Publication date: September 12, 2013Applicant: Microsoft CorporationInventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
-
Publication number: 20130229353Abstract: An interaction management module (IMM) is described for allowing users to engage an interactive surface in a collaborative environment using various input devices, such as keyboard-type devices and mouse-type devices. The IMM displays digital objects on the interactive surface that are associated with the devices in various ways. The digital objects can include input display interfaces, cursors, soft-key input mechanisms, and so on. Further, the IMM provides a mechanism for establishing a frame of reference for governing the placement of each cursor on the interactive surface. Further, the IMM provides a mechanism for allowing users to make a digital copy of a physical article placed on the interactive surface. The IMM also provides a mechanism which duplicates actions taken on the digital copy with respect to the physical article, and vice versa.Type: ApplicationFiled: March 27, 2013Publication date: September 5, 2013Applicant: Microsoft CorporationInventors: Björn U. Hartmann, Andrew D. Wilson, Hrvoje Benko, Meredith J. Morris
-
Publication number: 20130215454Abstract: Three-dimensional printing techniques are described. In one or more implementations, a system includes a three-dimensional printer and a computing device. The three-dimensional printer has a three-dimensional printing mechanism that is configured to form a physical object in three dimensions. The computing device is communicatively coupled to the three-dimensional printer and includes a three-dimensional printing module implemented at least partially in hardware to cause the three-dimensional printer to form the physical object in three dimensions as having functionality configured to communicate with a computing device.Type: ApplicationFiled: February 21, 2012Publication date: August 22, 2013Applicant: MICROSOFT CORPORATIONInventors: Desney S. Tan, Hrvoje Benko, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Kenneth P. Hinckley
-
Patent number: 8514188Abstract: A method of controlling a virtual object within a virtual workspace includes recognizing a hand posture of an initial touch gesture directed to a touch-input receptor, and a mode constraint is set based on the hand posture. The mode constraint specifies a constrained parameter of a virtual object that is to be maintained responsive to a subsequent touch gesture. The method further includes recognizing a subsequent touch gesture directed to the touch-input receptor. An unconstrained parameter of the virtual object is modulated responsive to the subsequent touch gesture while the constrained parameter of the virtual object is maintained in accordance with the mode constraint.Type: GrantFiled: December 30, 2009Date of Patent: August 20, 2013Assignee: Microsoft CorporationInventors: Paul Armistead Hoover, Maxim Oustiogov, Daniel J. Wigdor, Hrvoje Benko, Jarrod Lombardo
-
Publication number: 20130201095Abstract: Techniques involving presentations are described. In one or more implementations, a user interface is output by a computing device that includes a slide of a presentation, the slide having an object that is output for display in three dimensions. Responsive to receipt of one or more inputs by the computing device, how the object in the slide is output for display in the three dimensions is altered.Type: ApplicationFiled: February 7, 2012Publication date: August 8, 2013Applicant: MICROSOFT CORPORATIONInventors: Paul Henry Dietz, Vivek Pradeep, Stephen G. Latta, Kenneth P. Hinckley, Hrvoje Benko, Alice Jane Bernheim Brush
-
Publication number: 20130181902Abstract: Skinnable touch device grip pattern techniques are described herein. A touch-aware skin may be configured to substantially cover the outer surfaces of a computing device. The touch-aware skin may include a plurality of skin sensors configured to detect interaction with the skin at defined locations. The computing device may include one or more modules operable to obtain input from the plurality of skin sensors and decode the input to determine grips patterns that indicate how the computing device is being held by a user. Various functionality provided by the computing device may be selectively enabled and/or adapted based on a determined grip pattern such that the provided functionality may change to match the grip pattern.Type: ApplicationFiled: January 17, 2012Publication date: July 18, 2013Applicant: MICROSOFT CORPORATIONInventors: Kenneth P. Hinckley, Paul Henry Dietz, Hrvoje Benko, Desney S. Tan, Steven Nabil Bathiche
-
Publication number: 20130182892Abstract: Methods, systems, and computer-readable media for establishing an ad hoc network of devices that can be used to interpret gestures. Embodiments of the invention use a network of sensors with an ad hoc spatial configuration to observe physical objects in a performance area. The performance area may be a room or other area within range of the sensors. Initially, devices within the performance area, or with a view of the performance area, are indentified. Once identified, the sensors go through a discovery phase to locate devices within an area. Once the discovery phase is complete and the devices within the ad hoc network are located, the combined signals received from the devices may be used to interpret gestures made within the performance area.Type: ApplicationFiled: January 18, 2012Publication date: July 18, 2013Applicant: MICROSOFT CORPORATIONInventors: Eric Horvitz, Kenneth P. Hinckley, Hrvoje Benko
-
Patent number: 8446367Abstract: Technologies for a camera-based multi-touch input device operable to provide conventional mouse movement data as well as three-dimensional multi-touch data. Such a device is based on an internal camera focused on a mirror or set of mirrors enabling the camera to image the inside of a working surface of the device. The working surface allows light to pass through. An internal light source illuminates the inside of the working surface and reflects off of any objects proximate to the outside of the device. This reflected light is received by the mirror and then directed to the camera. Imaging from the camera can be processed to extract touch points corresponding to the position of one or more objects outside the working surface as well as to detect gestures performed by the objects. Thus the device can provide conventional mouse functionality as well as three-dimensional multi-touch functionality.Type: GrantFiled: April 17, 2009Date of Patent: May 21, 2013Assignee: Microsoft CorporationInventors: Hrvoje Benko, Daniel Allen Rosenfeld, Eyal Ofek, Billy Chen, Shahram Izadi, Nicolas Villar, John Helmes
-
Patent number: 8436789Abstract: An image orientation system is provided wherein images (rays of lights) are projected to a user based on the user's field of view or viewing angle. As the rays of light are projected, streams of air can be produced that bend or focus the rays of light toward the user's field of view. The streams of air can be cold air, hot air, or combinations thereof. Further, an image receiver can be utilized to receive the produced image/rays of light directly in line with the user's field of view. The image receiver can be a wearable device, such as a head mounted display.Type: GrantFiled: January 16, 2009Date of Patent: May 7, 2013Assignee: Microsoft CorporationInventors: Steven N. Bathiche, Hrvoje Benko, Stephen E. Hodges, Shahram Izadi, David Alexander Butler, William Ben Kunz, Shawn R. LeProwse
-
Patent number: 8432366Abstract: The claimed subject matter provides a system and/or a method that facilitates distinguishing input among one or more users in a surface computing environment. A variety of information can be obtained and analyzed to infer an association between a particular input and a particular user. Touch point information can be acquired from a surface wherein the touch point information relates to a touch point. In addition, one or more environmental sensors can monitor the surface computing environment and provide environmental information. The touch point information and the environmental information can be analyzed to determine direction of inputs, location of users, and movement of users and so on. Individual analysis results can be correlated and/or aggregated to generate a inference of association between a touch point and user.Type: GrantFiled: March 3, 2009Date of Patent: April 30, 2013Assignee: Microsoft CorporationInventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley