Patents by Inventor Relja Markovic
Relja Markovic has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9063566Abstract: Various embodiments are provided for a shared collaboration system and related methods for enabling an active user to interact with one or more additional users and with collaboration items. In one embodiment a head-mounted display device is operatively connected to a computing device that includes a collaboration engine program. The program receives observation information of a physical space from the head-mounted display device along with a collaboration item. The program visually augments an appearance of the physical space as seen through the head-mounted display device to include an active user collaboration item representation of the collaboration item. The program populates the active user collaboration item representation with additional user collaboration item input from an additional user.Type: GrantFiled: November 30, 2011Date of Patent: June 23, 2015Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Daniel McCulloch, Stephen Latta, Darren Bennett, Ryan Hastings, Jason Scott, Relja Markovic, Kevin Geisner, Jonathan Steed
-
Publication number: 20150154782Abstract: In applications that display a representation of a user, it may be reasonable to insert a pre-canned animation rather than animating a user's captured motion. For example, in a tennis swing, the ball toss and take back in a serve could be a pre-canned animation, whereas the actual forward swing may be mapped from the user's gestures. An animation of a user's gestures can be chained together into sequences with pre-canned animations, where animation blending techniques can provide for a smoother transition between the animation types. Techniques for blending animations, that may comprise determining boundaries and transition points between pre-canned animations and animations based on captured motion, may improve animation efficiency. Gesture history, including joint position, velocity, and acceleration, can be used to determine user intent, seed parameters for subsequent animations and game control, and determine the subsequent gestures to initiate.Type: ApplicationFiled: February 9, 2015Publication date: June 4, 2015Inventors: Kevin Geisner, Relja Markovic, Stephen Gilchrist Latta
-
Patent number: 9025832Abstract: A method of finding a new social network service friend for a player belonging to a social network service and having a friend group including one or more player-accepted friends includes recognizing the player, automatically identifying an observer within a threshold proximity to the player, and adding the observer to the friend group of the player in the social network service if the observer satisfies a friending criteria of the player.Type: GrantFiled: June 1, 2011Date of Patent: May 5, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Stephen Latta, Relja Markovic, Kevin Geisner, A. Dylan Vance, Brian Scott Murphy, Matt Coohill
-
Publication number: 20150110354Abstract: A system may receive image data and capture motion with respect to a target in a physical space and recognize a gesture from the captured motion. It may be desirable to isolate aspects of captured motion to differentiate random and extraneous motions. For example, a gesture may comprise motion of a user's right arm, and it may be desirable to isolate the motion of the user's right arm and exclude an interpretation of any other motion. Thus, the isolated aspect may be the focus of the received data for gesture recognition. Alternately, the isolated aspects may be an aspect of the captured motion that is removed from consideration when identifying a gesture from the captured motion. For example, gesture filters may be modified to correspond to the user's natural lean to eliminate the effect the lean has on the registry of a motion with a gesture filter.Type: ApplicationFiled: December 22, 2014Publication date: April 23, 2015Inventors: Gregory Nelson Snook, Relja Markovic, Stephen Gilchrist Latta, Kevin Geisner
-
Patent number: 9005029Abstract: One or more physical characteristics of each of multiple users are detected. These physical characteristics of a user can include physical attributes of the user (e.g., the user's height, length of the user's legs) and/or physical skills of the user (e.g., how high the user can jump). Based on these detected one or more physical characteristics of the users, two or more of the multiple users to share an online experience (e.g., play a multi-player game) are identified.Type: GrantFiled: September 14, 2012Date of Patent: April 14, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Brian Scott Murphy, Stephen G. Latta, Darren Alexander Bennett, Pedro Perez, Shawn C. Wright, Relja Markovic, Joel B. Deaguero, Christopher H. Willoughby, Ryan Lucas Hastings, Kevin Geisner
-
Patent number: 9008355Abstract: Automatic depth camera aiming is provided by a method which includes receiving from the depth camera one or more observed depth images of a scene. The method further includes, if a point of interest of a target is found within the scene, determining if the point of interest is within a far range relative to the depth camera. The method further includes, if the point of interest of the target is within the far range, operating the depth camera with a far logic, or if the point of interest of the target is not within the far range, operating the depth camera with a near logic.Type: GrantFiled: June 4, 2010Date of Patent: April 14, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Relja Markovic, Stephen Latta, Kyungsuk David Lee, Oscar Omar Garza Santos, Kevin Geisner
-
Patent number: 8994718Abstract: A virtual skeleton includes a plurality of joints and provides a machine readable representation of a human target observed with a three-dimensional depth camera. A relative position of a hand joint of the virtual skeleton is translated as a gestured control, and a three-dimensional virtual world is controlled responsive to the gestured control.Type: GrantFiled: December 21, 2010Date of Patent: March 31, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Stephen Latta, Darren Bennett, Kevin Geisner, Relja Markovic
-
Patent number: 8988437Abstract: In applications that display a representation of a user, it may be reasonable to insert a pre-canned animation rather than animating a user's captured motion. For example, in a tennis swing, the ball toss and take back in a serve could be a pre-canned animation, whereas the actual forward swing may be mapped from the user's gestures. An animation of a user's gestures can be chained together into sequences with pre-canned animations, where animation blending techniques can provide for a smoother transition between the animation types. Techniques for blending animations, that may comprise determining boundaries and transition points between pre-canned animations and animations based on captured motion, may improve animation efficiency. Gesture history, including joint position, velocity, and acceleration, can be used to determine user intent, seed parameters for subsequent animations and game control, and determine the subsequent gestures to initiate.Type: GrantFiled: March 20, 2009Date of Patent: March 24, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Kevin Geisner, Relja Markovic, Stephen Gilchrist Latta, Gregory Nelson Snook
-
Patent number: 8954947Abstract: In a state management system of an effects system implemented in a Graphics Processing Unit (GPU), techniques and technologies are provided for setting a value to particular variables at application run-time without validating the variables. For example, a compiled effects file comprising a number of variables can be loaded at application load time, and a generic, variable interface pointer for a particular variable of the effects file can be retrieved. A specialized variable interface pointer can then be generated which is associated with the particular variable by specifying a desired type of access that will be performed on the particular variable. At application run-time, the specialized variable interface can be used to set a value to each of the particular variables without validating the particular variables at application run-time.Type: GrantFiled: June 29, 2006Date of Patent: February 10, 2015Assignee: Microsoft CorporationInventors: Relja Markovic, Ramanujan Srinivasan, Samuel Glassenberg
-
Patent number: 8942428Abstract: A system may receive image data and capture motion with respect to a target in a physical space and recognize a gesture from the captured motion. It may be desirable to isolate aspects of captured motion to differentiate random and extraneous motions. For example, a gesture may comprise motion of a user's right arm, and it may be desirable to isolate the motion of the user's right arm and exclude an interpretation of any other motion. Thus, the isolated aspect may be the focus of the received data for gesture recognition. Alternately, the isolated aspects may be an aspect of the captured motion that is removed from consideration when identifying a gesture from the captured motion. For example, gesture filters may be modified to correspond to the user's natural lean to eliminate the effect the lean has on the registry of a motion with a gesture filter.Type: GrantFiled: May 29, 2009Date of Patent: January 27, 2015Assignee: Microsoft CorporationInventors: Gregory Nelson Snook, Relja Markovic, Stephen Gilchrist Latta, Kevin Geisner
-
Patent number: 8933884Abstract: In a motion capture system, a unitary input is provided to an application based on detected movement and/or location of a group of people. Audio information from the group can also be used as an input. The application can provide real-time feedback to the person or group via a display and audio output. The group can control the movement of an avatar in a virtual space based on the movement of each person in the group, such as in a steering or balancing game. To avoid a discontinuous or confusing output by the application, missing data can be generated for a person who is occluded or partially out of the field of view. A wait time can be set for activating a new person and deactivating a currently-active person. The wait time can be adaptive based on a first detected position or a last detected position of the person.Type: GrantFiled: January 15, 2010Date of Patent: January 13, 2015Assignee: Microsoft CorporationInventors: Relja Markovic, Stephen G. Latta, Kevin A. Geisner, David Hill, Darren A. Bennett, David C. Haley, Jr., Brian S. Murphy, Shawn C. Wright
-
Publication number: 20150009135Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.Type: ApplicationFiled: September 26, 2014Publication date: January 8, 2015Inventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
-
Publication number: 20140380254Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.Type: ApplicationFiled: September 4, 2014Publication date: December 25, 2014Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
-
Publication number: 20140375683Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.Type: ApplicationFiled: June 25, 2013Publication date: December 25, 2014Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
-
Patent number: 8885878Abstract: Interactive secret sharing includes receiving video data from a source and interpreting the video data to track an observed path of a device. In addition, position information is received from the device, and the position information is interpreted to track a self-reported path of the device. If the observed path is within a threshold tolerance of the self-reported path, access is provided to a restricted resource.Type: GrantFiled: July 22, 2011Date of Patent: November 11, 2014Assignee: Microsoft CorporationInventors: Bradley Robert Pettit, Eric Soldan, Relja Markovic
-
Patent number: 8884968Abstract: A method for modeling an object from image data comprises identifying in an image from the video a set of reference points on the object, and, for each reference point identified, observing a displacement of that reference point in response to a motion of the object. The method further comprises grouping together those reference points for which a common translational or rotational motion of the object results in the observed displacement, and fitting the grouped-together reference points to a shape.Type: GrantFiled: December 15, 2010Date of Patent: November 11, 2014Assignee: Microsoft CorporationInventors: Relja Markovic, Stephen Latta, Kevin Geisner
-
Patent number: 8869072Abstract: Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.Type: GrantFiled: August 2, 2011Date of Patent: October 21, 2014Assignee: Microsoft CorporationInventors: Stephen G. Latta, Relja Markovic, Arthur Charles Tomlin, Gregory N. Snook
-
Patent number: 8856691Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.Type: GrantFiled: May 29, 2009Date of Patent: October 7, 2014Assignee: Microsoft CorporationInventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
-
Publication number: 20140267311Abstract: Embodiments are disclosed that relate to interacting with a user interface via feedback provided by an avatar. One embodiment provides a method comprising receiving depth data, locating a person in the depth data, and mapping a physical space in front of the person to a screen space of a display device. The method further comprises forming an image of an avatar representing the person, outputting to a display an image of a user interface comprising an interactive user interface control, and outputting to the display device the image of the avatar such that the avatar faces the user interface control. The method further comprises detecting a motion of the person via the depth data, forming an animated representation of the avatar interacting with the user interface control based upon the motion of the person, and outputting the animated representation of the avatar interacting with the control.Type: ApplicationFiled: May 29, 2014Publication date: September 18, 2014Applicant: Microsoft CorporationInventors: Jeffrey Evertt, Joel Deaguero, Darren Bennett, Dylan Vance, David Galloway, Relja Markovic, Stephen Latta, Oscar Omar Garza Santos, Kevin Geisner
-
Patent number: 8814693Abstract: In accordance with one or more aspects, for a particular user one or more other users associated with that particular user are identified based on a social graph of that particular user. An avatar of at least one of the other users is obtained and included as a non-player-character in a game being played by that particular user. The particular user can provide requests to interact with the avatar of the second user (e.g., calling out the name of the second user, tapping the avatar of the second user on the shoulder, etc.), these requests being invitations for the second user to join in a game with the first user. An indication of such an invitation is presented to the second user, which can, for example, accept the invitation to join in a game with the first user.Type: GrantFiled: May 27, 2011Date of Patent: August 26, 2014Assignee: Microsoft CorporationInventors: Brian Scott Murphy, Stephen G. Latta, Darren Alexander Bennett, Kevin Geisner, Shawn C. Wright, Relja Markovic, Joel B. Deaguero, Christopher H. Willoughby, Ryan Lucas Hastings