Patents by Inventor Yasaman Sheri
Yasaman Sheri has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10445935Abstract: Optimizations are provided for facilitating interactions with virtual objects included within an augmented-reality scene. Initially, an augmented-reality scene is rendered for a user. Within that scene, an interactive virtual object of an application is rendered. Then, the position of the user's actual hand is determined relative to the interactive virtual object. When the user's actual hand is within a target threshold distance to the interactive virtual object, then a target visual cue is projected onto the interactive virtual object. When the user's actual hand is within an input threshold distance to the interactive virtual object, then an input visual cue is projected onto the interactive virtual object. Once the user's hand is within the input threshold distance to the interactive virtual object, then input may be provided to the application via the interactive object.Type: GrantFiled: May 26, 2017Date of Patent: October 15, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, David J. Calabrese, Yasaman Sheri, Daniel B. Witriol
-
Patent number: 10234935Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.Type: GrantFiled: August 11, 2016Date of Patent: March 19, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
-
Publication number: 20180342103Abstract: Optimizations are provided for facilitating interactions with virtual objects included within an augmented-reality scene. Initially, an augmented-reality scene is rendered for a user. Within that scene, an interactive virtual object of an application is rendered. Then, the position of the user's actual hand is determined relative to the interactive virtual object. When the user's actual hand is within a target threshold distance to the interactive virtual object, then a target visual cue is projected onto the interactive virtual object. When the user's actual hand is within an input threshold distance to the interactive virtual object, then an input visual cue is projected onto the interactive virtual object. Once the user's hand is within the input threshold distance to the interactive virtual object, then input may be provided to the application via the interactive object.Type: ApplicationFiled: May 26, 2017Publication date: November 29, 2018Inventors: Julia Schwarz, David J. Calabrese, Yasaman Sheri, Daniel B. Witriol
-
Patent number: 10140776Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.Type: GrantFiled: June 13, 2016Date of Patent: November 27, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert C J Pengelly, Yasaman Sheri, James Tichenor
-
Patent number: 9983684Abstract: Methods and devices for displaying a virtual affordance with a virtual target are disclosed. In one example, the virtual target is displayed to a user via a display device. The user's point of gaze is determined to be at a gaze location within a target zone including the virtual target. The user's hand is determined to be at a hand location within a designated tracking volume. Based on at least determining that the user's gaze is at the gaze location and the user's hand is at the hand location, the virtual affordance is displayed at a landing location corresponding to the virtual target, where the landing location is independent of both the gaze location and the user's hand location. Movement of the user's hand is tracked and the virtual affordance is modified in response to the movement.Type: GrantFiled: November 2, 2016Date of Patent: May 29, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jia Wang, Yasaman Sheri, Julia Schwarz, David J. Calabrese, Daniel B. Witriol
-
Publication number: 20180143693Abstract: A method for moving a virtual object includes detecting a position of two input objects. A position of a centroid that is equidistant from the two input objects and located between the two input objects is dynamically calculated, such that a reference line running between the two input objects intersects the centroid. Upon detecting a movement of the two input objects, the movement is translated into a change in one or both of a position and an orientation of the virtual object. Movement of the centroid caused by movement of the two input objects causes movement of the virtual object in a direction corresponding to the movement of the centroid. Rotation of the reference line about the centroid caused by the movement of the two input objects causes rotation of the virtual object about its center in a direction corresponding to the rotation of the reference line.Type: ApplicationFiled: November 21, 2016Publication date: May 24, 2018Inventors: David J. Calabrese, Julia Schwarz, Yasaman Sheri, Daniel B. Witriol
-
Publication number: 20180120944Abstract: Methods and devices for displaying a virtual affordance with a virtual target are disclosed. In one example, the virtual target is displayed to a user via a display device. The user's point of gaze is determined to be at a gaze location within a target zone including the virtual target. The user's hand is determined to be at a hand location within a designated tracking volume. Based on at least determining that the user's gaze is at the gaze location and the user's hand is at the hand location, the virtual affordance is displayed at a landing location corresponding to the virtual target, where the landing location is independent of both the gaze location and the user's hand location. Movement of the user's hand is tracked and the virtual affordance is modified in response to the movement.Type: ApplicationFiled: November 2, 2016Publication date: May 3, 2018Inventors: Jia Wang, Yasaman Sheri, Julia Schwarz, David J. Calabrese, Daniel B. Witriol
-
Publication number: 20180046245Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.Type: ApplicationFiled: August 11, 2016Publication date: February 15, 2018Inventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
-
Publication number: 20170358144Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.Type: ApplicationFiled: June 13, 2016Publication date: December 14, 2017Inventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert CJ Pengelly, Yasaman Sheri, James Tichenor