Patents by Inventor Julia Schwarz
Julia Schwarz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10445935Abstract: Optimizations are provided for facilitating interactions with virtual objects included within an augmented-reality scene. Initially, an augmented-reality scene is rendered for a user. Within that scene, an interactive virtual object of an application is rendered. Then, the position of the user's actual hand is determined relative to the interactive virtual object. When the user's actual hand is within a target threshold distance to the interactive virtual object, then a target visual cue is projected onto the interactive virtual object. When the user's actual hand is within an input threshold distance to the interactive virtual object, then an input visual cue is projected onto the interactive virtual object. Once the user's hand is within the input threshold distance to the interactive virtual object, then input may be provided to the application via the interactive object.Type: GrantFiled: May 26, 2017Date of Patent: October 15, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, David J. Calabrese, Yasaman Sheri, Daniel B. Witriol
-
Publication number: 20190302879Abstract: A virtual reality experience is provided to one or more users by a computing system through the use of a special-purpose virtual reality mat. The computing system receives image data from an optical sensor imaging a physical environment. The mat includes one or more fiducial markers that are recognizable by the computing system. A presence of these fiducial markers is detected based on the image data. An activity region within the physical environment is defined based, at least in part, on the detected fiducial markers. A positioning of a physical subject is identified within the physical environment relative to the activity region. The virtual reality experience is selectively augmented based on the positioning of the physical subject identified relative to the activity region.Type: ApplicationFiled: April 2, 2018Publication date: October 3, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Julia SCHWARZ, Jason Michael RAY
-
Publication number: 20190163683Abstract: Described herein are various technologies pertaining to presenting search results to a user, wherein the search results are messages generated by way of social networking applications. An interactive graphical object is presented together with retrieved messages, and messages are filtered responsive to interactions with the interactive graphical object. Additionally, a graphical object that is indicative of credibility of a message is presented together with the message.Type: ApplicationFiled: January 30, 2019Publication date: May 30, 2019Inventors: Meredith June Morris, Scott Joseph Counts, Asta Jane Roseway, Julia Schwarz
-
Patent number: 10290152Abstract: Methods, computing devices and head-mounted display devices for displaying user interface elements with virtual objects are disclosed. In one example, a virtual object and one or more user interface elements are displayed within a physical environment. User input is received that moves one or more of the virtual object and the one or more user interface elements. One or more of the virtual object and the one or more user interface elements are determined to be within a predetermined distance of a physical surface. Based at least on this determination, the one or more user interface elements are displayed on the surface.Type: GrantFiled: April 3, 2017Date of Patent: May 14, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, Bo Robert Xiao, Hrvoje Benko, Andrew Wilson
-
Patent number: 10234935Abstract: In various embodiments, computerized methods and systems for mediating interaction methodologies with virtual objects rendered in an immersive environment are provided. An intended target is identified from one or more virtual objects rendered in an at least partially-virtual environment. A relative proximity of the intended target to the user, or an extension of the user, is determined. An interaction methodology is selected for interaction with the intended target based on the determined relative proximity to the intended target, among other things. An indication of the selected interaction methodology is then provided to the user.Type: GrantFiled: August 11, 2016Date of Patent: March 19, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, James Tichenor, Yasaman Sheri, David J. Calabrese, Bharat Ahluwalia, Robert Pengelly
-
Patent number: 10216797Abstract: Described herein are various technologies pertaining to presenting search results to a user, wherein the search results are messages generated by way of social networking applications. An interactive graphical object is presented together with retrieved messages, and messages are filtered responsive to interactions with the interactive graphical object. Additionally, a graphical object that is indicative of credibility of a message is presented together with the message.Type: GrantFiled: February 12, 2016Date of Patent: February 26, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Meredith June Morris, Scott Joseph Counts, Asta Jane Roseway, Julia Schwarz
-
Publication number: 20180348956Abstract: The disclosed subject matter is a palm rejection technique utilizing temporal features, iterative classification, and probabilistic voting. Touch events are classified based on features periodically extracted from time windows of increasing size, always centered at the birth of the event. The classification process uses a series of decision trees acting on said features.Type: ApplicationFiled: July 23, 2018Publication date: December 6, 2018Inventors: Julia Schwarz, Chris Harrison
-
Publication number: 20180342103Abstract: Optimizations are provided for facilitating interactions with virtual objects included within an augmented-reality scene. Initially, an augmented-reality scene is rendered for a user. Within that scene, an interactive virtual object of an application is rendered. Then, the position of the user's actual hand is determined relative to the interactive virtual object. When the user's actual hand is within a target threshold distance to the interactive virtual object, then a target visual cue is projected onto the interactive virtual object. When the user's actual hand is within an input threshold distance to the interactive virtual object, then an input visual cue is projected onto the interactive virtual object. Once the user's hand is within the input threshold distance to the interactive virtual object, then input may be provided to the application via the interactive object.Type: ApplicationFiled: May 26, 2017Publication date: November 29, 2018Inventors: Julia Schwarz, David J. Calabrese, Yasaman Sheri, Daniel B. Witriol
-
Patent number: 10140776Abstract: Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.Type: GrantFiled: June 13, 2016Date of Patent: November 27, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Julia Schwarz, Bharat Ahluwalia, David Calabrese, Robert C J Pengelly, Yasaman Sheri, James Tichenor
-
Publication number: 20180329567Abstract: Methods and apparatus of embodiments of the present invention include a classification system configured to treat edge contact of a touch screen as a separate class of touch events such that any touches occurring near the edge of the touch screen are to be processed by a classifier that is configured to process edge contacts as compared to a classifier that is configured to process other contacts that may occur in the approximate middle of the touch screen which may be wholly digitized. An apparatus may employ two separate and distinct classifiers, including a full touch classifier and an edge touch classifier. The touch screen may be configured to have two different sensing regions to determine which of the two classifiers is appropriate for a touch event.Type: ApplicationFiled: November 16, 2017Publication date: November 15, 2018Inventors: TAIHEI MUNEMOTO, JULIA SCHWARZ, CHRIS HARRISON
-
Patent number: 10127886Abstract: A computing system, such as a head mounted display, is configured for dynamically modifying an occlusion, such as a hand occlusion, that is presented and moved within a mixed reality environment. The occlusion is associated with a movement attribute, such as a velocity or acceleration, corresponding with movement of the occlusion within the mixed reality environment. Upon detecting a movement of the occlusion, it is determined whether the movement attribute meets or exceeds a predetermined threshold. When the threshold is at least met, the visual appearance of the occlusion is modified by at least one of modifying a transparency attribute of the occlusion to cause increased transparency of the occlusion or by modifying an edge display attribute of the occlusion to cause feathering of one or more occlusion edges.Type: GrantFiled: October 14, 2016Date of Patent: November 13, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Robert Charles Johnstone Pengelly
-
Patent number: 10095402Abstract: Systems and methods are provided that determine when an initial stroke and a subsequent stroke track may be part of a common user input action. A method may include receiving a signal from which an initial stroke track representing an initial movement of a user controlled indicator against a touch sensitive surface and sensing a subsequent stroke track representing subsequent movement of the user controlled indicator against the touch sensitive surface can be determined. The method further includes determining that the initial stroke track and the subsequent stroke track comprise portions of common user input action when the initial stroke track is followed by the subsequent stroke track within a predetermined period of time and a trajectory of the initial stroke track is consistent with a trajectory of the subsequent stroke track.Type: GrantFiled: October 1, 2014Date of Patent: October 9, 2018Assignee: QEEXO, CO.Inventors: Robert Xiao, Julia Schwarz, Christopher Harrison
-
Publication number: 20180286126Abstract: Methods, computing devices and head-mounted display devices for displaying user interface elements with virtual objects are disclosed. In one example, a virtual object and one or more user interface elements are displayed within a physical environment. User input is received that moves one or more of the virtual object and the one or more user interface elements. One or more of the virtual object and the one or more user interface elements are determined to be within a predetermined distance of a physical surface. Based at least on this determination, the one or more user interface elements are displayed on the surface.Type: ApplicationFiled: April 3, 2017Publication date: October 4, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Julia Schwarz, Bo Robert Xiao, Hrvoje Benko, Andrew Wilson
-
Patent number: 10082935Abstract: An electronic device includes a touch-sensitive surface, for example a touch pad or touch screen. The user interacts with the touch-sensitive surface, producing touch interactions. Some of these touch interactions may be detected as indicative of a grasp for manipulating a physical tool (e.g., the grasp for holding a pen). When these touch interactions are encountered, a corresponding virtual tool is instantiated. The virtual tool controls an action on the electronic device that is similar to an action that can be performed by the physical tool. For example, the virtual pen can be used to draw on the display, whereas the physical pen draws on paper. A representation of the virtual tool is also displayed on a display for the electronic device, possibly providing additional affordances, at a location that corresponds to a location of the detected touch interaction.Type: GrantFiled: April 15, 2013Date of Patent: September 25, 2018Assignee: CARNEGIE MELLON UNIVERSITYInventors: Christopher Harrison, Julia Schwarz, Robert Bo Xiao, Scott E. Hudson
-
Patent number: 10031619Abstract: The present invention is a palm rejection technique utilizing temporal features, iterative classification, and probabilistic voting. Touch events are classified based on features periodically extracted from time windows of increasing size, always centered at the birth of the event. The classification process uses a series of decision trees acting on said features.Type: GrantFiled: April 14, 2015Date of Patent: July 24, 2018Assignee: CARNEGIE MELLON UNIVERSITYInventors: Julia Schwarz, Christopher Harrison
-
Publication number: 20180173300Abstract: Disclosed are an apparatus and a method of detecting a user interaction with a virtual object. In some embodiments, a depth sensing device of an NED device receives a plurality of depth values. The depth values correspond to depths of points in a real-world environment relative to the depth sensing device. The NED device overlays an image of a 3D virtual object on a view of the real-world environment, and identifies an interaction limit in proximity to the 3D virtual object. Based on depth values of points that are within the interaction limit, the NED device detects a body part or a user device of a user interacting with the 3D virtual object.Type: ApplicationFiled: December 19, 2016Publication date: June 21, 2018Inventors: Julia Schwarz, Hrvoje Benko, Andrew D. Wilson, Robert Charles Johnstone Pengelly, Bo Robert Xiao
-
Patent number: 9983684Abstract: Methods and devices for displaying a virtual affordance with a virtual target are disclosed. In one example, the virtual target is displayed to a user via a display device. The user's point of gaze is determined to be at a gaze location within a target zone including the virtual target. The user's hand is determined to be at a hand location within a designated tracking volume. Based on at least determining that the user's gaze is at the gaze location and the user's hand is at the hand location, the virtual affordance is displayed at a landing location corresponding to the virtual target, where the landing location is independent of both the gaze location and the user's hand location. Movement of the user's hand is tracked and the virtual affordance is modified in response to the movement.Type: GrantFiled: November 2, 2016Date of Patent: May 29, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Jia Wang, Yasaman Sheri, Julia Schwarz, David J. Calabrese, Daniel B. Witriol
-
Publication number: 20180143693Abstract: A method for moving a virtual object includes detecting a position of two input objects. A position of a centroid that is equidistant from the two input objects and located between the two input objects is dynamically calculated, such that a reference line running between the two input objects intersects the centroid. Upon detecting a movement of the two input objects, the movement is translated into a change in one or both of a position and an orientation of the virtual object. Movement of the centroid caused by movement of the two input objects causes movement of the virtual object in a direction corresponding to the movement of the centroid. Rotation of the reference line about the centroid caused by the movement of the two input objects causes rotation of the virtual object about its center in a direction corresponding to the rotation of the reference line.Type: ApplicationFiled: November 21, 2016Publication date: May 24, 2018Inventors: David J. Calabrese, Julia Schwarz, Yasaman Sheri, Daniel B. Witriol
-
Publication number: 20180120944Abstract: Methods and devices for displaying a virtual affordance with a virtual target are disclosed. In one example, the virtual target is displayed to a user via a display device. The user's point of gaze is determined to be at a gaze location within a target zone including the virtual target. The user's hand is determined to be at a hand location within a designated tracking volume. Based on at least determining that the user's gaze is at the gaze location and the user's hand is at the hand location, the virtual affordance is displayed at a landing location corresponding to the virtual target, where the landing location is independent of both the gaze location and the user's hand location. Movement of the user's hand is tracked and the virtual affordance is modified in response to the movement.Type: ApplicationFiled: November 2, 2016Publication date: May 3, 2018Inventors: Jia Wang, Yasaman Sheri, Julia Schwarz, David J. Calabrese, Daniel B. Witriol
-
Publication number: 20180108325Abstract: A computing system, such as a head mounted display, is configured for dynamically modifying an occlusion, such as a hand occlusion, that is presented and moved within a mixed reality environment. The occlusion is associated with a movement attribute, such as a velocity or acceleration, corresponding with movement of the occlusion within the mixed reality environment. Upon detecting a movement of the occlusion, it is determined whether the movement attribute meets or exceeds a predetermined threshold. When the threshold is at least met, the visual appearance of the occlusion is modified by at least one of modifying a transparency attribute of the occlusion to cause increased transparency of the occlusion or by modifying an edge display attribute of the occlusion to cause feathering of one or more occlusion edges.Type: ApplicationFiled: October 14, 2016Publication date: April 19, 2018Inventors: Julia Schwarz, Robert Charles Johnstone Pengelly