Patents by Inventor Nuria M. Oliver
Nuria M. Oliver has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9652042Abstract: Architecture for implementing a perceptual user interface. The architecture comprises alternative modalities for controlling computer application programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and verbal commands. The perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object. Detection of object characteristics is based at least in part upon image comparison of a plurality of images relative to a course mapping of the images. A seeding component iteratively seeds the tracking component with object hypotheses based upon the presence of the object characteristics and the image comparison. A filtering component selectively removes the tracked object from the object hypotheses and/or at least one object hypothesis from the set of object hypotheses based upon predetermined removal criteria.Type: GrantFiled: February 12, 2010Date of Patent: May 16, 2017Assignee: Microsoft Technology Licensing, LLCInventors: Andrew David Wilson, Nuria M. Oliver
-
Publication number: 20160239157Abstract: A framework is provided for obtaining window information. The window information can be applied to different assignment models to assign windows to different groups. A group may correspond to a task being performed by a user. The window information can be semantic or temporal information captured as window events and properties of windows whose events are captured. Temporal information can be information about switches between windows. Semantic information can be window titles. Temporal information, semantic information, or both, can be used to assign windows to groups.Type: ApplicationFiled: April 21, 2016Publication date: August 18, 2016Inventors: Nuria M. Oliver, Arungunram Surendran, Chintan S. Thakkar, Gregory Smith
-
Patent number: 8745541Abstract: A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.Type: GrantFiled: December 1, 2003Date of Patent: June 3, 2014Assignee: Microsoft CorporationInventors: Andrew D. Wilson, Nuria M. Oliver
-
Publication number: 20130275911Abstract: A framework is provided for obtaining window information. The window information can be applied to different assignment models to assign windows to different groups. A group may correspond to a task being performed by a user. The window information can be semantic or temporal information captured as window events and properties of windows whose events are captured. Temporal information can be information about switches between windows. Semantic information can be window titles. Temporal information, semantic information, or both, can be used to assign windows to groups.Type: ApplicationFiled: June 4, 2013Publication date: October 17, 2013Inventors: Nuria M. Oliver, Anmgunrum C. Surendran, Chintan S. Thakkar, Gregory R. Smith
-
Publication number: 20130190089Abstract: A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.Type: ApplicationFiled: October 20, 2008Publication date: July 25, 2013Inventors: Andrew Wilson, Nuria M. Oliver
-
Patent number: 8484577Abstract: A framework is provided for obtaining window information. The window information can be applied to different assignment models to assign windows to different groups. A group may correspond to a task being performed by a user. The window information can be semantic or temporal information captured as window events and properties of windows whose events are captured. Temporal information can be information about switches between windows. Semantic information can be window titles. Temporal information, semantic information, or both, can be used to assign windows to groups.Type: GrantFiled: February 26, 2010Date of Patent: July 9, 2013Assignee: Microsoft CorporationInventors: Nuria M. Oliver, Arungunram C. Surendran, Chintan S. Thakkar, Gregory R. Smith
-
Patent number: 8392229Abstract: A system that can enable the atomization of application functionality in connection with an activity-centric system is provided. The system can be utilized as a programmatic tool that decomposes an application's constituent functionality into atoms thereafter monitoring and aggregating atoms with respect to a particular activity. In doing so, the functionality of the system can be scaled based upon complexity and needs of the activity. Additionally, the system can be employed to monetize the atoms or activity capabilities based upon respective use.Type: GrantFiled: June 24, 2011Date of Patent: March 5, 2013Assignee: Microsoft CorporationInventors: Steven W. Macbeth, Roland L. Fernandez, Brian R. Meyers, Desney S. Tan, George G. Robertson, Nuria M. Oliver, Oscar E. Murillo, Elin R. Pedersen
-
Patent number: 8364514Abstract: A unique monitoring system and method is provided that involves monitoring user activity in order to facilitate managing and optimizing the utilization of various system resources. In particular, the system can monitor user activity, detect when users need assistance with their specific activities, and identify at least one other user that can assist them. Assistance can be in the form of answering questions, providing guidance to the user as the user completes the activity, or completing the activity such as in the case of taking on an assigned activity. In addition, the system can aggregate activity data across users and/or devices. As a result, problems with activity templates or activities themselves can be more readily identified, user performance can be readily compared, and users can communicate and exchange information regarding similar activity experiences. Furthermore, synchronicity and time-sensitive scheduling of activities between users can be facilitated and improved overall.Type: GrantFiled: June 27, 2006Date of Patent: January 29, 2013Assignee: Microsoft CorporationInventors: Steven W. Macbeth, Roland L. Fernandez, Brian R. Meyers, Desney S. Tan, George G. Robertson, Nuria M. Oliver, Oscar E. Murillo, Mary P. Czerwinski
-
Publication number: 20110264484Abstract: A system that can enable the atomization of application functionality in connection with an activity-centric system is provided. The system can be utilized as a programmatic tool that decomposes an application's constituent functionality into atoms thereafter monitoring and aggregating atoms with respect to a particular activity. In doing so, the functionality of the system can be scaled based upon complexity and needs of the activity. Additionally, the system can be employed to monetize the atoms or activity capabilities based upon respective use.Type: ApplicationFiled: June 24, 2011Publication date: October 27, 2011Applicant: MICROSOFT CORPORATIONInventors: Steven W. Macbeth, Roland L. Fernandez, Brian R. Meyers, Desney S. Tan, George G. Robertson, Nuria M. Oliver, Oscar E. Murillo, Elin R. Pedersen
-
Patent number: 7970637Abstract: A system that can enable the atomization of application functionality in connection with an activity-centric system is provided. The system can be utilized as a programmatic tool that decomposes an application's constituent functionality into atoms thereafter monitoring and aggregating atoms with respect to a particular activity. In doing so, the functionality of the system can be scaled based upon complexity and needs of the activity. Additionally, the system can be employed to monetize the atoms or activity capabilities based upon respective use.Type: GrantFiled: June 27, 2006Date of Patent: June 28, 2011Assignee: Microsoft CorporationInventors: Steven W. Macbeth, Roland L. Fernandez, Brian R. Meyers, Desney S. Tan, George G. Robertson, Nuria M. Oliver, Oscar E. Murillo, Elin R. Pedersen
-
Patent number: 7908151Abstract: The claimed subject matter provides a system and/or a method that facilitates dynamically providing a question to ask a medical professional during an appointment. An interface can receive a portion of medical data. A counselor component can generate a question based on the portion of medical data, wherein the question is generated to elicit an answer from a medical professional during an appointment. Moreover, the counselor component can dynamically generate a second question directed toward the medical professional based upon at least one of the answer or a value of information (VOI) computation.Type: GrantFiled: September 28, 2007Date of Patent: March 15, 2011Assignee: Microsoft CorporationInventors: David E. Heckerman, Pablo Argon, Behrooz Chitsaz, Hong L. Choing, James R. Hamilton, Nuria M. Oliver, Vladimir G. Sadovsky, Chris Demetrios Karkanias, Hurbert Van Hoof, Oren Rosenbloom
-
Patent number: 7873724Abstract: The present invention leverages analysis methods, such as expected value of information techniques, rate-based techniques, and random selection technique, to provide a fusion of low-level streams of input data (e.g., raw data) from multiple sources to facilitate in inferring human-centric notions of context while reducing computational resource burdens. In one instance of the present invention, the method utilizes real-time computations of expected value of information in a greedy, one-step look ahead approach to compute a next best set of observations to make at each step, producing “EVI based-perception.” By utilizing dynamically determined input data, the present invention provides utility-directed information gathering to enable a significant reduction in system resources. Thus, of the possible input combinations, the EVI-based system can automatically determine which sources are required for real-time computation relating to a particular context.Type: GrantFiled: December 5, 2003Date of Patent: January 18, 2011Assignee: Microsoft CorporationInventors: Eric J. Horvitz, Nuria M. Oliver
-
Patent number: 7868786Abstract: A location history is a collection of locations over time for an object. A stay is a single instance of an object spending some time in one place, and a destination is any place where one or more objects have experienced a stay. Location histories are parsed using stays and destinations. In a described implementation, each location of a location history is recorded as a spatial position and a corresponding time at which the spatial position is acquired. Stays are extracted from a location history by analyzing locations thereof with regard to a temporal threshold and a spatial threshold. Specifically, two or more locations are considered a stay if they exceed a minimum stay duration and are within a maximum roaming distance. Each stay includes a location, a starting time, and an ending time. Destinations are produced from the extracted stays using a clustering operation and a predetermined scaling factor.Type: GrantFiled: October 19, 2004Date of Patent: January 11, 2011Assignee: Microsoft CorporationInventors: Kentaro Toyama, Ramaswamy Hariharan, Ross G. Cutler, John R. Douceur, Nuria M. Oliver, Eric K. Ringger, Daniel C. Robbins, Matthew T. Uyttendaele
-
Patent number: 7836002Abstract: A system that can automatically narrow the search space or recognition scope within an activity-centric environment based upon a current activity or set of activities is provided. In addition, the activity and context data can also be used to rank the results of the recognition or search activity. In accordance with the domain scoping, natural language processing (NLP) as well as other types of conversion and recognition systems can dynamically adjust to the scope of the activity or group of activities thereby increasing the recognition systems accuracy and usefulness. In operation, a user context, activity context, environment context and/or device profile can be employed to effectuate the scoping. As well, the system can combine context with extrinsic data, including but not limited to, calendar, profile, historical activity data, etc. in order to define the parameters for an appropriate scoping.Type: GrantFiled: June 27, 2006Date of Patent: November 16, 2010Assignee: Microsoft CorporationInventors: Steven W. Macbeth, Roland L. Fernandez, Brian R. Meyers, Desney S. Tan, George G. Robertson, Nuria M. Oliver, Oscar E. Murillo
-
Patent number: 7761393Abstract: A system that can identify, create, update and/or process a workflow based upon a current, past or future activity is disclosed. A ‘workflow’ can be defined as an activity flow that includes interaction with, or assignment of work to, people, devices, or services by a single individual or a group of individuals. Once a workflow is determined in accordance with the innovation, the system can inform other users or groups that are performing, or intend to perform, a similar or like activity. In establishing the workflow, the innovation can operate in an ad hoc or authored manner. As well, the system can employ a combination of either ad hoc or authored mechanisms in establishment of the workflow.Type: GrantFiled: June 27, 2006Date of Patent: July 20, 2010Assignee: Microsoft CorporationInventors: Steven W. Macbeth, Roland L. Fernandez, Brian R. Meyers, Desney S. Tan, George G. Robertson, Nuria M. Oliver, Oscar E. Murillo, Elin R. Pedersen, Mary P. Czerwinski, Jeanine E. Spence
-
Publication number: 20100153399Abstract: A framework is provided for obtaining window information. The window information can be applied to different assignment models to assign windows to different groups. A group may correspond to a task being performed by a user. The window information can be semantic or temporal information captured as window events and properties of windows whose events are captured. Temporal information can be information about switches between windows. Semantic information can be window titles. Temporal information, semantic information, or both, can be used to assign windows to groups.Type: ApplicationFiled: February 26, 2010Publication date: June 17, 2010Applicant: MICROSOFT CORPORATIONInventors: Nuria M. Oliver, Arungunram C. Surendran, Chintan S. Thakkar, Gregory R. Smith
-
Publication number: 20100151946Abstract: A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.Type: ApplicationFiled: June 30, 2009Publication date: June 17, 2010Inventors: Andrew D. Wilson, Nuria M. Oliver
-
Publication number: 20100146455Abstract: Architecture for implementing a perceptual user interface. The architecture comprises alternative modalities for controlling computer application programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and verbal commands. The perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object. Detection of object characteristics is based at least in part upon image comparison of a plurality of images relative to a course mapping of the images. A seeding component iteratively seeds the tracking component with object hypotheses based upon the presence of the object characteristics and the image comparison. A filtering component selectively removes the tracked object from the object hypotheses and/or at least one object hypothesis from the set of object hypotheses based upon predetermined removal criteria.Type: ApplicationFiled: February 12, 2010Publication date: June 10, 2010Applicant: Microsoft CorporationInventors: Andrew David Wilson, Nuria M. Oliver
-
Publication number: 20100146464Abstract: Architecture for implementing a perceptual user interface. The architecture comprises alternative modalities for controlling computer application programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and verbal commands. The perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object. Detection of object characteristics is based at least in part upon image comparison of a plurality of images relative to a course mapping of the images. A seeding component iteratively seeds the tracking component with object hypotheses based upon the presence of the object characteristics and the image comparison. A filtering component selectively removes the tracked object from the object hypotheses and/or at least one object hypothesis from the set of object hypotheses based upon predetermined removal criteria.Type: ApplicationFiled: February 12, 2010Publication date: June 10, 2010Applicant: Microsoft CorporationInventors: Andrew D. Wilson, Nuria M. Oliver
-
Publication number: 20100138798Abstract: A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.Type: ApplicationFiled: June 17, 2009Publication date: June 3, 2010Inventors: Andrew D. Wilson, Nuria M. Oliver