Patents by Inventor Oscar Murillo

Oscar Murillo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150193107
    Abstract: A method to decode natural user input from a human subject. The method includes detection of a gesture and concurrent grip state of the subject. If the grip state is closed during the gesture, then a user-interface (UI) canvas of the computer system is transformed based on the gesture. If the grip state is open during the gesture, then a UI object arranged on the UI canvas is activated based on the gesture.
    Type: Application
    Filed: January 9, 2014
    Publication date: July 9, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Sergio Paolantonio, Christian Klein, Oscar Murillo
  • Patent number: 9063578
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Grant
    Filed: July 31, 2013
    Date of Patent: June 23, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Publication number: 20150070263
    Abstract: A system and method enabling dynamic interaction between users and displays. Interaction states for a user are determined by tracking user motions and position within the field of view of one or more capture devices. Interaction states are defined by any number of factors, including one or more of a user's body position, body orientation. Once a user occupies an interaction state, an associated application layout is applied to a display. Application layout states may include which application objects are displayed for a given interaction state. Triggering an application state is driven by a transition event and a determination that a user occupies an interaction state. Monitoring user motion and position may be performed continuously, so that changes in interaction states can be determined and corresponding changes to application layout states can be applied to a display, thereby rendering the technology dynamic to user movement.
    Type: Application
    Filed: September 9, 2013
    Publication date: March 12, 2015
    Applicant: Microsoft Corporation
    Inventors: Oscar Murillo, Richard Bailey
  • Publication number: 20150035750
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Application
    Filed: July 31, 2013
    Publication date: February 5, 2015
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Patent number: 8943420
    Abstract: The claimed subject matter relates to an architecture that can enhance an experience associated with indicia related to a local environment. In particular, the architecture can receive an image that depicts a view of the local environment including a set of entities represented in the image. One or more of the entities can be matched or correlated to modeled entities included in a geospatial model of the environment, potentially based upon location and direction, in order to scope or frame the view depicted in the image to a modeled view. In addition, the architecture can select additional content that can be presented. The additional content typically relates to services or data associated with modeled entities included in the geospatial model or associated with modeled entities included in an image-based data store.
    Type: Grant
    Filed: June 18, 2009
    Date of Patent: January 27, 2015
    Assignee: Microsoft Corporation
    Inventors: Flora P. Goldthwaite, Brett D. Brewer, Eric I-Chao Chang, Jonathan C. Cluts, Karim T. Farouki, Gary W. Flake, Janet Galore, Jason Garms, Abhiram G. Khune, Oscar Murillo, Sven Pleyer
  • Publication number: 20140282223
    Abstract: A user interface is output to a display device. If an element of a human subject is in a first conformation, the user interface scrolls responsive to movement of the element. If the element is in a second conformation, different than the first conformation, objects of the user interface are targeted responsive to movement of the element without scrolling the user interface.
    Type: Application
    Filed: March 13, 2013
    Publication date: September 18, 2014
    Applicant: Microsoft Corporation
    Inventors: David Bastien, Oscar Murillo, Mark Schwesinger
  • Publication number: 20140225820
    Abstract: An NUI system to provide user input to a computer system. The NUI system includes a logic machine and an instruction-storage machine. The instruction-storage machine holds instructions that, when executed by the logic machine, cause the logic machine to detect an engagement gesture from a human subject or to compute an engagement metric reflecting the degree of the subject's engagement. The instructions also cause the logic machine to direct gesture-based user input from the subject to the computer system as soon as the engagement gesture is detected or the engagement metric exceeds a threshold.
    Type: Application
    Filed: February 11, 2013
    Publication date: August 14, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Mark Schwesinger, Eduardo Escardo Raffo, Oscar Murillo, David Bastien, Matthew H. Ahn, Mauro Giusti, Kevin Endres, Christian Klein, Julia Schwarz, Charles Claudius Marais
  • Publication number: 20140173524
    Abstract: A cursor is moved in a user interface based on a position of a joint of a virtual skeleton modeling a human subject. If a cursor position engages an object in the user interface, and all immediately-previous cursor positions within a mode-testing period are located within a timing boundary centered around the cursor position, operation in a pressing mode commences. If a cursor position remains within a constraining shape and exceeds a threshold z-distance while in the pressing mode, the object is activated.
    Type: Application
    Filed: December 14, 2012
    Publication date: June 19, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Mark Schwesinger, David Bastien, Oscar Murillo, Oscar Kozlowski, Richard Bailey, Julia Schwarz
  • Publication number: 20140132499
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Application
    Filed: November 12, 2012
    Publication date: May 15, 2014
    Applicant: MICROSOFT CORPORATION
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Publication number: 20100332313
    Abstract: The claimed subject matter provides a system and/or a method that facilitates user selectable advertising networks. Advertising content can be formed into cohesive subsets of advertising. These subsets can be related to criteria to facilitate selection between available subsets of advertising content. A selection component can facilitate selection of the available subsets of advertising content based on these criteria. The criteria can be related to user preferences. Further the criteria can relate to explicit user preferences such as opt-in or opt-out indicia. The user can be presented with more relevant advertising content where user selection of advertising networks occurs.
    Type: Application
    Filed: June 25, 2009
    Publication date: December 30, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: John M. Miller, Janet Galore, Alexander Gounares, Eric Horvitz, Karim Farouki, Patrick Nguyen, Brett Brewer, Jayaram N.M. Nanduri, Milind Mahajan, Oscar Murillo
  • Publication number: 20100325563
    Abstract: The claimed subject matter relates to an architecture that can enhance an experience associated with indicia related to a local environment. In particular, the architecture can receive an image that depicts a view of the local environment including a set of entities represented in the image. One or more of the entities can be matched or correlated to modeled entities included in a geospatial model of the environment, potentially based upon location and direction, in order to scope or frame the view depicted in the image to a modeled view. In addition, the architecture can select additional content that can be presented. The additional content typically relates to services or data associated with modeled entities included in the geospatial model or associated with modeled entities included in an image-based data store.
    Type: Application
    Filed: June 18, 2009
    Publication date: December 23, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Flora P. Goldthwaite, Brett D. Brewer, Eric I-Chao Chang, Jonathan C. Cluts, Karim T. Farouki, Gary W. Flake, Janet Galore, Jason Garms, Abhiram G. Khune, Oscar Murillo, Sven Pleyer
  • Publication number: 20070218955
    Abstract: A portable wireless-enabled system includes an input transducer (for example, a microphone), an output transducer (for example, a speaker) and a wireless transceiver system. Also included, is a memory having a programmable user speech profile. A processor system controls operation of the input transducer, the output transducer, the wireless transceiver system and the memory.
    Type: Application
    Filed: March 17, 2006
    Publication date: September 20, 2007
    Applicant: Microsoft Corporation
    Inventors: Daniel Cook, David Mowatt, Oliver Scholz, Oscar Murillo, Robert Chambers, Ryan Bickel
  • Publication number: 20070219802
    Abstract: A method of interacting with a speech recognition (SR)-enabled personal computer (PC) is provided in which a user SR profile is transferred from a wireless-enabled device to the SR-enabled PC. Interaction with SR applications, on the SR-enabled PC, is carried out by transmitting speech signals wirelessly to the SR-enabled PC. The transmitted speech signals are recognized with the help of the transferred user SR profile.
    Type: Application
    Filed: March 17, 2006
    Publication date: September 20, 2007
    Applicant: Microsoft Corporation
    Inventors: Daniel Cook, David Mowatt, Oliver Scholz, Oscar Murillo
  • Publication number: 20060111916
    Abstract: A system and method for positioning a software User Interface (UI) window on a display screen is provided, wherein the method includes displaying the software UI window on the display screen and identifying at least one suitable location on the display screen responsive to an active target window area of a target application UI window. The method further includes determining whether the software UI window is disposed at the at least one suitable location on the display screen and if the software UI window is disposed in a location other than the at least one suitable location on the display screen, positioning the software UI window at the at least one suitable location on the display screen.
    Type: Application
    Filed: November 24, 2004
    Publication date: May 25, 2006
    Applicant: Microsoft Corporation
    Inventors: Robert Chambers, David Mowatt, Oscar Murillo
  • Publication number: 20060074687
    Abstract: A numbering scheme is disclosed for implementation in the context of an application display. A user is able to select an item on the display by speaking a number corresponding to a desired control item. In some cases, the screen can include so many numbers that the user loses context and is unable to identify which number they want to select. For this reason, in one embodiment, a temporal switching mechanism is implemented wherein periodic switches (e.g., second-long intervals) occur between showing numbered items and showing a non-numbered screen. In one embodiment, an optional secondary confirmation step is implemented wherein the user sees only the item they just selected and has the chance to (a) learn the programmatic name of the item they selected and/or (b) either confirm and proceed with their selection, or cancel. In one embodiment, the optional secondary confirmation step is omitted if the user speaks a number followed by a predetermined command word.
    Type: Application
    Filed: September 24, 2004
    Publication date: April 6, 2006
    Applicant: Microsoft Corporation
    Inventors: Ryan Bickel, Oscar Murillo, David Mowatt