Patents by Inventor Oscar Murillo

Oscar Murillo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11743215
    Abstract: Aspects of the present disclosure are directed to an XR messaging system that can conduct a message thread between multiple users, where individual messages can be designated for delivery to particular artificial reality locations or devices. When sending a message, a user can choose to send the message to a particular destination associated with one or more other users on the message thread. When such a destination selection is made, the message can be formatted for viewing at the selected destination by applying a template, to the message, selected based on the template being configured for the types of data defined in the message and for the type of the destination.
    Type: Grant
    Filed: June 28, 2021
    Date of Patent: August 29, 2023
    Inventors: Oscar Murillo, Fang-Yu Yang, Annika Rodrigues
  • Publication number: 20230072623
    Abstract: Aspects of the present disclosure are directed to an artificial reality capture and sharing system. The artificial reality capture and sharing system can provide an output view showing a world-view from an artificial reality device or a view of the user's point-of-view. The world-view can show the complete surrounding area that is being captured by the artificial reality capture and sharing system, whether or not the user of the artificial reality device is viewing that portion of the surroundings. The point-of-view version can show the portion of the surrounding area that is in the artificial reality device's display area. The artificial reality capture and sharing system can also apply filters to people depicted in its captured sensor data. This can include applying filters to identified users in on-device or shared output views or to live views of people in the surrounding area of the artificial reality device.
    Type: Application
    Filed: September 3, 2021
    Publication date: March 9, 2023
    Inventors: Andrea ZELLER, Oscar MURILLO, Matthew James SCHOENHOLZ
  • Publication number: 20230045759
    Abstract: A 3D calling system can provide 3D calls in various modes according to transitions and can provide affordances (i.e., visual or auditory cues) to improve 3D call image capturing. The 3D calling system of a recipient in a 3D call can display a hologram (from images captured by an external capture device (ECD)) or avatar of a sending call participant in a variety of ways, such as by making them “world-locked,” “ECD-locked,” or “body-locked.” The selection of a 3D call mode can be based on factors such as whether an ECD is active, whether the ECD is in motion, and user selections. In various cases, the 3D calling system can trigger various affordances to improve the quality of the images captured by the sender’s ECD, such as a displayed virtual object and/or an auditory cue, either signaling to the user that a current ECD configuration is non-optimal and/or providing instructions for an improved ECD configuration.
    Type: Application
    Filed: August 4, 2021
    Publication date: February 9, 2023
    Inventors: Jing MA, Paul Armistead HOOVER, Oscar MURILLO, Fang-Yu YANG
  • Patent number: 11099637
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Grant
    Filed: August 23, 2019
    Date of Patent: August 24, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Publication number: 20190377408
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Application
    Filed: August 23, 2019
    Publication date: December 12, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 10489984
    Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.
    Type: Grant
    Filed: June 26, 2018
    Date of Patent: November 26, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
  • Patent number: 10394314
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Grant
    Filed: August 3, 2016
    Date of Patent: August 27, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Publication number: 20180300955
    Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.
    Type: Application
    Filed: June 26, 2018
    Publication date: October 18, 2018
    Inventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
  • Patent number: 10032314
    Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.
    Type: Grant
    Filed: October 11, 2016
    Date of Patent: July 24, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
  • Patent number: 9971491
    Abstract: A method to decode natural user input from a human subject. The method includes detection of a gesture and concurrent grip state of the subject. If the grip state is closed during the gesture, then a user-interface (UI) canvas of the computer system is transformed based on the gesture. If the grip state is open during the gesture, then a UI object arranged on the UI canvas is activated based on the gesture.
    Type: Grant
    Filed: January 9, 2014
    Date of Patent: May 15, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Sergio Paolantonio, Christian Klein, Oscar Murillo
  • Publication number: 20180101988
    Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.
    Type: Application
    Filed: October 11, 2016
    Publication date: April 12, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
  • Patent number: 9785228
    Abstract: An NUI system to provide user input to a computer system. The NUI system includes a logic machine and an instruction-storage machine. The instruction-storage machine holds instructions that, when executed by the logic machine, cause the logic machine to detect an engagement gesture from a human subject or to compute an engagement metric reflecting the degree of the subject's engagement. The instructions also cause the logic machine to direct gesture-based user input from the subject to the computer system as soon as the engagement gesture is detected or the engagement metric exceeds a threshold.
    Type: Grant
    Filed: February 11, 2013
    Date of Patent: October 10, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Eduardo Escardo Raffo, Oscar Murillo, David Bastien, Matthew H. Ahn, Mauro Giusti, Kevin Endres, Christian Klein, Julia Schwarz, Charles Claudius Marais
  • Publication number: 20160342203
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Application
    Filed: August 3, 2016
    Publication date: November 24, 2016
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 9423939
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Grant
    Filed: November 12, 2012
    Date of Patent: August 23, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 9383894
    Abstract: Embodiments are disclosed that relate to providing feedback for a level of completion of a user gesture via a cursor displayed on a user interface. One disclosed embodiment provides a method comprising displaying a cursor having a visual property and moving a screen-space position of the cursor responsive to the user gesture. The method further comprises changing the visual property of the cursor in proportion to a level of completion of the user gesture. In this way, the level of completion of the user gesture may be presented to the user in a location to which the attention of the user is directed during performance of the gesture.
    Type: Grant
    Filed: January 8, 2014
    Date of Patent: July 5, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Christian Klein, Oscar Murillo, Sergio Paolantonio
  • Patent number: 9342230
    Abstract: A user interface is output to a display device. If an element of a human subject is in a first conformation, the user interface scrolls responsive to movement of the element. If the element is in a second conformation, different than the first conformation, objects of the user interface are targeted responsive to movement of the element without scrolling the user interface.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David Bastien, Oscar Murillo, Mark Schwesinger
  • Patent number: 9342160
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Grant
    Filed: May 26, 2015
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Publication number: 20150370349
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Application
    Filed: May 26, 2015
    Publication date: December 24, 2015
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Publication number: 20150199017
    Abstract: A method to be enacted in a computer system operatively coupled to a vision system and to a listening system. The method applies natural user input to control the computer system. It includes the acts of detecting verbal and non-verbal touchless input from a user of the computer system, selecting one of a plurality of user-interface objects based on coordinates derived from the non-verbal, touchless input, decoding the verbal input to identify a selected action from among a plurality of actions supported by the selected object, and executing the selected action on the selected object.
    Type: Application
    Filed: January 10, 2014
    Publication date: July 16, 2015
    Applicant: Microsoft Corporation
    Inventors: Oscar Murillo, Lisa Stifelman, Margaret Song, David Bastien, Mark Schwesinger
  • Publication number: 20150193124
    Abstract: Embodiments are disclosed that relate to providing feedback for a level of completion of a user gesture via a cursor displayed on a user interface. One disclosed embodiment provides a method comprising displaying a cursor having a visual property and moving a screen-space position of the cursor responsive to the user gesture. The method further comprises changing the visual property of the cursor in proportion to a level of completion of the user gesture. In this way, the level of completion of the user gesture may be presented to the user in a location to which the attention of the user is directed during performance of the gesture.
    Type: Application
    Filed: January 8, 2014
    Publication date: July 9, 2015
    Applicant: Microsoft Corporation
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Christian Klein, Oscar Murillo, Sergio Paolantonio