Patents by Inventor Oscar Murillo
Oscar Murillo has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11743215Abstract: Aspects of the present disclosure are directed to an XR messaging system that can conduct a message thread between multiple users, where individual messages can be designated for delivery to particular artificial reality locations or devices. When sending a message, a user can choose to send the message to a particular destination associated with one or more other users on the message thread. When such a destination selection is made, the message can be formatted for viewing at the selected destination by applying a template, to the message, selected based on the template being configured for the types of data defined in the message and for the type of the destination.Type: GrantFiled: June 28, 2021Date of Patent: August 29, 2023Inventors: Oscar Murillo, Fang-Yu Yang, Annika Rodrigues
-
Publication number: 20230072623Abstract: Aspects of the present disclosure are directed to an artificial reality capture and sharing system. The artificial reality capture and sharing system can provide an output view showing a world-view from an artificial reality device or a view of the user's point-of-view. The world-view can show the complete surrounding area that is being captured by the artificial reality capture and sharing system, whether or not the user of the artificial reality device is viewing that portion of the surroundings. The point-of-view version can show the portion of the surrounding area that is in the artificial reality device's display area. The artificial reality capture and sharing system can also apply filters to people depicted in its captured sensor data. This can include applying filters to identified users in on-device or shared output views or to live views of people in the surrounding area of the artificial reality device.Type: ApplicationFiled: September 3, 2021Publication date: March 9, 2023Inventors: Andrea ZELLER, Oscar MURILLO, Matthew James SCHOENHOLZ
-
Publication number: 20230045759Abstract: A 3D calling system can provide 3D calls in various modes according to transitions and can provide affordances (i.e., visual or auditory cues) to improve 3D call image capturing. The 3D calling system of a recipient in a 3D call can display a hologram (from images captured by an external capture device (ECD)) or avatar of a sending call participant in a variety of ways, such as by making them “world-locked,” “ECD-locked,” or “body-locked.” The selection of a 3D call mode can be based on factors such as whether an ECD is active, whether the ECD is in motion, and user selections. In various cases, the 3D calling system can trigger various affordances to improve the quality of the images captured by the sender’s ECD, such as a displayed virtual object and/or an auditory cue, either signaling to the user that a current ECD configuration is non-optimal and/or providing instructions for an improved ECD configuration.Type: ApplicationFiled: August 4, 2021Publication date: February 9, 2023Inventors: Jing MA, Paul Armistead HOOVER, Oscar MURILLO, Fang-Yu YANG
-
Patent number: 11099637Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.Type: GrantFiled: August 23, 2019Date of Patent: August 24, 2021Assignee: Microsoft Technology Licensing, LLCInventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
-
Publication number: 20190377408Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.Type: ApplicationFiled: August 23, 2019Publication date: December 12, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
-
Patent number: 10489984Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.Type: GrantFiled: June 26, 2018Date of Patent: November 26, 2019Assignee: Microsoft Technology Licensing, LLCInventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
-
Patent number: 10394314Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.Type: GrantFiled: August 3, 2016Date of Patent: August 27, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
-
Publication number: 20180300955Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.Type: ApplicationFiled: June 26, 2018Publication date: October 18, 2018Inventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
-
Patent number: 10032314Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.Type: GrantFiled: October 11, 2016Date of Patent: July 24, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
-
Patent number: 9971491Abstract: A method to decode natural user input from a human subject. The method includes detection of a gesture and concurrent grip state of the subject. If the grip state is closed during the gesture, then a user-interface (UI) canvas of the computer system is transformed based on the gesture. If the grip state is open during the gesture, then a UI object arranged on the UI canvas is activated based on the gesture.Type: GrantFiled: January 9, 2014Date of Patent: May 15, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mark Schwesinger, Emily Yang, Jay Kapur, Sergio Paolantonio, Christian Klein, Oscar Murillo
-
Publication number: 20180101988Abstract: A virtual reality headset system having a region configured to change its opacity is disclosed. The virtual reality headset system is configured to provide an immersive experience, but also to allow a user to see at least some portion of the outside world, at least some of the time. The headset has a casing that partially surrounds a display. The casing has an opening configured to receive the user's face. The casing has a region between the display and the user's face configured to change a degree of opacity. The region may be a window located in a non-front facing or lateral side of the casing.Type: ApplicationFiled: October 11, 2016Publication date: April 12, 2018Applicant: Microsoft Technology Licensing, LLCInventors: Oscar Murillo, Roger Ibars Martinez, Jean-Louis Villecroze, Nema Rao
-
Patent number: 9785228Abstract: An NUI system to provide user input to a computer system. The NUI system includes a logic machine and an instruction-storage machine. The instruction-storage machine holds instructions that, when executed by the logic machine, cause the logic machine to detect an engagement gesture from a human subject or to compute an engagement metric reflecting the degree of the subject's engagement. The instructions also cause the logic machine to direct gesture-based user input from the subject to the computer system as soon as the engagement gesture is detected or the engagement metric exceeds a threshold.Type: GrantFiled: February 11, 2013Date of Patent: October 10, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mark Schwesinger, Eduardo Escardo Raffo, Oscar Murillo, David Bastien, Matthew H. Ahn, Mauro Giusti, Kevin Endres, Christian Klein, Julia Schwarz, Charles Claudius Marais
-
Publication number: 20160342203Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.Type: ApplicationFiled: August 3, 2016Publication date: November 24, 2016Applicant: Microsoft Technology Licensing, LLCInventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
-
Patent number: 9423939Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.Type: GrantFiled: November 12, 2012Date of Patent: August 23, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
-
Patent number: 9383894Abstract: Embodiments are disclosed that relate to providing feedback for a level of completion of a user gesture via a cursor displayed on a user interface. One disclosed embodiment provides a method comprising displaying a cursor having a visual property and moving a screen-space position of the cursor responsive to the user gesture. The method further comprises changing the visual property of the cursor in proportion to a level of completion of the user gesture. In this way, the level of completion of the user gesture may be presented to the user in a location to which the attention of the user is directed during performance of the gesture.Type: GrantFiled: January 8, 2014Date of Patent: July 5, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Mark Schwesinger, Emily Yang, Jay Kapur, Christian Klein, Oscar Murillo, Sergio Paolantonio
-
Patent number: 9342230Abstract: A user interface is output to a display device. If an element of a human subject is in a first conformation, the user interface scrolls responsive to movement of the element. If the element is in a second conformation, different than the first conformation, objects of the user interface are targeted responsive to movement of the element without scrolling the user interface.Type: GrantFiled: March 13, 2013Date of Patent: May 17, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: David Bastien, Oscar Murillo, Mark Schwesinger
-
Patent number: 9342160Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.Type: GrantFiled: May 26, 2015Date of Patent: May 17, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
-
Publication number: 20150370349Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.Type: ApplicationFiled: May 26, 2015Publication date: December 24, 2015Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
-
Publication number: 20150199017Abstract: A method to be enacted in a computer system operatively coupled to a vision system and to a listening system. The method applies natural user input to control the computer system. It includes the acts of detecting verbal and non-verbal touchless input from a user of the computer system, selecting one of a plurality of user-interface objects based on coordinates derived from the non-verbal, touchless input, decoding the verbal input to identify a selected action from among a plurality of actions supported by the selected object, and executing the selected action on the selected object.Type: ApplicationFiled: January 10, 2014Publication date: July 16, 2015Applicant: Microsoft CorporationInventors: Oscar Murillo, Lisa Stifelman, Margaret Song, David Bastien, Mark Schwesinger
-
Publication number: 20150193124Abstract: Embodiments are disclosed that relate to providing feedback for a level of completion of a user gesture via a cursor displayed on a user interface. One disclosed embodiment provides a method comprising displaying a cursor having a visual property and moving a screen-space position of the cursor responsive to the user gesture. The method further comprises changing the visual property of the cursor in proportion to a level of completion of the user gesture. In this way, the level of completion of the user gesture may be presented to the user in a location to which the attention of the user is directed during performance of the gesture.Type: ApplicationFiled: January 8, 2014Publication date: July 9, 2015Applicant: Microsoft CorporationInventors: Mark Schwesinger, Emily Yang, Jay Kapur, Christian Klein, Oscar Murillo, Sergio Paolantonio