Patents by Inventor Mark Schwesinger

Mark Schwesinger has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11099637
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Grant
    Filed: August 23, 2019
    Date of Patent: August 24, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Publication number: 20190377408
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Application
    Filed: August 23, 2019
    Publication date: December 12, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 10394314
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Grant
    Filed: August 3, 2016
    Date of Patent: August 27, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 9971491
    Abstract: A method to decode natural user input from a human subject. The method includes detection of a gesture and concurrent grip state of the subject. If the grip state is closed during the gesture, then a user-interface (UI) canvas of the computer system is transformed based on the gesture. If the grip state is open during the gesture, then a UI object arranged on the UI canvas is activated based on the gesture.
    Type: Grant
    Filed: January 9, 2014
    Date of Patent: May 15, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Sergio Paolantonio, Christian Klein, Oscar Murillo
  • Patent number: 9823764
    Abstract: A method to identify a targeted object based on eye tracking and gesture recognition. The method is enacted in a compute system controlled by a user and operatively coupled to a machine vision system. In this method, the compute system receives, from the machine vision system, video imaging a head and pointer of the user. Based on the video, the compute system computes a geometric line of sight of the user, which is partly occluded by the pointer. Then, with reference to position data for one or more objects, the compute system identifies the targeted object, situated along the geometric line of sight.
    Type: Grant
    Filed: December 3, 2014
    Date of Patent: November 21, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Tommer Leyvand, Szymon Stachniak
  • Patent number: 9785228
    Abstract: An NUI system to provide user input to a computer system. The NUI system includes a logic machine and an instruction-storage machine. The instruction-storage machine holds instructions that, when executed by the logic machine, cause the logic machine to detect an engagement gesture from a human subject or to compute an engagement metric reflecting the degree of the subject's engagement. The instructions also cause the logic machine to direct gesture-based user input from the subject to the computer system as soon as the engagement gesture is detected or the engagement metric exceeds a threshold.
    Type: Grant
    Filed: February 11, 2013
    Date of Patent: October 10, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Eduardo Escardo Raffo, Oscar Murillo, David Bastien, Matthew H. Ahn, Mauro Giusti, Kevin Endres, Christian Klein, Julia Schwarz, Charles Claudius Marais
  • Publication number: 20160342203
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Application
    Filed: August 3, 2016
    Publication date: November 24, 2016
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 9423939
    Abstract: Embodiments related to dynamically adjusting a user interface based upon depth information are disclosed. For example, one disclosed embodiment provides a method including receiving depth information of a physical space from a depth camera, locating a user within the physical space from the depth information, determining a distance between the user and a display device from the depth information, and adjusting one or more features of a user interface displayed on the display device based on the distance.
    Type: Grant
    Filed: November 12, 2012
    Date of Patent: August 23, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Oscar Murillo, Emily Yang, Richard Bailey, Jay Kapur
  • Patent number: 9383894
    Abstract: Embodiments are disclosed that relate to providing feedback for a level of completion of a user gesture via a cursor displayed on a user interface. One disclosed embodiment provides a method comprising displaying a cursor having a visual property and moving a screen-space position of the cursor responsive to the user gesture. The method further comprises changing the visual property of the cursor in proportion to a level of completion of the user gesture. In this way, the level of completion of the user gesture may be presented to the user in a location to which the attention of the user is directed during performance of the gesture.
    Type: Grant
    Filed: January 8, 2014
    Date of Patent: July 5, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Christian Klein, Oscar Murillo, Sergio Paolantonio
  • Publication number: 20160182814
    Abstract: An example computer-implemented method for following a target comprises receiving digital image information from a digital camera having an adjustable field of view of an environment, displaying via a display device a plurality of candidate targets that are followable within the environment, computer-recognizing user selection of a candidate target to be followed in the image environment, and machine-adjusting the field of view of the camera to follow the user-selected candidate target.
    Type: Application
    Filed: December 19, 2014
    Publication date: June 23, 2016
    Inventors: Mark Schwesinger, Simon P. Stachniak, Tim Franklin
  • Publication number: 20160162082
    Abstract: A method to identify a targeted object based on eye tracking and gesture recognition. The method is enacted in a compute system controlled by a user and operatively coupled to a machine vision system. In this method, the compute system receives, from the machine vision system, video imaging a head and pointer of the user. Based on the video, the compute system computes a geometric line of sight of the user, which is partly occluded by the pointer. Then, with reference to position data for one or more objects, the compute system identifies the targeted object, situated along the geometric line of sight.
    Type: Application
    Filed: December 3, 2014
    Publication date: June 9, 2016
    Inventors: Mark Schwesinger, Tommer Leyvand, Szymon Stachniak
  • Patent number: 9342230
    Abstract: A user interface is output to a display device. If an element of a human subject is in a first conformation, the user interface scrolls responsive to movement of the element. If the element is in a second conformation, different than the first conformation, objects of the user interface are targeted responsive to movement of the element without scrolling the user interface.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David Bastien, Oscar Murillo, Mark Schwesinger
  • Patent number: 9342160
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Grant
    Filed: May 26, 2015
    Date of Patent: May 17, 2016
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Publication number: 20150370349
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Application
    Filed: May 26, 2015
    Publication date: December 24, 2015
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Publication number: 20150199017
    Abstract: A method to be enacted in a computer system operatively coupled to a vision system and to a listening system. The method applies natural user input to control the computer system. It includes the acts of detecting verbal and non-verbal touchless input from a user of the computer system, selecting one of a plurality of user-interface objects based on coordinates derived from the non-verbal, touchless input, decoding the verbal input to identify a selected action from among a plurality of actions supported by the selected object, and executing the selected action on the selected object.
    Type: Application
    Filed: January 10, 2014
    Publication date: July 16, 2015
    Applicant: Microsoft Corporation
    Inventors: Oscar Murillo, Lisa Stifelman, Margaret Song, David Bastien, Mark Schwesinger
  • Publication number: 20150193124
    Abstract: Embodiments are disclosed that relate to providing feedback for a level of completion of a user gesture via a cursor displayed on a user interface. One disclosed embodiment provides a method comprising displaying a cursor having a visual property and moving a screen-space position of the cursor responsive to the user gesture. The method further comprises changing the visual property of the cursor in proportion to a level of completion of the user gesture. In this way, the level of completion of the user gesture may be presented to the user in a location to which the attention of the user is directed during performance of the gesture.
    Type: Application
    Filed: January 8, 2014
    Publication date: July 9, 2015
    Applicant: Microsoft Corporation
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Christian Klein, Oscar Murillo, Sergio Paolantonio
  • Publication number: 20150193107
    Abstract: A method to decode natural user input from a human subject. The method includes detection of a gesture and concurrent grip state of the subject. If the grip state is closed during the gesture, then a user-interface (UI) canvas of the computer system is transformed based on the gesture. If the grip state is open during the gesture, then a UI object arranged on the UI canvas is activated based on the gesture.
    Type: Application
    Filed: January 9, 2014
    Publication date: July 9, 2015
    Applicant: MICROSOFT CORPORATION
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Sergio Paolantonio, Christian Klein, Oscar Murillo
  • Patent number: 9063578
    Abstract: Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user's natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user's hand so that the user's 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
    Type: Grant
    Filed: July 31, 2013
    Date of Patent: June 23, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Richard Bailey, David Bastien, Mark Schwesinger, Emily Yang, Adam Smith, Oscar Murillo, Tim Franklin, Jordan Andersen, Christian Klein
  • Publication number: 20150123901
    Abstract: Embodiments are disclosed that relate to controlling a computing device based upon gesture input. In one embodiment, orientation information of the human subject is received, wherein the orientation information includes information regarding an orientation of a first body part and an orientation of a second body part. A gesture performed by the first body part is identified based on the orientation information, and an orientation of the second body part is identified based on the orientation information. A mapping of the gesture to an action performed by the computing device is determined based on the orientation of the second body part.
    Type: Application
    Filed: November 4, 2013
    Publication date: May 7, 2015
    Applicant: Microsoft Corporation
    Inventors: Mark Schwesinger, Emily Yang, Jay Kapur, Sergio Paolantonio, Christian Klein
  • Publication number: 20150123890
    Abstract: Embodiments are disclosed which relate to two hand natural user input. For example, one disclosed embodiment provides a method comprising receiving first hand tracking data regarding a first hand of a user and second hand tracking data regarding a second hand of the user from a sensor system. The first hand tracking data and the second hand tracking data temporally overlap. A gesture is then detected based on the first hand tracking data and the second hand tracking data, and one or more aspects of the computing device are controlled based on the gesture detected.
    Type: Application
    Filed: November 4, 2013
    Publication date: May 7, 2015
    Applicant: Microsoft Corporation
    Inventors: Jay Kapur, Mark Schwesinger, Emily Yang, Sergio Paolantonio, Christian Klein