Patents by Inventor William Arthur Stewart Buxton

William Arthur Stewart Buxton has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240121280
    Abstract: Systems, methods, and computer-readable storage devices are disclosed for simulated choral audio chatter in communication systems. One method including: receiving audio data from each of a plurality of users participating in a first group of a plurality of groups for an event using a communication system; generating first simulated choral audio chatter based on the audio data received from each of the plurality of users in the first group; and providing the generated first simulated choral audio data to at least one user of a plurality of users of the event.
    Type: Application
    Filed: October 7, 2022
    Publication date: April 11, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Edward Sean Lloyd RINTEL, Amos MILLER, Andrew D. WILSON, Sasa JUNUZOVIC
  • Publication number: 20240005579
    Abstract: Systems and methods for representing two-dimensional representations as three-dimensional avatars are provided herein. In some examples, one or more input video streams are received. A first subject, within the one or more input video streams, is identified. Based on the one or more input video streams, a first view of the first subject is identified. Based on the one or more input video streams, a second view of the first subject is identified. The first subject is segmented into a plurality of planar object. The plurality of planar objects are transformed with respect to each other. The plurality of planar objects are based on the first and second views of the first subject. The plurality of planar objects are output in an output video stream. The plurality of planar objects provide perspective of the first subject to one or more viewers.
    Type: Application
    Filed: June 30, 2022
    Publication date: January 4, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Mar GONZALEZ FRANCO, Payod PANDA, Andrew D. WILSON, Kori M. INKPEN, Eyal OFEK, William Arthur Stewart BUXTON
  • Patent number: 11841998
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Grant
    Filed: July 20, 2022
    Date of Patent: December 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20230236713
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Application
    Filed: March 17, 2023
    Publication date: July 27, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
  • Patent number: 11678006
    Abstract: The description relates to cooperatively controlling devices based upon their location and pose. One example can receive first sensor data associated with a first device and second sensor data associated with a second device. This example can analyze the first sensor data and the second sensor data to determine relative locations and poses of the first device relative to the second device and can supply the locations and poses to enable content to be collectively presented across the devices based upon the relative locations and poses.
    Type: Grant
    Filed: June 17, 2021
    Date of Patent: June 13, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Michel Pahud, Nathalie Maryline Riche, William Arthur Stewart Buxton, Nicolai Marquardt
  • Patent number: 11656747
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Grant
    Filed: September 21, 2021
    Date of Patent: May 23, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: John C. Tang, William Arthur Stewart Buxton, Andrew D. Wilson, Kori M. Inkpen, Sasa Junuzovic, Abigail J. Sellen, Edward Sean Lloyd Rintel
  • Publication number: 20230086906
    Abstract: Aspects of this present disclosure relate to hybrid conference user interface. The hybrid conference interface provides an establishing shot before the meeting begins that places meeting attendees in a specific spatial arrangement, such as in specific seats around a conference table. Upon starting the conference, the hybrid user interface renders an appropriate perspective view of the meeting that is tailored to each attendee's perspective while also being spatially consistent for the entire group of attendees. Allowing attendees to place themselves where they want gives attendees a sense of physical space that helps them stay spatially oriented relative to the other people and resources in the room.
    Type: Application
    Filed: September 21, 2021
    Publication date: March 23, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: John C. TANG, William Arthur Stewart BUXTON, Andrew D. WILSON, Kori M. INKPEN, Sasa JUNUZOVIC, Abigail J. SELLEN, Edward Sean Lloyd RINTEL
  • Publication number: 20230046470
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Application
    Filed: July 20, 2022
    Publication date: February 16, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Publication number: 20220408142
    Abstract: The description relates to cooperatively controlling devices based upon their location and pose. One example can receive first sensor data associated with a first device and second sensor data associated with a second device. This example can analyze the first sensor data and the second sensor data to determine relative locations and poses of the first device relative to the second device and can supply the locations and poses to enable content to be collectively presented across the devices based upon the relative locations and poses.
    Type: Application
    Filed: June 17, 2021
    Publication date: December 22, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Michel PAHUD, Nathalie Maryline RICHE, William Arthur Stewart BUXTON, Nicolai MARQUARDT
  • Patent number: 11429203
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Grant
    Filed: June 19, 2020
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20210397274
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Application
    Filed: June 19, 2020
    Publication date: December 23, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Patent number: 10732759
    Abstract: Various technologies described herein pertain to utilizing sensed pre-touch interaction to control a mobile computing device. A pre-touch interaction of a user with the mobile computing device is detected. The pre-touch interaction includes a grip of the user on the mobile computing device and/or a hover of one or more fingers of the user with respect to a touchscreen of the mobile computing device. The finger(s) of the user can be within proximity but not touching the touchscreen as part of the hover. Parameter(s) of the pre-touch interaction of the user with the mobile computing device are identified, and a touch of the user on the touchscreen of the mobile computing device is detected. A computing operation is executed responsive to the touch, where the computing operation is based on the touch and the parameter(s) of the pre-touch interaction of the user with the mobile computing device.
    Type: Grant
    Filed: June 30, 2016
    Date of Patent: August 4, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Hrovje Benko, William Arthur Stewart Buxton, Seongkook Heo
  • Patent number: 10684758
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: June 16, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
  • Patent number: 10635291
    Abstract: Thumb+pen inputs are described herein, to improve the functionality of touch enabled devices for accepting bimanual input in situations where the device is gripped or supported by one of the user's hands, leaving only one hand free. The thumb of an engaging hand is identified and controls are placed within its range of motion to enhance the functionality provided by the free hand. The actions of the thumb can be used to determine how pen actions made using the other hand are interpreted. Alternatively, the pen can indicate an object through pointing, while the thumb indirectly manipulates one or more of its parameters through touch controls. Marking menus, spring-loaded modes, indirect input, and conventional multi-touch interfaces are applied with respect to the bimanual input mode in which one hand is positioned to hold or support the device, and the other hand is free to improve device operability and accessibility.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: April 28, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Ken Pfeuffer
  • Patent number: 10558341
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: February 11, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
  • Publication number: 20180239520
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, William Arthur Stewart Buxton, Michel Pahud, Haijun Xia
  • Publication number: 20180239509
    Abstract: Use of pre-interaction context associated with gesture and touch interactions is provided. A user interface is configured to receive an interaction. A touch or gesture input may be received from a user via a touchscreen. Prior to receiving an interaction on the user interface from the user, pre-interaction context is detected. Based on the pre-interaction context, the user's interaction with the user interface is interpreted and the interaction is provided on the user interface.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Seongkook Heo
  • Publication number: 20180239519
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
  • Publication number: 20180239482
    Abstract: Thumb+pen inputs are described herein, to improve the functionality of touch enabled devices for accepting bimanual input in situations where the device is gripped or supported by one of the user's hands, leaving only one hand free. The thumb of an engaging hand is identified and controls are placed within its range of motion to enhance the functionality provided by the free hand. The actions of the thumb can be used to determine how pen actions made using the other hand are interpreted. Alternatively, the pen can indicate an object through pointing, while the thumb indirectly manipulates one or more of its parameters through touch controls. Marking menus, spring-loaded modes, indirect input, and conventional multi-touch interfaces are applied with respect to the bimanual input mode in which one hand is positioned to hold or support the device, and the other hand is free to improve device operability and accessibility.
    Type: Application
    Filed: February 20, 2017
    Publication date: August 23, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Ken Pfeuffer
  • Publication number: 20180004386
    Abstract: Various technologies described herein pertain to utilizing sensed pre-touch interaction to control a mobile computing device. A pre-touch interaction of a user with the mobile computing device is detected. The pre-touch interaction includes a grip of the user on the mobile computing device and/or a hover of one or more fingers of the user with respect to a touchscreen of the mobile computing device. The finger(s) of the user can be within proximity but not touching the touchscreen as part of the hover. Parameter(s) of the pre-touch interaction of the user with the mobile computing device are identified, and a touch of the user on the touchscreen of the mobile computing device is detected. A computing operation is executed responsive to the touch, where the computing operation is based on the touch and the parameter(s) of the pre-touch interaction of the user with the mobile computing device.
    Type: Application
    Filed: June 30, 2016
    Publication date: January 4, 2018
    Inventors: Kenneth P. Hinckley, Michel Pahud, Hrvoje Benko, William Arthur Stewart Buxton, Seongkook Heo