Patents by Inventor Kenneth P. Hinckley

Kenneth P. Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11880626
    Abstract: In embodiments of multi-device pairing and combined display, mobile devices have device touchscreens, and housing bezels of the mobile devices can be positioned proximate each other to form a combined display from the respective device touchscreens. An input recognition system can recognize an input to initiate pairing the mobile devices, enabling a cross-screen display of an interface on the respective device touchscreens, such as with different parts of the interface displayed on different ones of the device touchscreens. The input to initiate pairing the mobile devices can be recognized as the proximity of the mobile devices, as a gesture input, as a sensor input, and/or as a voice-activated input to initiate the pairing of the mobile devices for the cross-screen display of the interface.
    Type: Grant
    Filed: July 1, 2021
    Date of Patent: January 23, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Publication number: 20230421722
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11792364
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: October 17, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Publication number: 20230308505
    Abstract: Aspects of the present disclosure relate to multi-user, multi-device gaze tracking. In examples, a system includes at least one processor, and memory storing instructions that, when executed by the at least one processor, causes the system to perform a set of operations. The set of operations include identifying a plurality of computing devices, and identifying one or more users. The set of operations may further include receiving gaze input data and load data, from two or more of the plurality of computing devices. The set of operations may further include performing load balancing between the plurality of devices, wherein the load balancing comprises assigning one or more tasks from a first of the plurality of computing devices to a second of the plurality of computing devices based upon the gaze input data.
    Type: Application
    Filed: March 22, 2022
    Publication date: September 28, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Jatin SHARMA, Kenneth P. HINCKLEY, Jay C. BEAVERS, Michel PAHUD
  • Publication number: 20230259322
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: April 26, 2023
    Publication date: August 17, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Patent number: 11669294
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: June 6, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Mar Gonzalez Franco, Edward Sean Lloyd Rintel, Eyal Ofek, Jaron Zepel Lanier, Molly Jane Nicholas, Payod Panda
  • Publication number: 20230094527
    Abstract: Methods and systems are disclosed for sharing a content item from a secondary computing device to a primary computing device based on a tilt position of the secondary computing device. A sensor on the secondary computing device determines that the secondary computing device has a first tilt position that is associated with a non-sharing mode. The sensor senses that the secondary computing device is tilting from the first tilt position towards a second tilt position that is associated with a full sharing mode. In response, the content item is shared with the primary computing device. An amount of the content item that is shared continues to be increased as long as the sensor continues to sense that the secondary computing device is tilting. Once the secondary computing device has reached the second tilt position, the entire content item is shared with the primary computing device.
    Type: Application
    Filed: December 7, 2022
    Publication date: March 30, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Nathalie M. RICHE, Molly NICHOLAS, Chunjong PARK, Nicolai MARQUARDT
  • Patent number: 11550404
    Abstract: Methods and systems are disclosed for sharing a content item from a secondary computing device to a primary computing device based on a tilt position of the secondary computing device. A sensor on the secondary computing device determines that the secondary computing device has a first tilt position that is associated with a non-sharing mode. The sensor senses that the secondary computing device is tilting from the first tilt position towards a second tilt position that is associated with a full sharing mode. In response, the content item is begun to be shared with the primary computing device. An amount of the content item that is shared with the primary computing device is continued to be increased as long as the sensor continues to sense that the secondary computing device is tilting from the first tilt position towards the second tilt position. The sensor senses that the secondary computing device has reached the second tilt position and shares the entire content item with the primary computing device.
    Type: Grant
    Filed: May 14, 2021
    Date of Patent: January 10, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Nathalie M. Riche, Molly Nicholas, Chunjong Park, Nicolai Marquardt
  • Publication number: 20220382506
    Abstract: Aspects of the present disclosure relate to computing device headset input. In examples, sensor data from one or more sensors of a headset device are processed to identify implicit and/or explicit user input. A context may be determined for the user input, which may be used to process the identified input and generate an action that affects the behavior of a computing device accordingly. As a result, the headset device is usable to control one or more computing devices. As compared to other wearable devices, headset devices may be more prevalent and may therefore enable more convenient and more intuitive user input beyond merely providing audio output.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20220385855
    Abstract: Aspects of the present disclosure relate to headset virtual presence techniques. For example, a participant of a communication session may not have an associated video feed, for example as a result of a user preference to disable video communication or a lack of appropriate hardware. Accordingly, a virtual presence may be generated for such a non-video participant, such that the non-video participant may be represented within the communication session similar to video participants. The virtual presence may be controllable using a headset device, for example such that movements identified by the headset device cause the virtual presence to move. In some instances, user input may be received to control emotions conveyed by the virtual presence, for example specifying an emotion type and/or intensity.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Mar Gonzalez FRANCO, Edward Sean Lloyd RINTEL, Eyal OFEK, Jaron Zepel LANIER, Molly Jane NICHOLAS, Payod PANDA
  • Publication number: 20220365606
    Abstract: Methods and systems are disclosed for sharing a content item from a secondary computing device to a primary computing device based on a tilt position of the secondary computing device. A sensor on the secondary computing device determines that the secondary computing device has a first tilt position that is associated with a non-sharing mode. The sensor senses that the secondary computing device is tilting from the first tilt position towards a second tilt position that is associated with a full sharing mode. In response, the content item is begun to be shared with the primary computing device. An amount of the content item that is shared with the primary computing device is continued to be increased as long as the sensor continues to sense that the secondary computing device is tilting from the first tilt position towards the second tilt position. The sensor senses that the secondary computing device has reached the second tilt position and shares the entire content item with the primary computing device.
    Type: Application
    Filed: May 14, 2021
    Publication date: November 17, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Nathalie M. RICHE, Molly NICHOLAS, Chunjong PARK, Nicolai MARQUARDT
  • Publication number: 20220027118
    Abstract: Systems and methods relate to sharing content and data across users and devices. A shared folios include media primitives and tools as its constituents. The use of shared folios addresses an issue of reliably and efficiently transfer user activities with data. The use addresses scenarios of both single-user work utilizing multiple devices and collaborative work among users utilizing multiple devices. The media primitive includes content in various data types. The tool include data and access to the data by devices and applications. A data distributor manages automatic synchronization of the folios across devices using centralized and distributed transaction logs The folios are synchronized with resiliency against failure in client devices. The folio and its constituents are interactively accessible through top-level, semi-transparent user interface. The media primitive and the tools may programmatically access local applications to automatically transfer user activities among users and devices.
    Type: Application
    Filed: October 5, 2021
    Publication date: January 27, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. HINCKLEY, Michel PAHUD, Jonathan D. GOLDSTEIN, Frederik Martin BRUDY
  • Publication number: 20210326093
    Abstract: In embodiments of multi-device pairing and combined display, mobile devices have device touchscreens, and housing bezels of the mobile devices can be positioned proximate each other to form a combined display from the respective device touchscreens. An input recognition system can recognize an input to initiate pairing the mobile devices, enabling a cross-screen display of an interface on the respective device touchscreens, such as with different parts of the interface displayed on different ones of the device touchscreens. The input to initiate pairing the mobile devices can be recognized as the proximity of the mobile devices, as a gesture input, as a sensor input, and/or as a voice-activated input to initiate the pairing of the mobile devices for the cross-screen display of the interface.
    Type: Application
    Filed: July 1, 2021
    Publication date: October 21, 2021
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Patent number: 11144275
    Abstract: Systems and methods relate to sharing content and data across users and devices. A shared folios include media primitives and tools as its constituents. The use of shared folios addresses an issue of reliably and efficiently transfer user activities with data. The use addresses scenarios of both single-user work utilizing multiple devices and collaborative work among users utilizing multiple devices. The media primitive includes content in various data types. The tool include data and access to the data by devices and applications. A data distributor manages automatic synchronization of the folios across devices using centralized and distributed transaction logs The folios are synchronized with resiliency against failure in client devices. The folio and its constituents are interactively accessible through top-level, semi-transparent user interface. The media primitive and the tools may programmatically access local applications to automatically transfer user activities among users and devices.
    Type: Grant
    Filed: June 29, 2020
    Date of Patent: October 12, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Jonathan D. Goldstein, Frederik Martin Brudy
  • Patent number: 11055050
    Abstract: In embodiments of multi-device pairing and combined display, mobile devices have device touchscreens, and housing bezels of the mobile devices can be positioned proximate each other to form a combined display from the respective device touchscreens. An input recognition system can recognize an input to initiate pairing the mobile devices, enabling a cross-screen display of an interface on the respective device touchscreens, such as with different parts of the interface displayed on different ones of the device touchscreens. The input to initiate pairing the mobile devices can be recognized as the proximity of the mobile devices, as a gesture input, as a sensor input, and/or as a voice-activated input to initiate the pairing of the mobile devices for the cross-screen display of the interface.
    Type: Grant
    Filed: September 26, 2016
    Date of Patent: July 6, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Koji Yatani
  • Patent number: 10983694
    Abstract: An apparatus includes a keyboard engine that operates a keyboard that accepts shape-writing input and radial entry input. A keyboard input module obtains input data from at least one input sensor of the keyboard. An intention disambiguation engine enables simultaneous use of the shape-writing input and the radial entry input acceptance for a user of the keyboard.
    Type: Grant
    Filed: February 23, 2018
    Date of Patent: April 20, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: William A. S. Buxton, Richard L. Hughes, Kenneth P. Hinckley, Michel Pahud, Irina Spiridonova
  • Patent number: 10754490
    Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.
    Type: Grant
    Filed: June 6, 2017
    Date of Patent: August 25, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Desney S. Tan, Kenneth P. Hinckley, Steven N. Bathiche, Ronald O. Pessner, Bongshin Lee, Anoop Gupta, Amir Netz, Brett D. Brewer
  • Patent number: 10732759
    Abstract: Various technologies described herein pertain to utilizing sensed pre-touch interaction to control a mobile computing device. A pre-touch interaction of a user with the mobile computing device is detected. The pre-touch interaction includes a grip of the user on the mobile computing device and/or a hover of one or more fingers of the user with respect to a touchscreen of the mobile computing device. The finger(s) of the user can be within proximity but not touching the touchscreen as part of the hover. Parameter(s) of the pre-touch interaction of the user with the mobile computing device are identified, and a touch of the user on the touchscreen of the mobile computing device is detected. A computing operation is executed responsive to the touch, where the computing operation is based on the touch and the parameter(s) of the pre-touch interaction of the user with the mobile computing device.
    Type: Grant
    Filed: June 30, 2016
    Date of Patent: August 4, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, Hrovje Benko, William Arthur Stewart Buxton, Seongkook Heo
  • Patent number: 10684758
    Abstract: The unified system for bimanual interactions provides a lightweight and integrated interface that allows the user to efficiently interact with and manipulate content in the user interface. The system is configured to detect a multi-finger interaction on the touchscreen and to differentiate whether the user intends to pan, zoom or frame a portion of the user interface. Generally, the framing interaction is identified by detection of the user's thumb and forefinger on the touchscreen, which cooperate to define a focus area between vectors extending outwards from the user's thumb and forefinger. Upon a determination that the user intends to interact with or manipulate content within the focus area, the unified system for bimanual interactions provides an indication of the objects that are located within the focus area and contextual menus for interacting with the objects.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: June 16, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Haijun Xia
  • Patent number: 10635291
    Abstract: Thumb+pen inputs are described herein, to improve the functionality of touch enabled devices for accepting bimanual input in situations where the device is gripped or supported by one of the user's hands, leaving only one hand free. The thumb of an engaging hand is identified and controls are placed within its range of motion to enhance the functionality provided by the free hand. The actions of the thumb can be used to determine how pen actions made using the other hand are interpreted. Alternatively, the pen can indicate an object through pointing, while the thumb indirectly manipulates one or more of its parameters through touch controls. Marking menus, spring-loaded modes, indirect input, and conventional multi-touch interfaces are applied with respect to the bimanual input mode in which one hand is positioned to hold or support the device, and the other hand is free to improve device operability and accessibility.
    Type: Grant
    Filed: February 20, 2017
    Date of Patent: April 28, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Kenneth P. Hinckley, Michel Pahud, William Arthur Stewart Buxton, Ken Pfeuffer