Patents by Inventor Kenneth Paul Hinckley

Kenneth Paul Hinckley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11841998
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Grant
    Filed: July 20, 2022
    Date of Patent: December 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Patent number: 11678006
    Abstract: The description relates to cooperatively controlling devices based upon their location and pose. One example can receive first sensor data associated with a first device and second sensor data associated with a second device. This example can analyze the first sensor data and the second sensor data to determine relative locations and poses of the first device relative to the second device and can supply the locations and poses to enable content to be collectively presented across the devices based upon the relative locations and poses.
    Type: Grant
    Filed: June 17, 2021
    Date of Patent: June 13, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Michel Pahud, Nathalie Maryline Riche, William Arthur Stewart Buxton, Nicolai Marquardt
  • Publication number: 20230046470
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode, and may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface may be associated with a second use mode.
    Type: Application
    Filed: July 20, 2022
    Publication date: February 16, 2023
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Publication number: 20220408142
    Abstract: The description relates to cooperatively controlling devices based upon their location and pose. One example can receive first sensor data associated with a first device and second sensor data associated with a second device. This example can analyze the first sensor data and the second sensor data to determine relative locations and poses of the first device relative to the second device and can supply the locations and poses to enable content to be collectively presented across the devices based upon the relative locations and poses.
    Type: Application
    Filed: June 17, 2021
    Publication date: December 22, 2022
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Michel PAHUD, Nathalie Maryline RICHE, William Arthur Stewart BUXTON, Nicolai MARQUARDT
  • Patent number: 11429203
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Grant
    Filed: June 19, 2020
    Date of Patent: August 30, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hugo Karl Denis Romat, Christopher Mervin Collins, Nathalie Riche, Michel Pahud, Adam Samuel Riddle, William Arthur Stewart Buxton
  • Publication number: 20210406578
    Abstract: A “Stroke Untangler” composes handwritten messages from handwritten strokes representing overlapping letters or partial letter segments are drawn on a touchscreen device or touch-sensitive surface. These overlapping strokes are automatically untangled and then segmented and combined into one or more letters, words, or phrases. Advantageously, segmentation and composition is performed without requiring user gestures, timeouts, or other inputs to delimit characters within words, and without using handwriting recognition-based techniques to guide untangling and composing of the overlapping strokes to form characters. In other words, the user draws multiple overlapping strokes. Those strokes are then automatically segmented and combined into one or more corresponding characters. Text recognition of the resulting characters is then performed. Further, the segmentation and combination is performed in real-time, thereby enabling real-time rendering of the resulting characters in a user interface window.
    Type: Application
    Filed: September 14, 2021
    Publication date: December 30, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Wolf Kienzle, Mudit Agrawal
  • Publication number: 20210397274
    Abstract: Methods and systems are provided that are directed to automatically adjusting a user interface based on tilt position of a digital drawing board. The digital drawing board has a tiltable screen with a sensor. The tiltable screen may be fixed in a stable tilt position. A sensor is used to determine that the digital drawing board has a first tilt position. The digital drawing board displays a first user interface associated with the first tilt position. The first user interface may be associated with a first use mode. The first user interface may also be based on an application running on the digital drawing board. When the sensor senses that the digital drawing board has moved from the first tilt position to a second tilt position, it automatically displays a second user interface associated with a second tilt position. The second user interface has different functionality than the first user interface.
    Type: Application
    Filed: June 19, 2020
    Publication date: December 23, 2021
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul HINCKLEY, Hugo Karl Denis ROMAT, Christopher Mervin COLLINS, Nathalie RICHE, Michel PAHUD, Adam Samuel RIDDLE, William Arthur Stewart BUXTON
  • Publication number: 20180129897
    Abstract: A “Stroke Untangler” composes handwritten messages from handwritten strokes representing overlapping letters or partial letter segments are drawn on a touchscreen device or touch-sensitive surface. These overlapping strokes are automatically untangled and then segmented and combined into one or more letters, words, or phrases. Advantageously, segmentation and composition is performed without requiring user gestures, timeouts, or other inputs to delimit characters within words, and without using handwriting recognition-based techniques to guide untangling and composing of the overlapping strokes to form characters. In other words, the user draws multiple overlapping strokes. Those strokes are then automatically segmented and combined into one or more corresponding characters. Text recognition of the resulting characters is then performed. Further, the segmentation and combination is performed in real-time, thereby enabling real-time rendering of the resulting characters in a user interface window.
    Type: Application
    Filed: January 8, 2018
    Publication date: May 10, 2018
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Wolf Kienzle, Mudit Agrawal
  • Patent number: 9886088
    Abstract: A physically-modulated friction stylus system and method for physically modulating friction between a styli tip and a surface of a computing device to emulate the “feel” of different types of writing instruments writing on different types of surfaces (such as pen on paper or a paintbrush on canvas). The actual friction between the stylus and the surface is modulated to produce the “feel.” The friction is physically modulated “on the fly” meaning that friction can be modulated while the stylus tip is in contact with the surface and while the stylus is moving. The friction is modulated dependent on a location of the stylus on the surface and the posture and orientation of the stylus. In addition, the friction can be modulated based on a direction and a velocity that the stylus tip is moving across the surface. Audio may also be used to improve the emulation experience.
    Type: Grant
    Filed: August 8, 2012
    Date of Patent: February 6, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Paul Henry Dietz, Kenneth Paul Hinckley, Anoop Gupta, Tristan Thomas Trutna
  • Patent number: 9881224
    Abstract: A “Stroke Untangler” composes handwritten messages from handwritten strokes representing overlapping letters or partial letter segments are drawn on a touchscreen device or touch-sensitive surface. These overlapping strokes are automatically untangled and then segmented and combined into one or more letters, words, or phrases. Advantageously, segmentation and composition is performed without requiring user gestures, timeouts, or other inputs to delimit characters within words, and without using handwriting recognition-based techniques to guide untangling and composing of the overlapping strokes to form characters. In other words, the user draws multiple overlapping strokes. Those strokes are then automatically segmented and combined into one or more corresponding characters. Text recognition of the resulting characters is then performed. Further, the segmentation and combination is performed in real-time, thereby enabling real-time rendering of the resulting characters in a user interface window.
    Type: Grant
    Filed: December 17, 2013
    Date of Patent: January 30, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Wolf Kienzle, Kenneth Paul Hinckley, Mudit Agrawal
  • Patent number: 9846536
    Abstract: Technologies pertaining to composing, displaying, and/or transmitting handwritten content through utilization of a touch-sensitive display screen of a mobile computing device are described herein. A user of the mobile computing device can set forth strokes on the touch-sensitive display screen, one on top of another, wherein such strokes correspond to different handwritten characters. Stroke segmentation can be undertaken to determine which strokes correspond to which characters in a handwritten sequence of characters.
    Type: Grant
    Filed: December 17, 2012
    Date of Patent: December 19, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Wolf Kienzle, Kenneth Paul Hinckley
  • Patent number: 9244545
    Abstract: A “Contact Discriminator” provides various techniques for differentiating between valid and invalid contacts received from any input methodology by one or more touch-sensitive surfaces of a touch-sensitive computing device. Examples of contacts include single, sequential, concurrent, or simultaneous user finger touches (including gesture type touches), pen or stylus touches or inputs, hover-type inputs, or any combination thereof. The Contact Discriminator then acts on valid contacts (i.e., contacts intended as inputs) while rejecting or ignoring invalid contacts or inputs. Advantageously, the Contact Discriminator is further capable of disabling or ignoring regions of input surfaces, such tablet touch screens, that are expected to receive unintentional contacts, or intentional contacts not intended as inputs, for device or application control purposes.
    Type: Grant
    Filed: June 21, 2012
    Date of Patent: January 26, 2016
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kenneth Paul Hinckley, Hrvoje Benko, Steven Bathiche, Paul Henry Dietz, Anoop Gupta
  • Patent number: 9219804
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Grant
    Filed: August 8, 2013
    Date of Patent: December 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Patent number: 9201520
    Abstract: A “Motion and Context Sharing Technique” uses a pen or stylus enhanced to incorporate multiple sensors, i.e., a “sensor pen,” and a power supply to enable various input techniques and gestures. Various combinations of pen stroke, pressure, motion, and other sensor pen inputs are used to enable various hybrid input techniques that incorporate simultaneous, concurrent, sequential, and/or interleaved, sensor pen inputs and touch inputs (i.e., finger, palm, hand, etc.) on displays or other touch sensitive surfaces. This enables a variety of motion-gesture inputs relating to the context of how the sensor pen is used or held, even when the pen is not in contact or within sensing range of the computing device digitizer. In other words, any particular touch inputs or combinations of touch inputs are correlated with any desired sensor pen inputs, with those correlated inputs then being used to initiate any desired action by the computing device.
    Type: Grant
    Filed: May 28, 2013
    Date of Patent: December 1, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Hrvoje Benko, Xiang Chen, Kenneth Paul Hinckley
  • Publication number: 20150169975
    Abstract: A “Stroke Untangler” composes handwritten messages from handwritten strokes representing overlapping letters or partial letter segments are drawn on a touchscreen device or touch-sensitive surface. These overlapping strokes are automatically untangled and then segmented and combined into one or more letters, words, or phrases. Advantageously, segmentation and composition is performed without requiring user gestures, timeouts, or other inputs to delimit characters within words, and without using handwriting recognition-based techniques to guide untangling and composing of the overlapping strokes to form characters. In other words, the user draws multiple overlapping strokes. Those strokes are then automatically segmented and combined into one or more corresponding characters. Text recognition of the resulting characters is then performed. Further, the segmentation and combination is performed in real-time, thereby enabling real-time rendering of the resulting characters in a user interface window.
    Type: Application
    Filed: December 17, 2013
    Publication date: June 18, 2015
    Applicant: Microsoft Corporation
    Inventors: Wolf Kienzle, Kenneth Paul Hinckley, Mudit Agrawal
  • Publication number: 20140171153
    Abstract: Technologies pertaining to composing, displaying, and/or transmitting handwritten content through utilization of a touch-sensitive display screen of a mobile computing device are described herein. A user of the mobile computing device can set forth strokes on the touch-sensitive display screen, one on top of another, wherein such strokes correspond to different handwritten characters. Stroke segmentation can be undertaken to determine which strokes correspond to which characters in a handwritten sequence of characters.
    Type: Application
    Filed: December 17, 2012
    Publication date: June 19, 2014
    Applicant: Microsoft Corporation
    Inventors: Wolf Kienzle, Kenneth Paul Hinckley
  • Publication number: 20140043242
    Abstract: A physically-modulated friction stylus system and method for physically modulating friction between a styli tip and a surface of a computing device to emulate the “feel” of different types of writing instruments writing on different types of surfaces (such as pen on paper or a paintbrush on canvas). The actual friction between the stylus and the surface is modulated to produce the “feel.” The friction is physically modulated “on the fly” meaning that friction can be modulated while the stylus tip is in contact with the surface and while the stylus is moving. The friction is modulated dependent on a location of the stylus on the surface and the posture and orientation of the stylus. In addition, the friction can be modulated based on a direction and a velocity that the stylus tip is moving across the surface. Audio may also be used to improve the emulation experience.
    Type: Application
    Filed: August 8, 2012
    Publication date: February 13, 2014
    Applicant: Microsoft Corporation
    Inventors: Paul Henry Dietz, Kenneth Paul Hinckley, Anoop Gupta, Tristan Thomas Trutna
  • Publication number: 20130324194
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Application
    Filed: August 8, 2013
    Publication date: December 5, 2013
    Applicant: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Publication number: 20130257777
    Abstract: A “Motion and Context Sharing Technique” uses a pen or stylus enhanced to incorporate multiple sensors, i.e., a “sensor pen,” and a power supply to enable various input techniques and gestures. Various combinations of pen stroke, pressure, motion, and other sensor pen inputs are used to enable various hybrid input techniques that incorporate simultaneous, concurrent, sequential, and/or interleaved, sensor pen inputs and touch inputs (i.e., finger, palm, hand, etc.) on displays or other touch sensitive surfaces. This enables a variety of motion-gesture inputs relating to the context of how the sensor pen is used or held, even when the pen is not in contact or within sensing range of the computing device digitizer. In other words, any particular touch inputs or combinations of touch inputs are correlated with any desired sensor pen inputs, with those correlated inputs then being used to initiate any desired action by the computing device.
    Type: Application
    Filed: May 28, 2013
    Publication date: October 3, 2013
    Applicant: Microsoft Corporation
    Inventors: Hrvoje Benko, Xiang Chen, Kenneth Paul Hinckley
  • Patent number: 8509842
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: August 13, 2013
    Assignee: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park