Patents by Inventor Kenton M. Lyons

Kenton M. Lyons has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130318458
    Abstract: In accordance with some embodiments, the depiction of chrome in user interfaces may be modified based on ambient conditions. There are a number of ambient conditions that can be used as a trigger to modify the chrome. For example, the current direction from which light hits the computer may be used to change the chrome shadow depictions on user selectable image elements, such as button icons.
    Type: Application
    Filed: November 21, 2011
    Publication date: November 28, 2013
    Inventors: Kenton M. Lyons, Joshua J. Ratcliff
  • Publication number: 20130271351
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In the context of a method, an orientation of each of a plurality of segments of a multi-segment wearable accessory is determined relative to an axis through the multi-segment wearable accessory, such as by determining an angle of each of the plurality of segments relative to the axis through the multi-segment wearable accessory. A relative ordering of the plurality of segments of the multi-segment wearable accessory may then be determined based upon the orientation of each of the plurality of segments relative to the axis.
    Type: Application
    Filed: September 6, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventors: Kenton M. Lyons, David H. Nguyen, Sean White
  • Publication number: 20130271454
    Abstract: The presentation of a computer display may be modified based on what the user is viewing on the display. In one embodiment, gaze detection technology may be used to determine what the user is looking at. Based on what the user is looking at, the display may be altered to either improve or degrade the display at the region the user is looking at.
    Type: Application
    Filed: September 8, 2011
    Publication date: October 17, 2013
    Inventors: Kenton M. Lyons, Joshua J. Ratcliff, Horst W. Hausseeker
  • Publication number: 20130271355
    Abstract: A method, apparatus, and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In particular, a method, apparatus, and computer program product are provided that determine an angle between at least two segments of the wearable accessory and cause one or more of the segments to operate in a manner that is at least partially dependent upon the angle between the at least two segments. In some cases, a determination may be made as to whether the wearable accessory is in a worn state or in an unworn state. For example, in the worn state, the segments may have separate functionality, whereas in the unworn state the segments may act as a single display or present an idle screen.
    Type: Application
    Filed: July 16, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventors: Kenton M. Lyons, David H. Nguyen, Sean White
  • Publication number: 20130271389
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In particular, a method, apparatus, and computer program product are provided that receive touch input via at least first and second segments of a multi-segment wearable accessory and determine that the touch input associated with the second segment is moving relative to the touch input associated with the first segment. A presentation of content displayed at least partially by the first segment may be modified based upon movement of the touch input associated with the second segment, such as in an expand operation, in which content is displayed on a second segment, or a collapse operation, in which content is removed from the second segment following an expand operation.
    Type: Application
    Filed: August 15, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventors: Kenton M. Lyons, David H. Nguyen, Daniel L. Ashbrook
  • Publication number: 20130271350
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In particular, a method, apparatus, and computer program product are provided that determine an orientation of a plurality of segments of a multi-segment wearable accessory, identify one of the segments to be associated with a peripheral device at least partially based upon the orientations determined, and provide for an association of the segment identified with the peripheral device. In this way, a particular segment may be selected to present content relating to the peripheral device or vice versa, for example, based on the orientation of the segment with respect to the peripheral device and/or the other segments.
    Type: Application
    Filed: August 22, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventor: Kenton M. Lyons
  • Publication number: 20130271392
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In particular, a method, apparatus, and computer program product are provided that provide for a determination of an initial angle defined between first and second segments in response to receipt of an initial contact component of a touch input via the first and second segments. A change in the initial angle can then be determined in response to movement of the segments with respect to each other, and an operation may be executed at least partially based upon the change in the initial angle determined. Thus, a continuous input parameter may be supplied by a user so as to provide another dimension of input to the wearable accessory.
    Type: Application
    Filed: September 21, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventor: Kenton M. Lyons
  • Publication number: 20130271495
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. Each screen presented on or capable of being presented on the display of a segment of the multi-segment wearable accessory may be considered a virtual segment, and the number of virtual segments may be greater than the number of physical segments of the accessory. One or more of the virtual segments may be associated with one or more of the segments in an overlaid configuration, such that a topmost virtual segment is presented for viewing while another virtual segment lies below the topmost virtual segment, hidden from the user's view. A virtual segment may be replaced with presentation of another virtual segment in response to rotation of the accessory.
    Type: Application
    Filed: August 20, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventors: David H. Nguyen, Kenton M. Lyons
  • Publication number: 20130271390
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In particular, a method, apparatus, and computer program product are provided that provide for a determination that touch input received via at least first and second segments of a multi-segment wearable accessory represents a rotational gesture. As a result, information that is presented upon a display of the first segment prior to receipt of the touch input is presented upon the display of the second segment and information that is presented upon a display of the second segment prior to receipt of the touch input is presented upon the display of the first segment in a “swap” operation following receipt of the touch input.
    Type: Application
    Filed: August 27, 2012
    Publication date: October 17, 2013
    Applicant: NOKIA CORPORATION
    Inventors: Kenton M. Lyons, David H. Nguyen, Daniel L. Ashbrook
  • Patent number: 8560484
    Abstract: In some embodiments a controller is adapted to receive textual representation of content experienced by a user, to receive information about an interaction by the user with the content, and to determine a user model in response to the textual representation and the interaction. Other embodiments are described and claimed.
    Type: Grant
    Filed: December 17, 2010
    Date of Patent: October 15, 2013
    Assignee: Intel Corporation
    Inventors: Kenton M. Lyons, Barbara Rosario, Trevor Pering, Roy Want
  • Publication number: 20130258089
    Abstract: Gaze detection technology may be used to aim aimable optics on an imaging device. As a result, the user need not do anything more to direct the camera's line of sight than to look at something. In some embodiments, the camera may then adjust the focus and exposure based on the gaze target. In addition, the camera may keep track of how long the user looks at a given area within a scene and, if a time threshold is exceeded, the camera may zoom in to that gaze target.
    Type: Application
    Filed: November 3, 2011
    Publication date: October 3, 2013
    Applicant: Intel Corporation
    Inventors: Kenton M. Lyons, Joshua J. Ratcliff
  • Publication number: 20130259312
    Abstract: In response to the detection of what the user is looking at on a display screen, the playback of audio or visual media associated with that region may be modified. For example, video in the region the user is looking at may be sped up or slowed down. A still image in the region of interest may be transformed into a moving picture. Audio associated with an object depicted in the region of interest on the display screen may be activated in response to user gaze detection.
    Type: Application
    Filed: September 8, 2011
    Publication date: October 3, 2013
    Inventors: Kenton M. Lyons, Joshua J. Ratcliff, Trevor Pering
  • Publication number: 20120250952
    Abstract: In one embodiment an electronic device comprises an input/output module, a memory coupled to the input/output module, and logic to store a first image of a face in the memory module, associate an identity with the first image of a face, subsequently collect a second image of a face, determine a correlation between features on the first image of a face and the second image of a face, and store the correlation between the first image and the second image. Other embodiments may be described.
    Type: Application
    Filed: April 1, 2011
    Publication date: October 4, 2012
    Inventors: Branislav Kveton, Kenton M. Lyons
  • Publication number: 20120197630
    Abstract: Methods and systems to summarize a source text as a function of contextual information, including to fit a summary within a context-based allotted time. The context-based allotted time may be apportioned amongst multiple portions of the source text, such as by relevance. The context-based allotted time and/or relevance may be user-specified and/or determined, such as by look-up, rule, computation, inference, and/or machine learning. During summary presentation, one or more portions of the source text may be re-summarized, such as to adjust a level of detail. A presentation rate may be user-controllable. Where new and/or changed contextual information affects an available time to review a remaining portion of the summary, the summary presentation may be automatically adjusted, and/or one or more portions of the source text may be re-summarized based on a revised context-based allotted time.
    Type: Application
    Filed: January 28, 2011
    Publication date: August 2, 2012
    Inventors: Kenton M. LYONS, Barbara ROSARIO, Trevor PERING, Roy WANT
  • Publication number: 20120162091
    Abstract: Methods and systems to allow users to gain the advantages of a large-format touch display by using smaller, cost-effective touch displays. Given two adjacent displays, regions may be created on both sides of the boundary between the displays. These regions may grow and shrink based on the user's movement, i.e., the velocity of a stylus or finger towards the boundary. If the user lifts his stylus within a region on one display, he may finish the tracking on the other by landing within the corresponding region of the latter display. This may allow a user to begin a drag on one display, drag towards another display, and “flyover” to the second display without slowing. The lift event may be removed when the first display detects the stylus being lifted as it moves towards the second. The landing on the second display may be removed.
    Type: Application
    Filed: December 23, 2010
    Publication date: June 28, 2012
    Inventors: Kenton M. Lyons, Nirmal Patel, Trevor Pering, Roy Want
  • Publication number: 20120158390
    Abstract: In some embodiments a controller is adapted to receive textual representation of content experienced by a user, to receive information about an interaction by the user with the content, and to determine a user model in response to the textual representation and the interaction. Other embodiments are described and claimed.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Inventors: Kenton M. Lyons, Barbara Rosario, Trevor Pering, Roy Want
  • Publication number: 20100164839
    Abstract: In general, in one aspect, the disclosure describes an apparatus that includes memory to store application window images of applications running thereon and a composite display generator to generate a composite display based on at least some of the application window images stored in the memory and any application window images received from other wireless devices. The apparatus may also include a remote display to transmit at least some of the application window images stored in the memory to the other wireless devices. The apparatus may include a frame buffer to store the composite display and a display to display the composite display stored in the frame buffer. The apparatus may include a communication interface to establish wireless communication links with the other wireless devices. Other embodiments are described and claimed.
    Type: Application
    Filed: December 31, 2008
    Publication date: July 1, 2010
    Inventors: Kenton M. Lyons, Roy Want, Trevor A. Pering
  • Publication number: 20100169791
    Abstract: A Remote Display Protocol (RDP) server, such as a hand-held mobile device, may control various aspects of the RDP client, such as a large display. For example, if a user wishes to use a RDP system to project the display from their laptop computer onto a large-screen display mounted in a conference room, embodiments would allow the user to control aspects of the remote client such as location, size, and full-screen treatment.
    Type: Application
    Filed: December 31, 2008
    Publication date: July 1, 2010
    Inventors: Trevor Pering, Roy Want, Kenton M. Lyons
  • Publication number: 20100164970
    Abstract: In general, in one aspect, the disclosure describes an apparatus having a frame buffer having a first area allocated to buffer display information for content to be displayed on a local display. A frame buffer allocator is to dynamically expand the frame buffer to include one or more additional areas allocated to buffer display information for content to be displayed on one or more remote displays associated with one or more wireless devices. A dynamic geometry manager is to configure the one or more additional areas of said frame buffer. One or more remote-frame-buffer protocols are to transfer the display information from the one or more additional areas to the one or more wireless devices. Other embodiments are described and claimed.
    Type: Application
    Filed: December 31, 2008
    Publication date: July 1, 2010
    Inventors: Kenton M. Lyons, Roy Want, Trevor A. Pering