Patents by Inventor Kenton M. Lyons

Kenton M. Lyons has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10701114
    Abstract: Techniques for augmented social networking may include receiving an image. After receiving an image, in real time, an identity of a person in the image may be determined. Association information for the person based on the identity and one or more defined parameters may be determined. The defined parameters may represent electronic communication. Location information of the person may be determined. The association information may be presented proximate to the person in an augmented reality view using the location information. Other embodiments are described and claimed.
    Type: Grant
    Filed: February 12, 2018
    Date of Patent: June 30, 2020
    Assignee: INTEL CORPORATION
    Inventors: Joshua J. Ratcliff, Kenton M. Lyons
  • Publication number: 20190036987
    Abstract: Techniques for augmented social networking may include receiving an image. After receiving an image, in real time, an identity of a person in the image may be determined. Association information for the person based on the identity and one or more defined parameters may be determined. The defined parameters may represent electronic communication. Location information of the person may be determined. The association information may be presented proximate to the person in an augmented reality view using the location information. Other embodiments are described and claimed.
    Type: Application
    Filed: February 12, 2018
    Publication date: January 31, 2019
    Applicant: INTEL CORPORATION
    Inventors: JOSHUA J. RATCLIFF, KENTON M. LYONS
  • Patent number: 10108316
    Abstract: An embodiment of the invention includes a system that tracks a user's pupillary response to content located on a web page. The system then determines a cognitive load for the user that is based on the measured response. Cognitive load refers to the total amount of mental activity imposed on working memory in any one instant. Further, the system may aggregate the cognitive load data for one user over time, for many different users, and/or for many different users over time. The cognitive load may be determined for different portions of a displayed page, such as a document object model (DOM) included on the page. The cognitive load may be specified for different elements that make up the DOM. Also, cognitive load may be apportioned over several different DOM elements at one moment in time or over a period of time. Other embodiments are described herein.
    Type: Grant
    Filed: December 30, 2011
    Date of Patent: October 23, 2018
    Assignee: Intel Corporation
    Inventor: Kenton M. Lyons
  • Patent number: 9894116
    Abstract: Techniques for augmented social networking may include receiving an image. After receiving an image, in real time, an identity of a person in the image may be determined. Association information for the person based on the identity and one or more defined parameters may be determined. The defined parameters may represent electronic communication. Location information of the person may be determined. The association information may be presented proximate to the person in an augmented reality view using the location information. Other embodiments are described and claimed.
    Type: Grant
    Filed: April 12, 2012
    Date of Patent: February 13, 2018
    Assignee: INTEL CORPORATION
    Inventors: Joshua J. Ratcliff, Kenton M. Lyons
  • Patent number: 9766700
    Abstract: A gaze activated data unit transfer system is described. An apparatus may comprise a gaze interface application operative on a processor circuit to manage data unit transfer operations based on eye movements of a human user. The gaze interface application may comprise a gaze interpreter component operative to receive eye movement information of a human eye from one or more eye gaze trackers, and interpret the eye movement information as a data unit transfer request to transfer a data unit from a source device to a target device, a data connection component operative to establish a data connection between the source and target devices using the transceiver in response to the data unit transfer request, and a data transfer component operative to send the data unit from the source device to the target device over the data connection. Other embodiments are described and claimed.
    Type: Grant
    Filed: December 14, 2011
    Date of Patent: September 19, 2017
    Assignee: INTEL CORPORATION
    Inventors: Kenton M. Lyons, Trevor Pering
  • Patent number: 9696690
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. Each screen presented on or capable of being presented on the display of a segment of the multi-segment wearable accessory may be considered a virtual segment, and the number of virtual segments may be greater than the number of physical segments of the accessory. One or more of the virtual segments may be associated with one or more of the segments in an overlaid configuration, such that a topmost virtual segment is presented for viewing while another virtual segment lies below the topmost virtual segment, hidden from the user's view. A virtual segment may be replaced with presentation of another virtual segment in response to rotation of the accessory.
    Type: Grant
    Filed: August 20, 2012
    Date of Patent: July 4, 2017
    Assignee: Nokia Technologies Oy
    Inventors: David H. Nguyen, Kenton M. Lyons
  • Patent number: 9662569
    Abstract: Software gaming applications may be written to accept an input from one or more standard controllers such a joystick with input buttons. However, multiple wearable sensors or hand held MID devices may enhance the gaming experience. These sensors may include devices such as accelerometers to detect movement of the extremity they are attached to and wirelessly communicate this information to a receiver. Embodiments are directed to using sensor fusion to combine sensor data from multiple wireless input devices, such as wearable sensors and MID devices together to form one logical input stream that is presented to an application and which the application sees are a standard controller.
    Type: Grant
    Filed: December 31, 2008
    Date of Patent: May 30, 2017
    Assignee: Intel Corporation
    Inventors: Trevor Pering, Roy Want, Kenton M. Lyons, Barbara Rosario
  • Patent number: 9646522
    Abstract: Information is delivered about a particular external environment using a transparent display. In one embodiment, a method includes determining a position of a mobile transparent display, determining an orientation of the display, retrieving information about the environment of the display using the determined position and orientation, and overlaying the retrieved information over a view on the transparent display.
    Type: Grant
    Filed: June 29, 2012
    Date of Patent: May 9, 2017
    Assignee: Intel Corporation
    Inventors: Joshua J. Ratcliff, Kenton M. Lyons
  • Patent number: 9361718
    Abstract: The presentation of a computer display may be modified based on what the user is viewing on the display. In one embodiment, gaze detection technology may be used to determine what the user is looking at. Based on what the user is looking at, the display may be altered to either improve or degrade the display at the region the user is looking at.
    Type: Grant
    Filed: September 8, 2011
    Date of Patent: June 7, 2016
    Assignee: Intel Corporation
    Inventors: Kenton M. Lyons, Joshua J. Ratcliff, Horst W. Hausseeker
  • Publication number: 20160034029
    Abstract: A gaze activated data unit transfer system is described. An apparatus may comprise a gaze interface application operative on a processor circuit to manage data unit transfer operations based on eye movements of a human user. The gaze interface application may comprise a gaze interpreter component operative to receive eye movement information of a human eye from one or more eye gaze trackers, and interpret the eye movement information as a data unit transfer request to transfer a data unit from a source device to a target device, a data connection component operative to establish a data connection between the source and target devices using the transceiver in response to the data unit transfer request, and a data transfer component operative to send the data unit from the source device to the target device over the data connection. Other embodiments are described and claimed.
    Type: Application
    Filed: December 14, 2011
    Publication date: February 4, 2016
    Inventors: KENTON M. LYONS, TREVOR PERING
  • Patent number: 9122249
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In particular, a method, apparatus, and computer program product are provided that receive touch input via at least first and second segments of a multi-segment wearable accessory and determine that the touch input associated with the second segment is moving relative to the touch input associated with the first segment. A presentation of content displayed at least partially by the first segment may be modified based upon movement of the touch input associated with the second segment, such as in an expand operation, in which content is displayed on a second segment, or a collapse operation, in which content is removed from the second segment following an expand operation.
    Type: Grant
    Filed: August 15, 2012
    Date of Patent: September 1, 2015
    Assignee: Nokia Technologies Oy
    Inventors: Kenton M. Lyons, David H. Nguyen, Daniel L. Ashbrook
  • Publication number: 20140368538
    Abstract: Techniques for augmented social networking may include receiving an image. After receiving an image, in real time, an identity of a person in the image may be determined. Association information for the person based on the identity and one or more defined parameters may be determined. The defined parameters may represent electronic communication. Location information of the person may be determined. The association information may be presented proximate to the person in an augmented reality view using the location information. Other embodiments are described and claimed.
    Type: Application
    Filed: April 12, 2012
    Publication date: December 18, 2014
    Inventors: Joshua J. Ratcliff, Kenton M. Lyons
  • Patent number: 8872729
    Abstract: A method, apparatus and computer program product are provided to facilitate the use of a multi-segment wearable accessory. In this regard, methods, apparatus and computer program products are provided for controlling and, in some instances, interacting with a multi-segment wearable accessory. In the context of a method, an orientation of each of a plurality of segments of a multi-segment wearable accessory is determined relative to an axis through the multi-segment wearable accessory, such as by determining an angle of each of the plurality of segments relative to the axis through the multi-segment wearable accessory. A relative ordering of the plurality of segments of the multi-segment wearable accessory may then be determined based upon the orientation of each of the plurality of segments relative to the axis.
    Type: Grant
    Filed: September 6, 2012
    Date of Patent: October 28, 2014
    Assignee: Nokia Corporation
    Inventors: Kenton M. Lyons, David H. Nguyen, Sean White
  • Publication number: 20140208226
    Abstract: An embodiment of the invention includes a system that tracks a user's pupillary response to content located on a web page. The system then determines a cognitive load for the user that is based on the measured response. Cognitive load refers to the total amount of mental activity imposed on working memory in any one instant. Further, the system may aggregate the cognitive load data for one user over time, for many different users, and/or for many different users over time. The cognitive load may be determined for different portions of a displayed page, such as a document object model (DOM) included on the page. The cognitive load may be specified for different elements that make up the DOM. Also, cognitive load may be apportioned over several different DOM elements at one moment in time or over a period of time. Other embodiments are described herein.
    Type: Application
    Filed: December 30, 2011
    Publication date: July 24, 2014
    Inventor: Kenton M. Lyons
  • Patent number: 8686921
    Abstract: In general, in one aspect, the disclosure describes an apparatus having a frame buffer having a first area allocated to buffer display information for content to be displayed on a local display. A frame buffer allocator is to dynamically expand the frame buffer to include one or more additional areas allocated to buffer display information for content to be displayed on one or more remote displays associated with one or more wireless devices. A dynamic geometry manager is to configure the one or more additional areas of said frame buffer. One or more remote-frame-buffer protocols are to transfer the display information from the one or more additional areas to the one or more wireless devices. Other embodiments are described and claimed.
    Type: Grant
    Filed: December 31, 2008
    Date of Patent: April 1, 2014
    Assignee: Intel Corporation
    Inventors: Kenton M. Lyons, Roy Want, Trevor A. Pering
  • Publication number: 20140085177
    Abstract: A method, apparatus and computer program product are provided to facilitate user input based upon the relative position of their fingers. In the context of a method, sensor information is received that is indicative of the position of a first finger relative to a second finger. The first finger may be a thumb such that the sensor information is indicative of the position of the thumb relative to the second finger. In conjunction with the receipt of sensor information, the method receives sensor information from a sensor that is offset from an interface between the first and second fingers so as not to be positioned between the first and second fingers. The method also determines, with a processor, the relative position of the first and second fingers based upon the sensor information and causes performance of an operation in response to the relative position of the first and second fingers.
    Type: Application
    Filed: September 21, 2012
    Publication date: March 27, 2014
    Applicant: NOKIA CORPORATION
    Inventors: Kenton M. Lyons, Ke-Yu Chen, Sean White, Daniel L. Ashbrook
  • Patent number: 8625860
    Abstract: In one embodiment an electronic device comprises an input/output module, a memory coupled to the input/output module, and logic to store a first image of a face in the memory module, associate an identity with the first image of a face, subsequently collect a second image of a face, determine a correlation between features on the first image of a face and the second image of a face, and store the correlation between the first image and the second image. Other embodiments may be described.
    Type: Grant
    Filed: April 1, 2011
    Date of Patent: January 7, 2014
    Assignee: Intel Corporation
    Inventors: Branislav Kveton, Kenton M. Lyons
  • Publication number: 20140007148
    Abstract: A system and method for adapting data processing of media having video content based, at least in part on, characteristics of a viewer captured from a sensor during presentation of the media to the viewer. During presentation of video content, a sensor may capture a viewer's eye movement and the focus of the viewer's gaze relative to a display upon which the video content is being displayed, wherein regions of the display in which the viewer's gaze is focused may be indicative of viewer interest in corresponding subject matter and regions of the display in which the viewer's gaze is not focused may be indicative of lack of viewer interest in corresponding subject matter. The system is configured to prioritize processing of the media file based, at least in part, on identified regions of interest and non-interest, wherein regions of interest are processed with higher priority than regions of non-interest.
    Type: Application
    Filed: June 28, 2012
    Publication date: January 2, 2014
    Inventors: Joshua J. Ratliff, Kenton M. Lyons
  • Publication number: 20140002629
    Abstract: A system and method for enhancing the peripheral vision of a user is disclosed. In some embodiments, the systems and methods image objects outside the field of view of the user with at least one sensor. The sensor may be coupled to eyewear that is configured to be worn over the eye of a user. Upon detection of said object(s), an indicator may be displayed in a display coupled to or integral with a lens of the eyewear. The indicator may be produced in a region of the display that is detectable by the user's peripheral vision. As a result, the user may be alerted to the presence of objects outside his/her field of view. Because the indicator is configured for detection by the user's peripheral vision, impacts on the user's foveal vision may be limited, minimized, or even eliminated.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Joshua J. Ratcliff, Kenton M. Lyons
  • Publication number: 20140002486
    Abstract: Information is delivered about a particular external environment using a transparent display. In one embodiment, a method includes determining a position of a mobile transparent display, determining an orientation of the display, retrieving information about the environment of the display using the determined position and orientation, and overlaying the retrieved information over a view on the transparent display.
    Type: Application
    Filed: June 29, 2012
    Publication date: January 2, 2014
    Inventors: Joshua J. Ratcliff, Kenton M. Lyons