Patents by Inventor Kenneth M. Karakotsios

Kenneth M. Karakotsios has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9201585
    Abstract: Various embodiments enable a user to navigate between various hierarchical or functional levels of a computing device by providing a hand gesture, such as a multiple finger pinch or splay. For example, a user viewing an application page on an interface of a computing device can leave the application page and navigate to the home page with a single hand gesture. Accordingly, the same hand gesture could be subsequently used to navigate the user to a higher functional level, such as a network level, a disk utility level, and the like. A multiple finger pinch or splay can also be utilized as a trigger to reveal an application's chrome, reveal running applications, provide a short-cut to accessing favorite applications or notes, or to provide alternative views or organization schemes for applications, documents, and the like.
    Type: Grant
    Filed: September 17, 2012
    Date of Patent: December 1, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, David W. Stafford
  • Patent number: 9196219
    Abstract: The implementations described include automatically detecting an object in a series of images, identifying that object as having a hand shape, obtaining color value samples from the object and utilizing those color values to generate a custom color spectrum from use in detecting and tracking a user's hand. In some implementations, the custom color spectrum may be periodically updated by obtaining additional color value samples of the user's hand and updating the custom color spectrum to include those color value samples.
    Type: Grant
    Filed: July 18, 2012
    Date of Patent: November 24, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Dong Zhou, Kenneth M. Karakotsios
  • Patent number: 9152185
    Abstract: A back touch sensor positioned on a back surface of a device accepts user input in the form of touches. The touches on the back touch sensor map keys on a virtual keyboard, a pointer input, and so forth. Touches on a touch sensor positioned on a front surface provide additional input while also allowing the user to grasp and hold the device.
    Type: Grant
    Filed: March 4, 2014
    Date of Patent: October 6, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Bradley J. Bozarth, Hannah Rebecca Lewbel
  • Patent number: 9123272
    Abstract: An electronic device can utilize one or more sensors and/or imaging elements to determine the relative position of at least one light source relative to the device. In various embodiments, occlusions can be used to cause shadows to be cast on certain sensors. By determining the relative position of each occlusion relative to the sensor, the device can determine an approximate direction of the light source. Utilizing the relative position of a light source, the electronic device can properly light or shade a graphical object to be rendered by the device or otherwise process image information captured by the device.
    Type: Grant
    Filed: May 13, 2011
    Date of Patent: September 1, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Leo B. Baldwin, Kenneth M. Karakotsios, Volodymyr V. Ivanchenko, Isaac S. Noble, Gregory M. Hart, Jeffrey P. Bezos
  • Publication number: 20150149961
    Abstract: Various interfaces and control schemes are provided that enable a user to enter text or select specific elements of an interface based at least in part upon relative motion. In some embodiments, a user can enter text by tilting a device to locate a specific character, and press on a button or specified area of a device to select that character. In other embodiments, a user can select a character by looking at a character and performing a selection action, a user device able to determine the character by tracking a gaze direction of the user. Any relative movement of a device and/or a user can be used as input, allowing for one-handed or even hands-free entrance of text or other selection of elements of a graphical user interface.
    Type: Application
    Filed: January 30, 2015
    Publication date: May 28, 2015
    Inventor: Kenneth M. Karakotsios
  • Patent number: 9041734
    Abstract: Image information displayed on an electronic device can be modified based at least in part upon a relative position of a user with respect to a device. Mapping, topological or other types of positional data can be used to render image content from a perspective that is consistent with a viewing angle for the current relative position of the user. As that viewing angle changes, as a result of movement of the user and/or the device, the content can be re-rendered or otherwise updated to display the image content from a perspective that reflects the change in viewing angle. Simulations of effects such as parallax and occlusions can be used with the change in perspective to provide a consistent user experience that provides a sense of three-dimensional content even when that content is rendered on a two-dimensional display. Lighting, shading and/or other effects can be used to enhance the experience.
    Type: Grant
    Filed: August 12, 2011
    Date of Patent: May 26, 2015
    Assignee: AMAZON TECHNOLOGIES, INC.
    Inventors: Howard D. Look, Leo B. Baldwin, Kenneth M. Karakotsios, Dennis Hodge, Isaac S. Noble, Volodymyr V. Ivanchenko, Jeffrey P. Bezos
  • Patent number: 8988556
    Abstract: A user attempting to obtain information about an object can capture image information including a view of that object, and the image information can be used with a matching or identification process to provide information about that type of object to the user. Information about the orientation of the camera and/or device used to capture the image can be provided in order to limit an initial search space for the matching or identification process. In some embodiments, images can be selected for matching based at least in part upon having a view matching the orientation of the camera or device. In other embodiments, images of objects corresponding to the orientation can be selected. Such a process can increase the average speed and efficiency in locating matching images. If a match cannot be found in the initial space, images of other views and categories can be analyzed as well.
    Type: Grant
    Filed: June 15, 2012
    Date of Patent: March 24, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Volodymyr V. Ivanchenko, Isaac S. Noble, Dong Zhou
  • Publication number: 20150078623
    Abstract: The computational resources needed to perform processes such as image recognition can be reduced by determining appropriate frames of image information to use for the processing. In some embodiments, infrared imaging can be used to determine when a person is looking substantially towards a device, such that an image frame captured at that time will likely be adequate for facial recognition. In other embodiments, sound triangulation or motion sensing can be used to assist in determining which captured image frames to discard and which to select for processing based on any of a number of factors indicative of a proper frame for processing.
    Type: Application
    Filed: November 24, 2014
    Publication date: March 19, 2015
    Inventors: KENNETH M. KARAKOTSIOS, Kah Kuen Fu, Volodymyr V. Ivanchenko, Mingjing Huang
  • Publication number: 20150062121
    Abstract: Instances of content, such as search results or browse items, can be displayed using a plurality of three-dimensional elements, with selected pieces of information for each instance placed upon faces, sides, or other portions of those elements. A user can view similar information for each of the instances of content by rotating the elements, such as by interacting with an input element or rotating a portable computing device rendering the elements. The user can apply various filtering criteria or value ranges, whereby the relative position of the elements in three-dimensional space can be adjusted based at least in part upon the applied values. By rotating the elements, applying criteria, and changing the camera view of the elements, a user can quickly compare a large number of instances of context according to a number of different criteria, and can quickly locate items of interest from a large selection of items.
    Type: Application
    Filed: September 8, 2014
    Publication date: March 5, 2015
    Inventors: Kenneth M. Karakotsios, Bradley J. Bozarth
  • Publication number: 20150062006
    Abstract: A user can emulate touch screen events with motions and gestures that the user performs at a distance from a computing device. A user can utilize specific gestures, such as a pinch gesture, to designate portions of motion that are to be interpreted as input, to differentiate from other portions of the motion. A user can then perform actions such as text input by performing motions with the pinch gesture that correspond to words or other selections recognized by a text input program. A camera-based detection approach can be used to recognize the location of features performing the motions and gestures, such as a hand, finger, and/or thumb of the user.
    Type: Application
    Filed: November 5, 2014
    Publication date: March 5, 2015
    Inventors: Kenneth M. KARAKOTSIOS, Dong Zhou
  • Patent number: 8947355
    Abstract: Various interfaces and control schemes are provided that enable a user to enter text or select specific elements of an interface based at least in part upon relative motion. In some embodiments, a user can enter text by tilting a device to locate a specific character, and press on a button or specified area of a device to select that character. In other embodiments, a user can select a character by looking at a character and performing a selection action, a user device able to determine the character by tracking a gaze direction of the user. Any relative movement of a device and/or a user can be used as input, allowing for one-handed or even hands-free entrance of text or other selection of elements of a graphical user interface.
    Type: Grant
    Filed: March 25, 2010
    Date of Patent: February 3, 2015
    Assignee: Amazon Technologies, Inc.
    Inventor: Kenneth M. Karakotsios
  • Patent number: 8942434
    Abstract: The pupil locations of a user with respect to a computing device can be determined by capturing one or more images of the user and analyzing those images using a set of pupil detection algorithms. Each algorithm can produce at least one estimated position with an associated confidence value, and this information from each algorithm can be used to determine a probable location of each pupil. In some embodiments, one or more environmental factors can be used to adjust the confidence values or select algorithms based on how the corresponding algorithms perform under those conditions. Similarly, an independence of the various algorithms can be utilized in some embodiments to adjust the confidence levels or weight results based on a level of dependence between those algorithms.
    Type: Grant
    Filed: December 20, 2011
    Date of Patent: January 27, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Isaac S. Noble, Edwin Joseph Selker
  • Patent number: 8943582
    Abstract: Data on a first computing device can be represented by a graphical object displayed on a screen of the first device. A user can initiate an “attach event” (e.g., a pinching gesture with respect to the object) to enable the object (e.g., the data represented by the object) to be associated and/or virtually attached to him/her. One or more cameras can view/track the user's hand/finger movement(s). Based on the viewed/tracked movement(s), the object representing the data can be moved on a screen of the first device to correspond to the movement of the user's hand/finger. The object can also be moved to a position on a screen of a second computing device when the user moves his/her hand/finger to an area corresponding to the position. A user initiated “release event” (e.g., an unpinching gesture) can end the association and enable the data to be shared with the second device.
    Type: Grant
    Filed: July 18, 2012
    Date of Patent: January 27, 2015
    Assignee: Amazon Technologies, Inc.
    Inventors: Dong Zhou, Kenneth M. Karakotsios
  • Patent number: 8922662
    Abstract: Image capture can be improved by capturing a sequence of images and analyzing the images to select the image with the least blur and/or an acceptable amount of blur. Gradients can be calculated for at least a portion of the images, and gradient histograms generated. Two or more component curves can be fit to each histogram, such as by using a Gaussian mixture model, and the curves can be compared to determine an amount of variation between the curves. The image with the smallest differences between component curves, or with differences less than a specified blur threshold, can be selected as a sufficiently sharp image and provided for viewing, processing, or another intended purpose of the image to be captured.
    Type: Grant
    Filed: July 25, 2012
    Date of Patent: December 30, 2014
    Assignee: Amazon Technologies, Inc.
    Inventors: Kah Kuen Fu, Kenneth M. Karakotsios, Volodymyr V. Ivanchenko
  • Patent number: 8913004
    Abstract: The determination of a gaze direction or field of view of a user with respect to a computing device can control aspects of the device, such as to reduce power or resource consumption. A computing device can include an imaging element and software for locating aspects of a user's facial features relative to the device, such that an orientation of the user's features relative to the device can be determined. Various actions on the device can be executed based at least in part upon a determined gaze direction of the user. In some embodiments, a display element of the device can turn off, and one or more inputs can be disabled, when the user is not looking at the device.
    Type: Grant
    Filed: March 5, 2010
    Date of Patent: December 16, 2014
    Assignee: Amazon Technologies, Inc.
    Inventors: Bradley J. Bozarth, Kenneth M. Karakotsios, Jeffrey P. Bezos
  • Patent number: 8902198
    Abstract: A user can emulate touch screen events with motions and gestures that the user performs at a distance from a computing device. A user can utilize specific gestures, such as a pinch gesture, to designate portions of motion that are to be interpreted as input, to differentiate from other portions of the motion. A user can then perform actions such as text input by performing motions with the pinch gesture that correspond to words or other selections recognized by a text input program. A camera-based detection approach can be used to recognize the location of features performing the motions and gestures, such as a hand, finger, and/or thumb of the user.
    Type: Grant
    Filed: January 27, 2012
    Date of Patent: December 2, 2014
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Dong Zhou
  • Patent number: 8897510
    Abstract: The computational resources needed to perform processes such as image recognition can be reduced by determining appropriate frames of image information to use for the processing. In some embodiments, infrared imaging can be used to determine when a person is looking substantially towards a device, such that an image frame captured at that time will likely be adequate for facial recognition. In other embodiments, sound triangulation or motion sensing can be used to assist in determining which captured image frames to discard and which to select for processing based on any of a number of factors indicative of a proper frame for processing.
    Type: Grant
    Filed: January 9, 2014
    Date of Patent: November 25, 2014
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Kah Kuen Fu, Volodymyr V. Ivanchenko, Mingjing Huang
  • Patent number: 8861310
    Abstract: The relative positions of two or more electronic devices can be determined utilizing ultrasonic beacons. Each device can have a unique signature that can be included in the beacon broadcast by that device. A device having an array of ultrasonic detectors can receive a beacon and correlate the beacon received at each detector. The time of arrival then can be used to determine the relative position of the source of the beacon. The signature in that beacon can also be used to determine the identity of the device that broadcast the beacon, in order to determine the identity of the device, or a user of that device, at the determined relative position. The devices can be configured to transmit signals over the air or through a specific transmission medium, such as propagating surface. Further, a dedicated detector array can be used for determining multiple relative positions.
    Type: Grant
    Filed: March 31, 2011
    Date of Patent: October 14, 2014
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Andrew D. Price
  • Publication number: 20140253440
    Abstract: A back touch sensor positioned on a back surface of a device accepts user input in the form of touches. The touches on the back touch sensor map keys on a virtual keyboard, a pointer input, and so forth. Touches on a touch sensor positioned on a front surface provide additional input while also allowing the user to grasp and hold the device.
    Type: Application
    Filed: March 4, 2014
    Publication date: September 11, 2014
    Applicant: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Bradley J. Bozarth, Hannah Rebecca Lewbel
  • Patent number: 8830225
    Abstract: Instances of content, such as search results or browse items, can be displayed using a plurality of three-dimensional elements, with selected pieces of information for each instance placed upon faces, sides, or other portions of those elements. A user can view similar information for each of the instances of content by rotating the elements, such as by interacting with an input element or rotating a portable computing device rendering the elements. The user can apply various filtering criteria or value ranges, whereby the relative position of the elements in three-dimensional space can be adjusted based at least in part upon the applied values. By rotating the elements, applying criteria, and changing the camera view of the elements, a user can quickly compare a large number of instances of context according to a number of different criteria, and can quickly locate items of interest from a large selection of items.
    Type: Grant
    Filed: March 25, 2010
    Date of Patent: September 9, 2014
    Assignee: Amazon Technologies, Inc.
    Inventors: Kenneth M. Karakotsios, Bradley J. Bozarth