Patents by Inventor Christopher Brian Fleizach

Christopher Brian Fleizach has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20110298723
    Abstract: An electronic device with a display and a touch-sensitive surface displays a plurality of user-selectable objects. A respective user-selectable object has a corresponding activation region on the touch-sensitive surface with an activation region size. The activation region size has a respective default size when a representative point for a finger contact is located outside the activation region. The activation region size has a respective expanded size when the representative point is located within the activation region. The device: detects movement of the finger contact across the touch-sensitive surface; in response, changes the size of the activation region for the respective user-selectable object between the respective default size and the respective expanded size in accordance with the movement of the finger contact; detects a user input when the representative point is located within the activation region for the respective user-selectable object; and, in response, performs a predefined operation.
    Type: Application
    Filed: June 7, 2010
    Publication date: December 8, 2011
    Inventors: Christopher Brian Fleizach, Reginald Hudson
  • Publication number: 20100309147
    Abstract: An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display; in response to detecting a first user interface navigation gesture by a finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with a current navigable unit type; in response to detecting a first user interface navigation setting gesture on the touch-sensitive surface: changing the current navigable unit type from the first navigable unit type to a second navigable unit type; and outputting accessibility information about the second navigable unit type; after changing the current navigable unit type, in response to detecting a second user interface navigation gesture by the finger on the touch-sensitive surface, navigating in the plurality of user interface elements in accordance with the current navigable unit type.
    Type: Application
    Filed: September 23, 2009
    Publication date: December 9, 2010
    Inventors: Christopher Brian Fleizach, Eric Taylor Seymour
  • Publication number: 20100309148
    Abstract: An accessibility method is performed by an electronic device with a display and a touch-sensitive surface. The method includes: mapping at least a first portion of the display to the touch-sensitive surface; concurrently displaying a plurality of user interface containers on the display; detecting a user interface container selection event that selects a first user interface container in the plurality of user interface containers; and, in response to detecting the user interface container selection event: ceasing to map the first portion of the display to the touch-sensitive surface, and proportionally mapping the first user interface container to be substantially coextensive with the touch-sensitive surface.
    Type: Application
    Filed: September 23, 2009
    Publication date: December 9, 2010
    Inventors: Christopher Brian Fleizach, Eric Taylor Seymour
  • Publication number: 20100313125
    Abstract: A method is performed by an accessible electronic device with a display and a touch-sensitive surface. The method includes: displaying a plurality of user interface elements on the display, wherein a current focus is on a first user interface element; detecting a first finger gesture on the touch-sensitive surface, wherein the first finger gesture is independent of contacting a location on the touch-sensitive surface that corresponds to a second user interface element; and, in response to detecting the first finger gesture: changing the current focus from the first user interface element in the plurality of user interface elements to the second user interface element in the plurality of user interface elements; and outputting accessibility information associated with the second user interface element.
    Type: Application
    Filed: September 23, 2009
    Publication date: December 9, 2010
    Inventors: Christopher Brian Fleizach, Eric Taylor Seymour, Reginald Dean Hudson
  • Publication number: 20090254345
    Abstract: Techniques for improved text-to-speech processing are disclosed. The improved text-to-speech processing can convert text from an electronic document into an audio output that includes speech associated with the text as well as audio contextual cues. One aspect provides audio contextual cues to the listener when outputting speech (spoken text) pertaining to a document. The audio contextual cues can be based on an analysis of a document prior to a text-to-speech conversion. Another aspect can produce an audio summary for a file. The audio summary for a document can thereafter be presented to a user so that the user can hear a summary of the document without having to process the document to produce its spoken text via text-to-speech conversion.
    Type: Application
    Filed: April 5, 2008
    Publication date: October 8, 2009
    Inventors: Christopher Brian Fleizach, Reginald Dean Hudson