Spelling, Phonics, Word Recognition, Or Sentence Formation Patents (Class 434/167)
  • Patent number: 9747272
    Abstract: A computing device is described that outputs for display at a presence-sensitive screen, a graphical keyboard having keys. The computing device receives an indication of a selection of one or more of the keys. Based on the selection the computing device determines a character string from which the computing device determines one or more candidate words. Based at least in part on the candidate words and a plurality of features, the computing device determines a spelling probability that the character string represents an incorrect spelling of at least one candidate word. The plurality of features includes a spatial model probability associated with at least one of the candidate words. If the spelling probability satisfies a threshold, the computing device outputs for display the at least one candidate word.
    Type: Grant
    Filed: March 6, 2014
    Date of Patent: August 29, 2017
    Assignee: Google Inc.
    Inventors: Yu Ouyang, Shumin Zhai
  • Patent number: 9679496
    Abstract: Reverse Language Resonance methods are described for instructing a target language to a learner who speaks a native language. The methods may include providing to the learner a predetermined lesson comprising a lesson text of a plurality of lesson words that are exclusively in the target language. The methods may further include priming implicit memory of the learner. The methods may further include displaying the lesson text on a display and playing a recorded version of spoken words of the lesson text on an audio output while the lesson text is displayed. The methods may further include instructing the learner to perform Concurrent Triple Activity including simultaneously reading the lesson text on the display, listening to the spoken words from the audio output, and repeating the spoken words along with the recorded version into an audio input while the recorded version is playing.
    Type: Grant
    Filed: November 30, 2012
    Date of Patent: June 13, 2017
    Inventor: Arkady Zilberman
  • Patent number: 9671946
    Abstract: Content is displayed on a touchscreen display of a computing system such as an electronic book reader. The content is displayed according to a setting for a first attribute (e.g., level of brightness) and a setting for a second attribute (e.g., day mode or night mode). In response to sensing a motion proximate to the touchscreen, the setting for the first attribute is changed to a different value. In response to a value for the setting for the first attribute crossing a threshold value (e.g., while the motion is being performed), the setting for the second attribute is changed.
    Type: Grant
    Filed: February 6, 2014
    Date of Patent: June 6, 2017
    Assignee: RAKUTEN KOBO, INC.
    Inventors: Sneha Patel, Anthony O'Donoghue
  • Patent number: 9582913
    Abstract: Various embodiments enable a computing device to perform tasks such as highlighting words in an augmented reality view that are important to a user. For example, word lists can be generated and the user, by pointing a camera of a computing device at a volume of text, can cause words from the word list within the volume of text to be highlighted in a live field of view of the camera displayed thereon. Accordingly, users can quickly identify textual information that is meaningful to them in an Augmented Reality view to aid the user in sifting through real-world text.
    Type: Grant
    Filed: September 25, 2013
    Date of Patent: February 28, 2017
    Assignee: A9.com, Inc.
    Inventors: Adam Wiggen Kraft, Arnab Sanat Kumar Dhua, Douglas Ryan Gray, Xiaofan Lin, Yu Lou, Sunil Ramesh, Colin Jon Taylor, David Creighton Mott
  • Patent number: 9472113
    Abstract: A computing device may provide a visual cue to items of content (for example, words in a book) synchronized with the playback of companion content (for example, audio content corresponding to the book). For example, embodiments of the present disclosure are directed to a content playback synchronization system for use with physical books (or other physical media). In an embodiment, the computing device may display a visual cue (for example, an underline, box, dot, cursor, or the like) to identify a current location in textual content of the physical book corresponding to a current output position of companion audio content. As the audio content is presented (i.e., as it “plays back”), the highlight and/or visual cue may be advanced to maintain synchronization between the output position within the audio content and a corresponding position in the physical textual content.
    Type: Grant
    Filed: February 5, 2013
    Date of Patent: October 18, 2016
    Assignee: AUDIBLE, INC.
    Inventors: Douglas Cho Hwang, Guy Ashley Story, Jr.
  • Patent number: 9116880
    Abstract: A processing system is described which generates stimulus information (SI) having one or more stimulus components (SCs) selected from an inventory of such components. The processing system then presents the SI to a group of human recipients, inviting those recipients to provide linguistic descriptions of the SI. The linguistic information that is received thereby has an implicit link to the SCs. Further, each linguistic component is associated with at least one feature of a target environment, such as a target computer system. Hence, the linguistic information also maps to the features of the target environment. These relationships allow applications to use the linguistic information to interact with the target environment in different ways. In one case, the processing system uses a challenge-response authentication task presentation to convey the stimulus information to the recipients.
    Type: Grant
    Filed: November 30, 2012
    Date of Patent: August 25, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: William B. Dolan, Christopher I. Charla, Christopher B. Quirk, Christopher J. Brockett, Noelle M. Sophy, Nicole Beaudry, Vikram Reddy Dendi, Pallavi Choudhury, Scott T. Laufer, Robert A. Sim, Thomas E. Woolsey, David Molnar
  • Patent number: 9058751
    Abstract: Disclosed herein, in certain embodiments, are computer-based language phoneme practice systems, products, programs, and methods comprising a digital processing device and a computer program that creates a language phoneme practice engine, wherein said engine comprises: a taxonomy of phonemes of a target language; a software module for providing an interface for practicing each said phoneme in said taxonomy, wherein said interface allows a learner to optionally access a visual representation and an auditory representation of each said phoneme in said taxonomy; and a software module for providing an interface for practicing each said phoneme in the context of the beginning, middle, and end of words of said target language, wherein said interface allows a learner to optionally access a visual and an auditory representation of each said word and each said phoneme in each said word.
    Type: Grant
    Filed: November 21, 2011
    Date of Patent: June 16, 2015
    Assignee: AGE OF LEARNING, INC.
    Inventors: Doug Dohring, David Hendry, Stephanie Yost, Jerry Chiawei Chen
  • Patent number: 9043195
    Abstract: A system to teach phonemic awareness uses a plurality of phonemes and a plurality of graphemes. Each phoneme is a unique sound and an indivisible unit of sound in a spoken language, and each grapheme is a written representation of one of the plurality of phonemes. A plurality of distinct graphical images and a plurality of unique names are provided where each unique name is associated with one of the graphical images and represents a grouping of graphemes selected from the plurality of graphemes. The system uses a plurality of sets of display pieces having a plurality of individual display pieces. Each individual display piece includes at least a portion of one of the graphical images and the graphemes from the grouping of graphemes constituting the associated unique name. A predefined instructional environment defines a predefined spatial context and predefined rules governing the acquisition and utilization of individual display pieces.
    Type: Grant
    Filed: September 26, 2011
    Date of Patent: May 26, 2015
    Inventor: Jaclyn Paris
  • Publication number: 20150140521
    Abstract: An apparatus comprises a bag used for individuals with special needs (children and adults) and for typically developing children. The bag is used as a shopping list and schedules for what may happen on other community events to help with an easier transition on these activities for anyone that may need assistance. The inside of the bag contains storage for cards holding many product pieces. The front of the bag holds a small shopping list for those with decreased cognitive abilities. The inside of the bag may have additional hook sew-on fastener tape to provide for a longer shopping list for those with high cognitive abilities. The back of the bag has a large pocket holding product pieces once the user finds a product on their shopping list. The small pocket with flap is used to hold money, gift cards, credit cards, etc.
    Type: Application
    Filed: November 19, 2014
    Publication date: May 21, 2015
    Inventor: Tarryl Susan Zdanky
  • Patent number: 9028257
    Abstract: A method and system are presented to address quantitative assessment of word recognition sensitivity of a subject, where the method comprises the steps of: (1) presenting at least one scene, comprising a plurality of letters and a background, to a subject on a display; (2) moving the plurality of letters relative to the scene; (3) receiving feedback from the subject via at least one; (4) quantitatively refining the received feedback; (5) modulating the saliency of the plurality of letters relative to accuracy of the quantitatively refined feedback; (6) calculating a critical threshold parameter; and (7) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: May 12, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 9028256
    Abstract: A method and system are presented to address quantitative assessment of visual motor response in a subject, where the method comprises the steps of: (1) presenting at least one scene to a subject on a display; (2) modulating the contrast of a predetermined section of the scene; (3) moving the predetermined section relative to the scene with the movement being tracked by the subject via at least one input device; (4) measuring a kinematic parameter of the tracked movement; (5) quantitatively refining the tracked movement; (6) determining the relationship between at least one of the scene and the quantitatively refined tracked movement; (7) adjusting the modulated contrast relative to the quantitatively refined tracked movement; (8) calculating a critical threshold parameter for the subject; and (9) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: May 12, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Publication number: 20150111182
    Abstract: These magnetic, write-on/wipe-off letters, numbers, and geometric shapes enable a child to learn to write and spell quickly. Primarily, they are individual pieces that are packaged as a set and have a grooved write-on/wipe-off surface with a magnetic base, so as to adhere to any magnetic surface enabling them to be manipulated quickly and easily, which will accelerate the process of learning to write and spell.
    Type: Application
    Filed: October 20, 2013
    Publication date: April 23, 2015
    Inventor: Sonya Janine Nelson
  • Patent number: 8997004
    Abstract: Techniques for real-time observation assessment are provided. The techniques, which are designed for educators, take advantage of handheld computers, desktop/laptop computers and Internet access in order to reduce the paperwork associated with conventional educational assessments. An array of instructional assessment applications are designed to run on handheld computers. The instructional assessment applications may be based on existing and widely used paper methodologies. A common Web-based platform for assessment application distribution, selection, download, data management and reporting is also provided. Users can then periodically synchronize instructional data (assessments, diagnostic results, notes and/or schedules) to the Web site. At the Web site, browser-based reports and analysis can be viewed, administered and shared via electronic mail.
    Type: Grant
    Filed: January 16, 2014
    Date of Patent: March 31, 2015
    Assignee: Amplify Education, Inc.
    Inventors: Lawrence Jason Berger, Gregory M. Gunn, John D. Stewart, Kenneth M. Gunn, Elizabeth Lynn, Nicole M. Adams, Anouk Markovits, Aaron Boyd
  • Patent number: 8986015
    Abstract: A method and system are presented to address quantitative assessment of social cues sensitivity of a subject, where the method comprises the steps of: (1) presenting at least one scene, comprising a single facial expressions and a background, to a subject on a display; (2) adjusting the facial expression on the scene; (3) receiving feedback from the subject via at least one input device; (4) quantitatively refining the received feedback; (5) modulating the adjusted facial expression relative to the accuracy of the quantitatively refined feedback; (6) transforming the modulated facial expression; (7) calculating a critical threshold parameter; and (8) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 24, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979542
    Abstract: A method and system is presented to address quantitative assessment of functional impairment in a subject, comprising: presenting a plurality of tests to a subject on a display, the tests comprising a series of scenes, the scene comprising at least one stimulus, wherein the scene may be associated with visual, auditory, and tactile stimulus arrays; modulating at least one spatial characteristic of at least one stimulus; receiving feedback from the subject via at least one input device; quantitatively refining the scene relative to the received feedback; calculating the relationship between at least one of the scene, the stimulus, and the quantitatively refined scene; adjusting the scene relative to the calculated relationship; determining equilibrated scene parameter of the subject; recording the equilibrated scene parameter onto a tangible computer readable medium; generating at least one diagnosis to the functional impairment associated with the equilibrated scene parameter; recommending treatment to the su
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979546
    Abstract: A method and system are presented to address quantitative assessment of facial emotion nulling of a subject, where the method comprises the steps of: (1) presenting at least one scene, comprising a single facial expressions and a background, to a subject on a display; (2) adjusting the facial expression on the scene; (3) receiving feedback from the subject via at least one input device; (4) quantitatively refining the received feedback; (5) modulating the adjusted facial expression relative to the accuracy of the quantitatively refined feedback; (6) transforming the modulated facial expression; (7) calculating a critical threshold parameter; and (8) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979547
    Abstract: A method and system are presented to address quantitative assessment of verbal memory of a subject, where the method comprises the steps of: (1) presenting at least one scene to the subject on a display, the scene comprising a plurality of symbols and a background; (2) moving the plurality of symbols relative to the scene, the movement being tracked by the subject via at least one input device; (3) adjusting the saliency of the plurality of symbols relative to the tracked movement; (4) increasing the number of said plurality of symbols; (5) receiving feedback from the subject via said input device; (6) quantitatively refining the received feedback; (7) modulating the movement of the plurality of symbols relative to the accuracy of the quantitatively refined feedback; (8) calculating a critical threshold parameter; and (9) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979545
    Abstract: A method and system are presented to address quantitative assessment of social interactions nulling testing of a subject, where the method comprises the steps of: (1) presenting at least one scene to the subject on a display, the scene comprising a plurality of body images and a background; (2) receiving feedback from the subject via at least one input device; (3) quantitatively refining the received feedback; (4) modulating the adjusted body image relative to the accuracy of the received feedback; (5) calculating a critical threshold parameter for the subject; (6) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Central Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979541
    Abstract: A method and system are presented to address quantitative assessment of spatial sequence memory in a subject, where the method comprises the steps of: (1) presenting at least one scene to the subject on a display, the scene comprising a plurality of elements and a background; (2) modulating the saliency of a predetermined section of said scene (3) receiving feedback from the subject via at least one input device; (4) modifying the saliency of the predetermined section; (5) adjusting a functional assessment parameter relative to the scene; (6) moving the predetermined section relative to the scene; (7) receiving refined feedback from the subject via the input device; (8) quantitatively refining the refined feedback; (9) calculating a critical threshold parameter for the subject; and (10) recording the critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979544
    Abstract: A method and system are presented to address quantitative assessment of word recognition sensitivity of a subject, where the method comprises the steps of: (1) presenting at least one scene to a subject on a display, the scene comprising a plurality of letters and a background; (2) distorting the scene; (3) moving plurality of letters relative to the scene, the movement being tracked by the subject via at least one input device; (4) receiving feedback from the subject via the input device; (5) quantitatively refining the received feedback; (6) modulating the movement of plurality of letters relative to accuracy of the received feedback; (7) calculating a critical threshold parameter; and (8) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8979543
    Abstract: A method and system are presented to address quantitative assessment of word identification latency of a subject, where the method comprises the steps of: (1) presenting at least one scene, comprising a plurality of letters and a background, to a subject on a display; (2) moving the plurality of letters relative to the scene; (3) receiving feedback from the subject via the input device; (4) quantitatively refining the received feedback; (5) modulating the movement of plurality of letters relative to accuracy of the quantitatively refined feedback; ((6) calculating a critical threshold parameter; and (7) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 17, 2015
    Assignee: Cerebral Assesment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8974231
    Abstract: A method and system are presented to address quantitative assessment of letter identification latency of a subject, where the method comprises the steps of: (1) presenting at least one scene to a subject on a display, the scene comprising a plurality of letters and a background; (2) moving the plurality of letters relative to the scene, the movement being tracked by the subject via at least one input device; (3) receiving feedback from the subject via the input device; (4) quantitatively refining the received feedback; (5) modulating the movement of plurality of letters relative to accuracy of the received feedback; (6) calculating a critical threshold parameter; and (7) recording a critical threshold parameter onto a tangible computer readable medium.
    Type: Grant
    Filed: August 21, 2014
    Date of Patent: March 10, 2015
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles Joseph Duffy
  • Patent number: 8961183
    Abstract: Embodiments of the present invention are generally directed to an Audio-Story Engine that includes a repository of prerecorded audio files that, when played in a certain sequence, with user provided recordings placed throughout, tell a story. To obtain the user provided recordings, the Audio-Story Engine asks the user to make audio recordings of various words or phrases. For example, the Audio-Story Engine may ask the user a series of questions in order to record and store the user's audible responses. Upon completion, the Audio-Story Engine plays back a completed story that incorporates the user's audio recordings by playing an appropriate user recording after playing a prerecorded audio file. This is repeated several times in sequence to form a seamless, customized, audio story. In addition, the Audio-Story Engine may alter the pitch or sound of the user's recorded words to match the pitch of the prerecorded story.
    Type: Grant
    Filed: June 4, 2012
    Date of Patent: February 24, 2015
    Assignee: Hallmark Cards, Incorporated
    Inventors: Anne Catherine Bates, Jason Paul Gahr, Adam Samuel Scheff, Jason Blake Penrod, Stephanie Farris Young, Timothy Jay Lien, Michael Anthony Monaco, Jr.
  • Patent number: 8918718
    Abstract: A system for enhancing reading performance operates on a network-connected server with software executing from a non-transitory medium at the server providing an interactive interface for a user connected to the server via a browser link. There is a data repository coupled to the server. The interactive interface provides a word search exercise for the user for improving the user's reading performance, displays a passage comprising a first number of words and a search list with a second number of words that each appear at least once in the passage, the second number smaller than the first number, and when the user clicks on every word in the passage for a word that appears in the search list, that word is indicated in the list as found, until all the words in the search list have been indicated as found.
    Type: Grant
    Filed: February 27, 2012
    Date of Patent: December 23, 2014
    Assignee: John Burgess Reading Performance System
    Inventor: John Burgess
  • Publication number: 20140335483
    Abstract: Social networking applications may be improved by incorporating a user's language proficiency to make content suggestions to the user. A language preference of a user, which may represent one of a plurality of signals, may be received. A signal may be, for example, an online activity of the user, a text generated or received by the user, or content requested by the user. At least one of the plurality of signals may be analyzed using a machine learning program. A machine learning program may be trained on data for a test group of users with a known language proficiency. A user-assigned language proficiency may be incorporated as a signal in training a machine learning program. The language proficiency of the user may be determined based upon the analysis of at least one of the plurality of signals. Content may be presented to the user based upon the language proficiency of the user.
    Type: Application
    Filed: May 13, 2013
    Publication date: November 13, 2014
    Inventors: Kirill BURYAK, Luke Hiro SWARTZ, Andrew SWERDLOW, Cibu JOHNY
  • Publication number: 20140302465
    Abstract: A method for teaching a person, such as a child, to read aloud is disclosed. The method is implemented as a voice activated, visual software application that provides a visual display of a story as the person reads the words of the story. The method provides pronunciation assistance to the person, which in the software includes a simulated “teacher” that helps the person with assistance in reading the story. The method progresses through a sequence of story passages which together represent the entire story. The method progresses to each story passage upon successful completion of the current story passage by the person reading the story, such that the method will not allow the visual story to progress for the reader if a word is not read correctly out loud.
    Type: Application
    Filed: April 4, 2014
    Publication date: October 9, 2014
    Inventors: Jonathan Andrew WATERHOUSE, Mary Allen MARSHALL
  • Publication number: 20140278376
    Abstract: Computer-implemented systems and methods are provided for automatically generating recitation items. For example, a computer performing the recitation item generation can receive one or more text sets that each includes one or more texts. The computer can determine a value for each text set using one or more metrics, such as a vocabulary difficulty metric, a syntactic complexity metric, a phoneme distribution metric, a phonetic difficulty metric, and a prosody distribution metric. Then the computer can select a final text set based on the value associated with each text set. The selected final text set can be used as the recitation items for a speaking assessment test.
    Type: Application
    Filed: March 17, 2014
    Publication date: September 18, 2014
    Applicant: Educational Testing Service
    Inventors: Su-Youn Yoon, Lei Chen, Keelan Evanini, Klaus Zechner
  • Publication number: 20140272822
    Abstract: Systems and methods for learning a high-level visual vocabulary generate inter-visual-word relationships between a plurality of visual words based on visual word-label relationships, map the visual words to a vector space based on the inter-visual word relationships, and generate high-level visual words in the vector space.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: CANON KABUSHIKI KAISHA
    Inventors: Yang Yang, Bradley Scott Denney, Juwei Lu, Dariusz Dusberger, Hung Khei Huang
  • Patent number: 8827713
    Abstract: The present invention measures reading fluency, which is simultaneous decoding and comprehension. Whether or not a person is a fluent reader is determined by the size of the visual unit, or sting of letters, used in word recognition. In order to measure the size of the visual unit used in word recognition, a lexical decision task (“LDT”) is used in which short and long words are presented on a display device. The person determines if the string of letters formulates a word. The person enters their response on an input device and the results are recorded. A score is calculated that measures reading fluency. The ability to correctly identify a string of letters as a word using holistic processing, rather than letter-by-letter, is the hallmark of a fluent person.
    Type: Grant
    Filed: February 27, 2013
    Date of Patent: September 9, 2014
    Assignee: University of Minnesota
    Inventor: Jay Samuels
  • Patent number: 8825492
    Abstract: The language-based video game places a player avatar into a game environment contained within a display field following a story narrative or an adventure for completing an objective. The gameplay reinforces pronunciation and writing of a given language. The display field includes a minor head graphic, as can be highlighted text in the given language, interactive text objects, and can include a control icon and a progress icon. The minor head graphic is a representation of a human head, or portion thereof, animated to show pronunciation of the highlighted text. As the player progresses through the game, the player encounters the interactive text objects that, upon activation, transform into useful objects for overcoming challenges present in the game environment, the interactive text object being the same as, substantially the same as, or corresponding to the highlighted text. Avatar movement and interactions are controlled through a control scheme via an interface.
    Type: Grant
    Filed: October 28, 2013
    Date of Patent: September 2, 2014
    Inventor: Yousef A. E. S. M. Buhadi
  • Publication number: 20140220517
    Abstract: A group of vocabulary words are selected for students at an educational institution. The vocabulary words together with corresponding definitions are put on clothing items worn by different students at the educational institution. The students at the educational institutional are tested regarding the group of vocabulary words and their definitions are before the clothing is worn and some period of days after the clothing started being worn. Selected words from the group of vocabulary words are selected for removal from the group based on the results of the second testing. New clothing items with new vocabulary words together with corresponding definitions are worn by students at the educational institution in place of the clothing items with the word selected for removal from the group.
    Type: Application
    Filed: June 28, 2013
    Publication date: August 7, 2014
    Inventor: Michael Bailey
  • Patent number: 8784114
    Abstract: A computer system contains and displays a cartoon-like story with a plurality of displayed components (such as cartoon characters, play items, etc.) recognizable by a preschool student on a first portion of the display screen along with an accompanying voice message on the speaker. Written instructions for teachers to follow in a personal one-to-one interaction with one or more students are displayed on a second portion of the display screen. For some activities, the student may actually respond using the computer such as by touching a touch screen or using a mouse to select one of the displayed components in response to a request to select a displayed component having a predetermined aspect. For other activities the teacher enters the response of the student as observed by the teacher. When one or more correct responses are entered by the student or teacher, achievement of a goal can automatically be entered into the educational management system.
    Type: Grant
    Filed: July 29, 2013
    Date of Patent: July 22, 2014
    Assignee: Assessment Technology, Inc.
    Inventor: David W. Bergan
  • Publication number: 20140199667
    Abstract: A method of automatically converting alphabetic text written in a text based language into a non-text based language. The method can include parsing the text to identify at least one word. The method also can include via a processor, identifying within a lexicon database data corresponding to the word, wherein the data corresponding to the word identifies at least one pictograph or symbol selected from a group of pictographs or symbols consisting of between twenty seven and thirty three distinct pictographs or symbols that visually look different than the text, wherein each pictograph or symbol corresponds to a unique speech sound of the text based language. The pictograph or symbol can be rendered in a view.
    Type: Application
    Filed: March 18, 2014
    Publication date: July 17, 2014
    Inventor: Howard A. Engelsen
  • Patent number: 8777626
    Abstract: An interactive computer based system and method for multi-sensory learning to teach students to read, to write, to improve their fluency, and to improve their reading comprehension. The multi-sensory system combines phonics and reading comprehension to teach students how to decode the alphabet and understand what they read. The multi-sensory system can be individually customized and tailored to meet needs of an individual student.
    Type: Grant
    Filed: May 3, 2013
    Date of Patent: July 15, 2014
    Assignee: Maxscholar, LLC
    Inventors: Daniel M. Levy, Deboarh L. Levy, Elliot G. Levy
  • Patent number: 8777630
    Abstract: A method is presented to address quantitative assessment of facial emotion sensitivity of a subject, where the method comprises the steps of: (1) presenting at least one scene to the subject on a display, the scene comprising a plurality of faces and a background on a display, the plurality of faces comprising a plurality of facial expressions; (2) adjusting at least one facial expression on the scene; (3) receiving feedback from said subject via at least one input device; (4) quantitatively refining the received feedback; (5) modulating the adjusted facial expression relative to the accuracy of the received feedback; (6) (7) transforming the modulated facial expression; (8) calculating a critical threshold parameter; and (9) recording a critical threshold parameter onto a tangible computer readable medium. An apparatus for quantitative assessment of facial emotion sensitivity of a subject comprising a display device, an input device, a control device, and a tangible computer readable medium.
    Type: Grant
    Filed: September 16, 2009
    Date of Patent: July 15, 2014
    Assignee: Cerebral Assessment Systems, Inc.
    Inventor: Charles J. Duffy
  • Publication number: 20140186806
    Abstract: The present invention is a method for assessing a patient's linguistic comprehension using a pupil response system comprising at least one pupillometer configured to measure the patient's pupil responses. The method includes (a) providing the patient with a list of verbal stimuli comprising at least two sets of verbal stimuli, each set of verbal stimuli comprising one or more verbal stimuli; wherein the two sets of the verbal stimuli differ substantially from each other in terms of the difficulty level; (b) presenting to the patient one verbal stimulus at a time from the list of verbal stimuli; (c) measuring and recording the patient's pupil response data for a period of time ranging from 200 milliseconds to 10 seconds during the presentation of each stimulus; and (d) analyzing the pupil response data to assess the patient's linguistic comprehension.
    Type: Application
    Filed: August 9, 2012
    Publication date: July 3, 2014
    Applicant: OHIO UNIVERSITY
    Inventors: Brooke Hallowell, Laura Roche Chapman
  • Patent number: 8758018
    Abstract: EEG-based acceleration of second language learning is accomplished by measuring via single-trial EEG a learner's cognitive response to the presentation (visual or auditory) of language learning materials and updating a user model of latent traits related to language-learning skills in accordance with the cognitive response. The user model is suitably updated with each trial, each trial being triggered by learner fixation on a portion of visual materials and/or a next phoneme in auditory materials. Additional discrimination may be achieved through the use of saccades or fixation duration features.
    Type: Grant
    Filed: December 31, 2009
    Date of Patent: June 24, 2014
    Assignee: Teledyne Scientific & Imaging, LLC
    Inventors: Mark Peot, Mario Aguilar, Aaron T. Hawkins
  • Patent number: 8753125
    Abstract: The submitted game with folding board stylized like a baseball field is uniquely associated with Baseball, providing entertainment about people or an original learning method to sustain interest and retention of any substituted subject. Rules in an instructive copyrighted Manual simultaneously enhance English use. Also boxed for two to twenty-two players aged five on, are two teams alternating at bat once an inning; laminated picture cards with removable holders represent team manager, umpire and nine members. Each of nine members has playing position originally correlated with the nine speech parts. Defense requires Offense to use speech parts designated at bases to fit the nine designated sentence parts for sentence construction to win with runs and walks. The subject's vocabulary is listed by speech parts, kindergarden through twelve on. After pitcher selects verb for each batter, Offense chooses vocabulary as designated to make and record sentences for reteaching.
    Type: Grant
    Filed: August 4, 2009
    Date of Patent: June 17, 2014
    Inventor: Arnot Dawn Havis Libby
  • Patent number: 8719035
    Abstract: Techniques are disclosed for recognizing user personality in accordance with a speech recognition system. For example, a technique for recognizing a personality trait associated with a user interacting with a speech recognition system includes the following steps/operations. One or more decoded spoken utterances of the user are obtained. The one or more decoded spoken utterances are generated by the speech recognition system. The one or more decoded spoken utterances are analyzed to determine one or more linguistic attributes (morphological and syntactic filters) that are associated with the one or more decoded spoken utterances. The personality trait associated with the user is then determined based on the analyzing step/operation.
    Type: Grant
    Filed: March 26, 2008
    Date of Patent: May 6, 2014
    Assignee: Nuance Communications, Inc.
    Inventors: Osamuyimen Thompson Stewart, Liwei Dai
  • Publication number: 20140093846
    Abstract: A system and method for providing an educational grade-level based word game that incorporates kinesthetic, tactile, and auditory learning styles for use in teaching children and other learners how to spell in an entertaining manner. The game correlates grade appropriate words to the common core language arts standards. The game may be portable and uses direct social interaction between real people. The game can incorporate a foreign language as well as sign language. The physical game set may include, but is not limited to, playing cards, one regular numbered game die, one six-sided graphical cube, and a kinesthetic implement, such as an elastic rope or a bouncy ball.
    Type: Application
    Filed: September 27, 2013
    Publication date: April 3, 2014
    Inventor: Tammy Bird
  • Patent number: 8684746
    Abstract: An exemplary proficiency examination system includes a collaborative system allowing multiple administrators across multiple educational institutions to add questions to a bank of questions from which an administrator at a single educational institution may select for a proficiency examination. The questions may be used in a placement examination for placing students into a class of an appropriate skill level.
    Type: Grant
    Filed: August 23, 2011
    Date of Patent: April 1, 2014
    Assignee: Saint Louis University
    Inventor: Daniel Nickolai
  • Patent number: 8678828
    Abstract: An apparatus for language instruction including at least one vowel card object having text corresponding to a vowel sound for a language and at least one consonant card object having text corresponding to a consonant sound for the language. Combining the text of at least one of the consonant card objects with the text of at least one of the vowel card objects defines a phonetic sound in the language based on the vowel sound and the consonant sound.
    Type: Grant
    Filed: February 22, 2010
    Date of Patent: March 25, 2014
    Inventor: Jennifer Liegh Gray
  • Patent number: 8672682
    Abstract: A method of automatically converting alphabetic text written in a text based language into a non-text based language. The method can include parsing the text to identify at least one word. The method also can include via a processor, identifying within a lexicon database data corresponding to the word, wherein the data corresponding to the word identifies at least one pictograph or symbol selected from a group of pictographs or symbols consisting of between twenty seven and thirty three distinct pictographs or symbols that visually look different than the text, wherein each pictograph or symbol corresponds to a unique speech sound of the text based language. The pictograph or symbol can be rendered in a view.
    Type: Grant
    Filed: October 20, 2011
    Date of Patent: March 18, 2014
    Inventor: Howard A. Engelsen
  • Patent number: 8667400
    Abstract: Techniques for real-time observation assessment are provided. The techniques, which are designed for educators, take advantage of handheld computers, desktop/laptop computers and Internet access in order to reduce the paperwork associated with conventional educational assessments. An array of instructional assessment applications are designed to run on handheld computers. The instructional assessment applications may be based on existing and widely used paper methodologies. A common Web-based platform for assessment application distribution, selection, download, data management and reporting is also provided. Users can then periodically synchronize instructional data (assessments, diagnostic results, notes and/or schedules) to the Web site. At the Web site, browser-based reports and analysis can be viewed, administered and shared via electronic mail.
    Type: Grant
    Filed: June 22, 2009
    Date of Patent: March 4, 2014
    Assignee: Amplify Education, Inc.
    Inventors: Lawrence Jason Berger, Gregory M. Gunn, John D. Stewart, Kenneth M. Gunn, Elizabeth Lynn, Nicole M. Adams, Anouk Markovits, Aaron Boyd
  • Patent number: 8628329
    Abstract: A function of storing character string information added with vowel or nasal sound information; a function of displaying the character string information from a dictionary function or the character string information to be input; a function of displaying read number information of the displayed character string information; a function of designating and inputting desired positional information to the displayed read number information; a function of changeably setting the positional information regarding the designated and input positional information; a function of retrieving and extracting the character string information from the dictionary function at a position corresponding to the positional information obtained by referring to the changeable positional information, wherein the character string information is coincident with the vowel or the nasal sound information regarding a read of a character corresponding to the designated and input positional information; and a function of outputting and displaying
    Type: Grant
    Filed: November 17, 2011
    Date of Patent: January 14, 2014
    Assignee: Tachikogi Rider Inc.
    Inventor: Daisuke Doi
  • Patent number: 8602789
    Abstract: Methods for assessing cognitive and linguistic abilities by tracking and recording the eye movements of a patient in response to predetermined verbal and visual stimuli. The methods incorporate conventional eye-tracking technology to acquire eye-fixation location and duration measures for testing linguistic comprehension, working memory, attention allocation, and the effect of semantic associative priming. Visual stimuli presented in the methods are carefully designed to reduce visually distracting features. Verbal stimuli are carefully designed to control for numerous linguistic features.
    Type: Grant
    Filed: October 14, 2009
    Date of Patent: December 10, 2013
    Assignee: Ohio University
    Inventors: Brooke Hallowell, Hans Kruse
  • Publication number: 20130323690
    Abstract: An uninterrupted reading experience can be provided by calculating a vocabulary level for a user in a first language and comparing difficulty levels of words within a document in the first language to the vocabulary level of the user in the first language. Each word of the document having a difficulty level that exceeds the vocabulary level of the user in the first language can be selected.
    Type: Application
    Filed: May 23, 2013
    Publication date: December 5, 2013
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Ankur Gandhe, Rashmi Gangadharaiah, Ananthakrishnan Ramanathan, Karthik Visweswariah
  • Publication number: 20130323689
    Abstract: Embodiments of the present invention are generally directed to an Audio-Story Engine that includes a repository of prerecorded audio files that, when played in a certain sequence, with user provided recordings placed throughout, tell a story. To obtain the user provided recordings, the Audio-Story Engine asks the user to make audio recordings of various words or phrases. For example, the Audio-Story Engine may ask the user a series of questions in order to record and store the user's audible responses. Upon completion, the Audio-Story Engine plays back a completed story that incorporates the user's audio recordings by playing an appropriate user recording after playing a prerecorded audio file. This is repeated several times in sequence to form a seamless, customized, audio story. In addition, the Audio-Story Engine may alter the pitch or sound of the user's recorded words to match the pitch of the prerecorded story.
    Type: Application
    Filed: June 4, 2012
    Publication date: December 5, 2013
    Applicant: HALLMARK CARDS, INCORPORATED
    Inventors: ANNE CATHERINE BATES, JASON PAUL GAHR, ADAM SAMUEL SCHEFF, JASON BLAKE PENROD, STEPHANIE FARRIS YOUNG, TIMOTHY JAY LIEN, MICHAEL ANTHONY MONACO, JR.
  • Patent number: 8596640
    Abstract: A method of play, wherein play emphasizes storytelling and story recounting abilities, and wherein the method comprises: (a) providing a gaining device which provides for: a plurality of subject elements, wherein each subject element from the plurality of subject elements comprises a topic for a particular story; and a plurality of situational elements, wherein each situational element from the plurality of situational elements qualifies the subject element; (b) pairing a subject element from the plurality of subject elements with a situational element from the plurality of situational elements; and (c) providing a story based on the paired subject element and selected situational element.
    Type: Grant
    Filed: October 31, 2012
    Date of Patent: December 3, 2013
    Inventor: Jacob G. R. Kramlich
  • Patent number: RE45322
    Abstract: A method of interpreting keypad input includes identifying a first letter of a target word from activation of an initial key, identifying a set of possible intermediate letters of the target word in response to non-activating traversal of associated keys of the keypad following activation of the initial key, identifying a last letter of the target word from activation of a final key following the non-activating traversal, and then determining the target word based upon the identified first, intermediate and last letters. The method is particularly useful in key input devices sensitive to non-activating finger position above the keys.
    Type: Grant
    Filed: November 30, 2012
    Date of Patent: January 6, 2015
    Assignee: Nuance Communications, Inc.
    Inventor: David H. Levy