Patents Assigned to Educational Testing Service
-
Publication number: 20140278376Abstract: Computer-implemented systems and methods are provided for automatically generating recitation items. For example, a computer performing the recitation item generation can receive one or more text sets that each includes one or more texts. The computer can determine a value for each text set using one or more metrics, such as a vocabulary difficulty metric, a syntactic complexity metric, a phoneme distribution metric, a phonetic difficulty metric, and a prosody distribution metric. Then the computer can select a final text set based on the value associated with each text set. The selected final text set can be used as the recitation items for a speaking assessment test.Type: ApplicationFiled: March 17, 2014Publication date: September 18, 2014Applicant: Educational Testing ServiceInventors: Su-Youn Yoon, Lei Chen, Keelan Evanini, Klaus Zechner
-
Publication number: 20140279763Abstract: In accordance with the teachings described herein, systems and methods are provided for measuring a user's comprehension of subject matter of a text. A summary generated by the user is received, where the summary summarizes the text. The summary is processed to determine a first numerical measure indicative of a similarity between the summary and a reference summary. The summary is processed to determine a second numerical measure indicative of a degree to which a single sentence of the summary summarizes an entirety of the text. The summary is processed to determine a third numerical measure indicative of a degree of copying in the summary of multi-word sequences present in the text. A numerical model is applied to the first numerical measure, the second numerical measure and the third numerical measure to determine a score for the summary indicative of the user's comprehension of the subject matter of the text.Type: ApplicationFiled: March 18, 2014Publication date: September 18, 2014Applicant: Educational Testing ServiceInventors: Nitin Madnani, Jill Burstein
-
Publication number: 20140255886Abstract: Computer-implemented systems and methods are provided for automatically scoring the content of moderately predictable responses. For example, a computer performing the content scoring analysis can receive a response (either in text or spoken form) to a prompt. The computer can determine the content correctness of the response by analyzing one or more content features. One of the content features is analyzed by applying one or more regular expressions, determined based on training responses associated with the prompt. Another content feature is analyzed by applying one or more context free grammars, determined based on training responses associated with the prompt. Another content feature is analyzed by applying a keyword list, determined based on the test prompt eliciting the response and/or stimulus material. Another content feature is analyzed by applying one or more probabilistic n-gram models, determined based on training responses associated with the prompt.Type: ApplicationFiled: March 6, 2014Publication date: September 11, 2014Applicant: Educational Testing ServiceInventors: Xinhao Wang, Klaus Zechner, Shasha Xie
-
Publication number: 20140234810Abstract: Computer-implemented systems and methods are provided for determining a document's complexity. For example, a computer performing the complexity analysis can receive a document. The computer can determine the content words within the document and determine an association measure for each group of content words. An association profile can be created for the document using the association measures. The computer can use the association profile to determine the complexity of the document. The complexity of the document may correspond to the document's suitable reading level or, if the document is an essay, an essay score.Type: ApplicationFiled: February 14, 2014Publication date: August 21, 2014Applicant: EDUCATIONAL TESTING SERVICEInventors: Michael Flor, Beata Beigman Klebanov
-
Patent number: 8798518Abstract: A method and system for estimating uncalibrated task performance are disclosed.Type: GrantFiled: June 29, 2005Date of Patent: August 5, 2014Assignee: Educational Testing ServiceInventors: Russell G. Almond, Sandip Sinharay, Linda Steinberg, Robert J. Mislevy
-
Publication number: 20140199676Abstract: Computer-implemented systems and methods are provided for scoring content of a spoken response to a prompt. A scoring model is generated for a prompt, where generating the scoring model includes generating a transcript for each of a plurality of training responses to the prompt, dividing the plurality of training responses into clusters based on the transcripts of the training responses, selecting a subset of the training responses in each cluster for scoring, scoring the selected subset of training responses for each cluster, and generating content training vectors using the transcripts from the scored subset. A transcript is generated for a received spoken response to be scored, and a similarity metric is computed between the transcript of the spoken response to be scored and the content training vectors. A score is assigned to the spoken response based on the determined similarity metric.Type: ApplicationFiled: January 10, 2014Publication date: July 17, 2014Applicant: Educational Testing ServiceInventors: Lei Chen, Klaus Zechner, Anastassia Loukina
-
Publication number: 20140193792Abstract: A Portal Assessment Design System includes a system for designing assessment models and assessments (e.g., standardized tests). In assessment design, specifications are organized around three basic elements: 1) what knowledge, skill or abilities are important to measure; 2) what is necessary evidence of that knowledge, skill or ability; 3) what performance situations are needed to elicit the required evidence from students and other assessment users. The system includes three modules (i.e., Domain Analysis, Domain Modeling, Conceptual Assessment Framework), each of which targets a specific phase of the design process, for designing an assessment in accordance with the three basic elements. The sequence of modules represents the structure of a design process that helps assessment developers work through successive stages or phases of assessment design, from the gathering of raw information to the creation of a complete set of specifications for an assessment product.Type: ApplicationFiled: February 14, 2014Publication date: July 10, 2014Applicant: Educational Testing ServiceInventors: Linda Steinberg, Robert J. Mislevy, Russell G. Almond
-
Publication number: 20140141392Abstract: Systems and methods are provided for assigning a difficulty score to a speech sample. Speech recognition is performed on a digitized version of the speech sample using an acoustic model to generate word hypotheses for the speech sample. Time alignment is performed between the speech sample and the word hypotheses to associate the word hypotheses with corresponding sounds of the speech sample. A first difficulty measure is determined based on the word hypotheses, and a second difficulty measure is determined based on acoustic features of the speech sample. A difficulty score for the speech sample is generated based on the first difficulty measure and the second difficulty measure.Type: ApplicationFiled: November 15, 2013Publication date: May 22, 2014Applicant: Educational Testing ServiceInventors: Su-Youn Yoon, Yeonsuk Cho, Klaus Zechner, Diane Napolitano
-
Patent number: 8706022Abstract: An on-line teaching and learning system with rapid change-by-change or real time reinforcement signals for students and simultaneous monitoring by the teacher of the actual responses as well as their correctness is disclosed. The system may include at least one teacher computer, a plurality of student computers operably connected to the at least one teacher computer by a communications network. The system may be used in a classroom setting or in a distance-learning environment.Type: GrantFiled: July 22, 2008Date of Patent: April 22, 2014Assignee: Educational Testing ServiceInventors: Michael F. Dunk, Mary Crist
-
Patent number: 8651873Abstract: A Portal Assessment Design System includes a system for designing assessment models and assessments (e.g., standardized tests). In assessment design, specifications are organized around three basic elements: 1) what knowledge, skill or abilities are important to measure; 2) what is necessary evidence of that knowledge, skill or ability; 3) what performance situations are needed to elicit the required evidence from students and other assessment users. The system includes three modules (i.e., Domain Analysis, Domain Modeling, Conceptual Assessment Framework), each of which targets a specific phase of the design process, for designing an assessment in accordance with the three basic elements. The sequence of modules represents the structure of a design process that helps assessment developers work through successive stages or phases of assessment design, from the gathering of raw information to the creation of a complete set of specifications for an assessment product.Type: GrantFiled: August 6, 2004Date of Patent: February 18, 2014Assignee: Educational Testing ServiceInventors: Linda S. Steinberg, Robert J. Mislevy, Russell W. Almond
-
Patent number: 8632344Abstract: A method and system for customizing an automated essay scoring system are disclosed. Relative weights may be assigned to a plurality of features for a plurality of benchmark essay responses. Based on the relative weights, automated scores may be determined for one or more first essay responses to a first prompt. The automated scores may be scaled based on one or more scoring standards for the first prompt. A scaled automated score may be assigned for a second essay response to the first prompt based on the scaled automated scores for the one or more first essay responses.Type: GrantFiled: May 25, 2012Date of Patent: January 21, 2014Assignee: Educational Testing ServiceInventor: Yigal Attali
-
Patent number: 8626054Abstract: To automatically annotate an essay, a sentence of the essay is identified and a feature associated with the sentence is determined. In addition, a probability of the sentence being a discourse element is determined by mapping the feature to a model. The model having been generated by a machine learning application based on at least one annotated essay. Furthermore, the essay is annotated based on the probability.Type: GrantFiled: July 20, 2010Date of Patent: January 7, 2014Assignee: Educational Testing ServiceInventors: Jill Burstein, Daniel Marcu
-
Publication number: 20130297294Abstract: Systems and methods are provided for non-monotonic recognition of phrasal terms. Phrasal terms are identified from a corpus of written materials and ranked based on, for example, a mutual rank ratio. The phrasal terms are sequentially selected and a determination is made as to whether to accept or reject the selected phrasal term based on at least one predetermined criteria. The ranking of the phrasal terms may also rely on linguistic support to reduce duplication of phrasal terms and to distinguish different confidence levels for identified and accepted phrasal terms.Type: ApplicationFiled: March 14, 2013Publication date: November 7, 2013Applicant: Educational Testing ServiceInventors: Robert Krovetz, Paul Deane
-
Patent number: 8572126Abstract: A computer memory stores a data structure representing a ternary search tree (TST) representing multiple word n-grams for a corpus of documents. The data structure includes plural records in a first memory, each record representing a node of the TST and comprising plural fields. At least some n-grams have a sequence of units. The plurality of fields includes one for identifying a given unit of the sequence for a given node, one reserved for storing payload information for the given node, and plural child fields reserved for storing information for a first, second and third child nodes of the given node. The child fields store a null value indicating the absence of the child node or an identifier identifying a memory location of the child node. For at least one record, at least one of the child fields stores an identifier identifying a memory location of a memory different than the first memory.Type: GrantFiled: June 24, 2011Date of Patent: October 29, 2013Assignee: Educational Testing ServiceInventor: Michael Flor
-
Publication number: 20130275461Abstract: Systems and methods are provided for identifying factual information in a written document. Named entities and corresponding noun phrases are identified in the written document. A query is built by combining one of the named entities with a respective one of the noun phrases. The query represents an assertion of a potential fact. The query is submitted for comparison with a fact repository which assesses whether the query presents a factual assertion. If the query presents a factual assertion (e.g., it matches a fact within the fact repository), a match is returned. Various modifications may be made to the queries to return additional matches and various combinations of filters may be applied to the matches to filter out less relevant matches.Type: ApplicationFiled: March 12, 2013Publication date: October 17, 2013Applicant: EDUCATIONAL TESTING SERVICEInventors: Beata Beigman Klebanov, Derrick Higgins
-
Patent number: 8550822Abstract: A method of determining a mastery level for an examinee from an assessment is disclosed. The method includes receiving one or more of an overall skill level for an examinee, a weight for the overall skill level, a covariate vector for an examinee, and a weight for the covariate vector. An examinee attribute value is computed using one or more of the received values for each examinee and each attribute. The computation of the examinee attribute values can include estimating the value using a Markov Chain Monte Carlo estimation technique. Examinee mastery levels are then assigned based on each examinee attribute level. Dichotomous or polytomous levels can be assigned based on requirements for the assessment.Type: GrantFiled: November 5, 2009Date of Patent: October 8, 2013Assignee: Educational Testing ServiceInventor: Jonathan Templin
-
Patent number: 8554129Abstract: A computer-based testing system includes testing stations connected to a testing service center and backend via the Internet for providing testing services. The system is operable to perform state management to implement fault recovery due to a computing device failure while a test is being administered. The system is also operable to utilize multiple caching techniques for mitigating network latency while administering tests.Type: GrantFiled: November 13, 2003Date of Patent: October 8, 2013Assignee: Educational Testing ServiceInventors: Darshan Timbadia, Steve Hendershott, Ken Berger
-
Publication number: 20130262110Abstract: Systems and methods are provided for generating a transcript of a speech sample response to a test question. The speech sample response to the test question is provided to a language model, where the language model is configured to perform an automated speech recognition function. The language model is adapted to the test question to improve the automated speech recognition function by providing to the language model automated speech recognition data related to the test question, Internet data related to the test question, or human-generated transcript data related to the test question. The transcript of the speech sample is generated using the adapted language model.Type: ApplicationFiled: March 14, 2013Publication date: October 3, 2013Applicant: Educational Testing ServiceInventors: Shasha Xie, Lei Chen
-
Publication number: 20130254216Abstract: Systems and methods are provided for scoring a response to a character-by-character highlighting task. A similarity value for the response is calculated by comparing the response to one or more correct responses to the task to determine the similarity or dissimilarity of the response to the one or more correct responses to the task. A threshold similarity value is calculated for the task, where the threshold similarity value is indicative of an amount of similarity or dissimilarity to the one or more correct responses required for the response to be scored at a certain level. The similarity value for the response is compared to the threshold similarity value. A score is assigned at, above, or below the certain level based on the comparison.Type: ApplicationFiled: March 22, 2013Publication date: September 26, 2013Applicant: Educational Testing ServiceInventors: Kentaro Yamamoto, Jana Sukkarieh, Matthias Von Davier
-
Patent number: 8517738Abstract: A computer-implemented method, system, and computer program product for automatically assessing text difficulty. Text reading difficulty predictions are expressed on a scale that is aligned with published reading standards. Two distinct difficulty models are provided for informational and literary texts. A principal components analysis implemented on a large collection of texts is used to develop independent variables accounting for strong intercorrelations exhibited by many important linguistic features. Multiple dimensions of text variation are addressed, including new dimensions beyond syntactic complexity and semantic difficulty. Feedback about text difficulty is provided in a hierarchically structured format designed to support successful text adaptation efforts.Type: GrantFiled: January 30, 2009Date of Patent: August 27, 2013Assignee: Educational Testing ServiceInventors: Kathleen Marie Sheehan, Irene Kostin, Yoko Futagi