Patents Assigned to Educational Testing Service
-
Patent number: 9087519Abstract: Systems and methods are provided for scoring speech. A speech sample is received, where the speech sample is associated with a script. The speech sample is aligned with the script. An event recognition metric of the speech sample is extracted, and locations of prosodic events are detected in the speech sample based on the event recognition metric. The locations of the detected prosodic events are compared with locations of model prosodic events, where the locations of model prosodic events identify expected locations of prosodic events of a fluent, native speaker speaking the script. A prosodic event metric is calculated based on the comparison, and the speech sample is scored using a scoring model based upon the prosodic event metric.Type: GrantFiled: March 20, 2012Date of Patent: July 21, 2015Assignee: Educational Testing ServiceInventors: Klaus Zechner, Xiaoming Xi
-
Patent number: 8990082Abstract: A method for scoring non-native speech includes receiving a speech sample spoken by a non-native speaker and performing automatic speech recognition and metric extraction on the speech sample to generate a transcript of the speech sample and a speech metric associated with the speech sample. The method further includes determining whether the speech sample is scorable or non-scorable based upon the transcript and speech metric, where the determination is based on an audio quality of the speech sample, an amount of speech of the speech sample, a degree to which the speech sample is off-topic, whether the speech sample includes speech from an incorrect language, or whether the speech sample includes plagiarized material. When the sample is determined to be non-scorable, an indication of non-scorability is associated with the speech sample. When the sample is determined to be scorable, the sample is provided to a scoring model for scoring.Type: GrantFiled: March 23, 2012Date of Patent: March 24, 2015Assignee: Educational Testing ServiceInventors: Su-Youn Yoon, Derrick Higgins, Klaus Zechner, Shasha Xie, Je Hun Jeon, Keelan Evanini
-
Patent number: 8908998Abstract: A computer-implemented method and system for ensuring quality control over a group of test reports comprising identifying a plurality of areas within a test report that contain data elements, performing optical character recognition on the plurality of areas for each test report in the group of test reports to generate text corresponding to the content in the areas, comparing the content in the areas to corresponding data in a data file from a trusted source, and creating an output report based on the result of the comparison. The test reports may indicate educational test score results and personal identification information for test takers.Type: GrantFiled: December 5, 2008Date of Patent: December 9, 2014Assignee: Educational Testing ServiceInventor: Gary Driscoll
-
Patent number: 8909127Abstract: Systems and methods are provided for carrying out an examination with a client computer. A client computer is booted using a secondary operating system other than a primary operating system of the client computer that is accessed from a computer readable medium provided by an administrator of an examination only on the day of the examination. A broadcast message is sent from the client computer to check for the existence of a local server and to establish communication with the local server. A client application is received from the local server for presenting the examination at the client computer. Assessment content is received for the examination from the local server. The examination including the assessment content is presented to a user of the client computer with the client application, and user responses are received at the client computer and transmitted to the local server.Type: GrantFiled: September 26, 2012Date of Patent: December 9, 2014Assignee: Educational Testing ServiceInventors: Wilson Gerald Toussaint, Jr., Kenneth H. Berger, Diana Wright Cano, Debra Pisacreta, Brent Bridgeman
-
Publication number: 20140343923Abstract: A computer-implemented method of training an assessment model for assessing constructed texts expressing opinions on subjects includes accessing a plurality of training texts, which are constructed texts. The training texts are analyzed with the processing system to derive values of a plurality of linguistic features of an assessment model. At least one of the plurality of linguistic features relates to sentiment and at least one of the plurality of linguistic feature relates to specificity. The assessment model is trained with the processing system based on the values of the plurality of linguistic features. Based on the training, a weight for each of the plurality of linguistic features is determined. The assessment model is calibrated to include the weights for at least some of the plurality of linguistic features such that the assessment model is configured to generate assessment measures for constructed texts expressing opinions on subjects.Type: ApplicationFiled: May 16, 2014Publication date: November 20, 2014Applicant: Educational Testing ServiceInventors: Michael Heilman, F. Jay Breyer, Michael Flor
-
Patent number: 8892421Abstract: Systems and methods are provided for determining a difficulty level of a text. A determination is made as to a number of cohesive devices present in a text. A further determination is made as to a number of cohesive devices expected in the text. A cohesiveness metric is calculated based on the number of cohesive devices present in the text and the number of cohesive devices expected in the text, where the cohesiveness metric is used to identify a difficulty level of the text.Type: GrantFiled: December 7, 2011Date of Patent: November 18, 2014Assignee: Educational Testing ServiceInventors: Kathleen Marie Sheehan, Irene Kostin, Yoko Futagi
-
Patent number: 8888493Abstract: A computer-implemented method, system, and computer program product for automatically assessing text difficulty. Text reading difficulty predictions are expressed on a scale that is aligned with published reading standards. Two distinct difficulty models are provided for informational and literary texts. A principal components analysis implemented on a large collection of texts is used to develop independent variables accounting for strong intercorrelations exhibited by many important linguistic features. Multiple dimensions of text variation are addressed, including new dimensions beyond syntactic complexity and semantic difficulty. Feedback about text difficulty is provided in a hierarchically structured format designed to support successful text adaptation efforts.Type: GrantFiled: July 24, 2013Date of Patent: November 18, 2014Assignee: Educational Testing ServiceInventors: Kathleen Marie Sheehan, Irene Kostin, Yoko Futagi
-
Publication number: 20140317032Abstract: Systems and methods are described for generating a scoring model for responses. A computer-implemented method of calibrating a scoring model using a processing system for scoring examinee responses includes accessing a plurality of training responses for training the scoring model. The plurality of training responses are analyzed to derive values of multiple features (variables) of the training responses. The scoring model is trained based on the values of the multiple features of the training responses and one or more external measures of proficiency for each individual associated with a training response utilized in the training. The one or more external measures are not derived from the training responses. Based on the training, a weight for each of the multiple features is determined. The scoring model is calibrated to include the weights for at least some of the features for scoring examinee responses.Type: ApplicationFiled: April 21, 2014Publication date: October 23, 2014Applicant: Educational Testing ServiceInventors: Shelby J. Haberman, Mo Zhang
-
Publication number: 20140302469Abstract: Systems and methods are described for providing a multi-modal evaluation of a presentation. A system includes a motion capture device configured to detect motion an examinee giving a presentation and an audio recording device configured to capture audio of the examinee giving the presentation. One or more data processors are configured to extract a non-verbal feature of the presentation based on data collected by the motion capture device and an audio feature of the presentation based on data collected by the audio recording device. The one or more data processors are further configured to generate a presentation score based on the non-verbal feature and the audio feature.Type: ApplicationFiled: April 8, 2014Publication date: October 9, 2014Applicant: Educational Testing ServiceInventors: Lei Chen, Gary Feng, Chee Wee Leong, Christopher Kitchen, Chong Min Lee
-
Publication number: 20140297277Abstract: Systems and methods are provided for scoring spoken language in multiparty conversations. A computer receives a conversation between an examinee and at least one interlocutor. The computer selects a portion of the conversation. The portion includes one or more examinee utterances and one or more interlocutor utterances. The computer assesses the portion using one or more metrics, such as: a pragmatic metric for measuring a pragmatic fit of the one or more examinee utterances; a speech act metric for measuring a speech act appropriateness of the one or more examinee utterances; a speech register metric for measuring a speech register appropriateness of the one or more examinee utterances; and an accommodation metric for measuring a level of accommodation of the one or more examinee utterances. The computer computes a final score for the portion of the conversation based on the one or more metrics applied.Type: ApplicationFiled: March 26, 2014Publication date: October 2, 2014Applicant: Educational Testing ServiceInventors: Klaus Zechner, Keelan Evanini
-
Publication number: 20140295387Abstract: Systems and methods are provided for scoring a constructed response. The constructed response is processed according to a set of grammar rules to generate a data structure. The grammar rules specify a set of preferred responses for the item. The grammar rules utilize a plurality of variables that specify legitimate word patterns for the constructed response. It is determined whether the data structure indicates that the constructed response is included in the set of preferred responses, and if so, a maximum score is assigned to the constructed response. If the data structure indicates that the constructed response is not included in the set of preferred responses, a partial credit score for the constructed response is determined by assessing from the data structure which ones of the concepts represented by the variables are present in the constructed response. The partial credit score is assigned based on the presence of the concepts.Type: ApplicationFiled: March 27, 2014Publication date: October 2, 2014Applicant: Educational Testing ServiceInventors: Michael Heilman, Daniel Blanchard
-
Publication number: 20140295400Abstract: Systems and methods are described for providing an assessment of a conversational aptitude of a test taker. A system includes a computer-readable medium configured for storage of a conversational aptitude assessment data structure. A conversational aptitude assessment data structure includes conversation cycle data records describing a plurality of conversation cycles between a virtual personality and the test taker, where a conversation cycle data record includes a virtual personality script and a plurality of model test taker responses and associated cycle links, where each cycle link identifies a next conversation cycle data record. A data processor is configured to access a first conversation cycle data record, determine the model test taker response with which a test taker response is most similar, and select a next conversation cycle data record identified with the cycle link associated with the most similar model test taker response.Type: ApplicationFiled: March 27, 2014Publication date: October 2, 2014Applicant: Educational Testing ServiceInventors: Diego Zapata-Rivera, Youngsoon So, Lei Liu
-
Publication number: 20140295399Abstract: Systems and methods are provided for determining a susceptibility of a computer-implemented automated scoring engine to gaming strategies. A plurality of responses to a prompt are provided to a computer-implemented automated scoring engine to receive a first set of scores. A first transformation is performed on each of the plurality of responses to generate a first set of transformed responses. The first set of transformed responses is provided to the computer-implemented automatic scoring engine to receive a second set of scores, and a gaming susceptibility metric is determined based on the first set of scores and the second set of scores.Type: ApplicationFiled: March 26, 2014Publication date: October 2, 2014Applicant: Educational Testing ServiceInventors: Derrick Higgins, Isaac Bejar, Michael Heilman, Yoko Futagi, Michael Flor
-
Publication number: 20140288915Abstract: Systems and methods are provided for correcting a grammatical error in a text sequence. A first text sequence in a first language is received. The first text sequence is translated to a second language to provide a first translated text. The first text sequence is translated to a third language to provide a second translated text. The third language is different from the second language. The first translated text is translated to the first language to provide a first back translation. The second translated text is translated to the first language to provide a second back translation. A plurality of candidate text sequences that include features of the first back translation and the second back translation are determined. The plurality of candidate text sequences include alternative grammatical options for the first text sequence. The plurality of candidate text sequences are scored with the processing system.Type: ApplicationFiled: March 19, 2014Publication date: September 25, 2014Applicant: Educational Testing ServiceInventors: Nitin Madnani, Joel Tetreault, Martin Chodorow
-
Publication number: 20140278376Abstract: Computer-implemented systems and methods are provided for automatically generating recitation items. For example, a computer performing the recitation item generation can receive one or more text sets that each includes one or more texts. The computer can determine a value for each text set using one or more metrics, such as a vocabulary difficulty metric, a syntactic complexity metric, a phoneme distribution metric, a phonetic difficulty metric, and a prosody distribution metric. Then the computer can select a final text set based on the value associated with each text set. The selected final text set can be used as the recitation items for a speaking assessment test.Type: ApplicationFiled: March 17, 2014Publication date: September 18, 2014Applicant: Educational Testing ServiceInventors: Su-Youn Yoon, Lei Chen, Keelan Evanini, Klaus Zechner
-
Publication number: 20140279763Abstract: In accordance with the teachings described herein, systems and methods are provided for measuring a user's comprehension of subject matter of a text. A summary generated by the user is received, where the summary summarizes the text. The summary is processed to determine a first numerical measure indicative of a similarity between the summary and a reference summary. The summary is processed to determine a second numerical measure indicative of a degree to which a single sentence of the summary summarizes an entirety of the text. The summary is processed to determine a third numerical measure indicative of a degree of copying in the summary of multi-word sequences present in the text. A numerical model is applied to the first numerical measure, the second numerical measure and the third numerical measure to determine a score for the summary indicative of the user's comprehension of the subject matter of the text.Type: ApplicationFiled: March 18, 2014Publication date: September 18, 2014Applicant: Educational Testing ServiceInventors: Nitin Madnani, Jill Burstein
-
Publication number: 20140255886Abstract: Computer-implemented systems and methods are provided for automatically scoring the content of moderately predictable responses. For example, a computer performing the content scoring analysis can receive a response (either in text or spoken form) to a prompt. The computer can determine the content correctness of the response by analyzing one or more content features. One of the content features is analyzed by applying one or more regular expressions, determined based on training responses associated with the prompt. Another content feature is analyzed by applying one or more context free grammars, determined based on training responses associated with the prompt. Another content feature is analyzed by applying a keyword list, determined based on the test prompt eliciting the response and/or stimulus material. Another content feature is analyzed by applying one or more probabilistic n-gram models, determined based on training responses associated with the prompt.Type: ApplicationFiled: March 6, 2014Publication date: September 11, 2014Applicant: Educational Testing ServiceInventors: Xinhao Wang, Klaus Zechner, Shasha Xie
-
Publication number: 20140234810Abstract: Computer-implemented systems and methods are provided for determining a document's complexity. For example, a computer performing the complexity analysis can receive a document. The computer can determine the content words within the document and determine an association measure for each group of content words. An association profile can be created for the document using the association measures. The computer can use the association profile to determine the complexity of the document. The complexity of the document may correspond to the document's suitable reading level or, if the document is an essay, an essay score.Type: ApplicationFiled: February 14, 2014Publication date: August 21, 2014Applicant: EDUCATIONAL TESTING SERVICEInventors: Michael Flor, Beata Beigman Klebanov
-
Patent number: 8798518Abstract: A method and system for estimating uncalibrated task performance are disclosed.Type: GrantFiled: June 29, 2005Date of Patent: August 5, 2014Assignee: Educational Testing ServiceInventors: Russell G. Almond, Sandip Sinharay, Linda Steinberg, Robert J. Mislevy
-
Publication number: 20140199676Abstract: Computer-implemented systems and methods are provided for scoring content of a spoken response to a prompt. A scoring model is generated for a prompt, where generating the scoring model includes generating a transcript for each of a plurality of training responses to the prompt, dividing the plurality of training responses into clusters based on the transcripts of the training responses, selecting a subset of the training responses in each cluster for scoring, scoring the selected subset of training responses for each cluster, and generating content training vectors using the transcripts from the scored subset. A transcript is generated for a received spoken response to be scored, and a similarity metric is computed between the transcript of the spoken response to be scored and the content training vectors. A score is assigned to the spoken response based on the determined similarity metric.Type: ApplicationFiled: January 10, 2014Publication date: July 17, 2014Applicant: Educational Testing ServiceInventors: Lei Chen, Klaus Zechner, Anastassia Loukina