Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
A many-facet Rasch analysis of rater...
~
Yang, Rui.
Linked to FindBook
Google Book
Amazon
博客來
A many-facet Rasch analysis of rater effects on an Oral English Proficiency Test.
Record Type:
Language materials, printed : Monograph/item
Title/Author:
A many-facet Rasch analysis of rater effects on an Oral English Proficiency Test./
Author:
Yang, Rui.
Description:
149 p.
Notes:
Source: Dissertation Abstracts International, Volume: 71-09, Section: A, page: 3252.
Contained By:
Dissertation Abstracts International71-09A.
Subject:
Education, Tests and Measurements. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3413993
ISBN:
9781124157115
A many-facet Rasch analysis of rater effects on an Oral English Proficiency Test.
Yang, Rui.
A many-facet Rasch analysis of rater effects on an Oral English Proficiency Test.
- 149 p.
Source: Dissertation Abstracts International, Volume: 71-09, Section: A, page: 3252.
Thesis (Ph.D.)--Purdue University, 2010.
This study investigates the impact of rater severity and the stability of rater severity over time on the scores examinees receive on an Oral English Proficiency Test used to certify International Teaching Assistants at a North American university. Ratings from 434 examinees by 9 raters from August 2007 testing administration and 10 raters from August 2008 testing administration were analyzed using FACETS, a multi-faceted Rasch analysis program (Linacre, 2008). The study found that the raters demonstrated different levels of severity. However, the impact of rater severity on the test scores was small. About 4% of examinees from the two testing administrations tested out with observed averages higher than their fair averages. The majority of raters used the scale in a consistent fashion. One rater, however, demonstrated inconsistency with slightly larger infit statistic than the upper control limit of 1.2. The level of severity for most raters was not invariant but acceptable across the two sessions a year apart. Two raters showed drift of severity more than the Rasch model expects. New raters do not show more variation with respect to their severity and stability of severity than experienced raters. Slightly larger gaps between adjacent rating categories were identified, which invites the opportunity for scale revision that would allow raters to better distinguish the oral proficiency levels of the examinees. The study shows that FACETS is a useful tool for studying rater performance. The FACETS results can be used to target individual raters in follow-up rater trainings to help improve rater accuracy. In this way, examinees as stakeholders are protected against errors introduced by human raters.
ISBN: 9781124157115Subjects--Topical Terms:
1017589
Education, Tests and Measurements.
A many-facet Rasch analysis of rater effects on an Oral English Proficiency Test.
LDR
:02752nam 2200313 4500
001
1393886
005
20110415112009.5
008
130515s2010 ||||||||||||||||| ||eng d
020
$a
9781124157115
035
$a
(UMI)AAI3413993
035
$a
AAI3413993
040
$a
UMI
$c
UMI
100
1
$a
Yang, Rui.
$3
610549
245
1 2
$a
A many-facet Rasch analysis of rater effects on an Oral English Proficiency Test.
300
$a
149 p.
500
$a
Source: Dissertation Abstracts International, Volume: 71-09, Section: A, page: 3252.
500
$a
Adviser: April Ginther.
502
$a
Thesis (Ph.D.)--Purdue University, 2010.
520
$a
This study investigates the impact of rater severity and the stability of rater severity over time on the scores examinees receive on an Oral English Proficiency Test used to certify International Teaching Assistants at a North American university. Ratings from 434 examinees by 9 raters from August 2007 testing administration and 10 raters from August 2008 testing administration were analyzed using FACETS, a multi-faceted Rasch analysis program (Linacre, 2008). The study found that the raters demonstrated different levels of severity. However, the impact of rater severity on the test scores was small. About 4% of examinees from the two testing administrations tested out with observed averages higher than their fair averages. The majority of raters used the scale in a consistent fashion. One rater, however, demonstrated inconsistency with slightly larger infit statistic than the upper control limit of 1.2. The level of severity for most raters was not invariant but acceptable across the two sessions a year apart. Two raters showed drift of severity more than the Rasch model expects. New raters do not show more variation with respect to their severity and stability of severity than experienced raters. Slightly larger gaps between adjacent rating categories were identified, which invites the opportunity for scale revision that would allow raters to better distinguish the oral proficiency levels of the examinees. The study shows that FACETS is a useful tool for studying rater performance. The FACETS results can be used to target individual raters in follow-up rater trainings to help improve rater accuracy. In this way, examinees as stakeholders are protected against errors introduced by human raters.
590
$a
School code: 0183.
650
4
$a
Education, Tests and Measurements.
$3
1017589
650
4
$a
Language, Linguistics.
$3
1018079
690
$a
0288
690
$a
0290
710
2
$a
Purdue University.
$b
English.
$3
1026432
773
0
$t
Dissertation Abstracts International
$g
71-09A.
790
1 0
$a
Ginther, April,
$e
advisor
790
1 0
$a
Weiser, Irwin
$e
committee member
790
1 0
$a
Niepokuj, Mary
$e
committee member
790
1 0
$a
Silva, Tony
$e
committee member
790
$a
0183
791
$a
Ph.D.
792
$a
2010
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3413993
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9157025
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login