查看原文
其他

刊讯|SSCI 期刊《语言评估季刊》2022年第1期

四万学者关注了→ 语言学心得 2022-12-05

LANGUAGE ASSESSMENT QUARTERLY

Volume 19, Issue 1, 2022

    LANGUAGE ASSESSMENT QUARTERLY(SSCI二区,2020IF:1.667)2022年第1期共发文6篇,其中研究性论文3篇,访谈1篇,文章评论1篇,书评1篇。研究论文涉及差别试题功能(Differential Item Functioning, DIF),英语语音评估演变,多阶段适应性测试(multi-stage adaptive testing approach)等方面。

目录


RESEARCH ARTICLES

■ The Evolution of Assessment in English Pronunciation: The Case of Hong Kong (1978-2018), by Jim Yee Him Chan, Pages 1-26.

■ A Revisit of Zumbo’s Third Generation DIF: How Are We Doing in Language Testing?, by Hongli Li, Charles Vincent Hunter, Jacquelyn Anne Bialo, Pages 27–53.

■ Using Multistage Testing to Enhance Measurement of an English Language Proficiency Test, by David MacGregor, Shu Jing Yen, Xin Yu, Pages 54-75.


INTERVIEW

■ At the Intersection of Language Testing Policy, Practice, and Research: An Interview with Yan Jin, by Jason Fan, Kellie Frost, Pages 76–89.


ARTICLE COMMENTARY

■ The Relationship between Word Difficulty and Frequency: A Response to Hashimoto (2021), by Jeffrey Stewart, Joseph P. Vitta, Christopher Nicklin, Stuart McLean, Geoffrey G. Pinchbeck & Brandon Kramer, Pages 90–101.


BOOK REVIEW

■ Fairness, Justice, and Language Assessment, by Sondoss Elnegahy, Haeyun Jin, Haeun Kim, Pages 102–105.


摘要

The Evolution of Assessment in English Pronunciation: The Case of Hong Kong (1978-2018)

Jim Yee Him Chan, The University of Hong Kong

Abstract This study tracked the development of Hong Kong’s assessment practices for English pronunciation over the past four decades, with reference to the nativeness and intelligibility principles in L2 pronunciation research and pedagogy. Specifically, it evaluated changes in assessors’ comments on candidates’ English pronunciation performance in school-exit public examinations between 1978 and 2018. Qualitative and quantitative content analyses were conducted on the examination report for each year to identify themes related to candidates’ pronunciation ‘problems’, including ‘word-based’ features (word pronunciation, word stress, segmentals), ‘discourse-based’ features (suprasegmentals) and ‘delivery’ (clarity, fluency, loudness, naturalness, pacing). In the examination reports, candidates’ problems with word-based features (particularly word pronunciation) received the most attention across the decades. Most of the comments in later reports were aligned with the intelligibility principle, particularly at the segmental level (e.g., missing consonants, simplification of consonant clusters, word pronunciation). These assessment practices were potentially influenced by the teaching methods recommended in the different ELT curricula over time (i.e., from an oral-structural to a communicative/task-based language teaching approach), and also by the assessors’ judgements. The paper concludes by proposing a research agenda for the promotion of an evidence-based approach that can inform future assessment practices.



A Revisit of Zumbo’s Third Generation DIF: How Are We Doing in Language Testing?

Hongli Li, Georgia State University, Atlanta, Georgia, USA

Charles Vincent Hunter, Georgia State University, Atlanta, Georgia, USA

Jacquelyn Anne Bialo, Georgia State University, Atlanta, Georgia, USA

Abstract The purpose of this study is to review the status of differential item functioning (DIF) research in language testing, particularly as it relates to the investigation of sources (or causes) of DIF, which is a defining characteristic of the third generation DIF. This review included 110 DIF studies of language tests dated from 1985 to 2019. We found that DIF researchers did not address sources of DIF more frequently in recent years than in earlier years. Nevertheless, DIF research in language testing has expanded with new DIF analysis procedures, more group- ing variables, and more diversified methods for investigating sources of DIF. In addition, in the early years of DIF research, methods to identify sources of DIF relied heavily on content analysis. This review showed that while more sophisticated statistical procedures have been adopted in recent years to address sources of DIF, understanding sources of DIF still remains a challenging task. We also discuss the pros and cons of existing methods to detect sources of DIF and implications for future investigations.



Using Multistage Testing to Enhance Measurement of an English Language Proficiency Test

David MacGregor, WIDA, Wisconsin Center for Education Research, University of Wisconsin, Madison, WI

Shu Jing Yen, Center for Applied Linguistics, Washington DC, United States

Xin Yu, Center for Applied Linguistics, Washington DC, United States

Abstract How can one construct a test that provides accurate measurements across the range of performance levels while providing adequate coverage of all of the critical areas of the domain, yet that is not unmanageably long? This paper discusses the approach taken in a linear test of academic English language, and how the transition to a computer-based test allowed for a design that better fit the demands of the test. It also describes the multi-stage adaptive approach that was devised. This approach allows for a test that covers a broad range of performance levels while including items that assess the language of the content areas as described in the English language develop- ment standards underpinning the test. The design also allows for a test that is closely tailored to the ability level of the English learner taking the test, and that therefore produces a more precise measure. The efficacy of the design in enhancing measurement of two versions of a high-stakes English language assessments is explored, and the implications of the results are discussed.



The Relationship between Word Difficulty and Frequency: A Response to Hashimoto (2021)

Jeffrey Stewart, Tokyo University of Science, Tokyo, Japan

Joseph P. Vitta, Kyushu University, Fukuoka, Japan

Christopher Nicklin, Rikkyo University, Tokyo, Japan

Stuart McLean, Momoyamagakuin University, Osaka, Japan

Geoffrey G. Pinchbeck, Carleton University, Ottawa, Canada

Brandon Kramer, Kwansei Gakuin University, Nishinomiya, Japan

Abstract Hashimoto (2021) reported a correlation of −.50 (r^2= .25) between word frequency rank and difficulty, concluding the construct of mod- ern vocabulary size tests is questionable. In this response we show that the relationship between frequency and difficulty is clear albeit non- linear and demonstrate that if a wider range of frequencies is tested and log transformations are applied, the correlation can approach .80. Finally, while we acknowledge the great promise of knowledge-based word lists, we note that a strong correlation between difficulty and frequency is not, in fact, the primary reason size tests are organized by frequency.




期刊简介

Language Assessment Quarterly: An International Journal (LAQ) is dedicated to the advancement of theory, research, and practice in first, second, and foreign language assessment for school, college, and university students; for employment; and for immigration and citizenship. LAQ publishes original articles addressing theoretical issues, empirical research, and professional standards and ethics related to language assessment, as well as interdisciplinary articles on related topics, and reports of language test development and testing practice. All articles are peer-reviewed. Language Assessment Quarterly accepts the following types of article: Full-length articles, Commentary, Book Reviews, Test Reviews, Interviews and Practical Advice. The journal is directed to an international audience. 


《语言评估季刊:国际期刊》 (LAQ) 致力于促进学校学生、工作场所职员以及移民在第一、第二和外语评估方面的理论、研究和实践。LAQ 发表有关语言评估相关理论问题、实证研究、专业标准和道德规范的原创文章,相关主题的跨学科文章,以及语言测试开发和测试实践报告。所有文章均经过同行评审。《语言评估季刊》接受长篇文章、评论、书评、测试评论、访谈和实用建议。该期刊面向国际读者。


Examples of topic areas appropriate for LAQ include:

  • assessment from around the world at all instructional levels including specific purposes;

  • assessment for immigration and citizenship and other ‘gate-keeping’ contexts;

  • issues of validity, reliability, fairness, access, accommodations, administration, and legal remedies;

  • assessment in culturally and/or linguistically diverse populations;

  • professional standards and ethical practices for assessment professionals;

  • interdisciplinary interfaces between language assessment and learning;

  • issues related to technology and computer-based assessment;

  • innovative and practical methods and techniques in developing assessment instruments;

  • recent trends in analysis of performance; and

  • issues of social-political and socio-economic concern to assessment professionals.


适用于 LAQ 的主题领域包括:

  • 来自世界各地所有教学水平的评估,评估具有特定目的;

  • 评估移民和公民身份以及其他筛选情况;

  • 有效性、可靠性、公平性、可达性、便利、管理和法律补救措施的问题;

  • 在文化和/或语言多样化的人群中进行评估;

  • 评估专业人员的专业标准和道德实践;

  • 语言评估和学习之间的跨学科接口;

  • 与技术和基于计算机的评估有关的问题;

  • 开发评估工具的创新和实用方法和技术;

  • 关于语言表现分析的最新趋势;

  • 评估专业人员关注的社会政治和社会经济问题。


官网地址:

https://www.tandfonline.com/journals/hlaq20

本文来源:LANGUAGE ASSESSMENT QUARTERLY官网

点击文末“阅读原文”可跳转下载



课程推荐



往期推荐

刊讯|《世界汉语教学》 2022第2期

刊讯|SSCI 期刊 System 2022年第105卷

刊讯|《四川师范大学学报》“国际中文教育”专栏(2021)

刊讯|《中国文字》2022年第1期

刊讯|SSCI 期刊《语言学与哲学》2022年第1期

刊讯|《解放军外国语学院学报》2022年第2期(留言赠刊)

刊讯|SSCI期刊《语言学与教育》 2022年第67卷

刊讯|《国际汉语文化研究》 2021年第6辑

刊讯|《天津师范大学学报》“国际中文教育”专栏(2021)


欢迎加入

“语言学心得交流分享群”“语言学考博/考研/保研交流群”


请添加“心得君”入群请备注“学校+研究方向

今日小编:栗子

  审     核:心得小蔓

转载&合作请联系

"心得君"

微信:xindejun_yyxxd

点击“阅读原文”可跳转下载

您可能也对以下帖子感兴趣

文章有问题?点此查看未经处理的缓存