for Journals by Title or ISSN
for Articles by Keywords
help
  Subjects -> EDUCATION (Total: 1742 journals)
    - ADULT EDUCATION (24 journals)
    - COLLEGE AND ALUMNI (9 journals)
    - E-LEARNING (22 journals)
    - EDUCATION (1455 journals)
    - HIGHER EDUCATION (117 journals)
    - INTERNATIONAL EDUCATION PROGRAMS (4 journals)
    - ONLINE EDUCATION (28 journals)
    - SCHOOL ORGANIZATION (12 journals)
    - SPECIAL EDUCATION AND REHABILITATION (34 journals)
    - TEACHING METHODS AND CURRICULUM (37 journals)

EDUCATION (1455 journals)                  1 2 3 4 5 6 7 8 | Last

Showing 1 - 200 of 857 Journals sorted alphabetically
#Tear : Revista de Educação, Ciência e Tecnologia     Open Access   (Followers: 1)
(Pensamiento), (palabra) y obra     Open Access  
@tic. revista d'innovació educativa     Open Access  
Abant İzzet Baysal Üniversitesi Eğitim Fakültesi Dergisi     Open Access  
About Campus     Hybrid Journal   (Followers: 5)
Academic Medicine     Full-text available via subscription   (Followers: 58)
Academic Psychiatry     Full-text available via subscription   (Followers: 22)
Academic Questions     Hybrid Journal   (Followers: 7)
Academy of Educational Leadership Journal     Full-text available via subscription   (Followers: 53)
Academy of Management Learning and Education     Full-text available via subscription   (Followers: 49)
Accounting & Finance     Hybrid Journal   (Followers: 46)
Accounting Education: An International Journal     Hybrid Journal   (Followers: 14)
ACM Transactions on Computing Education (TOCE)     Hybrid Journal   (Followers: 3)
Across the Disciplines     Open Access   (Followers: 7)
Acta Didactica Norge     Open Access  
Acta Scientiarum. Education     Open Access  
Acta Technologica Dubnicae     Open Access  
Action in Teacher Education     Hybrid Journal   (Followers: 56)
Action Learning: Research and Practice     Hybrid Journal   (Followers: 38)
Action Research     Hybrid Journal   (Followers: 39)
Active Learning in Higher Education     Hybrid Journal   (Followers: 238)
Actualidades Pedagógicas     Open Access  
Administration & Society     Hybrid Journal   (Followers: 11)
Administrative Science Quarterly     Full-text available via subscription   (Followers: 152)
Adult Education Quarterly     Hybrid Journal   (Followers: 173)
Advanced Education     Open Access   (Followers: 4)
Advances in Health Sciences Education     Hybrid Journal   (Followers: 22)
Advances in High Energy Physics     Open Access   (Followers: 19)
Advances in School Mental Health Promotion     Partially Free   (Followers: 9)
AERA Open     Open Access   (Followers: 6)
Africa Education Review     Partially Free   (Followers: 24)
African Journal of Chemical Education     Open Access   (Followers: 2)
African Journal of Educational Studies in Mathematics and Sciences     Full-text available via subscription   (Followers: 5)
African Journal of Health Professions Education     Open Access   (Followers: 5)
African Journal of Research in Mathematics, Science and Technology Education     Full-text available via subscription   (Followers: 8)
Agora     Full-text available via subscription   (Followers: 3)
AGORA Magazine     Open Access  
Ahmad Dahlan Journal of English Studies     Open Access   (Followers: 1)
AIDS Education and Prevention     Full-text available via subscription   (Followers: 7)
Akadémiai Értesítö     Full-text available via subscription  
AKSIOMA Journal of Mathematics Education     Open Access   (Followers: 1)
Al-Jabar : Jurnal Pendidikan Matematika     Open Access  
Alexandria : Revista de Educação em Ciência e Tecnologia     Open Access  
Alsic     Open Access   (Followers: 18)
Alteridad     Open Access  
Amasya Universitesi Egitim Fakültesi Dergisi     Open Access  
American Annals of the Deaf     Full-text available via subscription   (Followers: 13)
American Biology Teacher     Full-text available via subscription   (Followers: 13)
American Educational Research Journal     Hybrid Journal   (Followers: 143)
American Journal of Business Education     Open Access   (Followers: 10)
American Journal of Distance Education     Hybrid Journal   (Followers: 28)
American Journal of Education     Full-text available via subscription   (Followers: 166)
American Journal of Educational Research     Open Access   (Followers: 55)
American Journal of Health Education     Hybrid Journal   (Followers: 28)
American Journal of Physics     Full-text available via subscription   (Followers: 54)
ANALES de la Universidad Central del Ecuador     Open Access   (Followers: 1)
Annali dell'Universita di Ferrara     Hybrid Journal  
Annals of Dyslexia     Hybrid Journal   (Followers: 9)
Annals of Modern Education     Full-text available via subscription   (Followers: 3)
Apertura. Revista de innovación educativa‏     Open Access   (Followers: 2)
Applied Environmental Education & Communication     Hybrid Journal   (Followers: 13)
Applied Measurement in Education     Hybrid Journal   (Followers: 9)
Arabia     Open Access  
Art Design & Communication in Higher Education     Hybrid Journal   (Followers: 19)
Arts and Humanities in Higher Education     Hybrid Journal   (Followers: 33)
Arts Education Policy Review     Hybrid Journal   (Followers: 4)
ASHE Higher Education Reports     Hybrid Journal   (Followers: 14)
Asia Pacific Education Review     Hybrid Journal   (Followers: 9)
Asia Pacific Journal of Education     Hybrid Journal   (Followers: 18)
Asia-Pacific Education Researcher     Hybrid Journal   (Followers: 11)
Asia-Pacific Journal of Health, Sport and Physical Education     Hybrid Journal   (Followers: 9)
Asia-Pacific Journal of Teacher Education     Hybrid Journal   (Followers: 18)
Asian Association of Open Universities Journal     Open Access  
Asian Education and Development Studies     Hybrid Journal   (Followers: 5)
Asian Journal of English Language Teaching     Full-text available via subscription   (Followers: 11)
Asian Journal of Legal Education     Full-text available via subscription   (Followers: 5)
ASp     Open Access   (Followers: 2)
Assessing Writing     Hybrid Journal   (Followers: 11)
Assessment & Evaluation in Higher Education     Hybrid Journal   (Followers: 125)
Assessment for Effective Intervention     Hybrid Journal   (Followers: 15)
Assessment in Education: Principles, Policy & Practice     Hybrid Journal   (Followers: 36)
Assessment Update     Hybrid Journal   (Followers: 4)
AStA Wirtschafts- und Sozialstatistisches Archiv     Hybrid Journal   (Followers: 5)
At-Ta'dib Jurnal Kependidikan Islam     Open Access  
At-Tajdid : Jurnal Ilmu Tarbiyah     Open Access   (Followers: 2)
At-Turats     Open Access  
Athenea Digital     Open Access  
Aula Abierta     Open Access   (Followers: 1)
Australasian Journal of Educational Technology     Open Access   (Followers: 14)
Australasian Journal of Gifted Education     Full-text available via subscription   (Followers: 4)
Australasian Marketing Journal (AMJ)     Hybrid Journal   (Followers: 8)
Australian Art Education     Full-text available via subscription   (Followers: 6)
Australian Educational and Developmental Psychologist, The     Full-text available via subscription   (Followers: 6)
Australian Educational Computing     Open Access  
Australian Educational Researcher     Hybrid Journal   (Followers: 19)
Australian Journal of Adult Learning     Full-text available via subscription   (Followers: 12)
Australian Journal of Career Development     Hybrid Journal   (Followers: 2)
Australian Journal of Dyslexia and Learning Difficulties     Full-text available via subscription   (Followers: 8)
Australian Journal of Education     Hybrid Journal   (Followers: 30)
Australian Journal of Learning Difficulties     Hybrid Journal   (Followers: 4)
Australian Journal of Music Education     Full-text available via subscription   (Followers: 3)
Australian Journal of Public Administration     Hybrid Journal   (Followers: 404)
Australian Journal of Teacher Education     Open Access   (Followers: 21)
Australian Mathematics Teacher, The     Full-text available via subscription   (Followers: 7)
Australian Primary Mathematics Classroom     Full-text available via subscription   (Followers: 2)
Australian Screen Education Online     Full-text available via subscription   (Followers: 2)
Australian TAFE Teacher     Full-text available via subscription   (Followers: 2)
Australian Universities' Review, The     Full-text available via subscription   (Followers: 3)
Autism     Hybrid Journal   (Followers: 192)
Avaliação : Revista da Avaliação da Educação Superior (Campinas)     Open Access  
Azalea: Journal of Korean Literature & Culture     Full-text available via subscription   (Followers: 4)
Balkan Region Conference on Engineering and Business Education     Open Access   (Followers: 1)
BELIA : Early Childhood Education Papers     Open Access   (Followers: 4)
BELT - Brazilian English Language Teaching Journal     Open Access   (Followers: 4)
Berkeley Review of Education     Open Access   (Followers: 4)
Biblioteca Escolar em Revista     Open Access  
Biblioteka i Edukacja     Open Access   (Followers: 4)
Bildung und Erziehung     Hybrid Journal   (Followers: 2)
Bioedukasi : Jurnal Pendidikan Biologi FKIP UM Metro     Open Access  
Biosaintifika : Journal of Biology & Biology Education     Open Access   (Followers: 7)
Biosfer : Jurnal Biologi dan Pendidikan Biologi     Open Access  
BMC Medical Education     Open Access   (Followers: 41)
BMJ Simulation & Technology Enhanced Learning     Full-text available via subscription   (Followers: 7)
BoEM - Boletim online de Educação Matemática     Open Access  
Boletim Cearense de Educação e História da Matemática     Open Access  
Boletim de Educação Matemática     Open Access  
British Educational Research Journal     Hybrid Journal   (Followers: 168)
British Journal of Educational Studies     Hybrid Journal   (Followers: 189)
British Journal of Educational Technology     Hybrid Journal   (Followers: 125)
British Journal of Religious Education     Hybrid Journal   (Followers: 8)
British Journal of Sociology of Education     Hybrid Journal   (Followers: 45)
British Journal of Special Education     Hybrid Journal   (Followers: 39)
British Journal of Visual Impairment     Hybrid Journal   (Followers: 12)
Brookings Trade Forum     Full-text available via subscription   (Followers: 3)
Business, Management and Education     Open Access   (Followers: 18)
Caderno Brasileiro de Ensino de Física     Open Access  
Caderno Intersabares     Open Access  
Cadernos CEDES     Open Access   (Followers: 1)
Cadernos de Educação     Open Access  
Cadernos de Educação, Tecnologia e Sociedade     Open Access  
Cadernos de Pesquisa     Open Access  
Cadernos de Pesquisa     Open Access   (Followers: 2)
Cadernos de Pesquisa em Educação     Open Access   (Followers: 1)
Cadmo     Full-text available via subscription  
Cahiers de la recherche sur l'éducation et les savoirs     Open Access   (Followers: 4)
Calidad en la educación     Open Access   (Followers: 1)
Cambridge Journal of Education     Hybrid Journal   (Followers: 96)
Campus Legal Advisor     Hybrid Journal   (Followers: 2)
Campus Security Report     Hybrid Journal  
Canadian and International Education     Open Access   (Followers: 8)
Canadian Journal for New Scholars in Education/ Revue canadienne des jeunes chercheures et chercheurs en éducation     Open Access   (Followers: 7)
Canadian Journal for the Scholarship of Teaching and Learning     Open Access   (Followers: 13)
Canadian Journal of Education : Revue canadienne de l'éducation     Open Access   (Followers: 5)
Canadian Journal of Higher Education     Open Access   (Followers: 23)
Canadian Journal of Learning and Technology / La revue canadienne de l’apprentissage et de la technologie     Open Access   (Followers: 14)
Canadian Journal of School Psychology     Hybrid Journal   (Followers: 9)
Canadian Journal of Science, Mathematics and Technology Education     Hybrid Journal   (Followers: 18)
Career Development and Transition for Exceptional Individuals     Hybrid Journal   (Followers: 7)
Catalejos. Revista sobre lectura, formación de lectores y literatura para niños     Open Access  
Catharsis : Journal of Arts Education     Open Access  
CELE Exchange, Centre for Effective Learning Environments     Full-text available via subscription   (Followers: 1)
Cendekia : Jurnal Kependidikan dan Kemasyarakatan     Open Access  
Change: The Magazine of Higher Learning     Hybrid Journal   (Followers: 14)
Changing English: Studies in Culture and Education     Hybrid Journal   (Followers: 6)
Charrette     Open Access  
Chemical Engineering Education     Full-text available via subscription  
Chemistry Education Research and Practice     Free   (Followers: 5)
Chemistry in Education     Open Access   (Followers: 9)
Chi'e : Journal of Japanese Learning and Teaching     Open Access   (Followers: 2)
Child Language Teaching and Therapy     Hybrid Journal   (Followers: 25)
Child Psychiatry & Human Development     Hybrid Journal   (Followers: 9)
Childhood Education     Hybrid Journal   (Followers: 15)
Children's Literature in Education     Hybrid Journal   (Followers: 8)
Chinese Education & Society     Full-text available via subscription   (Followers: 2)
Christian Higher Education     Hybrid Journal   (Followers: 2)
Christian Perspectives in Education     Open Access   (Followers: 7)
Ciência & Educação (Bauru)     Open Access  
Ciência & Saúde Coletiva     Open Access   (Followers: 2)
Ciencia en Desarrollo     Open Access  
Ciencias Sociales y Educación     Open Access   (Followers: 2)
Citizenship, Social and Economics Education     Full-text available via subscription   (Followers: 5)
Classroom Discourse     Hybrid Journal   (Followers: 8)
Clinical Child and Family Psychology Review     Hybrid Journal   (Followers: 7)
Clio y Asociados     Open Access  
CME     Hybrid Journal   (Followers: 1)
Coaching: An International Journal of Theory, Research and Practice     Hybrid Journal   (Followers: 9)
Cogent Education     Open Access   (Followers: 1)
College Athletics and The Law     Hybrid Journal   (Followers: 1)
College Teaching     Hybrid Journal   (Followers: 12)
Colóquio Internacional de Educação e Seminário de Estratégias e Ações Multidisciplinares     Open Access  
Communication Disorders Quarterly     Hybrid Journal   (Followers: 14)
Communication Education     Hybrid Journal   (Followers: 19)
Communication Methods and Measures     Hybrid Journal   (Followers: 11)
Community College Journal of Research and Practice     Hybrid Journal   (Followers: 8)
Community College Review     Hybrid Journal   (Followers: 7)
Community Development     Hybrid Journal   (Followers: 18)
Community Literacy Journal     Partially Free   (Followers: 2)
Comparative Education     Hybrid Journal   (Followers: 27)
Comparative Education Review     Full-text available via subscription   (Followers: 32)
Comparative Professional Pedagogy     Open Access   (Followers: 2)

        1 2 3 4 5 6 7 8 | Last

Journal Cover Assessing Writing
  [SJR: 0.962]   [H-I: 20]   [11 followers]  Follow
    
   Hybrid Journal Hybrid journal (It can contain Open Access articles)
   ISSN (Print) 1075-2935
   Published by Elsevier Homepage  [3044 journals]
  • Exploring the relationship between textual characteristics and rating
           quality in rater-mediated writing assessments: An illustration with L1 and
           L2 writing assessments
    • Authors: Stefanie A. Wind; Catanya Stager; Yogendra J. Patil
      Pages: 1 - 15
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Stefanie A. Wind, Catanya Stager, Yogendra J. Patil
      Numerous researchers have explored the degree to which specific textual characteristics of student compositions are associated with high and low ratings, as well as differences in these relationships across subgroups of students (e.g., English language learners). These studies provide insight into rater judgments and the development of writing proficiency. However, the degree to which textual characteristics are associated with the psychometric quality of ratings is relatively unexplored. This study illustrates a procedure for exploring the influence of textual characteristics of essays on rating quality in the context of rater-mediated writing performance assessments in order to gain a more-complete understanding of rating quality. Two illustrative datasets are used that reflect writing assessments for native English speakers and English language learners. The CohMetrix software program was used to obtain measures of textual characteristics, and the Partial Credit model was used to obtain indicators of rating quality. The relationship between essay features and rating quality was explored using correlation and profile analyses. Results suggested that rating quality varies across essays with different features, and the relationship between rating quality and essay features is unique to individual writing assessments. Implications are discussed as they relate to research and practice for rater-mediated writing assessments.

      PubDate: 2017-08-31T04:08:02Z
      DOI: 10.1016/j.asw.2017.08.003
      Issue No: Vol. 34 (2017)
       
  • Automated formative writing assessment using a levels of language
           framework
    • Authors: Joshua Wilson; Rod Roscoe; Yusra Ahmed
      Pages: 16 - 36
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Joshua Wilson, Rod Roscoe, Yusra Ahmed
      This study investigates a novel approach to conducting formative writing assessment that involves evaluating students' writing skills across three levels of language (word, sentence, and discourse) using automated measures of word choice, syntax, and cohesion. Writing from students in Grades 6 and 8 (n=240 each) was analyzed using Coh-Metrix. Multigroup confirmatory factor analysis evaluated a hypothesized three factor levels of language model, and multigroup structural equation modeling determined if these factors predicted performance on a state writing achievement test comprised of a Direct Assessment of Writing (DAW) and an Editing and Revising test (ER). Results indicated that a subset of 9 Coh-Metrix measures successfully modeled three latent levels of language factors at each grade level. Results also indicated that the DAW test was predicted by the latent Discourse factor and the ER test was predicted by the latent Discourse and Sentence factors. Findings provide a proof of concept for automated formative assessment using a levels of language framework. Furthermore, although not the primary goal of the study, results may lay the groundwork for new levels of language detection algorithms that could be incorporated within automated writing evaluation software programs to expand automated+teacher assessment and feedback approaches.

      PubDate: 2017-09-12T04:55:20Z
      DOI: 10.1016/j.asw.2017.08.002
      Issue No: Vol. 34 (2017)
       
  • Assessing C2 writing ability on the certificate of english language
           proficiency: Rater and examinee age effects
    • Authors: Daniel R. Isbell
      Pages: 37 - 49
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Daniel R. Isbell
      Differentiating between advanced L2 writers at the higher levels of the Common European Framework of Reference (CEFR) presents a challenge in assessment. The distinction between descriptors at the C1 and C2 levels are fine-grained, and even native speakers of a language may not consistently achieve them. At the same time, the CEFR has generally been conceived with the language abilities and contexts of use of adults in mind, thus making CEFR-based interpretations of young language learner’s abilities problematic. This study examines two issues in the assessment of C2-level writing in the context of the Certificate of English Language Proficiency (CELP) writing task: rater effects and examinee age. Interrater reliability and many-facet Rasch analysis showed that raters varied substantially in severity. CELP scoring procedures for rater disagreement partially mitigated severity differences. Contrary to expectations, age differentiated examinee abilities minimally and defied hypothesized ordering (i.e., that writing ability would increase with age). Additionally, some raters were found to demonstrate bias towards the youngest examinees. Specific implications for the CELP’s validity argument and broader implications for assessing young writers in CEFR terms are discussed.

      PubDate: 2017-09-12T04:55:20Z
      DOI: 10.1016/j.asw.2017.08.004
      Issue No: Vol. 34 (2017)
       
  • The TOEFL iBT writing: Korean students’ perceptions of the TOEFL iBT
           writing test
    • Authors: Eun-Young Julia Kim
      Pages: 1 - 11
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Eun-Young Julia Kim
      The TOEFL is one of the most widely recognized language proficiency tests developed to measure international students’ level of readiness for degree study. Whereas there exist a number of correlational studies conducted by various affiliates of ETS based on large-scale quantitative data, there is a dearth of studies that explore test-takers’ perceptions and experiences concerning the TOEFL iBT. Writing skills have paramount importance for academic success, and high-stakes tests such as the TOEFL have a tendency to influence test-takers’ perceptions on what defines good academic writing. To date, no research has specifically focused on test-takers’ perceptions on the writing section of the TOEFL iBT. To fill this gap, this study explores Korean students’ perceptions of effective strategies for preparing for the TOEFL iBT writing test, challenges they face in the test-taking and test-preparation processes, and implications such findings have for various stakeholders, by analyzing online forum data. Findings indicate that the scores for the writing section of the TOEFL iBT, albeit helpful for the initial benchmarking tool, may conceal more than it reveals about Korean students’ academic writing ability. The study suggests that the format, questions, and the scoring of the TOEFL iBT writing test be critically examined from test-takers’ perspectives.

      PubDate: 2017-03-05T12:58:57Z
      DOI: 10.1016/j.asw.2017.02.001
      Issue No: Vol. 33 (2017)
       
  • Assessing peer and instructor response to writing: A corpus analysis from
           an expert survey
    • Authors: Ian G. Anson; Chris M. Anson
      Pages: 12 - 24
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Ian G. Anson, Chris M. Anson
      Over the past 30 years, considerable scholarship has critically examined the nature of instructor response on written assignments in the context of higher education (see Straub, 2006). However, as Haswell (2008) has noted, less is currently known about the nature of peer response, especially as it compares with instructor response. In this study, we critically examine some of the properties of instructor and peer response to student writing. Using the results of an expert survey that provided a lexically-based index of high-quality response, we evaluate a corpus of nearly 50,000 peer responses produced at a four-year public university. Combined with the results of this survey, a large-scale automated content analysis shows first that instructors have adopted some of the field's lexical estimation of high-quality response, and second that student peer response reflects the early acquisition of this lexical estimation, although at further remove from their instructors. The results suggest promising directions for the parallel improvement of both instructor and peer response.

      PubDate: 2017-03-18T08:19:07Z
      DOI: 10.1016/j.asw.2017.03.001
      Issue No: Vol. 33 (2017)
       
  • Understanding university students’ peer feedback practices in EFL
           writing: Insights from a case study
    • Authors: Shulin Yu; Guangwei Hu
      Pages: 25 - 35
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Shulin Yu, Guangwei Hu
      While research on peer feedback in the L2 writing classroom has proliferated over the past three decades, only limited attention has been paid to how students respond to their peers’ writing in specific contexts and why they respond in the ways they do. As a result, much remains to be known about how individual differences and contextual influences shape L2 students’ peer feedback practices. To bridge the research gap, this case study examines two Chinese EFL university students’ peer feedback practices and the factors influencing their feedback practices. Analyses of multiple sources of data including interviews, video recordings of peer feedback sessions, stimulated recalls, and texts reveal that the students took markedly different approaches when responding to their peers’ writing. The findings also indicate that their peer feedback practices were situated in their own distinct sociocultural context and mediated by a myriad of factors including beliefs and values, motives and goals, secondary school learning and feedback experience, teacher feedback practices, feedback training, feedback group dynamics, as well as learning and assessment culture.

      PubDate: 2017-04-08T21:03:50Z
      DOI: 10.1016/j.asw.2017.03.004
      Issue No: Vol. 33 (2017)
       
  • Similarities and differences in constructs represented by U.S. States’
           middle school writing tests and the 2007 national assessment of
           educational progress writing assessment
    • Authors: Ya Mo; Gary A. Troia
      Pages: 48 - 67
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Ya Mo, Gary A. Troia
      Little is known regarding the underlying constructs of writing tests used by U.S. state education authorities and national governments to evaluate the writing performance of their students, especially in middle school grades. Through a content analysis of 78 prompts and 35 rubrics from 27 states’ middle school writing assessments from 2001 to 2007, and three representative prompts and rubrics from the United States’ 2007 National Assessment of Educational Progress (NAEP) writing test, this study illuminates the writing constructs underlying large-scale writing assessments through examination of features in prompts and rubrics and investigation of the connections between prompts and rubrics in terms of genre demands. We found the content of state writing assessments and the NAEP align with respect to measurement parameters associated with (a) emphasis on writing process, audience awareness, and topic knowledge, (b) availability of procedural facilitators (e.g., checklists, rubrics, dictionaries) to assist students in their writing, and (c) inclusion of assessment criteria focused on organization, structure, content, details, sentence fluency, semantics, and general conventions. However, the NAEP’s writing assessment differs from many state tests of writing by including explicit directions for students to review their writing, giving students two timed writing tasks rather than one, making informational text production one of the three genres assessed, and including genre-specific evaluative components in rubrics. This study contributes to our understanding of the direction and path that large-scale writing assessments in the US are taking and how writing assessments are continually evolving.

      PubDate: 2017-06-27T08:22:38Z
      DOI: 10.1016/j.asw.2017.06.001
      Issue No: Vol. 33 (2017)
       
  • To make a long story short: A rubric for assessing graduate students’
           academic and popular science writing skills
    • Authors: Tzipora Rakedzon; Ayelet Baram-Tsabari
      Pages: 28 - 42
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Tzipora Rakedzon, Ayelet Baram-Tsabari
      Graduate students are future scientists, and as such, being able to communicate science is imperative for their integration into the scientific community. This is primarily achieved through scientific papers, mostly published in English; however, interactions outside of academia are also beneficial for future scientists. Therefore, academic writing courses are prevalent and popular science communication courses are on the rise. Nevertheless, no rubrics exist for assessing students' writing in academic and science communication courses. This article describes the development and testing of a rubric for assessing advanced L2 STEM graduate students’ writing in academic (abstract) and popular science writing (press release). The rubric was developed as part of a longstanding academic writing course, but was modified to include a module on science communication with the lay public. Analysis of student needs and the literature inspired a pre-pilot that assessed 16 descriptors on 60 student works. A subsequent, adjusted pilot version on 30 students resulted in adaptations to fit each genre and course goals. In the third round, a modified, final rubric tested on 177 graduate students was created that can be used for both assessment and comparison of the genres. This rubric can assess scientific genres at the graduate level and can be adapted for other genres and levels.

      PubDate: 2017-01-07T04:12:57Z
      DOI: 10.1016/j.asw.2016.12.004
      Issue No: Vol. 32 (2017)
       
  • Checking assumed proficiency: Comparing L1 and L2 performance on a
           university entrance test
    • Authors: Bart Deygers; Kris Van den Branden; Elke Peters
      Pages: 43 - 56
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Bart Deygers, Kris Van den Branden, Elke Peters
      This study compares the results of three groups of participants on the writing component of a centralised L2 university entrance test at the B2 level in Flanders, Belgium. The study investigates whether all Flemish candidates have a B2-level in Dutch upon university entrance, and whether L1 test takers outperform L2 candidates who learned Dutch at home or in Flanders. The results show that, even though the Flemish group outperformed both groups of L2 candidates, not all Flemish candidates reached the B2 level. Additionally, the study compares the results of two groups of L2 users on the same test and shows that candidates who studied Dutch in a Dutch-speaking context do not necessarily outscore candidates who did not. The primary methods of analysis include non-parametric regression and Multi-Faceted Rasch. The results are interpreted in terms of Hulstijn’s conceptualisation of Higher Language Competence, and the study abroad literature. Implications for the university entrance policy are discussed at the end of the paper.

      PubDate: 2017-01-07T04:12:57Z
      DOI: 10.1016/j.asw.2016.12.005
      Issue No: Vol. 32 (2017)
       
  • The effectiveness of instructor feedback for learning-oriented language
           assessment: Using an integrated reading-to-write task for English for
           academic purposes
    • Authors: Ah-Young (Alicia) Kim; Hyun Jung Kim
      Pages: 57 - 71
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Ah-Young (Alicia) Kim, Hyun Jung Kim
      Learning-oriented language assessment (LOLA) can be effective in promoting learning through assessment by creating a link between the two. Although previous studies have examined the effectiveness of feedback – a major element of LOLA – in L2 writing, few have examined how LOLA could be implemented using an integrated reading-to-write task in English for academic purpose (EAP) contexts, which was the objective of this study. Participants were ten Korean TESOL graduate students taking a research methods course and their professor. During a seven-week period, each student completed a weekly integrated reading-to-write task as part of their classroom assessment – they read an academic research paper on a topic of their choice and wrote a review on it. After receiving feedback from the instructor, students revised their work and resubmitted it the following week. Students and the instructor also participated in a semi-structured interview to discuss the effectiveness of learning-oriented feedback on academic reading-to-write tasks. Learners displayed varying developmental patterns, with some students showing more improvement than others. The findings highlighted two participants’ progress in the content domain. Qualitative analysis results suggest that the students reacted differently to the instructor feedback, leading to varying degrees of writing enhancement. The results provide pedagogical implications for using integrated academic reading-to-write tasks and sustained feedback for LOLA.

      PubDate: 2017-01-22T04:23:23Z
      DOI: 10.1016/j.asw.2016.12.001
      Issue No: Vol. 32 (2017)
       
  • Textual voice elements and voice strength in EFL argumentative writing
    • Authors: Hyung-Jo Yoon
      Pages: 72 - 84
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Hyung-Jo Yoon
      This study examined how the quantity and diversity of textual voice elements contribute to holistic voice strength and essay quality. For the quantification of voice elements, this study used an automated processing tool, the Authorial Voice Analyzer (AVA), which was developed based on categories from Hyland’s voice model (i.e., hedges, boosters, attitude markers, self-mentions, reader pronouns, and directives). To explore the relationship between textual voice elements and holistic voice strength, as well as between voice elements and essay quality, this study analyzed 219 argumentative essays written by L1 Greek-speaking EFL students. The results suggested positive, but weak to moderate, correlations between textual voice and holistic voice strength; a regression model with three textual voice features explained 26% of the variance in voice strength scores. The results also indicated weak correlations between textual voice and essay quality. Interestingly, the textual voice features contributing to voice strength (boosters, attitude markers, and self-mentions) were different from those contributing to essay quality (hedges). Interpreting these findings in relation to the context (timed argumentative writing in an EFL context), this study suggests implications for L2 writing assessment and pedagogy.

      PubDate: 2017-03-05T12:58:57Z
      DOI: 10.1016/j.asw.2017.02.002
      Issue No: Vol. 32 (2017)
       
  • Writing Assessment and the Revolution in Digital Texts and Technologies,
           M. Neal., Teachers College Press, New York (2010). 152 pp., ISBN
           978-0-8077-5140-4
    • Authors: Les Perelman
      Pages: 126 - 129
      Abstract: Publication date: January 2017
      Source:Assessing Writing, Volume 31
      Author(s): Les Perelman


      PubDate: 2017-01-07T04:12:57Z
      DOI: 10.1016/j.asw.2016.12.003
      Issue No: Vol. 31 (2017)
       
  • Tools and Tech: A New Forum
    • Authors: Laura L. Aull
      Abstract: Publication date: Available online 24 July 2017
      Source:Assessing Writing
      Author(s): Laura L. Aull


      PubDate: 2017-07-27T11:06:25Z
      DOI: 10.1016/j.asw.2017.06.003
       
  • Social Justice and Educational Measurement: John Rawls, the History of
           Testing, and the Future of Education, Z. Stein. Routledge (2016). 220 pp.,
           ISBN 978-1138947009
    • Authors: J.W. Hammond
      Abstract: Publication date: Available online 11 July 2017
      Source:Assessing Writing
      Author(s): J.W. Hammond


      PubDate: 2017-07-21T10:40:56Z
       
  • Ed.Board/Aims and scope
    • Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32


      PubDate: 2017-04-29T22:00:21Z
       
  • First Year University Writing: A Corpus Based Study with Implications for
           Pedagogy, L. Aull. Palgrave Macmillan (2015), 239
    • Authors: Mark Chapman
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Mark Chapman


      PubDate: 2017-04-29T22:00:21Z
       
  • Evaluating rater accuracy and perception for integrated writing
           assessments using a mixed-methods approach
    • Authors: Jue Wang; George Engelhard Kevin Raczynski Tian Song Edward Wolfe
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Jue Wang, George Engelhard, Kevin Raczynski, Tian Song, Edward W. Wolfe
      Integrated writing (IW) assessments underscore the connections between reading comprehension and writing skills. These assessments typically include rater-mediated components. Our study identified IW type essays that are difficult-to-score accurately, and then investigated reasons based on rater perceptions and judgments. Our data based on IW assessments are used as formative assessments designed to provide information on the developing literacy of students. We used a mixed- methods approach with rater accuracy defined quantitatively based on Rasch measurement theory, and a survey-based qualitative method designed to investigate rater perceptions and judgments toward student essays within the context of IW assessments. The quantitative analyses suggest that the essays and raters vary along a continuum designed to represent rating accuracy. The qualitative analyses suggest that raters had inconsistent perceptions toward certain features of essays compared to the experts, such as the amount of textual borrowing, the development of ideas, and the consistency of the focus. The implications of this study for research and practice of IW assessments are discussed.

      PubDate: 2017-04-15T21:25:34Z
       
  • Ed.Board/Aims and scope
    • Abstract: Publication date: January 2017
      Source:Assessing Writing, Volume 31


      PubDate: 2017-01-07T04:12:57Z
       
  • Thank you to reviewers, 2016
    • Abstract: Publication date: January 2017
      Source:Assessing Writing, Volume 31


      PubDate: 2017-01-07T04:12:57Z
       
  • Placement of multilingual writers: Is there a role for student voices?
    • Authors: Dana R. Ferris; Katherine Evans; Kendon Kurzer
      Pages: 1 - 11
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Dana R. Ferris, Katherine Evans, Kendon Kurzer
      Directed Self-Placement (DSP) is one placement model that has been implemented in various composition programs in the U.S. but has yet to be investigated thoroughly in second language writing settings. Central to DSP is the belief that, if students are given agency to help determine their educational trajectory, they will be empowered and more motivated to succeed (Crusan, 2011; Royer & Gilles, 1998). In this study, 1067 university L2 students completed both a voluntary self-assessment survey and the locally administered placement examination. We statistically compared the students’ placement exam scores and their responses to the final question as to which level of a four-course writing program they thought would best meet their needs. We also examined a stratified random sample of 100 students’ standardized test scores to see if there was a statistical relationship between those tests, our locally designed and administered placement test, and students’ own self-placement scores. We conclude that student self-assessment might have a legitimate role in our placement process, but it probably cannot be used by itself to accurately place large numbers of multilingual students into a four-level sequence.

      PubDate: 2016-11-14T00:08:36Z
      DOI: 10.1016/j.asw.2016.10.001
      Issue No: Vol. 32 (2016)
       
  • Improvement of writing skills during college: A multi-year cross-sectional
           and longitudinal study of undergraduate writing performance
    • Authors: Daniel Oppenheimer; Franklin Zaromb; James R. Pomerantz; Jean C. Williams; Yoon Soo Park
      Pages: 12 - 27
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Daniel Oppenheimer, Franklin Zaromb, James R. Pomerantz, Jean C. Williams, Yoon Soo Park
      We examined persuasive and expository writing samples collected from more than 300 college students as part of a nine-year cross-sectional and longitudinal study of undergraduate writing performance, conducted between 2000 and 2008. Using newly developed scoring rubrics, longitudinal analyses of writing scores revealed statistically significant growth in writing performance over time. These findings held for both persuasive and expository writing. Although writing performance was better among women than men, and better among students majoring in the humanities and social sciences than in natural sciences and engineering, neither women nor humanities and social science majors showed differential improvement over time from freshman to senior year. Our findings showed reliable increases in writing performance during a student’s college years, and moreover demonstrated that such longitudinal changes can be effectively measured. We call for more such outcome assessment in higher education as an essential tool to enhance student learning.

      PubDate: 2016-12-05T02:20:19Z
      DOI: 10.1016/j.asw.2016.11.001
      Issue No: Vol. 32 (2016)
       
  • Validation of a locally created and rated writing test used for placement
           in a higher education EFL program
    • Authors: Robert C. Johnson; A. Mehdi Riazi
      Abstract: Publication date: Available online 4 October 2016
      Source:Assessing Writing
      Author(s): Robert C. Johnson, A. Mehdi Riazi
      This paper reports a study conducted to validate a locally created and rated writing test. The test was used to inform a higher education institution’s decisions regarding placement of entering students into appropriate preparatory English program courses. An amalgam of two influential models – Kane’s (1992, 1994) interpretive model and Bachman’s (2005) and Bachman and Palmer’s (2010) assessment use argument – was used to build a validation framework. A mixed methods approach incorporating a diverse array of quantitative and qualitative data from various stakeholders, including examinees, students, instructors, staff, and administrators, guided the collection and analysis of evidence informing the validation. Results established serious doubts about the writing test, not only in terms of interpreted score meaning, but also the impact of its use on various stakeholders, and on teaching and learning. The study reinforces the importance of comprehensive validation efforts, particularly by test users, for all instruments informing decisions about test-takers, including writing tests and other types of direct performance assessments. Results informed a number of suggested changes regarding the rubric and rater training, among others, thus demonstrating the potential of validation studies as ‘road maps’ for immediate opportunities to improve both testing and decisions made based on testing.

      PubDate: 2016-10-10T08:05:10Z
      DOI: 10.1016/j.asw.2016.09.002
       
  • Narrative and expository genre effects on students, raters, and
           performance criteria
    • Authors: Heejeong Jeong
      Abstract: Publication date: Available online 27 August 2016
      Source:Assessing Writing
      Author(s): Heejeong Jeong
      The effects of genre play an important role in the assessment of student writing. This study examines the effects of narrative and expository genres on student language proficiency, raters, and performance criteria. For this study, EFL students (n=180) from three proficiency levels (novice, intermediate, and advanced) wrote a narrative and an expository essay that were assessed by raters using four performance criteria: paragraph structure, content, form, and vocabulary. A multi-faceted Rasch measurement (MFRM) analysis showed that differences in the students’ scores were not statistically significant between genres, but showed a significant difference depending on the writing proficiency level. Novice students received significantly higher scores on narratives, while advanced students received significantly higher scores on expository essays, but there was no score difference for intermediate students. Raters showed greater variance when rating for narratives compared to expository texts. Narrative essays covered a wider range of student writing ability, while expository essays showed more centralization in writing scores. For the four performance criteria, vocabulary showed interactions with narrative and expository genres. Expository essays were given significantly higher scores for vocabulary than for narrative texts. The results of this study have implications for the use of narrative and expository genres for writing assessment.

      PubDate: 2016-08-28T12:15:21Z
      DOI: 10.1016/j.asw.2016.08.006
       
  • Exploring the relationship of organization and connection with scores in
           integrated writing assessment
    • Authors: Lia Plakans; Atta Gebril
      Abstract: Publication date: Available online 25 August 2016
      Source:Assessing Writing
      Author(s): Lia Plakans, Atta Gebril
      Traditionally, second language writing assessment has employed writing tasks that require only a single skill; however, in many academic contexts, writing requires the integration of several abilities, including reading and listening. To improve authenticity, integrated tasks are increasingly used in the research and assessment of second language writing. Scholars have proposed discourse synthesis as an underlying construct for these tasks. This study investigated performances on integrated reading-listening-writing tasks to consider how organization and connection, subprocesses in discourse synthesis, are reflected in scores. Four hundred eighty responses to an integrated writing prompt were analyzed for organizational patterns, coherence, and cohesion in relation to test scores. Raters coded essays for type and appropriateness of organization and coherence quality, while computational analysis was used to look at cohesion features. The results indicate that organization and coherence were related to writing score, with quality improving as score increased. However, the cohesion markers analyzed in this study yielded no statistical differences across the score levels.

      PubDate: 2016-08-28T12:15:21Z
      DOI: 10.1016/j.asw.2016.08.005
       
  • Taking stock of portfolio assessment scholarship: From research to
           practice
    • Authors: Ricky Lam
      Abstract: Publication date: Available online 21 August 2016
      Source:Assessing Writing
      Author(s): Ricky Lam
      Portfolio assessment has been extensively investigated over the past two decades. Nonetheless, its broader applications in the first and second language writing classrooms remain inadequate. This paper emphasizes that theoretical and empirical research evidence is likely to inform the classroom-based implementation of portfolio assessment. The paper first introduces the origin, definitions, rationale, applications and characteristics of portfolio assessment, and then historicizes writing portfolio assessment scholarship according to the evolving trends of portfolio assessment development in both the first and second language writing contexts. Subsequently, a method section is included concerning how the theoretical and empirical scholarship was screened, selected and categorized for review in terms of three key themes: (1) research which supports classroom applications of portfolio assessment; (2) research which inhibits classroom-based portfolio assessment practices; and (3) research that needs future investigation on how to promulgate portfolio implementation. The review is followed by three pedagogical recommendations suggesting how teachers, administrators and programme directors can better develop learning-supportive portfolio assessment practices and have maximum exposure to pertinent professional learning. It is hoped that the paper advances the portfolio assessment scholarship, predominantly with a view of using research evidence to inform classroom practices.

      PubDate: 2016-08-24T12:15:14Z
      DOI: 10.1016/j.asw.2016.08.003
       
  • Voice in timed L2 argumentative essay writing
    • Authors: Cecilia Guanfang Zhao
      Abstract: Publication date: Available online 18 August 2016
      Source:Assessing Writing
      Author(s): Cecilia Guanfang Zhao
      The concept of voice is included in various writing textbooks, learning standards, and assessment rubrics, indicating the importance of this element in writing instruction and assessment at both secondary and postsecondary levels. Researchers in second language (L2) writing, however, often debate the importance of voice in L2 writing. Due to the elusiveness of this concept, much of such debate is still at the theoretical level; few empirical studies exist that provide solid evidence to either support or refute the proposition that voice is an important concept to teach in L2 writing classrooms. To fill this gap, the present study empirically investigated the relationship between voice salience, as captured by an analytic rubric, and official TOEFL iBT argumentative essay scores in 200 timed L2 essays. Results showed that voice was a significant predictor of TOEFL essay scores, explaining about 25% of the score variances. Moreover, while each individual voice dimension was found to be strongly or moderately correlated with essay scores when examined in isolation, only the ideational dimension became a significant predictor of text quality, when the effect of other dimensions was controlled for. Implications of such results for L2 writing instruction are discussed.

      PubDate: 2016-08-24T12:15:14Z
      DOI: 10.1016/j.asw.2016.08.004
       
  • “I feel disappointed”: EFL university students’ emotional responses
           towards teacher written feedback
    • Authors: Omer Hassan Ali Mahfoodh
      Abstract: Publication date: Available online 7 August 2016
      Source:Assessing Writing
      Author(s): Omer Hassan Ali Mahfoodh
      Studies on teacher written feedback in Second Language (L2) contexts have not given adequate attention to learners’ emotional responses towards teacher written feedback. Thus, this study examined the relationship between emotional responses of EFL university students towards teacher written feedback and students’ success of revisions. Data were collected using think-aloud protocols, students’ written texts, and semi-structured interviews. To obtain students’ emotional responses towards teacher written feedback, grounded theory was employed to analyse think-aloud protocols and semi-structured interviews. Teacher written feedback was tabulated and categorised using a coding scheme which was developed based on Straub and Lunsford (1995) and Ferris (1997). Students’ success of revisions was analysed using an analytical scheme based on Conrad and Goldstein (1999). The results revealed that EFL university students’ emotional responses include acceptance of feedback, rejection of feedback, surprise, happiness, dissatisfaction, disappointment, frustration, and satisfaction. Some emotional responses could be attributed to harsh criticism, negative evaluation, and miscommunication between teachers and their students. The study also revealed that emotional responses can affect students’ understanding and utilisation of teacher written feedback.
      Graphical abstract image

      PubDate: 2016-08-08T12:14:54Z
      DOI: 10.1016/j.asw.2016.07.001
       
  • Responding to student writing online: Tracking student interactions with
           instructor feedback in a Learning Management System
    • Authors: Angela Laflen; Michelle Smith
      Abstract: Publication date: Available online 1 August 2016
      Source:Assessing Writing
      Author(s): Angela Laflen, Michelle Smith
      Instructor response to student writing increasingly takes place within Learning Management Systems (LMSs), which often make grades visible apart from instructor feedback by default. Previous studies indicate that students generally ascribe more value to grades than to instructor feedback, while instructors believe that feedback is most important. This study investigated how students interact with an LMS interface—an instance of Sakai—to access instructor feedback on their writing. Our blind study analyzed data from 334 students in 16 courses at a medium, comprehensive private college to investigate the question: Does the rate at which students open attachments with instructor feedback differ if students can see their grades without opening the attachment? We compared two response methodologies: mode 1 made grades visible apart from feedback, and mode 2 required students to open attached feedback files to find their grades. The data for each mode was collected automatically by the LMS, retrieved, and retrospectively analyzed. The results show that making grades visible separate from feedback significantly reduced the rate at which students opened instructor feedback files and that timing also impacted students’ rate of access. These findings provide the basis for empirically informed best practices for grading and returning papers online.

      PubDate: 2016-08-03T12:14:49Z
      DOI: 10.1016/j.asw.2016.07.003
       
  • K-12 multimodal assessment and interactive audiences: An exploratory
           analysis of existing frameworks
    • Authors: Ewa McGrail; Nadia Behizadeh
      Abstract: Publication date: Available online 30 July 2016
      Source:Assessing Writing
      Author(s): Ewa McGrail, Nadia Behizadeh
      Multimodal writing today often occurs through membership in an online, participatory culture; thus, based on affordances of online compositions, the audience for student writers has shifted from imagined readers to actual, accessible readers and responders. Additionally, recent content and technology standards for students in US schools emphasize the importance of distributing multimodal compositions to wider audiences. In this article, we closely examine attention to interactive audience and collaboration in a purposive sample of kindergarten through 12th grade (K-12) assessment frameworks, as well as how these frameworks define multimodal composition. We found that multimodal composition is being defined consistently across all frameworks as composition that includes multiple ways of communicating. However, many multimodal composition examples were texts that were non-interactive composition types even though many authors acknowledged the emergence of interactive online composition types that afford the writer the ability to communicate and collaborate with an audience. In addition, the frameworks reviewed tended to focus on the final product and less often on the process or dynamic collaboration with the audience. In the discussion, implications for classroom teachers as well as considerations for researchers exploring the construct of online multimodal writing are offered.

      PubDate: 2016-08-03T12:14:49Z
      DOI: 10.1016/j.asw.2016.06.005
       
  • How students' ability levels influence the relevance and accuracy of their
           feedback to peers: A case study
    • Authors: Ivan Chong
      Abstract: Publication date: Available online 25 July 2016
      Source:Assessing Writing
      Author(s): Ivan Chong
      Traditionally, teachers play a central role in creating a learning environment that favors the implementation of peer assessment in writing. Nevertheless, students’ writing ability and how it factors into students’ provision of relevant (content-related) and accurate (language-related) written feedback is not considered. This is due to the fact that most studies about peer assessment were conducted in a tertiary setting and researchers assume university students have attained a basic level of cognitive and linguistic developments that would empower them to make judgments about their peers’ work. The present study, which was conducted in a Hong Kong secondary school, investigated this research gap by analyzing first drafts produced by a class of 16 Secondary 1 (Grade 7) students in a writing unit. The first section of the study reports students’ writing abilities in terms of content development and linguistic accuracy; findings in the subsequent section suggest that there is a strong and positive relationship between students’ writing abilities and the relevance and accuracy of their written feedback. This paper ends with two pedagogical implications for implementing peer assessment: Alignment with pre-writing instruction and the development of marking focuses based on students’ abilities.

      PubDate: 2016-07-28T12:14:43Z
      DOI: 10.1016/j.asw.2016.07.002
       
  • Are TOEFL iBT® writing test scores related to keyboard type? A survey of
           keyboard-related practices at testing centers
    • Authors: Guangming Ling
      Abstract: Publication date: Available online 18 July 2016
      Source:Assessing Writing
      Author(s): Guangming Ling
      The strength of a computer-based writing test, such as the TOEFL iBT ® Writing Test, lies in its capability to assess all examinees under the same conditions so that scores reflect the targeted writing abilities rather than differences in testing conditions, such as types of keyboards. The familiarity and proficiency examinees have with a specific type of keyboard could affect their efficiency in writing essays and introduce construct-irrelevant variance, although little research is available in the literature. To explore this, we surveyed 2214 TOEFL iBT testing centers in 134 countries on practices related to keyboard type and analyzed the centers’ responses and the TOEFL iBT scores of examinees from these centers. Results revealed that (a) most testing centers used the U.S. standard English keyboard (USKB) for the test, but a small proportion of centers used a country-specific keyboard (CSKB) after being converted to the USKB; (b) TOEFL iBT Writing scores appear to be significantly associated with the types of keyboard and overlay in only 10 countries, with trivial or small score differences associated with keyboard type. These findings suggest that the current practices related to keyboard type appear to have no or little practical effect on examinees’ TOEFL iBT Writing scores.

      PubDate: 2016-07-23T12:14:37Z
      DOI: 10.1016/j.asw.2016.04.001
       
 
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
 
Home (Search)
Subjects A-Z
Publishers A-Z
Customise
APIs
Your IP address: 54.198.58.62
 
About JournalTOCs
API
Help
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-2016