for Journals by Title or ISSN
for Articles by Keywords
  Subjects -> EDUCATION (Total: 1805 journals)
    - ADULT EDUCATION (24 journals)
    - COLLEGE AND ALUMNI (9 journals)
    - E-LEARNING (22 journals)
    - EDUCATION (1513 journals)
    - HIGHER EDUCATION (121 journals)
    - ONLINE EDUCATION (28 journals)
    - SCHOOL ORGANIZATION (13 journals)

EDUCATION (1513 journals)                  1 2 3 4 5 6 7 8 | Last

Showing 1 - 200 of 857 Journals sorted alphabetically
#Tear : Revista de Educação, Ciência e Tecnologia     Open Access   (Followers: 2)
(Pensamiento), (palabra) y obra     Open Access   (Followers: 1)
@tic. revista d'innovació educativa     Open Access  
Abant İzzet Baysal Üniversitesi Eğitim Fakültesi Dergisi     Open Access  
About Campus     Hybrid Journal   (Followers: 5)
Academic Medicine     Full-text available via subscription   (Followers: 60)
Academic Psychiatry     Full-text available via subscription   (Followers: 23)
Academic Questions     Hybrid Journal   (Followers: 8)
Academy of Educational Leadership Journal     Full-text available via subscription   (Followers: 60)
Academy of Management Learning and Education     Full-text available via subscription   (Followers: 53)
Accounting & Finance     Hybrid Journal   (Followers: 49)
Accounting Education: An International Journal     Hybrid Journal   (Followers: 17)
ACM Transactions on Computing Education (TOCE)     Hybrid Journal   (Followers: 6)
Across the Disciplines     Open Access   (Followers: 8)
Acta Didactica Norge     Open Access   (Followers: 1)
Acta Scientiarum. Education     Open Access  
Acta Technologica Dubnicae     Open Access  
Action in Teacher Education     Hybrid Journal   (Followers: 60)
Action Learning: Research and Practice     Hybrid Journal   (Followers: 41)
Action Research     Hybrid Journal   (Followers: 44)
Active Learning in Higher Education     Hybrid Journal   (Followers: 284)
Actualidades Pedagógicas     Open Access  
Adelphi series     Hybrid Journal   (Followers: 15)
Administration & Society     Hybrid Journal   (Followers: 13)
Administrative Science Quarterly     Full-text available via subscription   (Followers: 170)
Adult Education Quarterly     Hybrid Journal   (Followers: 157)
Advanced Education     Open Access   (Followers: 8)
Advances in Building Education     Open Access   (Followers: 3)
Advances in Health Sciences Education     Hybrid Journal   (Followers: 25)
Advances in High Energy Physics     Open Access   (Followers: 19)
Advances in School Mental Health Promotion     Partially Free   (Followers: 9)
AERA Open     Open Access   (Followers: 7)
Africa Education Review     Partially Free   (Followers: 25)
African Journal of Chemical Education     Open Access   (Followers: 3)
African Journal of Educational Studies in Mathematics and Sciences     Full-text available via subscription   (Followers: 5)
African Journal of Health Professions Education     Open Access   (Followers: 6)
African Journal of Research in Mathematics, Science and Technology Education     Full-text available via subscription   (Followers: 8)
Agora     Full-text available via subscription   (Followers: 3)
AGORA Magazine     Open Access  
Ahmad Dahlan Journal of English Studies     Open Access   (Followers: 1)
AIDS Education and Prevention     Full-text available via subscription   (Followers: 7)
Akadémiai Értesítö     Full-text available via subscription  
Aksiologiya : Jurnal Pengabdian Kepada Masyarakat     Open Access  
AKSIOMA Journal of Mathematics Education     Open Access   (Followers: 1)
Al-Idarah : Jurnal Kependidikan Islam     Open Access  
Al-Jabar : Jurnal Pendidikan Matematika     Open Access  
Alexandria : Revista de Educação em Ciência e Tecnologia     Open Access  
Alsic     Open Access   (Followers: 20)
Alteridad     Open Access  
Amasya Universitesi Egitim Fakültesi Dergisi     Open Access  
American Annals of the Deaf     Full-text available via subscription   (Followers: 13)
American Biology Teacher     Full-text available via subscription   (Followers: 13)
American Educational Research Journal     Hybrid Journal   (Followers: 163)
American Journal of Business Education     Open Access   (Followers: 10)
American Journal of Distance Education     Hybrid Journal   (Followers: 29)
American Journal of Education     Full-text available via subscription   (Followers: 190)
American Journal of Educational Research     Open Access   (Followers: 60)
American Journal of Health Education     Hybrid Journal   (Followers: 32)
American Journal of Physics     Full-text available via subscription   (Followers: 56)
American String Teacher     Full-text available via subscription   (Followers: 1)
ANALES de la Universidad Central del Ecuador     Open Access   (Followers: 2)
Annali dell'Universita di Ferrara     Hybrid Journal  
Annals of Dyslexia     Hybrid Journal   (Followers: 10)
Annals of Modern Education     Full-text available via subscription   (Followers: 3)
Apertura. Revista de innovación educativa‏     Open Access   (Followers: 2)
Applied Environmental Education & Communication     Hybrid Journal   (Followers: 15)
Applied Measurement in Education     Hybrid Journal   (Followers: 9)
Arabia     Open Access  
Art Design & Communication in Higher Education     Hybrid Journal   (Followers: 20)
Arts and Humanities in Higher Education     Hybrid Journal   (Followers: 34)
Arts Education Policy Review     Hybrid Journal   (Followers: 5)
ASHE Higher Education Reports     Hybrid Journal   (Followers: 15)
Asia Pacific Education Review     Hybrid Journal   (Followers: 12)
Asia Pacific Journal of Education     Hybrid Journal   (Followers: 21)
Asia-Pacific Education Researcher     Hybrid Journal   (Followers: 12)
Asia-Pacific Journal of Health, Sport and Physical Education     Hybrid Journal   (Followers: 9)
Asia-Pacific Journal of Teacher Education     Hybrid Journal   (Followers: 20)
Asian Association of Open Universities Journal     Open Access  
Asian Education and Development Studies     Hybrid Journal   (Followers: 5)
Asian Journal of English Language Teaching     Full-text available via subscription   (Followers: 14)
Asian Journal of Legal Education     Full-text available via subscription   (Followers: 5)
ASp     Open Access   (Followers: 2)
Assessing Writing     Hybrid Journal   (Followers: 11)
Assessment & Evaluation in Higher Education     Hybrid Journal   (Followers: 146)
Assessment for Effective Intervention     Hybrid Journal   (Followers: 15)
Assessment in Education: Principles, Policy & Practice     Hybrid Journal   (Followers: 40)
Assessment Update     Hybrid Journal   (Followers: 4)
AStA Wirtschafts- und Sozialstatistisches Archiv     Hybrid Journal   (Followers: 5)
At-Ta'dib Jurnal Kependidikan Islam     Open Access  
At-Taqaddum     Open Access  
At-Turats     Open Access  
Athenea Digital     Open Access  
Aula Abierta     Open Access   (Followers: 1)
Australasian Journal of Educational Technology     Open Access   (Followers: 19)
Australasian Journal of Gifted Education     Full-text available via subscription   (Followers: 4)
Australasian Marketing Journal (AMJ)     Hybrid Journal   (Followers: 9)
Australian Art Education     Full-text available via subscription   (Followers: 7)
Australian Educational and Developmental Psychologist, The     Full-text available via subscription   (Followers: 8)
Australian Educational Computing     Open Access   (Followers: 1)
Australian Educational Researcher     Hybrid Journal   (Followers: 24)
Australian Journal of Adult Learning     Full-text available via subscription   (Followers: 15)
Australian Journal of Career Development     Hybrid Journal   (Followers: 3)
Australian Journal of Dyslexia and Learning Difficulties     Full-text available via subscription   (Followers: 8)
Australian Journal of Education     Hybrid Journal   (Followers: 34)
Australian Journal of Environmental Education     Full-text available via subscription   (Followers: 10)
Australian Journal of Indigenous Education, The     Full-text available via subscription   (Followers: 10)
Australian Journal of Learning Difficulties     Hybrid Journal   (Followers: 4)
Australian Journal of Music Education     Full-text available via subscription   (Followers: 3)
Australian Journal of Public Administration     Hybrid Journal   (Followers: 441)
Australian Journal of Teacher Education     Open Access   (Followers: 22)
Australian Mathematics Teacher, The     Full-text available via subscription   (Followers: 7)
Australian Primary Mathematics Classroom     Full-text available via subscription   (Followers: 3)
Australian Screen Education Online     Full-text available via subscription   (Followers: 2)
Australian TAFE Teacher     Full-text available via subscription   (Followers: 3)
Australian Universities' Review, The     Full-text available via subscription   (Followers: 4)
Autism     Hybrid Journal   (Followers: 234)
Avaliação : Revista da Avaliação da Educação Superior (Campinas)     Open Access  
Azalea: Journal of Korean Literature & Culture     Full-text available via subscription   (Followers: 5)
Balkan Region Conference on Engineering and Business Education     Open Access   (Followers: 1)
BELIA : Early Childhood Education Papers     Open Access   (Followers: 6)
BELT - Brazilian English Language Teaching Journal     Open Access   (Followers: 5)
Biblioteca Escolar em Revista     Open Access  
Biblioteka i Edukacja     Open Access   (Followers: 4)
Bildung und Erziehung     Hybrid Journal   (Followers: 4)
Bioedukasi : Jurnal Pendidikan Biologi FKIP UM Metro     Open Access  
Bioma : Jurnal Ilmiah Biologi     Open Access  
Biosaintifika : Journal of Biology & Biology Education     Open Access   (Followers: 7)
Biosfer : Jurnal Biologi dan Pendidikan Biologi     Open Access  
BMC Medical Education     Open Access   (Followers: 42)
BMJ Simulation & Technology Enhanced Learning     Full-text available via subscription   (Followers: 8)
BoEM - Boletim online de Educação Matemática     Open Access  
Boletim Cearense de Educação e História da Matemática     Open Access  
Boletim de Educação Matemática     Open Access  
British Educational Research Journal     Hybrid Journal   (Followers: 187)
British Journal of Educational Studies     Hybrid Journal   (Followers: 160)
British Journal of Educational Technology     Hybrid Journal   (Followers: 150)
British Journal of Music Education     Hybrid Journal   (Followers: 21)
British Journal of Religious Education     Hybrid Journal   (Followers: 8)
British Journal of Sociology of Education     Hybrid Journal   (Followers: 51)
British Journal of Special Education     Hybrid Journal   (Followers: 47)
British Journal of Visual Impairment     Hybrid Journal   (Followers: 12)
Brookings Trade Forum     Full-text available via subscription   (Followers: 3)
Business, Management and Education     Open Access   (Followers: 18)
Caderno Brasileiro de Ensino de Física     Open Access  
Caderno Intersabares     Open Access  
Cadernos CEDES     Open Access   (Followers: 1)
Cadernos de Educação     Open Access  
Cadernos de Educação, Tecnologia e Sociedade     Open Access  
Cadernos de Pesquisa     Open Access  
Cadernos de Pesquisa     Open Access   (Followers: 2)
Cadernos de Pesquisa em Educação     Open Access   (Followers: 1)
Cadmo     Full-text available via subscription   (Followers: 1)
Cahiers de la recherche sur l'éducation et les savoirs     Open Access   (Followers: 4)
Calidad en la educación     Open Access   (Followers: 1)
Cambridge Journal of Education     Hybrid Journal   (Followers: 98)
Campus Legal Advisor     Hybrid Journal   (Followers: 2)
Campus Security Report     Hybrid Journal  
Canadian and International Education     Open Access   (Followers: 9)
Canadian Journal for New Scholars in Education/ Revue canadienne des jeunes chercheures et chercheurs en éducation     Open Access   (Followers: 8)
Canadian Journal for the Scholarship of Teaching and Learning     Open Access   (Followers: 16)
Canadian Journal of Education : Revue canadienne de l'éducation     Open Access   (Followers: 8)
Canadian Journal of Higher Education     Open Access   (Followers: 24)
Canadian Journal of Learning and Technology / La revue canadienne de l’apprentissage et de la technologie     Open Access   (Followers: 15)
Canadian Journal of School Psychology     Hybrid Journal   (Followers: 10)
Canadian Journal of Science, Mathematics and Technology Education     Hybrid Journal   (Followers: 20)
Career Development and Transition for Exceptional Individuals     Hybrid Journal   (Followers: 7)
Catalejos. Revista sobre lectura, formación de lectores y literatura para niños     Open Access  
Catharsis : Journal of Arts Education     Open Access  
CELE Exchange, Centre for Effective Learning Environments     Full-text available via subscription   (Followers: 1)
Cendekia : Jurnal Kependidikan dan Kemasyarakatan     Open Access  
Change: The Magazine of Higher Learning     Hybrid Journal   (Followers: 16)
Changing English: Studies in Culture and Education     Hybrid Journal   (Followers: 6)
Charrette     Open Access  
Chemical Engineering Education     Full-text available via subscription   (Followers: 1)
Chemistry Education Research and Practice     Free   (Followers: 5)
Chemistry in Education     Open Access   (Followers: 9)
Chi'e : Journal of Japanese Learning and Teaching     Open Access   (Followers: 3)
Child Language Teaching and Therapy     Hybrid Journal   (Followers: 30)
Child Psychiatry & Human Development     Hybrid Journal   (Followers: 9)
Childhood Education     Hybrid Journal   (Followers: 17)
Children's Literature in Education     Hybrid Journal   (Followers: 9)
Chinese Education & Society     Full-text available via subscription   (Followers: 3)
Christian Higher Education     Hybrid Journal   (Followers: 2)
Christian Perspectives in Education     Open Access   (Followers: 7)
Ciência & Educação (Bauru)     Open Access  
Ciência & Saúde Coletiva     Open Access   (Followers: 2)
Ciencia en Desarrollo     Open Access  
Ciencias Sociales y Educación     Open Access   (Followers: 2)
Citizenship, Social and Economics Education     Full-text available via subscription   (Followers: 5)
Classroom Discourse     Hybrid Journal   (Followers: 8)
Clinical Child and Family Psychology Review     Hybrid Journal   (Followers: 7)
Clio y Asociados     Open Access  
CME     Hybrid Journal   (Followers: 1)
Coaching: An International Journal of Theory, Research and Practice     Hybrid Journal   (Followers: 10)
Cogent Education     Open Access   (Followers: 2)
College Athletics and The Law     Hybrid Journal   (Followers: 1)
College Teaching     Hybrid Journal   (Followers: 13)
Colóquio Internacional de Educação e Seminário de Estratégias e Ações Multidisciplinares     Open Access  
Communication Disorders Quarterly     Hybrid Journal   (Followers: 14)
Communication Education     Hybrid Journal   (Followers: 20)

        1 2 3 4 5 6 7 8 | Last

Journal Cover Assessing Writing
  [SJR: 0.962]   [H-I: 20]   [11 followers]  Follow
   Hybrid Journal Hybrid journal (It can contain Open Access articles)
   ISSN (Print) 1075-2935
   Published by Elsevier Homepage  [3177 journals]
  • Not to scale' An argument-based inquiry into the validity of an L2
           writing rating scale
    • Authors: Anthony Becker
      Pages: 1 - 12
      Abstract: Publication date: July 2018
      Source:Assessing Writing, Volume 37
      Author(s): Anthony Becker
      In second language (L2) writing, rating scales are often used to measure a variety of discourse and linguistic features. When developing scales, the scoring criteria need to provide a clear and credible basis for scoring judgments, as well as for differentiating levels of writing performance (Weigle, 2002). Often times, the criteria used to evaluate the L2 writing of students at intensive English programs (IEPs) are adopted from textbooks or developed as an ad hoc solution, and they’re adequacy or relevance to classroom-based writing is not always considered. The inclusion of poor evaluative criteria can lead to scores with low reliability and problems with validity (Jonsson & Svingby, 2007). This study sought to investigate the quality of a rating scale used to assess L2 students’ writing ability at an intermediate level of an IEP in the US. Using a mixed- methods approach, several sources of data were collected and analyzed. The results indicated that the scale largely appeared to function as it was intended, despite the fact that it could have benefited from some revisions. The implications support that L2 practitioners must make principled and justified decisions about the scoring criteria that they include in scales when assessing ESL students’ writing.

      PubDate: 2018-02-25T18:35:25Z
      DOI: 10.1016/j.asw.2018.01.001
      Issue No: Vol. 37 (2018)
  • Editorial
    • Authors: David Slomp; Martin East
      Abstract: Publication date: January 2018
      Source:Assessing Writing, Volume 35
      Author(s): David Slomp, Martin East

      PubDate: 2018-02-25T18:35:25Z
      DOI: 10.1016/j.asw.2018.02.001
      Issue No: Vol. 35 (2018)
  • Effects of indirect coded corrective feedback with and without short
           affective teacher comments on L2 writing performance, learner uptake and
    • Authors: Chiachieh Tang; Yeu-Ting Liu
      Pages: 26 - 40
      Abstract: Publication date: January 2018
      Source:Assessing Writing, Volume 35
      Author(s): Chiachieh Tang, Yeu-Ting Liu
      Though studies have shown the benefits of oral corrective feedback (CF), there is a paucity of research exploring the potency of indirect written CF. Studies have indicated the need of further research on indirect written CF and teacher comments as a way to encourage L2 learners to be better writers. To this end, this study investigated if indirect coded correction feedback (ICCF) and short affective comments were more effective than ICCF alone in enhancing L2 learners’ writing performance, uptake, and motivation. L2 learner participants (n = 56) received the two aforementioned feedback modes and completed three writing tasks at successive times. Analyses of the writings showed a significant improvement in overall writing performance and learner uptake irrespective of the feedback mode they received. This seems to indicate that adding affective comments to ICCF did not significantly boost L2 learners’ writing; however, further analysis of the participants’ questionnaire data showed that the addition did foster a positive mindset motivating them to take further actions to improve their writing and that the pedagogical potency of ICCF and short affective comments seems to be complementary. Pedagogical implications and applications for how ICCF and short affective teacher comments can impact L2 writing are provided.
      Graphical abstract image

      PubDate: 2018-02-05T16:45:38Z
      DOI: 10.1016/j.asw.2017.12.002
      Issue No: Vol. 35 (2018)
  • Examining the validity of an analytic rating scale for a Spanish test for
           academic purposes using the argument-based approach to validation
    • Authors: Arturo Mendoza; Ute Knoch
      Pages: 41 - 55
      Abstract: Publication date: January 2018
      Source:Assessing Writing, Volume 35
      Author(s): Arturo Mendoza, Ute Knoch
      Rating scales are used to assess the performance of examinees presented with open-ended tasks. Drawing on an argument-based approach to validation, this study reports on the development of an analytic rating scale designed for a Spanish test for academic purposes. The study is one of the first that sets out the detailed scale development and validation activities for a rating scale for Spanish as a second language. The rating scale was grounded in a communicative competence model and developed and validated over two phases. The first version was trialed by five raters, and its quality was analyzed by means of many-facet Rasch measurement. Based on the raters’ experience and on the statistical results, the rating scale was modified and a second version was trialed by six raters. After the rating process, raters were sent an online questionnaire in order to collect their opinions and perceptions of the rating scale, the training and the feedback provided during the rating process. The results suggest the rating scale was of good quality and raters’ comments were generally positive, although they mentioned that more samples and training were needed. The study has implications for rating scale development and validation for languages other than English.

      PubDate: 2018-02-05T16:45:38Z
      DOI: 10.1016/j.asw.2017.12.003
      Issue No: Vol. 35 (2018)
  • Call for papers 25th Anniversary Themed Issue: Framing the Future of
           Writing Assessment
    • Abstract: Publication date: January 2018
      Source:Assessing Writing, Volume 35

      PubDate: 2018-02-25T18:35:25Z
  • Analysis of syntactic complexity in secondary education ELF writers at
           different proficiency levels
    • Authors: Ana Cristina Lahuerta Martínez
      Pages: 1 - 11
      Abstract: Publication date: January 2018
      Source:Assessing Writing, Volume 35
      Author(s): Ana Cristina Lahuerta Martínez
      The present study examines differences in syntactic complexity in English writing among lower intermediate and intermediate secondary education writers by means of quantitative measures of syntactic complexity, and compares the scores on the selected syntactic complexity measures with holistic ratings of learners’ overall writing quality. We examined the writing of 188 students at years 3 (lower intermediate) and 4 (intermediate) of secondary education including gender in the analysis. Essays were evaluated by holistic ratings of writing quality and quantitative measures gauging complexification at the sentential, the clausal, and the phrasal level of syntactic organisation. Data revealed significant strong correlations between the holistic ratings and all but one of the complexity metrics. The scores on the general quality of the writings and on all syntactic complexity measures increased from grade 3 to grade 4 and for all but one sentential complexity measure (compound-complex sentence ratio) the increase was statistically significant. Girls obtained a higher score in the general quality of the compositions and in all the measures examined, and for four measures the difference in score was significant.

      PubDate: 2017-12-13T09:30:53Z
      DOI: 10.1016/j.asw.2017.11.002
      Issue No: Vol. 35 (2017)
  • From independent ratings to communal ratings: A study of CWA raters’
           decision-making behaviors
    • Authors: Vivian Lindhardsen
      Pages: 12 - 25
      Abstract: Publication date: January 2018
      Source:Assessing Writing, Volume 35
      Author(s): Vivian Lindhardsen
      The present exploratory study maps the decision-making behaviors of raters in a well-established communal writing assessment (CWA) context, tracing their behaviors all the way from independent rating sessions, where initial images and judgments are formed, to communal rating sessions, where final scores are assigned on the basis of collaboration between two raters. Results from think-aloud protocols, recorded discussions, and retrospective reports from 20 experienced raters rating 15 EFL essays showed that when moving from independent ratings to communal ratings, raters gradually refined their assessments and balanced their attention more evenly among the official assessment criteria to reach what they believed to be more accurate scores. These interpretations support a hermeneutic rather than a psychometric approach to establishing the validity of the CWA practices.

      PubDate: 2017-12-27T12:17:19Z
      DOI: 10.1016/j.asw.2017.12.004
      Issue No: Vol. 35 (2017)
    • Authors: Liz Hamp-Lyons
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Liz Hamp-Lyons

      PubDate: 2017-12-13T09:30:53Z
      DOI: 10.1016/j.asw.2017.11.001
      Issue No: Vol. 34 (2017)
  • Exploring the relationship between textual characteristics and rating
           quality in rater-mediated writing assessments: An illustration with L1 and
           L2 writing assessments
    • Authors: Stefanie A. Wind; Catanya Stager; Yogendra J. Patil
      Pages: 1 - 15
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Stefanie A. Wind, Catanya Stager, Yogendra J. Patil
      Numerous researchers have explored the degree to which specific textual characteristics of student compositions are associated with high and low ratings, as well as differences in these relationships across subgroups of students (e.g., English language learners). These studies provide insight into rater judgments and the development of writing proficiency. However, the degree to which textual characteristics are associated with the psychometric quality of ratings is relatively unexplored. This study illustrates a procedure for exploring the influence of textual characteristics of essays on rating quality in the context of rater-mediated writing performance assessments in order to gain a more-complete understanding of rating quality. Two illustrative datasets are used that reflect writing assessments for native English speakers and English language learners. The CohMetrix software program was used to obtain measures of textual characteristics, and the Partial Credit model was used to obtain indicators of rating quality. The relationship between essay features and rating quality was explored using correlation and profile analyses. Results suggested that rating quality varies across essays with different features, and the relationship between rating quality and essay features is unique to individual writing assessments. Implications are discussed as they relate to research and practice for rater-mediated writing assessments.

      PubDate: 2017-08-31T04:08:02Z
      DOI: 10.1016/j.asw.2017.08.003
      Issue No: Vol. 34 (2017)
  • Automated formative writing assessment using a levels of language
    • Authors: Joshua Wilson; Rod Roscoe; Yusra Ahmed
      Pages: 16 - 36
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Joshua Wilson, Rod Roscoe, Yusra Ahmed
      This study investigates a novel approach to conducting formative writing assessment that involves evaluating students' writing skills across three levels of language (word, sentence, and discourse) using automated measures of word choice, syntax, and cohesion. Writing from students in Grades 6 and 8 (n=240 each) was analyzed using Coh-Metrix. Multigroup confirmatory factor analysis evaluated a hypothesized three factor levels of language model, and multigroup structural equation modeling determined if these factors predicted performance on a state writing achievement test comprised of a Direct Assessment of Writing (DAW) and an Editing and Revising test (ER). Results indicated that a subset of 9 Coh-Metrix measures successfully modeled three latent levels of language factors at each grade level. Results also indicated that the DAW test was predicted by the latent Discourse factor and the ER test was predicted by the latent Discourse and Sentence factors. Findings provide a proof of concept for automated formative assessment using a levels of language framework. Furthermore, although not the primary goal of the study, results may lay the groundwork for new levels of language detection algorithms that could be incorporated within automated writing evaluation software programs to expand automated+teacher assessment and feedback approaches.

      PubDate: 2017-09-12T04:55:20Z
      DOI: 10.1016/j.asw.2017.08.002
      Issue No: Vol. 34 (2017)
  • Assessing C2 writing ability on the certificate of english language
           proficiency: Rater and examinee age effects
    • Authors: Daniel R. Isbell
      Pages: 37 - 49
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Daniel R. Isbell
      Differentiating between advanced L2 writers at the higher levels of the Common European Framework of Reference (CEFR) presents a challenge in assessment. The distinction between descriptors at the C1 and C2 levels are fine-grained, and even native speakers of a language may not consistently achieve them. At the same time, the CEFR has generally been conceived with the language abilities and contexts of use of adults in mind, thus making CEFR-based interpretations of young language learner’s abilities problematic. This study examines two issues in the assessment of C2-level writing in the context of the Certificate of English Language Proficiency (CELP) writing task: rater effects and examinee age. Interrater reliability and many-facet Rasch analysis showed that raters varied substantially in severity. CELP scoring procedures for rater disagreement partially mitigated severity differences. Contrary to expectations, age differentiated examinee abilities minimally and defied hypothesized ordering (i.e., that writing ability would increase with age). Additionally, some raters were found to demonstrate bias towards the youngest examinees. Specific implications for the CELP’s validity argument and broader implications for assessing young writers in CEFR terms are discussed.

      PubDate: 2017-09-12T04:55:20Z
      DOI: 10.1016/j.asw.2017.08.004
      Issue No: Vol. 34 (2017)
  • Integrating assessment as, for, and of learning in a large-scale exam
           preparation course
    • Authors: Karim Sadeghi; Teymour Rahmati
      Pages: 50 - 61
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Karim Sadeghi, Teymour Rahmati
      This empirical study examined the validity of arguments regarding assessment integration tensions, strategies, and the potential of an integrated assessment model in enhancing students’ writing ability. To this end, an integrated assessment as, for, and of learning model was experimented with a group of learners preparing to take the Cambridge English: Preliminary English Test. Moreover, an assessment for and of (non-integrated) model was used with another group of candidates as the control group. Subsequently, the candidates’ writing performances measured by Cambridge Assessment in terms of overall band descriptions were converted into numerical indices. The Mann-Whitney U Test comparison of the participants’ converted scores revealed that the integrated assessment group performed better than the non-integrated assessment candidates. Furthermore, classroom observations and a focus-group interview with the integrated assessment group clarified a number of issues concerning assessment integration and AaL implementation tensions and strategies. The results indicated that an integrated assessment model tailored to contextual specifications can contribute both theoretically and practically to teaching and assessing writing.

      PubDate: 2017-10-02T07:45:10Z
      DOI: 10.1016/j.asw.2017.09.003
      Issue No: Vol. 34 (2017)
  • Design and evaluation of automated writing evaluation models:
           Relationships with writing in naturalistic settings
    • Authors: Brent Bridgeman; Chaitanya Ramineni
      Pages: 62 - 71
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Brent Bridgeman, Chaitanya Ramineni
      Automated Writing Evaluation (AWE)systems are built by extracting features from a 30min essay and using a statistical model that weights those features to optimally predict human scores on the 30min essays. But the goal of AWE should be to predict performance in real world naturalistic tasks, not just to predict human scores on 30min essays. Therefore, a more meaningful way of creating the feature weights in the AWE model is to select weights that are optimized to predict the real world criterion. This unique new approach was used in a sample of 194 graduate students who supplied two examples of their writing from required graduate school coursework. Contrary to results from a prior study predicting portfolio scores, the experimental model was no more effective than the traditional model in predicting scores on actual writing done in graduate school. Importantly, when the new weights were evaluated in large samples of international students, the population subgroups that were advantaged or disadvantaged by the new weights were different from the groups advantaged/disadvantaged by the traditional weights. It is critically important for any developer of AWE models to recognize that models that are equally effective in predicting an external criterion may advantage/disadvantage different groups.

      PubDate: 2017-11-01T05:59:31Z
      DOI: 10.1016/j.asw.2017.10.001
      Issue No: Vol. 34 (2017)
  • College student perceptions of writing errors, text quality, and author
    • Authors: Adam C. Johnson; Joshua Wilson; Rod D. Roscoe
      Pages: 72 - 87
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Adam C. Johnson, Joshua Wilson, Rod D. Roscoe
      Both conventional wisdom and empirical research suggest that errors in writing impact perceptions of both writing quality and characteristics of the author. Texts that exhibit poor spelling and grammar, or lack compelling arguments and clear structure, are perceived as lower quality. Moreover, the authors themselves may be perceived as less intelligent, creative, hardworking or trustworthy. Using a within-subjects design, the current study systematically examined the effects of lower-level errors and higher-level errors on college students’ (n =70) perceptions of multiple aspects of writing quality and author characteristics. Results demonstrated that students noticed both kinds of errors but were much more sensitive to lower-level errors than higher-level errors. Nearly identical patterns were observed for judgments of text quality and authors, and the sensitivity to lower-level errors was stronger for more-skilled readers. Implications for challenges and biases in peer assessment are discussed.

      PubDate: 2017-11-09T07:22:37Z
      DOI: 10.1016/j.asw.2017.10.002
      Issue No: Vol. 34 (2017)
  • Student and instructor perceptions of writing tasks and performance on
           TOEFL iBT versus university writing courses
    • Authors: Lorena Llosa; Margaret E. Malone
      Pages: 88 - 99
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Lorena Llosa, Margaret E. Malone
      This study examined student and instructor perceptions of writing tasks and performance on TOEFL iBT versus university writing courses. Participants included 103 international, nonnative-English-speaking undergraduate students enrolled in required university writing courses and their writing instructors (n=18). Students completed a background questionnaire, two TOEFL iBT writing tasks (one Integrated and one Independent), and a questionnaire about their perceptions of the writing tasks. The 18 instructors also completed a questionnaire and sx participated in an extended interview. Students and instructors reported that neither the Independent nor the Integrated task alone was representative of the types of writing they do in their writing course. However, the Independent and Integrated task together represented many of the characteristics of course assignments. Additionally, instructors perceived the criteria in the TOEFL iBT writing rubrics to be very similar to the criteria that they use in class to assess student writing suggesting that TOEFL iBT tasks and course assignments are based on a similar operationalization of the writing construct. Finally, students and instructors generally perceived the quality of the writing produced for the TOEFL iBT writing tasks to be comparable to the quality of writing produced in course assignments.

      PubDate: 2017-11-16T13:50:59Z
      DOI: 10.1016/j.asw.2017.09.004
      Issue No: Vol. 34 (2017)
  • Reclaiming Accountability: Improving Writing Programs through
           Accreditation and Large-Scale Assessments, W. Sharer, T.A. Morse, M.F.
           Eble, W.P. Banks. Utah State University Press (2016), ISBN:
    • Authors: Ashley Velazquez
      Pages: 100 - 102
      Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34
      Author(s): Ashley Velazquez

      PubDate: 2017-12-13T09:30:53Z
      DOI: 10.1016/j.asw.2017.09.001
      Issue No: Vol. 34 (2017)
  • The TOEFL iBT writing: Korean students’ perceptions of the TOEFL iBT
           writing test
    • Authors: Eun-Young Julia Kim
      Pages: 1 - 11
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Eun-Young Julia Kim
      The TOEFL is one of the most widely recognized language proficiency tests developed to measure international students’ level of readiness for degree study. Whereas there exist a number of correlational studies conducted by various affiliates of ETS based on large-scale quantitative data, there is a dearth of studies that explore test-takers’ perceptions and experiences concerning the TOEFL iBT. Writing skills have paramount importance for academic success, and high-stakes tests such as the TOEFL have a tendency to influence test-takers’ perceptions on what defines good academic writing. To date, no research has specifically focused on test-takers’ perceptions on the writing section of the TOEFL iBT. To fill this gap, this study explores Korean students’ perceptions of effective strategies for preparing for the TOEFL iBT writing test, challenges they face in the test-taking and test-preparation processes, and implications such findings have for various stakeholders, by analyzing online forum data. Findings indicate that the scores for the writing section of the TOEFL iBT, albeit helpful for the initial benchmarking tool, may conceal more than it reveals about Korean students’ academic writing ability. The study suggests that the format, questions, and the scoring of the TOEFL iBT writing test be critically examined from test-takers’ perspectives.

      PubDate: 2017-03-05T12:58:57Z
      DOI: 10.1016/j.asw.2017.02.001
      Issue No: Vol. 33 (2017)
  • Assessing peer and instructor response to writing: A corpus analysis from
           an expert survey
    • Authors: Ian G. Anson; Chris M. Anson
      Pages: 12 - 24
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Ian G. Anson, Chris M. Anson
      Over the past 30 years, considerable scholarship has critically examined the nature of instructor response on written assignments in the context of higher education (see Straub, 2006). However, as Haswell (2008) has noted, less is currently known about the nature of peer response, especially as it compares with instructor response. In this study, we critically examine some of the properties of instructor and peer response to student writing. Using the results of an expert survey that provided a lexically-based index of high-quality response, we evaluate a corpus of nearly 50,000 peer responses produced at a four-year public university. Combined with the results of this survey, a large-scale automated content analysis shows first that instructors have adopted some of the field's lexical estimation of high-quality response, and second that student peer response reflects the early acquisition of this lexical estimation, although at further remove from their instructors. The results suggest promising directions for the parallel improvement of both instructor and peer response.

      PubDate: 2017-03-18T08:19:07Z
      DOI: 10.1016/j.asw.2017.03.001
      Issue No: Vol. 33 (2017)
  • Understanding university students’ peer feedback practices in EFL
           writing: Insights from a case study
    • Authors: Shulin Yu; Guangwei Hu
      Pages: 25 - 35
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Shulin Yu, Guangwei Hu
      While research on peer feedback in the L2 writing classroom has proliferated over the past three decades, only limited attention has been paid to how students respond to their peers’ writing in specific contexts and why they respond in the ways they do. As a result, much remains to be known about how individual differences and contextual influences shape L2 students’ peer feedback practices. To bridge the research gap, this case study examines two Chinese EFL university students’ peer feedback practices and the factors influencing their feedback practices. Analyses of multiple sources of data including interviews, video recordings of peer feedback sessions, stimulated recalls, and texts reveal that the students took markedly different approaches when responding to their peers’ writing. The findings also indicate that their peer feedback practices were situated in their own distinct sociocultural context and mediated by a myriad of factors including beliefs and values, motives and goals, secondary school learning and feedback experience, teacher feedback practices, feedback training, feedback group dynamics, as well as learning and assessment culture.

      PubDate: 2017-04-08T21:03:50Z
      DOI: 10.1016/j.asw.2017.03.004
      Issue No: Vol. 33 (2017)
  • Similarities and differences in constructs represented by U.S. States’
           middle school writing tests and the 2007 national assessment of
           educational progress writing assessment
    • Authors: Ya Mo; Gary A. Troia
      Pages: 48 - 67
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Ya Mo, Gary A. Troia
      Little is known regarding the underlying constructs of writing tests used by U.S. state education authorities and national governments to evaluate the writing performance of their students, especially in middle school grades. Through a content analysis of 78 prompts and 35 rubrics from 27 states’ middle school writing assessments from 2001 to 2007, and three representative prompts and rubrics from the United States’ 2007 National Assessment of Educational Progress (NAEP) writing test, this study illuminates the writing constructs underlying large-scale writing assessments through examination of features in prompts and rubrics and investigation of the connections between prompts and rubrics in terms of genre demands. We found the content of state writing assessments and the NAEP align with respect to measurement parameters associated with (a) emphasis on writing process, audience awareness, and topic knowledge, (b) availability of procedural facilitators (e.g., checklists, rubrics, dictionaries) to assist students in their writing, and (c) inclusion of assessment criteria focused on organization, structure, content, details, sentence fluency, semantics, and general conventions. However, the NAEP’s writing assessment differs from many state tests of writing by including explicit directions for students to review their writing, giving students two timed writing tasks rather than one, making informational text production one of the three genres assessed, and including genre-specific evaluative components in rubrics. This study contributes to our understanding of the direction and path that large-scale writing assessments in the US are taking and how writing assessments are continually evolving.

      PubDate: 2017-06-27T08:22:38Z
      DOI: 10.1016/j.asw.2017.06.001
      Issue No: Vol. 33 (2017)
  • To make a long story short: A rubric for assessing graduate students’
           academic and popular science writing skills
    • Authors: Tzipora Rakedzon; Ayelet Baram-Tsabari
      Pages: 28 - 42
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Tzipora Rakedzon, Ayelet Baram-Tsabari
      Graduate students are future scientists, and as such, being able to communicate science is imperative for their integration into the scientific community. This is primarily achieved through scientific papers, mostly published in English; however, interactions outside of academia are also beneficial for future scientists. Therefore, academic writing courses are prevalent and popular science communication courses are on the rise. Nevertheless, no rubrics exist for assessing students' writing in academic and science communication courses. This article describes the development and testing of a rubric for assessing advanced L2 STEM graduate students’ writing in academic (abstract) and popular science writing (press release). The rubric was developed as part of a longstanding academic writing course, but was modified to include a module on science communication with the lay public. Analysis of student needs and the literature inspired a pre-pilot that assessed 16 descriptors on 60 student works. A subsequent, adjusted pilot version on 30 students resulted in adaptations to fit each genre and course goals. In the third round, a modified, final rubric tested on 177 graduate students was created that can be used for both assessment and comparison of the genres. This rubric can assess scientific genres at the graduate level and can be adapted for other genres and levels.

      PubDate: 2017-01-07T04:12:57Z
      DOI: 10.1016/j.asw.2016.12.004
      Issue No: Vol. 32 (2017)
  • Checking assumed proficiency: Comparing L1 and L2 performance on a
           university entrance test
    • Authors: Bart Deygers; Kris Van den Branden; Elke Peters
      Pages: 43 - 56
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Bart Deygers, Kris Van den Branden, Elke Peters
      This study compares the results of three groups of participants on the writing component of a centralised L2 university entrance test at the B2 level in Flanders, Belgium. The study investigates whether all Flemish candidates have a B2-level in Dutch upon university entrance, and whether L1 test takers outperform L2 candidates who learned Dutch at home or in Flanders. The results show that, even though the Flemish group outperformed both groups of L2 candidates, not all Flemish candidates reached the B2 level. Additionally, the study compares the results of two groups of L2 users on the same test and shows that candidates who studied Dutch in a Dutch-speaking context do not necessarily outscore candidates who did not. The primary methods of analysis include non-parametric regression and Multi-Faceted Rasch. The results are interpreted in terms of Hulstijn’s conceptualisation of Higher Language Competence, and the study abroad literature. Implications for the university entrance policy are discussed at the end of the paper.

      PubDate: 2017-01-07T04:12:57Z
      DOI: 10.1016/j.asw.2016.12.005
      Issue No: Vol. 32 (2017)
  • The effectiveness of instructor feedback for learning-oriented language
           assessment: Using an integrated reading-to-write task for English for
           academic purposes
    • Authors: Ah-Young (Alicia) Kim; Hyun Jung Kim
      Pages: 57 - 71
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Ah-Young (Alicia) Kim, Hyun Jung Kim
      Learning-oriented language assessment (LOLA) can be effective in promoting learning through assessment by creating a link between the two. Although previous studies have examined the effectiveness of feedback – a major element of LOLA – in L2 writing, few have examined how LOLA could be implemented using an integrated reading-to-write task in English for academic purpose (EAP) contexts, which was the objective of this study. Participants were ten Korean TESOL graduate students taking a research methods course and their professor. During a seven-week period, each student completed a weekly integrated reading-to-write task as part of their classroom assessment – they read an academic research paper on a topic of their choice and wrote a review on it. After receiving feedback from the instructor, students revised their work and resubmitted it the following week. Students and the instructor also participated in a semi-structured interview to discuss the effectiveness of learning-oriented feedback on academic reading-to-write tasks. Learners displayed varying developmental patterns, with some students showing more improvement than others. The findings highlighted two participants’ progress in the content domain. Qualitative analysis results suggest that the students reacted differently to the instructor feedback, leading to varying degrees of writing enhancement. The results provide pedagogical implications for using integrated academic reading-to-write tasks and sustained feedback for LOLA.

      PubDate: 2017-01-22T04:23:23Z
      DOI: 10.1016/j.asw.2016.12.001
      Issue No: Vol. 32 (2017)
  • Textual voice elements and voice strength in EFL argumentative writing
    • Authors: Hyung-Jo Yoon
      Pages: 72 - 84
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Hyung-Jo Yoon
      This study examined how the quantity and diversity of textual voice elements contribute to holistic voice strength and essay quality. For the quantification of voice elements, this study used an automated processing tool, the Authorial Voice Analyzer (AVA), which was developed based on categories from Hyland’s voice model (i.e., hedges, boosters, attitude markers, self-mentions, reader pronouns, and directives). To explore the relationship between textual voice elements and holistic voice strength, as well as between voice elements and essay quality, this study analyzed 219 argumentative essays written by L1 Greek-speaking EFL students. The results suggested positive, but weak to moderate, correlations between textual voice and holistic voice strength; a regression model with three textual voice features explained 26% of the variance in voice strength scores. The results also indicated weak correlations between textual voice and essay quality. Interestingly, the textual voice features contributing to voice strength (boosters, attitude markers, and self-mentions) were different from those contributing to essay quality (hedges). Interpreting these findings in relation to the context (timed argumentative writing in an EFL context), this study suggests implications for L2 writing assessment and pedagogy.

      PubDate: 2017-03-05T12:58:57Z
      DOI: 10.1016/j.asw.2017.02.002
      Issue No: Vol. 32 (2017)
  • Assessing Writing, Teaching Writers: Putting the Analytic Writing
           Continuum to Work in Your Classroom, M.A. Smith, S.S. Swain. Teachers
           College Press, New York (2017)
    • Authors: Les Perelman
      Pages: 126 - 129
      Abstract: Publication date: Available online 22 September 2017
      Source:Assessing Writing
      Author(s): Les Perelman

      PubDate: 2017-09-25T07:08:06Z
      DOI: 10.1016/j.asw.2016.12.003
      Issue No: Vol. 31 (2017)
  • Writing Assessment and the Revolution in Digital Texts and Technologies,
           M. Neal., Teachers College Press, New York (2010). 152 pp., ISBN
    • Authors: Les Perelman
      Pages: 126 - 129
      Abstract: Publication date: January 2017
      Source:Assessing Writing, Volume 31
      Author(s): Les Perelman

      PubDate: 2017-01-07T04:12:57Z
      DOI: 10.1016/j.asw.2016.12.003
      Issue No: Vol. 31 (2017)
  • Miroslaw Pawlak (ed.) Error Correction in the Foreign Language Classroom:
           Reconsidering the Issues. Springer-Verlag, Berlin, Heidelberg (2015).
    • Authors: Shima Ghahari
      Abstract: Publication date: Available online 12 December 2017
      Source:Assessing Writing
      Author(s): Shima Ghahari

      PubDate: 2017-12-13T09:30:53Z
  • Ed.Board/Aims and scope
    • Abstract: Publication date: October 2017
      Source:Assessing Writing, Volume 34

      PubDate: 2017-12-13T09:30:53Z
  • Ed.Board/Aims and scope
    • Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33

      PubDate: 2017-09-25T07:08:06Z
  • Editorial for ASSESSING WRITING Vol 33 2017
    • Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33

      PubDate: 2017-09-25T07:08:06Z
  • Tools and Tech: A New Forum
    • Authors: Laura L. Aull
      Abstract: Publication date: Available online 24 July 2017
      Source:Assessing Writing
      Author(s): Laura L. Aull

      PubDate: 2017-07-27T11:06:25Z
      DOI: 10.1016/j.asw.2017.06.003
  • Social Justice and Educational Measurement: John Rawls, the History of
           Testing, and the Future of Education, Z. Stein. Routledge (2016). 220 pp.,
           ISBN 978-1138947009
    • Authors: J.W. Hammond
      Abstract: Publication date: Available online 11 July 2017
      Source:Assessing Writing
      Author(s): J.W. Hammond

      PubDate: 2017-07-21T10:40:56Z
  • Ed.Board/Aims and scope
    • Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32

      PubDate: 2017-04-29T22:00:21Z
  • First Year University Writing: A Corpus Based Study with Implications for
           Pedagogy, L. Aull. Palgrave Macmillan (2015), 239
    • Authors: Mark Chapman
      Abstract: Publication date: April 2017
      Source:Assessing Writing, Volume 32
      Author(s): Mark Chapman

      PubDate: 2017-04-29T22:00:21Z
  • Evaluating rater accuracy and perception for integrated writing
           assessments using a mixed-methods approach
    • Authors: Jue Wang; George Engelhard Kevin Raczynski Tian Song Edward Wolfe
      Abstract: Publication date: July 2017
      Source:Assessing Writing, Volume 33
      Author(s): Jue Wang, George Engelhard, Kevin Raczynski, Tian Song, Edward W. Wolfe
      Integrated writing (IW) assessments underscore the connections between reading comprehension and writing skills. These assessments typically include rater-mediated components. Our study identified IW type essays that are difficult-to-score accurately, and then investigated reasons based on rater perceptions and judgments. Our data based on IW assessments are used as formative assessments designed to provide information on the developing literacy of students. We used a mixed- methods approach with rater accuracy defined quantitatively based on Rasch measurement theory, and a survey-based qualitative method designed to investigate rater perceptions and judgments toward student essays within the context of IW assessments. The quantitative analyses suggest that the essays and raters vary along a continuum designed to represent rating accuracy. The qualitative analyses suggest that raters had inconsistent perceptions toward certain features of essays compared to the experts, such as the amount of textual borrowing, the development of ideas, and the consistency of the focus. The implications of this study for research and practice of IW assessments are discussed.

      PubDate: 2017-04-15T21:25:34Z
  • Ed.Board/Aims and scope
    • Abstract: Publication date: January 2017
      Source:Assessing Writing, Volume 31

      PubDate: 2017-01-07T04:12:57Z
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
About JournalTOCs
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-