Hybrid journal (It can contain Open Access articles) ISSN (Print) 2049-5986 - ISSN (Online) 2049-5994 Published by Inderscience Publishers[450 journals]
Authors:Lan Luong, Arthur Poropat, Helen Klieve, Kate Thompson Pages: 269 - 292 Abstract: Massification of higher education in Vietnam has brought about both achievements and various issues. Efforts have been made to improve the quality of teaching and learning in higher education, but very little attention has been paid to the issue of student motivation. This study aims to contribute to this area of knowledge by testing the applicability of the academic motivation scale (AMS) in assessing Vietnamese university students' motivation. This was achieved via evaluation of the scale's psychometric properties using data obtained from 648 first year students from a high-ranking university. Using a different approach in model testing, results indicated that the revised seven factor AMS with 23 items best fitted the data. All subscales had satisfactory reliabilities. Thus, the revised AMS can be used to study Vietnamese university students' motivation. Keywords: academic motivation; academic motivation scale; Vietnam; university; self-determination theory; Vietnamese university students Citation: International Journal of Quantitative Research in Education, Vol. 4, No. 4 (2019) pp. 269 - 292 PubDate: 2019-06-14T23:20:50-05:00 DOI: 10.1504/IJQRE.2019.100167 Issue No:Vol. 4, No. 4 (2019)
Authors:Ki Cole, Sohee Kim, Mwarumba Mwavita Pages: 293 - 309 Abstract: The aim of this study was to compare linear equating (Levine and Tucker), equipercentile equating, and true and observed score IRT equating with fixed item parameter calibration methods when applied to non-equivalent groups taking forms which have various multidimensional structures: equal or unequal total test difficulty and similar or dissimilar difficulty within dimensions across forms. This situation may be common for large-scale test forms that are composed of multiple sub-content areas and are being administered to examinees of mixed abilities at different times. Within the specifications of this study, when forms differ in total difficulty but the difference in difficulty within dimensions is consistent across forms, the linear methods may be preferred. When forms differ in average item difficulty within dimensions, regardless of equal or unequal total test difficulty, the IRT method with fixed item parameter calibration is favoured when data are correlated. Keywords: equating; psychometrics; test construction; fixed item parameter calibration; multidimensional data; non-equivalent anchor test Citation: International Journal of Quantitative Research in Education, Vol. 4, No. 4 (2019) pp. 293 - 309 PubDate: 2019-06-14T23:20:50-05:00 DOI: 10.1504/IJQRE.2019.100168 Issue No:Vol. 4, No. 4 (2019)
Authors:Iftikhar Yusuf Al-Ariqi, Jagannath K. Dange, Mir Mohsin Pages: 310 - 331 Abstract: Studies have shown that the best way to test the students' ability in reading comprehension is the multiple-choice questions (MCQs) for its validity and reliability. This paper tries to assess items and test quality in order to explore the relationship between difficulty index (p-value) and discrimination indices (DIs) with distractor efficiency (DE). The study was conducted among 134 second year Yemeni EFL students in Sana'a University, Yemen. Twenty MCQs analysed for p-value, DI and DE. Results indicated that the mean p-value and DI were 61.92 ± 25.1% and 0.31 ± 0.27, respectively. DI was noted to be maximum at p value range between 40% and 60%. Combining the two indices, 19 items could be called 'good' having a p-value from 20% to 90%, as well as a DI ≥ 0.40. Overall 75% items had two non-functional distractors (NFDs), while 20% items had three functional distractors and 5% had only one functional distractor. Keywords: difficulty index; discrimination index; distractor efficiency; item analysis; multiple-choice questions; MCQs; non-functional distractor; NFD Citation: International Journal of Quantitative Research in Education, Vol. 4, No. 4 (2019) pp. 310 - 331 PubDate: 2019-06-14T23:20:50-05:00 DOI: 10.1504/IJQRE.2019.100169 Issue No:Vol. 4, No. 4 (2019)
Authors:Samantha E. Robinson, Joon Jin Song Pages: 332 - 353 Abstract: Monitoring student performance throughout the course of an academic semester can have a positive impact for students and educators alike in terms of motivating invaluable course redesign, effective student intervention, and practical methodologies for classroom enrichment. Student academic performance systems (SAPS), analytical tools to track student progress, can enhance both learning and academic development. However, these monitoring systems need to be effective, easy to implement, clear to interpret, and based on a framework flexible enough to easily adjust and suit any educational level or course style. We propose a SAPS system designed to monitor student performance, demonstrate that the SAPS system is an efficient and complementary tool for educators, and argue that some form of a SAPS system should be incorporated into every classroom at all instructional levels. Keywords: educational strategies; student performance monitoring; student evaluation; statistical surveillance; statistical classification techniques; hybrid classification; logistic regression; classification and regression trees; CART methods Citation: International Journal of Quantitative Research in Education, Vol. 4, No. 4 (2019) pp. 332 - 353 PubDate: 2019-06-14T23:20:50-05:00 DOI: 10.1504/IJQRE.2019.100170 Issue No:Vol. 4, No. 4 (2019)