Modeling the Discrimination Power of Physics Items
For the purposes of tailoring physics instruction in accordance with the needs and abilities of the students it is useful to explore the knowledge structure of students of different ability levels. In order to precisely differentiate the successive, characteristic states of student achievement it is necessary to use test items that possess appropriate discriminatory power. By identifying the cognitive factors, which account for differences or similarities between high achievers and low achievers, we can evaluate the efficacy of developing various aspects of physics competence within the physics instruction. Further, knowing the predictors of physics item discrimination power makes it possible to systematically modify physics items with the purpose of improving their psychometric characteristics. In this study, we conducted a secondary analysis of the data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we performed a content analysis of 123 physics items that were included within abovementioned assessments. Thereafter, an item database was created. Items were mainly described by variables, which were supposed to reflect some basic cognitive domain characteristics of high and low achievers. For each of the items, we calculated the item discrimination power. Finally, a regression model of physics item discrimination power was created. It has been shown that 43,6 % of item discrimination power variance can be explained by factors which reflect the automaticity, complexity and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the constructs of cognitive load and retention. Interference effects between intuitive and formal physics knowledge structures proved to influence the item discrimination power, too.
Bejar, I. I. (1983). Subject matter experts' assessment of item statistics. Applied Psychological Measurement, 7, 303-310.
Chalifour, C., & Powers, D. E. (1989). The relationship of content characteristics of GRE analytical reasoning items to their difficulties and discriminations. Journal of Educational Measurement, 26, 120-132.
Clement, J. (1994). Use of Physical Intuition and Imagistic Simulation in Expert Problem Solving. In D. Tirosh (Ed.), Implicit and explicit knowledge. Hillsdale, NJ: Ablex Publishing Corp.
de Jong, T., & Ferguson-Hessler, M. (1996). Types and qualities of knowledge. Educational Psychologist, 31, 105-113.
diSessa, A. (1993). Towards an epistemology of physics. Cognition and Instruction, 10, 105 225.
Draxler, D. (2005). Aufgabendesign und basismodellorientierter Physikunterricht. Ph.D. thesis, Universitaet Duisburg-Essen.
Field, A. (2005). Discovering statistics using SPSS. London: SAGE.
Halloun, I.A. (2006). Modeling Theory in Science Education. Dordrecht: Springer.
Miles, J., & Shelvin, M. (2001). Applying regression and correlation: A guide for students and researchers. London: SAGE.
Henderson and L. Hsu. (Eds.). Proceedings of Physics education Research Conference 2008. NY: 2008.
Kauertz, A. (2007). Schwierigkeitserzeugende Merkmale physikalischer Leistungsaufgaben. Ph.D. thesis, Universitaet Duisburg-Essen.
Kelley T.L. (1939). The selection of upper and lower groups for the validation of test items. Journal of Educational Psychology, 30, 17-24.
Martin, M.O., Mullis, I.V.S., Foy, P. (2008). TIMSS 2007 International Science Report. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
Marshall, S. (1988). Assessing problem solving: A short term remedy and a long term solution.
In R. I. Charles & E. A. Silver, The teaching and assessing of problem solving (pp.159– 177). Reston, VA: The National Council of Teachers of Mathematics.
McCloskey, M. (1983). Intuitive physics. Scientific American, 248, 122-130.
Mesic, V., & Muratovic, H. (2011). Identifying predictors of physics item difficulty: A linear regression approach. Physical Review Special Topics Physics Education Research 7.
Nersessian, N. (2008). Creating Scientific Concepts. Cambridge, MA: MIT Press.
Olson, J.F., Martin, M.O., & Mullis, I.V.S. (2008). TIMSS 2007 Technical Report. Chestnut Hill, MA: Boston College.
Petrovic, L. (2006). External assessment of student achievement at primary school level: An expert’s report. Sarajevo: Standards and Assessment Agency for Federation of BiH and RS.
Rosca, C. V. (2004). What makes a science item difficult? A study of TIMSS-R items using regression and the linear logistic test model. Ph.D. thesis, Boston College, Boston.
SAA 2006 Database available at the Sarajevo office of the Agency for Pre-school, Primary and Secondary Education in BiH (2006).
Schütz, G. (2006). School Size and Student Achievement in TIMSS 2003. In T. Loveless (Ed.), Lessons Learned: What International. Assessments Tell Us about Mathematics Achievement. Washington: Brookings. Institution Press.
Sweller, J., van Merriënboer, J., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10, 251-296.
Teodorescu, R., Bennhold, C., & Feldman, G. (2008). Enhancing Cognitive Development through Physics Problem Solving: A Taxonomy of Introductory Physics Problems. In M. S. C.
TIMSS 2007 International Database available at http:// timss.bc.edu/timss2007/idb_ug.html
The copyright for all articles belongs to the authors. All other copyright is held by the journal.