ДА ЛИ ЈЕ ФОРМАТ ЗАДАТКА ВАЖАН? ЕМПИРИЈСКА СТУДИЈА УПОТРЕБЕ ЗАДАТКА ВИШЕСТРУКОГ ИЗБОРА И ОТВОРЕНОГ ОДГОВОРА У НАСТАВИ ГЕОМЕТРИЈЕ
Sažetak
Предмет проучавања овог рада је употреба задатака вишеструког избора и задатака отвореног одговора у настави геометрије у старијим разредима основне школе, с посебним освртом на стратегије решавања. Узорак је чинило 889 ученика од петог до осмог разреда. Резултати истраживања показују да ученици, очекивано, постижу боље резултате у решавању задатака вишеструког избора у односу на задатке отвореног одговора, што се може приписати примени стратегије погађања. Међутим, ученици углавном не користе могућност примене различитих стратегија у овом типу задатка. Истраживање указује на то да оба формата задатака имају своје предности у процесу учења и да избор формата може допринети ефикаснијем наставном процесу.
Reference
Anić, I., Košanin, R., & Savković, J. (2020). Matematika 7: udžbenik sa zbirkom zadataka. Deo 2. BIGZ školstvo d.o.o.
Barton, C. (2018). On Formative Assessment in Math: How Diagnostic Questions Can Help. American Educator, 42(2), 33–43. www.diagnosticquestions.com,
Birenbaum, M., & Tatsuoka, K. K. (1987). Open-Ended Versus Multiple-Choice Response Formats - It Does Make a Difference for Diagnostic Purpoese. Applied Psychological Measurement, 11(4), 385–395.
Bonner, S. M. (2013). Mathematics strategy use in solving test items in varied formats. Journal of Experimental Education, 81(3), 409–428. https://doi.org/10.1080/00220973.2012.727886
Bridgeman, B. (1992). A Comparison of Quantitative Questions in Open-Ended and Multiple-Choice Formats Author ( s ): Brent Bridgeman Published by : National Council on Measurement in Education Stable URL : http://www.jstor.com/stable/1435138 REFERENCES Linked references are av. 29(3), 253–271.
Cascella, C., Giberti, C., & Bolondi, G. (2020). Studies in Educational Evaluation An analysis of Differential Item Functioning on INVALSI tests , designed to explore gender gap in mathematical tasks. Studies in Educational Evaluation, 64(November 2018), 100819. https://doi.org/10.1016/j.stueduc.2019.100819
Chan, N., & Kennedy, P. E. (2002). Are Multiple‐Choice Exams Easier for Economics Students? A Comparison of Multiple‐Choice and “Equivalent” Constructed‐Response Exam Questions. Southern Economic Journal, 68(4), 957–971. https://doi.org/10.1002/j.2325-8012.2002.tb00469.x
Chaoui, N. A. (2011). Finding relationships between multiple-choice math tests and their stem-equivalent constructed responses. ProQuest Dissertations and Theses, 166. https://doi.org/10.5642/cguetd/21
Clay, B. (2001). Is This a Trick Question? A Short Guide to Writing Effective Test Questions. Kansas Curriculum Center.
Foy, P., Arora, A., & M. Stanco, G. (Eds.). (2011). TIMSS 2011 User Guide for the International Database. TIMSS & PIRLS international Study Center, Lynch School of Education, Boston College, International Association for the Evaluation of Educational Achievement (IEA).
Goecke, B., Staab, M., Schittenhelm, C., & Wilhelm, O. (2022). Stop Worrying about Multiple-Choice: Fact Knowledge Does Not Change with Response Format. Journal of Intelligence, 10(4). https://doi.org/10.3390/jintelligence10040102
Haladyna, T. M. (2004). Developing and Validating Multiple-Choice Test Items (third).
Haladyna, T. M., Downing, S. M., & Rodriguez, C. (2002). Applied Measurement in Education A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education, 15(3), 309–333.
Hollingworth, L., Beard, J. J., & Proctor, T. P. (2007). An investigation of item type in a standards-based assessment. Practical Assessment, Research and Evaluation, 12(18).
Hubbard, J. K., Potts, M. A., & Couch, B. A. (2017). How question types reveal student thinking: An experimental comparison of multiple-true-false and free-response formats. CBE Life Sciences Education, 16(2), 1–13. https://doi.org/10.1187/cbe.16-12-0339
Jr., H. F. O., & Brown, R. S. (1998). Differential Effects of Question Formats in Math Assessment on Metacognition and Affect. Applied Measurement in Education, 11(4), 331–351.
Katz, I. R., Bennett, R. E., & Berger, A. E. (2000). Effects of response format on difficulty of SAT-mathematics items: It’s not the strategy. Journal of Educational Measurement, 37(1), 39–57. https://doi.org/10.1111/j.1745-3984.2000.tb01075.x
Kusumawati, M., & Hadi, S. (2018). An analysis of multiple choice questions (MCQs): Item and test statistics from mathematics assessments in senior high school. REID (Research and Evaluation in Education), 4(1), 70–78. https://doi.org/10.21831/reid.v4i1.20202
Liou, P. Y., & Bulut, O. (2020). The Effects of Item Format and Cognitive Domain on Students’ Science Performance in TIMSS 2011. Research in Science Education, 50(1), 99–121. https://doi.org/10.1007/s11165-017-9682-7
Livingston, B. S. a. (2009). Constructed-Response Test Questions: Why We Use Them; How We Score Them. R & D Connections, 11(11), 1–8. http://144.81.87.152/Media/Research/pdf/RD_Connections11.pdf
Marcq, K., Donayre, E. J. C., & Braeken, J. (2024). The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study. Studies in Educational Evaluation, 83(September). https://doi.org/10.1016/j.stueduc.2024.101401
Martinez, M. E. (1999). Cognition and the question of test item format. Educational Psychologist, 34(4), 207–218. https://doi.org/10.1207/s15326985ep3404_2
Martins, L. G., & Martinho, M. H. (2024). Types of tasks in Mathematics textbooks: A study with Portuguese’s textbooks of 10th and 11th grades. Educacion Matematica, 36(1), 66–91. https://doi.org/10.24844/EM3601.03
Özdemir, A. Z., & Toker, Z. (2025). Analysis of distractors in mathematics questions and their potential to lead misconceptions. Thinking Skills and Creativity, 56(September 2024). https://doi.org/10.1016/j.tsc.2024.101730
Pavlović Babić, D., & Baucal, A. (2009). Matematička pismenost PISA 2003 i PISA 2006. Ministarstvo prosvete Republike Srbije, Zavod za vrednovanje kvaliteta obrazovanja i vaspitanja, Institut za psihologiju Filozofskog fakulteta Univerziteta u Beogradu.
Photopoulos, P., Tsonos, C., Stavrakas, I., & Triantis, D. (2021). Preference for Multiple Choice and Constructed Response Exams for Engineering Students with and without Learning Difficulties. International Conference on Computer Supported Education, CSEDU - Proceedings, 1(April), 220–231. https://doi.org/10.5220/0010462502200231
Sangwin, C. J., & Jones, I. (2017). Asymmetry in student achievement on multiple-choice and constructed-response items in reversible mathematics processes. Educational Studies in Mathematics, 94(2), 205–222. https://doi.org/10.1007/s10649-016-9725-4
Schult, J., & Sparfeldt, J. R. (2018). Reliability and Validity of PIRLS and TIMSS: Does the Response Format Matter? European Journal of Psychological Assessment, 34(4), 258–269. https://doi.org/10.1027/1015-5759/a000338
Stankous, N. V. (2016). Constructive Response Vs. Multiple-Choice Tests In Math: American Experience And Discussion (Review). European Scientific Journal, ESJ, 7881(May), 308–316.
Štěpánková, B., & Emanovský, P. (2011). On Open-Ended and Closed-Ended Questions in Didactic Tests of Mathematics. Problems of Education in the 21st Century, 28, 114–122.
Verbić, S. Ž. (2013). Heurisitke za maksimizaciju informacione vrednosti računarskih testova znanja. Univerzitet u Beogradu.
Vidović, N., Stanojević, G., Stuparević, Z., Stanojević, V., Vračar, L., & Stančić, M. (2015). Zbirka zadataka za završni ispit - osnovna škola, MATEMATIKA, Vežbam za malu maturu. Kreativni centar.
Zadaci iz matematike. (2017). Matematički List Za Učenike Osnovnih Škola, 2.
Zhouf, J. (2013). Categorisation of Multiple-choice Problems. Procedia - Social and Behavioral Sciences, 93, 592–596. https://doi.org/10.1016/j.sbspro.2013.09.244
