Developing the Rubric for Evaluating Problem Posing (REPP) (Pages: 8 - 25)

Author :  

Year-Number: 2016-Volume 8, Issue 1
Language : null
Konu : null

Abstract

Problem posing means either posing new problem situations in terms of given conditions or re-formulating an already written problem. It is also equally important to evaluate the posed problems and rubrics can be used during these evaluations. Rubrics are typical graded scoring keys used for assessing the studies of students. In this sense, the purpose of this study is to develop a rubric which can be used to evaluate the problems posed by students. In accordance with this purpose, the study was carried out with 29, 7th grade students who are studying in a middle school. The data collection tools are; worksheets, discussions during lessons and related literature. The data were descriptively analyzed. As a result of the analysis, an analytical rubric composed of four dimensions was developed. The dimensions of the rubric are as in the following; the text of the problem (language and expression), the compatibility of the problem with the mathematical principles, the type/structure of the problem and the solvability of the problem. For the reliability analysis of the study while the inter-rater agreement index was found as 0,720; the Kappa value 0,700; Pearson correlation coefficient r=0,873; Spearman correlation coefficient r=0,865 and Cronbach alfa coefficient 0,932; the intra-rater agreement index was found as 0,830; Kappa value 0,817; Pearson correlation coefficient r=0,921; Spearman correlation coefficient r=0,912 and Cronbach alfa coefficient 0,959.

Keywords

Abstract

Problem posing means either posing new problem situations in terms of given conditions or re-formulating an already written problem. It is also equally important to evaluate the posed problems and rubrics can be used during these evaluations. Rubrics are typical graded scoring keys used for assessing the studies of students. In this sense, the purpose of this study is to develop a rubric which can be used to evaluate the problems posed by students. In accordance with this purpose, the study was carried out with 29, 7th grade students who are studying in a middle school. The data collection tools are; worksheets, discussions during lessons and related literature. The data were descriptively analyzed. As a result of the analysis, an analytical rubric composed of four dimensions was developed. The dimensions of the rubric are as in the following; the text of the problem (language and expression), the compatibility of the problem with the mathematical principles, the type/structure of the problem and the solvability of the problem. For the reliability analysis of the study while the inter-rater agreement index was found as 0,720; the Kappa value 0,700; Pearson correlation coefficient r=0,873; Spearman correlation coefficient r=0,865 and Cronbach alfa coefficient 0,932; the intra-rater agreement index was found as 0,830; Kappa value 0,817; Pearson correlation coefficient r=0,921; Spearman correlation coefficient r=0,912 and Cronbach alfa coefficient 0,959.

Keywords


  • Abu-Elwan, R. (2007). The use of web quest to enhance the mathematical problem-posing skills of pre- service teachers. The International Journal for Technology in Mathematics Education, 14(1), 31. (ERIC No: EJ847732)

  • Akay, H. (2006). Problem kurma yaklaşımı ile yapılan matematik öğretiminin öğrencilerin akademik başarısı, problem çözme becerisi ve yaratıcılığı üzerindeki etkisinin incelenmesi. Yayınlanmamış doktora tezi, Gazi Üniversitesi, Ankara.

  • Akay, H., & Boz, N. (2010). The effect of problem posing oriented analyses-II course on the attitudes toward mathematics and mathematics self-efficacy of elementary prospective mathematics teachers. Australian Journal of Teacher Education, 35(1), 59-75. doi: 10.14221/ajte.2010v35n1.6

  • Albayrak, M., İpek, A. S., & Işık, C. (2006). Temel işlem becerilerinin öğretiminde problem kurma-çözme çalışmaları. Erzincan Eğitim Fakültesi Dergisi, 8(2), 1-11.

  • Al-Fallay, I. (2000). Examining the analytic marking method: Developing and using an analytic scoring schema. Language & Translation, 12(1), 1-22.

  • Arter, J., & McTighe, J. (2000). Scoring rubrics in the classroom: Using performance criteria for assessing and improving student performance. Thousand Oaks, CA: Corwin Press.

  • Brown, S. I., & Walter, M. I. (1993). Problem posing: Reflection and applications. Hillsdale, NJ: Erlbaum.

  • Brown, G. T. L., Glasswell, K., & Harland, D. (2004). Accuracy in the scoring of writing: Studies of reliability and validity using a New Zealand writing assessment system. Assessing Writing, 9, 105-121, doi: 10.1016/j.asw.2004.07.001.

  • Callison, D. (2000). Rubrics. School Library Media Activities Monthly, 17(2), 34-36.

  • Chang, K. E., Wu, L. J., Weng, S. E., & Sung, Y. T. (2012). Embedding game-based problem solving phase into problem posing system for mathematics learning. Computers & Education, 58(2), 775-786. doi: 10.1016/j.compedu.2011.10.002.

  • Dornisch, M. M., & Mc Loughlin, A. S. (2006). Limitations of web-based rubric resources: Addressing the challenges. Practical Assessment, Research & Evaluation, 11(3), 1-8.

  • East, M., & Young, D. (2007). Scoring L2 writing samples: Exploring the relative effectiveness of two different diagnostic methods. New Zealand Studies in Applied Linguistics, 13(1), 1-21.

  • Ergün, H. (2010). Problem tasarımının fizik eğitiminde kavramsal öğrenmeye ve problem çözmeye etkisi. Yayınlanmamış doktora tezi, Marmara Üniversitesi, İstanbul.

  • Goodrich-Andrade, H. (2000). Using rubrics to promote thinking and learning. Educational Leadership, 57(5), 13-18.

  • Goodrich-Andrade, H. (2005). Teaching with rubrics: The good, the bad, and the ugly. College Teaching, 53(1), 27-30. doi: 10.3200/CTCH.53.1.27-31

  • Grundmeier, T. A. (2003). The effects of providing mathematical problem posing experiences for K-8 pre-service teachers: investigating teachers’ beliefs and characteristics of posed problems. Unpublished doctoral dissertation, University of New Hampshire, Durham, ABD.

  • Gülten, D., Ergin, H., & Ergin, T. (2007). İlköğretim 3. sınıf öğrencilerinin problem kurma becerileri ile bilişsel işlemlerden eşzamanlılık ve planlama arasındaki ilişki. Anadolu Üniversitesi VI. Ulusal Sınıf Öğretmenliği Eğitimi Sempozyumu, Eskişehir, Türkiye.

  • Gür, H., & Korkmaz, E. (2003). İlköğretim 7. sınıf öğrencilerinin problem ortaya atma becerilerinin belirlenmesi. Matematikçiler Derneği Bilim Köşesi. http://www.matder.org.tr adresinden 06.04.2015 tarihinde edinilmiştir.

  • Hafner, J. C., & Hafner, P. M. (2003). Quantitative analysis of the rubric as an assessment tool: An empirical study of student peer-group rating. International Journal of Science Education, 25(12), 1509-1528. doi: 10.1080/0950069022000038268

  • Işık, C., Kar, T., Yalçın, T., & Zehir, K. (2011). Prospective teachers’ skills in problem posing with regard to different problem posing models. Procedia Social and Behavioral Sciences, 15, 485-489. doi: 10.1016/j.sbspro.2011.03.127

  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130-144. doi: 10.1016/j.edurev.2007.05.002

  • Karip, E. (Ed.) (2011). Ölçme ve değerlendirme. Ankara: Pegem Akademi Yayıncılık.

  • Knoch, U. (2009). Diagnostics assessment of writing: A comparison of two rating scales. Language Testing, 26(2), 275-304. doi: 10.1177/0265532208101008

  • Kutlu, O., Doğan, C. D., & Karakaya, I. (2009). Öğrenci başarısının belirlenmesi: Performansa ve portfolyaya dayalı durum belirleme. Ankara: Pegem Akademi Yayıncılık.

  • Landis, J. R., & Koch, G. G. (1977). The measure of observer agreement for categorical data. Biometrics, 33(1), 159-174.

  • Lavy, I., & Bershadsky, I. (2002). “What if not?” problem posing and spatial geometry-a case study. 26th Conference of the International Group for the Psychology of Mathematics Education: PME 26, Norwich, UK.

  • Marzano, R. J. (2002). A comparison of selected methods of scoring classroom assessments. Applied Measurement in Education, 15(3), 249-267. doi: 10.1207/S15324818AME1503_2

  • Mertler, C. A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25). Retrieved on 10.09.2015 from http://PAREonline.net/getvn.asp?v=7&n=25

  • Mestre, P. J. (2002). Probing adults’ conceptual understanding and transfer of learning via problem posing. Applied Developmental Psychology, 23, 9-50. doi: 10.1016/S0193-3973(01)00101-0

  • Moskal, B. M. (2000). Scoring rubrics: What, when, and how? Practical Assessment, Research & Evaluation, 7(3). Retrieved from http://PAREonline.net/getvn.asp?v=7&n=25 on 10.09.2015

  • Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research & Evaluation, 7, 71-81.

  • Nakamura, Y. (2004). A comparison of holistic and analytic scoring methods in the assessment of writing. The Interface between Interlanguage, Pragmatics and Assessment: Proceedings of the 3rd Annual JALT Pan-SIG Conference, Tokyo, Japan. Retrieved from http://jalt.org/pansig/2004/HTML/Nakamura.htm on 06.04.2015

  • Newell, J. A., Dahm, K. D., & Newell, H. L. (2002). Rubric development and interrater reliability issues in assessing learning outcomes. Chemical Engineering Education, 36(3), 212-215.

  • Nitko, A. J. (2004). Educational assessment of student. Upper Saddle River, NJ: Pearson Education.

  • Oakleaf, M. (2009). Using rubrics to assess information literacy: An examination of methodology and interrater reliability. Journal of the American Society for Information Science and Technology, 60(5), 969-983. doi: 10.1002/asi.21030

  • Popham, W. J. (2003). Test better, teach better: The instructional role of assessment. Alexandria, VA: Association for Supervision and Curriculum Development.

  • Rincker, S. J. (2002). Developing core French rubrics to evaluate student progress and performance: an action research study. Unpublished master thesis, University of Regina, Regina, Canada.

  • SAS. (2006). Compute estimates and tests of agreement among multiple raters. Retrieved from http://support.sas.com/kb/25/006.html on 06.04.2015

  • Shepard, M. M. (2005). The effect of the use of a rubric in teacher assessment. Unpublished doctoral dissertation, Boston College, Chestnut Hill, ABD.

  • Silver, E. A., & Cai, J. (1996). An analysis of arithmetic problem posing by middle school students. Journal of Research in Mathematics Education, 27(5), 521-539.

  • Silver, E. A., & Cai, J. (2005). Assessing students’ mathematical problem posing. Teaching Children Mathematics, 12(3), 129-135. (ERIC No: EJ749674)

  • Silvestri, L., & Oescher, J. (2006). Using rubrics to increase the reliability of assessment in health classes. International Electronic Journal of Health Education, 9, 25-30. (ERIC No: EJ794114)

  • Spandel, V. (2006). In defense of rubrics. English Journal, 96(1), 19-22.

  • Stoyanova, E. (2003). Extending students understanding of mathematics via problem posing. Australian Mathematics Teacher, 59(2), 32-40.

  • Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research & Evaluation, 9(4). Retrieved on 10.09.2015 from http://PAREonline.net/getvn.asp?v=9&n=4

  • Thaler, N., Kazemi, E., & Huscher, C. (2009). Developing a rubric to assess student learning outcomes using a class assignment. Teaching of Psychology, 36, 113-116. doi: 10.1080/00986280902739305.

  • Tierney, R., & Simon, M. (2004). What’s still wrong with rubrics: Focusing on the consistency of performance criteria across scales levels. Practical Assessment, Research & Evaluation, 9(2). Retrieved on 10.09.2015 from http://PAREonline.net/getvn.asp?v=9&n=2

  • Truemper, C. M. (2004). Using scoring rubrics to facilitate assessment and evaluation of graduate level nursing students. Journal of Nursing Education, 43(12), 562-564.

  • Tuncel, G. (2011). Sosyal bilgiler dersinde rubriklerin etkili kullanımı. Marmara Coğrafya Dergisi, 23, 213-233.

  • Yıldırım, A., & Şimşek, H. (2008). Sosyal bilimlerde nitel araştırma yöntemleri. Ankara: Seçkin Yayıncılık.

  • Zimmaro, D. M. (2004). Developing grading rubrics. Measurement and Evaluation Center. Retrieved 06.04.2015 from http://slo.sbcc.edu/wp-content/uploads/developing_grading_rubrics.pdfs on

                                                                                                                                                                                                        
  • Article Statistics