The Analysis of Rubric Feasibility Using Video Snippets of Learning Process

Ghullam Hamdu, Iis Suryani

Abstract


Multiple interpretations can cause a decrease in the quality of rubrics. This research aims at developing an appropriate rubrics description for the measurement of students' scientific attitudes. The process of developing the rubrics was done by using the video snippets of learning process. A total of 23 observers whose final year students were involved in analyzing the video snippets of learning process. They assessed eight indicators of the scientific attitude of ten students through the video snippets. The feasibility of the rubric is considered based on the responses given by 85% of the observers who have the similar answer to the scientific attitude of students in the video snippets of learning process. The description test of the rubrics was analyzed descriptively, which then obtained two rubrics that obtained answer differences less than 85% of the observers. Therefore, this research focuses on investigating the two rubrics. This research implies that there needs to be a clear description of the rubrics in relations to the time of observation of the number of students and behaviors.


Keywords


feasibility analysis; learning process; rubrics; scientific attitude; video snippets

Full Text:

PDF

References


Alsina, Á., Ayllón, S., Colomer, J., Fernández-Peña, R., Fullana, J., Pallisera, M., … Serra, L. (2017). Improving and evaluating reflective narratives: A rubric for higher education students. Teaching and Teacher Education, 63, 148–158. https://doi.org/10.1016/j.tate.2016.12.015

Barbara, M., & Leydens, J. A. (2000). Scoring Rubric Development: Validity and Reliability. - Practical Assessment, Research & Evaluation, 7(10), 1–6.

Becker, A. (2016). Student-generated scoring rubrics: Examining their formative value for improving ESL students’ writing performance. Assessing Writing, 29, 15–24. https://doi.org/10.1016/j.asw.2016.05.002

Caldwell, K., & Atwal, A. (2005). Non-participant observation: using video tapes to collect data in nursing research. Nurse Researcher, 13(2), 42–54. https://doi.org/10.7748/nr2005.10.13.2.42.c5967

Dornisch, M. M., & Mcloughlin, A. S. (2006). Limitations of web-based rubric resources : Addressing the challenges. Practical Assessment Research Evaluation, 11(3), 1–8. https://doi.org/http://dx.doi.org/Article

Ebert-may, D. (2010). Scoring Rubrics. Educational Testing Service, Portland, OR, USA, 1–17. Retrieved from http://www.peopledev.co.za/library/Scoring rubrics - Moskal B.pdf

Goldberg, G. L. (2014). Revising an Engineering Design Rubric: A Case Study Illustrating Principles and Practices to Ensure Technical Quality of Rubrics. Practical Assessment, Research & Evaluation (PARE), 19(8), 7714. Retrieved from http://pareonline.net/getvn.asp?v=19&n=8

Jewitt, C. (2012). An introduction to using video for research An Introduction to Using Video for Research. National Centre for Research Methods Working Paper, 1–22.

Moskal, B. M. (2000). Scoring Rubrics : What , When and How ? Practical Assessment Research & Evaluation, 7(3), 3–7.

Most, I., Academy, K., Selin, J., & Education, H. (2015). Video - Based Learning : Current Research on Benefits and Best Practices, 1–8.

Nagro, S. A., deBettencourt, L. U., Rosenberg, M. S., Carran, D. T., & Weiss, M. P. (2017). The Effects of Guided Video Analysis on Teacher Candidates’ Reflective Ability and Instructional Skills. Teacher Education and Special Education: The Journal of the Teacher Education Division of the Council for Exceptional Children, 40(1), 7–25. https://doi.org/10.1177/0888406416680469

Robin, T., Simon, M., & Robin, T. & Simon, M. (2004). What’s still wrong with rubrics: focusing on the consistency of performance criteria across scale levels. Practical Assessment, Research & Evaluation, 9(2), 1–6. Retrieved from http://www.asu.edu/courses/asu101/asuonline/temp/whats_still_wrong_with_rubrics.pdf

Tobergte, D. R., & Curtis, S. (2013). Observation in Research. Journal of Chemical Information and Modeling, 53(9), 1689–1699. https://doi.org/10.1017/CBO9781107415324.004

Trace, J., Meier, V., & Janssen, G. (2016). “I can see that”: Developing shared rubric category interpretations through score negotiation. Assessing Writing, 30, 32–43. https://doi.org/10.1016/j.asw.2016.08.001

Wulan, AR (2018). Menggunakan Asesmen Kinerja untuk Pembelajaran Sains dan Penelitian. Bandung: UPI Press.




DOI: https://doi.org/10.53400/mimbar-sd.v6i2.14150

Refbacks

  • There are currently no refbacks.




Copyright (c) 2019 Mimbar Sekolah Dasar

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

View Mimbar Sekolah Dasar Stats