test Ανάπτυξη και Πιλοτική Εφαρμογή Εργαλείου Μέτρησης Επιστημονικού Εγγραμματισμού Μαθητών Δημοτικού|Research in Science & Technology Education

Ανάπτυξη και Πιλοτική Εφαρμογή Εργαλείου Μέτρησης Επιστημονικού Εγγραμματισμού Μαθητών Δημοτικού


ΑΙΚΑΤΕΡΙΝΗ ΣΑΡΓΙΩΤΗ
Αναστάσιος Εμβαλωτής
Abstract

Η παρούσα εργασία παρουσιάζει μέρος της μεθοδολογίας και των διαδικασιών εγκυροποίησης ενός εργαλείου μέτρησης επιστημονικού εγγραμματισμού μαθητών/τριών της τελευταίας τάξης του Δημοτικού Σχολείου και την προοπτική ανίχνευσης επίπεδων επιστημονικού εγγραμματισμού τους. Στη πιλοτική έρευνα, τα προκαταρκτικά αποτελέσματα της οποίας δημοσιεύουμε, συμμετείχε δείγμα 260 μαθητών/τριών, που απάντησε σε ειδικά διαμορφωμένο ηλεκτρονικό ερωτηματολόγιο. Τα αποτελέσματα των διαδικασιών εγκυροποίησης κατοχυρώνουν τη φαινομενική εγκυρότητα και την εγκυρότητα περιεχομένου και κατατάσσουν τους μαθητές/τριες σε τρία διαφορετικά επίπεδα επιστημονικού εγγραμματισμού (χαμηλό, μεσαίο, υψηλό).

Article Details
  • Section
  • Research Article
Downloads
Download data is not yet available.
References
Σοφιανοπούλου, Χ., Εμβαλωτής, Α., Πίτσια, Β. & Καρακολίδης, Α. (2017).Έκθεση Αποτελεσμάτων του Διεθνούς Προγράμματος PISA 2015 για την Αξιολόγηση των Μαθητών στην Ελλάδα. Ινστιτούτο Εκπαιδευτικής Πολιτικής (ΙΕΠ). http://iep.edu.gr/pisa/images/publications/reports/pisa_2015_greek_report.pdf
Bybee, R. W. (1995). Achieving scientific literacy. The Science Teacher, 62, 28-33.
Bybee, R. W. (1997). Achieving scientific literacy: From purposes to practices. Heinemann.
Cetin-Dindar, A. (2016). Student motivation in constructivist learning environment. Eurasia Journal of Mathematics, Science and Technology Education, 12(2), 233-247. https://doi.org/10.12973/eurasia.2016.1399a
Chin, C. C., Yang, W. C., & Tuan, H. L. (2016). Argumentation in a socioscientific context and its influence on fundamental and derived science literacies. International Journal of Science and Mathematics Education, 14, 603-617. https://doi.org/10.1007/s10763-014-9606-1
Drasgow, F., & Lissak, R. I. (1983). Modified parallel analysis: a procedure for examining the latent dimensionality of dichotomously scored item responses. Journal of Applied psychology, 68(3), 363. https://psycnet.apa.org/doi/10.1037/0021-9010.68.3.363
Fernandez-Cano, A. (2016). A Methodological Critique of the PISA Evaluations. RELIEVE, 22(1), 1-16. http://dx.doi.org/10.7203/relieve.22.1.8806
Hacıeminoğlu, E., Yılmaz-Tüzün, Ö., & Ertepınar, H. (2014). Development and validation of nature of science instrument for elementary school students. Education 3-13, 42(3), 258- 283. https://doi.org/10.1080/03004279.2012.671840
Holbrook, J., & Rannikmae, M. (2009). The meaning of scientific literacy. International journal of environmental and science education, 4(3), 275-288.
IEA. (2011). TIMSS and PIRLS Achievement Scaling Methodology. Retreived from https://timssandpirls.bc.edu/methods/pdf/TP11_Scaling_Methodology.pdf.
Lau, K. C. (2009). A critical examination of PISA’s assessment on scientific literacy. International Journal of Science and Mathematics Education, 7, 1061-1088. https://doi.org/10.1007/s10763-009-9154-2
Lord, F. (1980). Applications of item response theory to practical testing problems. Lawrence Erlbaum Associates.
Lynn, M. R. (1986). Determination and quantification of content validity. Nursing research, 35(6), 382-386.
MacKenzie, S. B., Podsakoff, P. M., Podsakoff, N. P. (2011). Construct measurement and validation procedures in MIS and behavioral research: Integrating new and existing techniques. MIS Quarterly, 35, 293-334. https://doi.org/10.2307/23044045
Magis, D., Beland, S., & Raiche, G. (2013). difR: Collection of methods to detect dichotomous Differential Item Functioning (DIF) in psychometrics. R package version 5.0. http://www.CRAN.R-project.org/package=difR
Mason, L., Boscolo, P., Tornatora, M. C., & Ronconi, L. (2013). Besides knowledge: A crosssectional study on the relations between epistemic beliefs, achievement goals, self-beliefs, and achievement in science. Instructional Science, 41, 49-79. https://doi.org/10.1007/s11251-012-9210-0
Ng, D. T. K., & Chu, S. K. W. (2021). Motivating students to learn STEM via engaging flight simulation activities. Journal of Science Education and Technology, 30(5), 608-629. https://doi.org/10.1007/s10956-021-09907-2
OECD. (2006). Assessing scientific, reading and mathematical literacy: A framework for PISA 2006. OECD Publishing.
OECD. (2016α). PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic and Financial Literacy. PISA. OECD. https://doi.org/10.1787/19963777
OECD. (2016β). PISA 2015 Results (Volume I): Excellence and Equity in Education, PISA. OECD Publishing. https://www.oecd.org/publications/pisa-2015-results-volume-i-9789264266490-en.htm
OECD. (2016γ). PISA 2015 Technical Report. Paris: OECD Publishing.
OECD. (2018). Education for a bright future in Greece: Reviews of national policies for education. OECD. https://www.oecd.org/education/education-for-a-bright-future-in-greece-9789264298750-en.htm
OECD. (2019). PISA 2018 Results (Volume I): What students know and can do. PISA. OECD. https://www.oecd.org/education/pisa-2018-results-volume-i-5f07c754-en.htm
OECD (2021), OECD Skills Outlook 2021: Learning for Life, OECD Publishing. https://doi.org/10.1787/0ae365b4-en
Polit, D. F., & Beck. C. T. (2006). The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489–497. https://doi.org/10.1002/nur.20147
Pollmeier, J., Tröbst, S., Hardy, I., Möller, K., Kleickmann, T., Jurecka, A., & Schwippert, K. (2017). Science-P I: Modeling conceptual understanding in primary school. In D. Leutner, J. Fleischer, J. Grünkorn & E. Klieme (Eds.), Competence assessment in education: Research, models and instruments, 9-17. Springer. https://doi.org/10.1007/978-3-319-50030-0_2
Rijbroek, B., Strating, M. M., Konijn, H. W., & Huijsman, R. (2019). Child protection cases, one size fits all? Cluster analyses of risk and protective factors. Child abuse & neglect, 95, 104068. https://doi.org/10.1016/j.chiabu.2019.104068
Rizopoulos, D. (2006). ltm: An R package for latent variable modeling and item response theory analyses. Journal of Statistical Software, 17, 1-25. http://dx.doi.org/10.18637/jss.v017.i05
Roberts, D. A. (2007). Scientific Literacy/Science Literacy. In S. K. Abell & N. G. Lederman (Eds.), Handbook of Research on Science Education (pp. 729-780). Lawrence Erlbaum Associates.
Sadler, T. D., & Zeidler, D. L. (2009). Scientific literacy, PISA, and socioscientific discourse: Assessment for progressive aims of science education. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 46(8), 909-921. https://doi.org/10.1002/tea.20327
Sangoseni, O., Hellman, M., & Hill, C. (2013). Development and validation of a questionnaire to assess the effect of online learning on behaviors, attitudes, and clinical practices of physical therapists in the United States regarding evidenced-based clinical practice. Internet Journal of Allied Health Sciences and Practice, 11(2), 1–12. https://doi.org/10.46743/1540-580X/2013.1439
Sargioti, A., & Emvalotis, A. (2020). Attitudes towards science and the impact of epistemic beliefs on pre-service primary teachers’ scientific literacy. Educational Journal of the University of Patras Unesco Chair, 7(1), 175-189. https://efe.library.upatras.gr/ejupUNESCOchair/article/view/3239
Schiefer, J., Golle, J., Tibus, M., & Oschatz, K. (2019). Scientific reasoning in elementary school children: Assessment of the inquiry cycle. Journal of Advanced Academics, 30(2), 144-177. https://doi.org/10.1177/1932202X18825152
Shamos, M. H. (1995). The myth of scientific literacy. Rutgers University Press.
Shen, B. S. P. (1975). Scientific literacy and the public understanding of science. In S. B. Day (Eds.), Communication of scientific information (pp. 44–52). Karger.
Solis, D. H., Hutchinson, D., & Longnecker, N. (2021, August). Formal learning in informal settings–increased physics content knowledge after a science centre visit. Frontiers in Education, 6(698691), 1-12. https://doi.org/10.3389/feduc.2021.698691
Tkaczynski, A. (2016). Segmentation Using Two-Step Cluster Analysis. In T. Dietrich, S. Rundle- Thiele & K. Kubacki (Eds.), Segmentation in social marketing: Process, methods and application (pp. 109-125). Springer. https://doi.org/10.1007/978-981-10-1835-0_8
Toma, R. B. (2021). Measuring children’s perceived cost of school science: Instrument development and psychometric evaluation. Studies in Educational Evaluation, 70, 101009. https://doi.org/10.1016/j.stueduc.2021.101009
Toma, R. B., & Meneses Villagra, J. A. (2019). Validation of the single-items Spanish-School Science Attitude Survey (S-SSAS) for elementary education. Plos One, 14(1), e0209027. https://doi.org/10.1371/journal.pone.0209027
van der Ark, L. A. (2007). Mokken scale analysis in R. Journal of Statistical Software, 46. https://doi.org/10.18637/jss.v020.i11
Van de Sande, E., Kleemans, T., Verhoeven, L., & Segers, E. (2019). The linguistic nature of children’s scientific reasoning. Learning and Instruction, 62, 20-26. https://doi.org/10.1016/j.learninstruc.2019.02.002
Xiao, S., & Sandoval, W. A. (2017). Associations between attitudes towards science and children’s evaluation of information about socioscientific issues. Science & Education, 26, 247-269. https://doi.org/10.1007/s11191-017-9888-0
Yang, F. M., & Kao, S. T. (2014). Item response theory for measurement validity. Shanghai archives of Psychiatry, 26(3). https://doi.org/10.3969%2Fj.issn.1002-0829.2014.03.010
Zoupidis, A., Pnevmatikos, D., Spyrtou, A., & Kariotoglou, P. (2016). The impact of procedural and epistemological knowledge on conceptual understanding: the case of density and floating–sinking phenomena. Instructional Science, 44, 315-334. https://doi.org/10.1007/s11251-016-9375-z