- Year 2023
- NSF Noyce Award # 1849948
- First Name Jeremy
- Last Name Zelkowski
- Discipline Mathematics
Jim Gleason, Martha Makowski, Bill Bergeron
Melinda Williams, Hillcrest Middle School; Alicia Ware, Northridge High School; Andy Hamric, Northside High School; Jeremy Zelkowski
Our poster will present early results of the first cohort’s national board results in relationship to our classroom observation protocol results. The results are important in guiding teachers seeking national boards and what practices are most important to attend to and be aware of as they grow their practices. There is little in the literature using quantitative instruments with validity evidence about teachers’ classroom practices as it relates to improved teaching outcomes. Our early work addresses parts (a)-(c) with this poster.
The research questions for the early analyses within the project are seeking to understand the relationship between multiple classroom observations of mathematical practice and the outcomes on national boards components 2 & 3. Specifically, we ask: What is the relationship between MCOP2 Student Engagement and Teacher Facilitation factors from observation ratings to National Boards Component 2 and 3 scores?
Over the course of the last decade plus, there has been a group of nearly 50 mathematics education researchers from across examining the validity of measures in mathematics education. A subset of this national group has compiled and coded the validity evidence of “teacher education instruments” in published articles in 24 field journals that aimed to measure construct domains of affect (e.g. beliefs, practices, ability) but not knowledge. The framework of their work is predicated on the joint standards publication of the AERA, APA, & NCME (2014) for measurement in education where six types of validity evidence are included and nicely summarized in Krupa et al. (2019) (i.e. Content, Reliability, Internal Structure, Response Processes, Relationship to Other Variables, and Consequences of Use). This subset of scholars presented concerns that more than 80% of affect based instruments since 2000 in mathematics education journals lack multiple categories of validity evidence, whereas most affect instruments in mathematics teacher education presented only one or two areas of validity evidence (Gallagher et al., 2022). The MCOP2 was developed and framed in the notion the classroom is a community of practices by both teachers and students (Gleason et al., 2017). Moreover, the MCOP2 was designed explicitly to assess the degree to which student engagement [SE] with the SMPs is actively present during classroom instruction, while implicitly assessing the degree to which teachers facilitate [TF] effective MTPs [Zelkowski et al., 2020]. Gleason et al. (2017) presented validity evidence of content, reliability, response processes, and internal structure in the original publication, while later publications have demonstrated consequences of use (Authors, 20XX) and relationships to other variables (Authors, 20XX). The instrument is scored 0-1-2-3 without the use of fractional points depending on the level of observed classroom actions. The National Professional Boards for Teaching Standards (NPBTS, 2016) has seen a tremendous number of newly nationally board-certified teachers and candidates for such as states have incentivized teachers to participate in national boards with salary increases for achieving certification. States, districts, and schools have developed support mechanisms, professional development, and writing peer mentorship groups as an effort to support teachers in their pursuit of certification. The NPBTS consists of the measurement of four domains, three of which are affective (i.e. differentiation of instruction for students, teaching practices related to classroom learning environment, and effective/reflective professional practices) via Components 2, 3, and 4 respectively, whereas the other domain is the content knowledge domain via Component 1. The portfolio of these three affective domains is assessed based on established rubrics scoring 0-1-2-3-4 using quarter-point (+/-) values in the rubrics, where multiple areas of validity evidence (NBPTS, 2016) of instructional practices of accomplished teaching has been demonstrated.
The Spearman Correlational analyses initially found the items that were statistically significant in relationship to NBCT component scores. Given the high correlations of multiple variables (rho>0.5), we then explored which variables are the greater predictors to understand which were most related in understanding NBCT related scores. The MCOP2 scoring can be interpreted as follows. The mean SE and TF factor scores ranging 2.5-3.0 would be “excellent” with 2.0-2.49 as “very good”. Further, 1.5-1.99 would be “good to average” and less than 1.5 is below average teaching practices. Based on EQs (1, 2, 3, 4, 5), we find that Component 2 is strongly related to the level of student engagement observed throughout the year’s observations during initial NBCTS submission. The model explains nearly 60% of variation in the outcome scores. Important to note, a mean score of ~1.8 on the MCOP2 SE factor is related to a Component 2 score of 3 (clear evidence of accomplished teaching). This is a lower score than anticipated. With the differentiation component mostly about how teachers plan to engage all students in their planning and enactment and a purely written component of portfolio artifacts, it is not so surprising in hindsight. Component 3 though tells a different story. A mean MCOP2 TF factor score requires slightly more than a 2.0 to warrant clear evidence of accomplished teaching on Component 3 (teaching practices). This is an expected score range. The key deliverables for the next year continue with the next cohort progressing and the first cohort finishing up national board certification. MTFs who finished national boards are mentors and leading the other MTFs currently during this year’s project.
Our initial findings with the first cohort of MTFs demonstrates significant relationships even with a small sample size less than 15 teachers. However, by going even deeper into the item-level analyses, we see the importance of a “conceptual teaching focus” with the TF item 6 on teaching practice construct outcomes (need >2.0 mean on item 6), as well as the SE item 13 on differentiation of instruction to engage students with a high-quality classroom culture (need ultimately a 3.0 mean) to warrant accomplished teaching for the differentiation construct. That is not to say that the other Spearman significant correlations are not worth considering. Items 4&7 are also important. Item 4 focuses on both TF and SE factors, pointing to a teacher who facilitates lessons with a focus on students critically examining mathematical strategies. Item 7, modeling, is also important as we found a relationship to content knowledge. Lastly, we were not initially examining the Component 1 content knowledge relationship to MCOP2 observations, it was secondary that we found some significance with the modeling item 7. We believe this finding begins to shed potential that modeling may be more present in teachers’ classrooms with greater content knowledge. Holistically, there were also TF, SE, and total MCOP2 scores that were important. We view this holistically that most items are important on each factor, but certainly as our data set grows, we expect to see more granular findings in future research analyses.