Classroom Observation: A New Kind of Tool

Classroom observation instruments are not used all that regularly in higher education, but when they are, the focus tends to be on high-level abstractions (“The teacher was organized.”) or aggregated behaviors (“The teacher treated students with respect.”). Items like these are appropriate, but they do not identify the specific, concrete behaviors observers saw that caused them to come to these more comprehensive conclusions. 

It’s difficult to react to this kind of feedback with anything other than a generalized response (“Next semester I’ll work to be more organized.”). Moreover, the links between items like these and improved learning outcomes tend to be loose and indirect. A faculty research team of biologists at the University of Washington has developed an instrument, Practical Observation Rubric to Assess Active Learning, or PORTAAL, that addresses these issues. Their objective was to create “a tool that translates the research-based best practices into explicit and approachable practices.” (p. 13) The instrument “is intended to provide easy-to-implement, research-supported recommendations to STEM instructors [they’re relevant to other instructors as well] trying to move from instructor-centered to more active learning-based instruction.” (p. 2)

So the items to be observed and the ones they recommend that faculty use are those identified by research as having significant impact on student learning. They are specific and concrete. However, PORTAAL doesn’t claim to be a comprehensive list of research-supported best practices. Moreover, some best practices, like organization and showing respect for students, are not communicated by single actions but by collections of them. That caused the instrument’s developers to offer this caveat: “Following the suggestions outlined in this tool does not guarantee greater student learning, but the tool is a solid, research-supported first step.” (p. 13)

The instrument includes 21 items that cluster around four dimensions, each briefly described and illustrated here.

Practice—These items measure the amount and quality of practice students do during class, as well as how those practice opportunities are distributed across the class session. Items here relate to the number of minutes during a given class period when students have the opportunity to talk about course content, the percent of activities in which the instructor reminds students to use their prior knowledge, and the frequency with which instructors provide feedback on student explanations. The article includes a table for each of these dimensions that references the studies that justify inclusion of the items they’ve selected.

Logic development—These are items on the instrument that aim to measure higher-order thinking skills with very specific behaviors. “To provide students with opportunities to practice their logic development, it is necessary for instructors to formulate questions that require a higher level of thinking.” One simple way to encourage better answers to those questions is by reminding students to provide a rationale for their answers; hence, there is an item on how often teachers do this. Research documents the value of giving students time to think before they answer or discuss answers. The logic behind both right and wrong answers should be discussed, and there are items for this as well.

Accountability—Students must be motivated to participate in active learning classrooms. Teachers typically get students engaged in activities by giving points—for participation and/or for correct answers. More students have the opportunity to participate if teachers use group work. And correct answers should not always be provided by the same students. These research findings are represented by items on the instrument.

Reducing student apprehension—Here teachers increase participation not by giving points but by dealing with the fear that prevents students from participation and active involvement. The specific behaviors included on the instrument include praising efforts of the whole class and letting students know that contributions are appreciated. Student fear can also be reduced by framing errors as a positive part of the learning process and showing what can be learned from them.

The PORTAAL tool can be used for classroom observation, although the research team cautions that it’s difficult to use in real time and works more reliably if the teacher is videotaped. Then the instrument can be used to analyze the tape. They also see it as a promising self-reflection tool. 

The research team used PORTAAL to analyze the teaching of 25 biology teachers, all teaching in a three-quarter introductory biology series. In addition, they identified two instructors whose implementations of active learning had been shown in earlier research to increase student exam scores. These “reference” faculty were also taped, providing the team with a comparative set of benchmark data.

The researchers found “large variation” in the extent to which individual instructors used the items on the instrument. However, the two reference faculty had values in the top quartile for 52 percent of the items. There was also variation in the extent to which the items were used. For example, in the practice dimension, “more than half the instructors allowed students less than 6 minutes per 50-minute class session to engage in practice.” (p. 10) Notably, the reference faculty allowed 17 and 31 minutes, respectively. Instructors cued students to use prior knowledge in only 4.2 percent of the activities, and only 15 percent of activities involved higher-order cognitive skills. On the other hand, in 60 percent of the activities, the instructors heard student explanations, and over 65 percent of these faculty used some sort of accountability (points, random calling on students, or small-group work). Based on these results, the researchers recommend improvements in the following areas: more opportunities for in-class practice, reminding students to explain answers and giving them time to think before answering, increased participation by calling on students, and better communication about the role of errors in learning. 

Reference: Eddy, S. L., Converse, M., and Wenderoth, M. P., (2015). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes. Cell Biology Education-Life Sciences Education, 14 (Summer), 1-16.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles

Love ’em or hate ’em, student evaluations of teaching (SETs) are here to stay. Parts <a href="https://www.teachingprofessor.com/free-article/its-time-to-discuss-student-evaluations-bias-with-our-students-seriously/" target="_blank"...

Since January, I have led multiple faculty development sessions on generative AI for faculty at my university. Attitudes...
Does your class end with a bang or a whimper? Many of us spend a lot of time...

Faculty have recently been bombarded with a dizzying array of apps, platforms, and other widgets...

The rapid rise of livestream content development and consumption has been nothing short of remarkable. According to Ceci...

Feedback on performance has proven to be one of the most important influences on learning, but students consistently...

wpChatIcon

Classroom observation instruments are not used all that regularly in higher education, but when they are, the focus tends to be on high-level abstractions (“The teacher was organized.”) or aggregated behaviors (“The teacher treated students with respect.”). Items like these are appropriate, but they do not identify the specific, concrete behaviors observers saw that caused them to come to these more comprehensive conclusions. 

It's difficult to react to this kind of feedback with anything other than a generalized response (“Next semester I'll work to be more organized.”). Moreover, the links between items like these and improved learning outcomes tend to be loose and indirect. A faculty research team of biologists at the University of Washington has developed an instrument, Practical Observation Rubric to Assess Active Learning, or PORTAAL, that addresses these issues. Their objective was to create “a tool that translates the research-based best practices into explicit and approachable practices.” (p. 13) The instrument “is intended to provide easy-to-implement, research-supported recommendations to STEM instructors [they're relevant to other instructors as well] trying to move from instructor-centered to more active learning-based instruction.” (p. 2)

So the items to be observed and the ones they recommend that faculty use are those identified by research as having significant impact on student learning. They are specific and concrete. However, PORTAAL doesn't claim to be a comprehensive list of research-supported best practices. Moreover, some best practices, like organization and showing respect for students, are not communicated by single actions but by collections of them. That caused the instrument's developers to offer this caveat: “Following the suggestions outlined in this tool does not guarantee greater student learning, but the tool is a solid, research-supported first step.” (p. 13)

The instrument includes 21 items that cluster around four dimensions, each briefly described and illustrated here.

Practice—These items measure the amount and quality of practice students do during class, as well as how those practice opportunities are distributed across the class session. Items here relate to the number of minutes during a given class period when students have the opportunity to talk about course content, the percent of activities in which the instructor reminds students to use their prior knowledge, and the frequency with which instructors provide feedback on student explanations. The article includes a table for each of these dimensions that references the studies that justify inclusion of the items they've selected.

Logic development—These are items on the instrument that aim to measure higher-order thinking skills with very specific behaviors. “To provide students with opportunities to practice their logic development, it is necessary for instructors to formulate questions that require a higher level of thinking.” One simple way to encourage better answers to those questions is by reminding students to provide a rationale for their answers; hence, there is an item on how often teachers do this. Research documents the value of giving students time to think before they answer or discuss answers. The logic behind both right and wrong answers should be discussed, and there are items for this as well.

Accountability—Students must be motivated to participate in active learning classrooms. Teachers typically get students engaged in activities by giving points—for participation and/or for correct answers. More students have the opportunity to participate if teachers use group work. And correct answers should not always be provided by the same students. These research findings are represented by items on the instrument.

Reducing student apprehension—Here teachers increase participation not by giving points but by dealing with the fear that prevents students from participation and active involvement. The specific behaviors included on the instrument include praising efforts of the whole class and letting students know that contributions are appreciated. Student fear can also be reduced by framing errors as a positive part of the learning process and showing what can be learned from them.

The PORTAAL tool can be used for classroom observation, although the research team cautions that it's difficult to use in real time and works more reliably if the teacher is videotaped. Then the instrument can be used to analyze the tape. They also see it as a promising self-reflection tool. 

The research team used PORTAAL to analyze the teaching of 25 biology teachers, all teaching in a three-quarter introductory biology series. In addition, they identified two instructors whose implementations of active learning had been shown in earlier research to increase student exam scores. These “reference” faculty were also taped, providing the team with a comparative set of benchmark data.

The researchers found “large variation” in the extent to which individual instructors used the items on the instrument. However, the two reference faculty had values in the top quartile for 52 percent of the items. There was also variation in the extent to which the items were used. For example, in the practice dimension, “more than half the instructors allowed students less than 6 minutes per 50-minute class session to engage in practice.” (p. 10) Notably, the reference faculty allowed 17 and 31 minutes, respectively. Instructors cued students to use prior knowledge in only 4.2 percent of the activities, and only 15 percent of activities involved higher-order cognitive skills. On the other hand, in 60 percent of the activities, the instructors heard student explanations, and over 65 percent of these faculty used some sort of accountability (points, random calling on students, or small-group work). Based on these results, the researchers recommend improvements in the following areas: more opportunities for in-class practice, reminding students to explain answers and giving them time to think before answering, increased participation by calling on students, and better communication about the role of errors in learning. 

Reference: Eddy, S. L., Converse, M., and Wenderoth, M. P., (2015). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes. Cell Biology Education-Life Sciences Education, 14 (Summer), 1-16.