Skip to content Skip to navigation

How to Learn About How Students Learn

How to Learn About How Students Learn

Scientists explore new ways to analyze classroom interactions with only video.

5th_Floor_Lecture_Hall.jpg

Image credits:

Yuen Yiu, Staff Writer, altered from original image provided by user Xbxg32000 through Wikimedia commons.

Rights information:

CC BY-SA 3.0

Tuesday, December 13, 2016 - 14:45

Yuen Yiu, Staff Writer

(Inside Science) -- How well do students learn? Today's policy makers tend to base their assessments on standardized tests, and not much else. But some education researchers suggest it's just as important to ask a more fundamental question: How do students learn?

"We have this whole suite of standardized tests that are intended to measure learning in some way," said Rosemary Russ, a physics education researcher from the University of Wisconsin–Madison, "but we also need to look at student behaviors and look for patterns between that and teaching methodologies and other measures of learning."

One way to do this is to observe classroom behaviors. Are students falling asleep when their instructor turns down the lights for a PowerPoint presentation, for example, or are they taking notes and asking questions?

Toward this goal, researchers from the University of California, Irvine; Seattle Pacific University and Harvard University in Cambridge, Massachusetts investigated new ways to observe and analyze general classroom interactions. According to a paper they published in the journal Physical Review Physics Education Research in late November, meaningful classroom analyses can be done with video data alone, without audio. This new insight can drastically reduce the cost and effort needed for future classroom research across all education levels.

Look, not listen

"Right now when people are doing these analyses, they generally collect both audio and video data," said Laura Tucker, a science education researcher from UC Irvine who worked on the project.

To obtain clearly audible recordings from a classroom, researchers often have to set up multiple microphones at multiple angles and positions. This becomes difficult for larger classes, where students' voices start to blend together, and individual remarks become indistinguishable.

"Logistically, audio data is difficult to collect and analyze, and matching audio to specific students in large studies is even harder. Putting a camera in a classroom (without collecting audio) is much easier," said Tucker.

Most smartphones have two cameras already built in, just one example of how relatively cheap it is to collect and store video data.

To test the possibility of analyzing classrooms with only videos, the researchers trained a group of volunteers to tag classroom activities using a classification system published in 2009, by Rachel Scherr and David Hammer, a pair of researchers from University of Maryland in College Park.

By looking at video samples taken inside a medium-sized university class, the volunteers were able to identify several types of activities -- such as whether the students were filling out tests, listening to a lecture, having discussions or just joking around. They did this by observing the students' gestures, body language, head movements and other nonverbal clues, using the protocols provided by the classification system.

Ultimately, the researchers found that the volunteers could distinguish between these activities with the same accuracy, regardless of the presence of audio recordings.

Next step -- computers

Eventually the researchers would like to automate the process with computers, so that more data can be processed. Nonverbal communication -- body language -- is a natural component of human interaction. But before computers can recognize it, researchers need to articulate those visual cues in a way that the computers can understand.

"Humans are absolutely essential to doing the initial work," said Tucker. "We first need to figure out if and how these analyses can be done, then we can teach a computer how to do it."

From the rudimentary spam email filters to the more sophisticated photo tagging algorithms such as those used by Facebook, universities and companies alike have made big strides in pattern recognition software over the past decade. The technology is now also taking over video analysis, largely driven by its application in self-driving cars.

"If the process could be automated, then we could analyze substantially more data and start to look for patterns between student behaviors and other measures of learning," said Russ.

Given the diversity of learning environments in different districts, states and countries, video analysis could provide classroom information that standardized tests just can't deliver.

Filed under

Republish

Authorized news sources may reproduce our content. Find out more about how that works. © American Institute of Physics

Author Bio & Story Archive

Picture of Yuen Yiu

Yuen Yiu covers the Physics beat for Inside Science. He's a Ph.D. physicist and fluent in Cantonese and Mandarin. Follow Yuen on Twitter: @fromyiutoyou.