Here’s how it would work. Using facial recognition software called EngageSense, computers would apply algorithms to what the cameras have recorded during a lecture or discussion to interpret how engaged the students have been. Were the kids’ eyes focused on the teacher? Or were they looking everywhere but the front of the class? Were they smiling or frowning? Or did they just seem confused? Or bored?
Teachers would be provided a report that, based on facial analysis, would tell them when student interest was highest or lowest. Says SensorStar co-founder Sean Montgomery, himself a former teacher: “By looking at maybe just a couple of high points and a couple of low points, you get enough takeaway. The next day you can try to do more of the good stuff and less of the less-good stuff.”