Credit: UC San Diego Jacobs School of Engineering
The science behind reading faces is fascinating. Sometimes it's obvious when someone is confused, happy, sad, angry, distracted etc. and other times what you can see in someone's face is subtle and hard to read. Distilling facial expressions into mathematical formulas that can be put into a computer program, with the panoply of human emotions we all experience and observe in others, must be really hard.
A computer scientist at UC San Diego is trying to do just that. Jacob Whitehill is hoping to create a facial recognition system that can determine if students understand or are confused by what they are hearing.
The goal is facial recognition software that can tell robotic teachers to slow down or speed up when giving instruction.
There may be a market for robotic teachers someday in some capacities, but I'll always prefer being taught by humans like John Ludgate, who I had for math my sophomore year of high school. He used to stand at the front of the room, bouncing a piece of chalk in his hand as he patiently tried to explain concepts in trigonometry, like the differences between sine and cosine.
Like a lot of teachers, he had to spend far too much time maintaining order in a classroom with about 30 hyperactive, hormonal and at times hyena-like 15-year-old kids. It might have been good training for him: he ended up joining the Peace Corps and going to Africa. I don't know what happened to him from there.
There's more on Whitehill's work here.