Nov 29th, 2013, 3:00 pm in Huxley 217/218

Understanding human faces

Powered by ever more sophisticated sensors, the machines of the future will interact with humans far more naturally than is possible today. Most of us find face-to-face interaction the most comfortable and for good reason - the brain does an outstanding job of interpreting the identity and emotional state of another human - all from just looking at their face.

With this in mind we seek to develop algorithmic approaches to understanding the human face. I’ll explain how powerful generative models of the face can be constructed, and what we can learn from them. I’ll then demonstrate the usefulness of such models in identity and emotion recognition, and highlight how our collaboration with Great Ormond Street Hospital is helping advance craniofacial surgery techniques. Finally, I’ll talk about where we can take statistical facial modelling in the future, and discuss some of the challenges that we most overcome in order to advance.

James Booth is a PhD student in the Intelligent Behaviour Understanding Group at Imperial College London. His research revolves around building and fitting 3D statistical models of the human face. He is also an honorary member of the Craniofacial Unit at Great Ormond Street Hospital, where his research is being applied to help advance corrective surgery for children with facial deformities. When not knee-deep in Python code James enjoys swimming and cycling, having recently completed his first Triathlon More about me→