Children with autism spectrum disorder (ASD) find social activates difficult, including maintaining eye contact and reading facial expressions. But a novel application of Google Glass could help.
The research appeared in the journal npj Digital Medicine.
For kids on the autism spectrum, early treatment is best. But late diagnoses and therapy waitlists mean key developmental phases may be missed before they get help.
Now, a group at Stanford University has built a tool to span those gaps: an Android app that takes images from Google Glass's camera, analyzes them for faces and expressions, then displays the relevant emoticon in the glasses.
"The augmented reality really acts as a Jiminy Cricket on a child's shoulder, an audible at the line of scrimmage saying, 'That's a happy face,'" said co-author Dennis Wall, associate professor of pediatrics, psychiatry and biomedical data sciences at Stanford Medical School.
The machine-learning-assisted software recognizes eight emotions considered by many researchers to be universal among humans: happiness, sadness, anger, disgust, surprise, fear, neutral and contempt (named "meh" to render the emotion more child-friendly).
In addition to empowering children with ASD to engage in therapy in their home environment, the software lets parents review color-coded videos to see which facial expression their children caught or missed.
It also includes three games that reinforce socialization lessons.
The 14 children in the pilot study showed significantly improved social skills afterward. Average participation lasted 10 weeks.
A recently completed randomized controlled trial of 74 children, not yet published, found similar results.