As Google retools its Glass experiment, researchers at Stanford are using the device to help autistic children recognize and classify emotions.
In a small office buried inside an administrative building at Stanford, Catalin Voss and Nick Haber are pairing face-tracking technology with machine learning to build at-home treatments for autism. The Autism Glass Project, a part of the Wall Lab in the Stanford School of Medicine, launches the second phase of its study Monday morning.
The software uses machine learning for feature extraction, to detect what Voss calls ‘action units’ from faces.
With the portable power we now have available, we’re only scratching the surface of what’s possible.