What do neuroscientists and salesmen have in common? Virtually nothing would make them happier than the ability to read your mind: the former in order to understand how the brain works, the latter to leverage that information to sell you things (apologies if you were expecting a punch line to the opening questions). People are generally very good at reading emotions in others based upon factors such as tone of voice, facial expression, and body language. But what about machines? Through artificial intelligence and principles of human-computer interaction can they be taught to read emotions too?
This is precisely what a team of researchers at the Samsung Advanced Institute of Technology in South Korea are working towards by enabling smart phones capable of inferring user emotion. As reported in MIT’s Technology Review:
Rather than relying on specialized sensors or cameras, the phone infers a user’s emotional state based on how he’s using the phone.
For example, it monitors certain inputs, such as the speed at which a user types, how often the “backspace” or “special symbol” buttons are pressed, and how much the device shakes. These measures let the phone postulate whether the user is happy, sad, surprised, fearful, angry, or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology’s Intelligence Group, in South Korea. Lee led the work on the new system. He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one’s mental state, which the software’s machine-learning algorithms can detect with an accuracy of 67.5 percent.
Once a phone infers an emotional state, it can then change how it interacts with the user:
The system could trigger different ringtones on a phone to convey the caller’s emotional state or cheer up someone who’s feeling low. “The smart phone might show a funny cartoon to make the user feel better,” he says.
As another example, one could imagine that our phones and Siri-esque assistants may peek at our calendars and be able to tell when we are busy, and therefore more likely to be stressed. Similarly, contextual cues may be used in conjunction to predict emotional state, e.g. whether the user is experiencing bad weather or a traffic jam based on geolocation. We at Medgadget are excited to see how this innovative application of technology develops. Maybe in the not too distant future our phones will also double as our psychiatrists.
Technology Review Article: A Smart Phone that Knows You’re Angry
Company profile: Affectiva, makers of emotion measurement technologies (skin conductance sensor, facial recognition technology, etc).