The new Kinect app acts as your virtual therapist. But just how helpful is SimSensei?
Written by Michael Thomsen (@mike_thomsen)
Depression is epidemic in the United States, with roughly nine percent of Americans meeting the clinical criteria for depression in any given month according to Harvard Medical School. University of Toronto psychiatrist Edward Shorter explains this phenomenon as distinct from major depression, or melancholia, something that is endured by most people without reaching the level of serious debilitation. "[Non-melancholic people] get out of bed to go to work every day, they don't cry all the time," he explained to The Daily Beast. "But instead they have other stuff, and this is the crucial point. They're anxious, for example. They're fatigued. They have all kinds of somatic pains, body pains that come and go. And they tend to obsess about the whole thing."
A new system called SimSensei proposes to help identify people suffering from these depressive symptoms and offers preliminary conversational exchanges with a computer character. The program relies on Microsoft's Kinect camera to take full 3D video of its subjects, analyzing speech patterns and facial expression to detect symptoms of sadness. The interface uses a polygonal model therapist who asks leading questions of users. "I'm not a therapist," the computer avatar says, "but I love to learn about people and I'd love to learn about you."
The technology has been designed to be as neutral as possible but there is something innately strange about talking about your feelings to a computer avatar.
The questions posed are simple and unobtrusive—"How are you doing?" "Where are you from originally?"—but any signs of potentially unpleasant emotions can trigger a branch in the line of questions that ask more pointed questions about the subject's current state. The program can detect prolonged downward glances, overly animated hand movements, and when inconclusive answers are given it will pause to give the user time to keep talking through an uncomfortable subject.
SimSensei was originally designed for military use with funding support from Darpa, hoping to create a mental health support portal that would have been accessible to veterans in any number of health clinics. The model would work like a therapeutic photo booth, offering full privacy and customization, with the patient able to choose their therapist's model. The kiosk technology would work in conjunction with a care regiment laid out by a human therapist, and would also connect to Sim Coach, a version of the technology for the home that would monitor the patient and collect data for his file to help the care provider chart his or her progress.
The technology has been designed to be as neutral as possible but there is something innately strange about talking about your feelings to a computer avatar, an experience that in some cases may exacerbate feelings of isolation or alienation. When one subject answers that he's from Los Angeles in the demonstration videos, SimSensei says, "Me too," which feels jarringly untrue. Even if it is a playful abstraction about where the avatar's programmers are from, inferring that meaning from context makes one even more self-conscious about the fakery of the avatar setup. Likewise, pathologizing Shorter's non-melancholic depression, which may be healthy and normal responses to uncontrollable events—being laid off, having a home foreclosed on, going through divorce—risk prolonging and intensifying the feelings.
A number of studies have suggested that mimicking facial expressions and emotive gestures can lead to increased activity in the areas of the brain associated with feelings. It's been argued the primary reason babies smile is to mimic their parents, and not necessarily to express happiness. A number of researchers have outlined theories on the principle of emotional contagion, which Gerald Schoenewolf defined as the emotional influence over others "through the conscious or unconscious induction of emotion states and behavioral attitudes."
These ideas suggest that humans use emotions both as the basis of socialization and corrective social exchanges that bond people to one another. But using the same conversational mechanisms between a computer and a human, where bonds are replaced by liability waivers and end-user agreements, seems like a doomed approach. Looking to conversational A.I. to solve problems that are largely connected to social and political conditions leaves all of the problems in place and, in the best case scenario, offers only a momentary relief to the person suffering through them. In the case of supporting former military members with Major Depression SimSensei is less a solution to the problem of hugely inadequate mental health professionals available to service members, compounded by a systemic hostility to diagnosing mental health conditions, than a sedative that simulates progress by appearing to quell the symptoms of dysfunction in the system. It risks even further destabilization by suggesting the dysfunction in the system is the individual, a sentiment most people will become suspicious of over time.