Avatars can be a useful addition to therapy by providing a disconnect for either the patient or the clinician. A computer can pick up on very slight nuances that may be missed and as everything is recorded future sessions can reflect heavily on what has occurred before. So does this mean an avatar is better than a human therapist? Or is it simply a very convenient addition?
Avatar... The word evokes different thoughts in people. Blue aliens for one. For those who have seen the film, or any one of the others that use motion-capture technology (The Polar Express, A Christmas Carol), you will know what I am talking about when I describe the modern day avatar. A computer generated image of a human being that interacts and uses expressions and movements in a somewhat disconcerting way. The uneasiness we feel when confronted by these wannabe humans is known as ‘the uncanny valley’ - a phrase coined to describe the feeling we get from a robot that nearly, but not quite, has the appearance of a human. Whilst we may see ourselves in the knots on a tree or stars in the sky (a common experience known as pareidolia) if these representations cross over the line from similarity to resemblance it becomes an uncomfortable experience for us. Some virtual reality such as Second Life appears to bridge that gap but it isn’t often you see a character on it that accurately represents what a human actually looks like - think more Barbie.
The use of avatars in the treatment of schizophrenia is a recent and interesting development. Instead of hearing a disembodied voice the patient can choose a face to go with the delusion - and then the therapist guides them through communication with it. This computer generated avatar will heckle and hound them whilst the therapist encourages them to stand up to it, argue back, tell it that it is wrong. Through this opposition the patient becomes more aware of and able to disagree with the auditory hallucinations. The results of the pilot study are positive and funding has been given to continue with a much larger research project. The man behind it, Julian Leff, believes the patients are able to maintain awareness that the avatars were created by themselves and therefore do not pose any actual threat. It is easy to see how the uncanny valley, that feeling of disgust, can have a positive contribution. I wonder though if by giving a face to the voice it makes it more or less human? If the patient becomes aware that the words are coming from the therapist might this damage progress?
Therapy also takes advantage of avatars - stress, PTSD and anxiety all have their exponents of computer generated therapists or support. Interstress is a recent EU funded project to use biosensors that can measure stress in real world situations. These input data into your mobile device and you can interact with a virtual world with the aim of relaxing you. Support is provided by other people in a group 3D environment where you are all represented as avatars. The Online Therapy Institute provide a similar environment in Second Life - people can interact with a support system or qualified therapists in a virtual reality institute. These have the same thing in common however - each avatar is supposed to represent and is controlled by a real human. The avatars themselves are connected to others rather than acting ‘alone’.
What does use an avatar to represent an entirely new entity, a person of themselves rather than simply a representation of someone, is MACH. MACH stands for My Automated Computer coacH and was created by an MIT student. So far used for improving interview skills it follows a persons vocalizations, smiles, head movements and eye contact through a web cam. The avatar then responds - e.g. if a person is not looking directly at them it asks where they’ve gone. Preliminary findings suggest that this is a much more effective way to train people in interview skills than getting them to watch videos and its use in people with social anxiety has begun to take centre stage. The avatar is not some person behind the scenes and their responses are limited to those it is programmed to give but with the data we can collect and the continuing sophistication and capability of computing it is reasonable to see this as having real benefit in the future. Either that or it will simply replace real world human interactions!
Speaking of which. Sim Coach is a virtual support programme in the beta testing stages for those affected by PTSD. It provides the first stage of help, for those who are unsure of where to go or feeling stigmatized. The avatars, of which there are four, all have calming voices and they are programmed to reply to what is being typed - whatever is indicated by the words used initiates an appropriate response such as better sleep tips for those with insomnia. Past responses and answers to questionnaires are kept for future interactions. With the upcoming SimSensei which can respond, similar to MACH, to vocalizations, movements and expressions, the breadth of the avatars ability is phenomenal. All past interactions are maintained and used as well as small cues within the session that may have been missed by a human. But is a computer able to make intuitive decisions or provide the suitable tactile reactions that may be needed? Will it remain in the uncanny valley? Or are we looking at a future where those who find it difficult to talk about their feelings can use a computer program to get the help they need?
In each of these therapies I see the same words again and again - “its aim is not to replace human interactions or face to face therapy”. I’m sure we are all aware of the many ways in which online communication is replacing other means. Social networking applies not only to personal lives but also to our business connections. It has added another layer to what we are able to do rather than diminishing what was already there. Sure, some people may abuse it but the majority of people have healthy online interactions with others through Facebook, Twitter, LinkedIn, etc. Apply this experience to avatar therapy and, if people are open to it and willing to try, I don’t see why it can’t be a positive addition. Think of the applications to social disorders, exposure therapy and more. If there is a chance that this could improve peoples lives why not jump at it? Just be wary of the uncanny valley and consider its implications.