We live a world where virtual assistants live a stone throw away from us. You pick up your smartphone and talk to Bixby, the Google Assistant or Siri. You are about to step out of your house, and you want to know if it will rain in the evening. The natural instinct is to ask Alexa if you should be carrying an umbrella. Virtual assistants have become a part and parcel of our everyday lives. Now think of the evolution of these assistants into virtual humans. This is the technology being developed by Samsung NEON and I got a chance to talk to Anil Unnikrishnan, who is one among the technical leads of the lab.
To start off with, what is the difference between a digital avatar and a digital human?
From cartoon like caricatures to technologies like Epic’s Metahumans, digital avatars can encompass a variety of fidelity. A good old example of an avatar is Clippy that Microsoft released back in 1997. Technology has progressed since then. Digital humans will look near identical to actual human beings. That’s the fundamental difference between a digital avatar and a digital human.
These creations look, behave, and can be interacted with like actual humans.
Human beings are social creatures. We crave for interaction with others. The word interaction is an overloaded term here. It includes a combination of words, voice, and visuals. Albert Mehrabian, Professor Emeritus at University of California, Los Angeles talks about how only 7% of meaning in communicated through spoken words in his 1971 book Silent Messages. The rest is 38% through the tone of voice, and 55% through body language.
(Anil Unnikrishnan)
Evolution Into Digital Humans
Now going back to the assistants of today. There is always a sense of commanding when it comes to speaking with these devices. The reason is that at the end of the day, you know it’s a piece of hardware. Its non-emotive and it’s there to listen to everything you say and tend to all your needs. But now think of a scenario where these systems started to manifest like actual humans. They won’t all look the same. The digital human yoga instructor will be different from your concierge at the restaurant from the personal assistant at home. But when you bring photorealistic looking digital humans into this conversation, we start to bring in the other two modes of interaction and more importantly people start to develop a sense of empathy.
The human brain contains what is called a “mirror neuron system”. When we observe actions we are familiar, this part of the brain internally starts to mimic the other person. This leads to the development of emotions for us human beings. In our work here at NEON, we work on problems trying to bridge that gap between humans and machines.
I’ve seen quite a few of these digital personas in different forms. But sometimes my brain tells me they are not real even if I want to believe they are.
This is a common topic of discussion around digital humans. This effect is what is called the Uncanny Valley. If you take the spectrum of things that are completely unreal (take Clippy as an example) all the way till the other end of an actual human, the closer you get to the human form, the more the brain starts to question if its real. The slightest of deviation from the norm will send signals to your brain enforcing what you are seeing are not an actual human. There is never a benefit of doubt given to these creations.
What are some of the digital humans that are out there in the world?
Even though this technology is nascent, there are few companies that are doing this. A good example is Lil Miquela, a fictional character that exists only on social media and collaborates with brands to promote products, primarily in fashion. The Instagram handle as of now has about 2.7 million followers. This character was created based on an actual human but the persona that is Lil Miquela does not exist in the real world. The combination of their visual appearance, relatable personality, and the ability to create content resonates with the audience and has helped them gain popularity. People know she isn’t real. But continue to be influenced by her.
What are some of the use cases of digital humans?
Like we spoke about before, digital humans can be used for social media marketing and promotions. One of the most common places where digital humans will be used is in the service industry. The fact that a digital human can be available 24/7 and can scale quickly is a big advantage they have over humans. The time and cost incurred to train a human and get them ready for a job in the service industry is far higher when compared to digital humans. Another important advantage these beings have over humans is that fact that their emotional engagement can be programmed. We humans are never perfect. That’s what makes us human. But the service industry demands us to be. Having a bad day outside of work is not an excuse when you are working at a concierge desk or in the airport. Deploying artificial humans in these scenarios is a “best of both worlds” situation. They have the knowledge to assist the customer and the emotional quotient to deal with the customer’s frustrations without being frustrated themselves. More importantly, they can be trained to speak multiple languages and interact with customer more easily.
Slowly but steadily, you can think of digital humans as not only an interface between humans and computers, but also an interface between human beings. Digital human can start to interact with other humans for you. You can think of them as your assistants helping us navigate through unfamiliar surroundings. Let’s take an example of an elderly person traveling abroad to meet their grandkids. This can involve a transit at airports where their native language is not spoken. The digital human can converse with the flight attendants and people at the airport terminal for this person and help them navigate their way through the airport.
Comments