Neon Is Far From the Avatar of Your Sci-Fi Dreams

Neon Life has big plans for its artificial humans, but they aren’t ready

What: Neon Life is working on life-like avatars that can converse and react like real humans. The beta avatars are unimpressive.

How: They’re scanning in humans and breaking them into bits and pieces so the system can create avatar interactions and responses on-the fly.

Why Do You Care: The company claims to have technology that can help them scale up quickly and, with constant innovations in AI and silicon, we could be talking to hyper-realistic avatars in the not-too-distant future.

I wanted Neon to be something more. The unbelievable, leaked videos of totally life-like avatars made me wonder if we’d finally crossed that Rubicon between uncanny valley and real, virtual human beings indistinguishable from flesh and blood.

That’s not what Neon is, though. Not at all.

Neon Life Booth
The majority of Neon Life CES 2020 booth avatars were just video demos, and not true Neon avatars.  Lifewire / Lane Ulanoff

The videos were just “for illustration purposes only,” designed I assume to excite and inspire. What Neon Life, a labs division of Samsung, has built and unveiled this week at CES 2020 in Las Vegas is a collection of photo realistic avatars that move and respond as if they’ve taken a sedative, offering slow responses, off putting twitches, and eyes that often don’t blink in sync.

Neon Life has a worthy goal: To transform how we communicate with technology. To move us beyond our stilted conversations with Siri and Alexa and build avatars that can communicate with all the emotion, movement, incredulity, worries, and more that we use to talk to each other every day.

If all goes as planned, Neon avatars will someday live on all our screens, ready to interact, answer questions, and connect with us in ways traditional digital assistants and avatars never could.

This is an actual Neon avatar. It's built from a live model scan, but creates interaction on-the-fly on a frame-by-frame basis.  Lifewire / Lance Ulanoff

Using something called Core R3, Neon hopes to eventually generate artificial humans that look 100 percent real to us.

In a brief demo on the floor of CES, however, Neon gave us just a glimpse of what’s currently possible with their very early beta. Using a pair of life-sized avatars on the same large screens where Neon Life had been showing off the fake demo videos on continuous loops, Neon reps walked one of the avatars, which do resemble the original models, though a short conversation and even had one of them take unscripted questions from the audience. The responses were slow, didn’t sound much like the avatars they were coming from (at least what I expected to hear) and were out of synch with their mouths.

Neon Life’s Bo Moon explained to me that the voice was coming from a different source than the avatar image, which led to the discontinuity.

The Kernel of Something Fresh

To create the functioning avatars, Neon Life scanned in human models, then trained the artificial intelligence system on how the avatar should look. They’re able to do it so quickly, Moon claimed, that Neon Life can do the scanning at scale.

A finished avatar is not a series of cobbled together image fragments. Instead every one of the frames is generated in real time by the Neon engine to deliver not only, say, a smile, but all the permutations leading up to that smile.

Neon Life wants to create fully realistic avatars that do not necessarily have to look like anyone. "The body parts that make up a nose will be real; the configuration is unique," said Moon. In fact, they would prefer it that way, though they can’t guarantee that some won’t end up resembling real people.

The Voice

Along with the jerky movements, the voice was one of the least impressive parts of Neon. I wondered if Neon Life would consider marrying its avatar system with, say, an Alexa. Moon said the company is “not closed off to any idea.” The goal for Neon is realism and that could include working with third-party voice systems.

This was from the live demo of an early Neon avatar.  Lifewire / Lance Ulanoff

Perhaps sensing my disappointment over the current state of Neon Life, Moon repeated that what we’re seeing right now is an early beta. It’s unclear how quickly they can ramp up and start putting these avatars out in the wild on our phones, computers and TVs, but based on what I saw, it won’t be anytime soon.

The body parts that make up a nose will be real; the configuration is unique.

What’s also unknown is where the intelligence will live. Is Neon’s AI cloud-based? Is its data encrypted? I could see avatars like this in airports, malls and other public spaces, but what would encourage consumers to adopt them into their homes and pockets?

Ultimately, the current state of Neon isn’t much better than the dozens of other human-like avatars I’ve seen in the past.

When Neon is so good that I no longer realize I’m talking to a digital being, Neon Life will have achieved something. Until then, I’ll take a pass on this hype machine.