Home sci-tech 'Artificial humans' are company bet for the future of interaction between …

bannerebay

'Artificial humans' are company bet for the future of interaction between …

by ace
'Artificial humans' are company bet for the future of interaction between ...

The way we deal with artificial intelligence-based voice assistants today is very different from that predicted by movies in recent years, which have even romanticized the relationship between man and machine.

The options available, created by giants like Google, Amazon and Apple, are still limited and quite robotic in terms of interaction, despite recent advances.

That's what Neon, a project linked to the Samsung-backed Star Labs group, wants to change. With just 4 months in operation, they came to the Consumer Electronic Show (CES), the technology fair in Las Vegas, to introduce the “artificial human”.

  • Smart door and anti-rhomb pillow: CES gadgets
  • See full event coverage

The idea behind Neon is to provide a more human interface to our interactions with technology – one that has genuine expressions, can show emotions and even react to certain situations, such as a crowded conference room, for example.

Neon Virtual Assistants, shown at CES - Photo: Thiago Lavado / G1

Neon Virtual Assistants, shown at CES – Photo: Thiago Lavado / G1

The name Neon itself is a kind of acronym for “Neo humaN”, or new humans.

According to Pranav Mistry, director of the project and Star Labs, Neon doesn't have a brain yet, meaning it can't answer questions, interact in conversation, remember a person. The technology is based on video analysis and modeling to create a simulation with features that a person would have.

“Voice and intelligence could come from partner companies. We have the power of real time, ”Mistry said during a presentation. News from World's Largest Technology Fair Includes Artificial Humans

News from World's Largest Technology Fair Includes Artificial Humans

Although the project is recent, Neon has been working since 2018 to develop a rendering algorithm called Core R3. The “R3” set is for reality, real time and response, characteristics that the team thinks are critical for an “artificial human” to get as close as possible to the interactions we have with other people.

If the "virtual humans" presented by Neon look like real people, it's because they were based on real people. The company has scanned the faces and reactions of people who exist to feed the neural algorithm, which must now be able to replicate millions of expressions and emotions – and also to build new faces.

Neon used real human figures to create virtual assistants - Photo: Thiago Lavado / G1Neon used real human figures to create virtual assistants - Photo: Thiago Lavado / G1

Neon used real human figures to create virtual assistants – Photo: Thiago Lavado / G1

All this in theory, since in the images presented by the company, there is a set of small letters that says the scenarios are for illustrative purposes only.

Nevertheless, the company has shown that it can control expressions of artificial models with ease and precision, such as closing the eyes only partially, having different types of smiles, or raising eyebrows in different ways.

According to neuroscientist Angie Chiang, who is part of the project, the hardest part is getting technology to learn behaviors. "I don't have a single smile, tomorrow I may not say hi to you the same way," he said.

During the presentation, a connection map was shown, showing the millions of interaction options generated by the algorithm.

For Chiang, it is possible to change the way we currently communicate with personal assistants – usually for more mechanical tasks such as checking the weather or setting an alarm.

Chiang says there are several applications that the company considers for the platform, in areas such as hospitality, medicine, elderly care, customer service.

"Imagine what it would be like to go to a coffee shop with a virtual attendant who has a memory and interact in a more humane way, to the point of saying, 'Hey, Angie, are you going to want the same coffee today?'" Chiang said.

However, nothing is concrete yet and we are unlikely to see such a virtual assistant with human features anytime soon.

Technology analyzes features of various people to create a virtual model that has characteristics of a person, such as human expressions. It is different from editing tools, which manipulate people's faces to create new expressions, according to the company.

It can also be integrated with other platforms to speak other languages ​​or have specific applications, for example.

Neon also has plans for the future: another project, called Spectra. This other platform should add a new layer to technology, which would allow us to include emotions, memory, learning and intelligence – precisely the traits that define our interaction with other people.

Neon hopes that in the future, when you try to talk to an assistant, instead of saying "hey, Neon", you will call him by a name. A people name: like Frank, Monica, Maia or Natasha.

banneraliexp

Related Articles

Leave a Comment

17 − fifteen =

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More