This AI storyteller fools humans 3 out of 5 times

A newly developed neural network is capable of subtitling a series of images in a method that mimics human narrative. Rather than simply identifying and describing objects, the AI ​​makes inferences about what is happening in an image. And it's strangely good at his job.

The team, researchers at UC Santa Barbara, developed the AI ​​to determine if a neural network could be used to infer new abstract stories from images .

According to a white paper published by the team:

Different legends, stories have more expressive language styles and contain many imaginary concepts that do not appear in images.

The neural network developed by the researchers calls a learning framework by adverse rewards (AREL). What is different about this, compared to a similar AI, is that it does not rely on an automatic evaluation system, thus avoiding the cloning (and regurgitation) of human efforts.

Teaching a network of neurons to find abstract stories that make sense is not a trivial matter, but AREL has gone one step further. Not only can he invent his own stories, but these stories are compelling enough to deceive humans into believing that another wrote them.

In order to test AREL, the team used the humans of Mechanical Turk from Amazon to perform two separate tests. First, a test from Turing that simply asked Turkish workers to determine whether a story had been created by a person or a computer.

According to research, AREL has passed the Turing test three times.

In a separate test, the researchers asked Turkish workers to choose between AREL, a human story, and that created by the old-fashioned AI. Nearly half the time that human workers chose AREL.

The implications for an AI storytelling are exciting. As developers discover how to ensure that the results generated by a neural network better align with human thinking we will begin to see considerable benefits for language processors simple.

Sports referees, for example, could be replaced or augmented by an AI capable of understanding and explaining a series of events. Do we really need to pay someone at $ 188,322 to determine if Tom Brady is cheating or not?

It goes without saying that once the AI ​​is robust enough to explain its decision making, telling "stories" about it. on real-time images, like "Number 66, defense, offsides, the game gives a 5-yard pentalty." Repeat first down, "we will not need people to do works based on on rules that require an agent to do nothing more than observe and report.

And let's not forget that there is a real market for narration on the fly. If this technology fell into the hands of the developers of Telltale Games or the designers of Wizards of the Coast (the company that makes Dungeons and Dragons), it could be used to generate a endless stream of unique and personal entertainment.

AREL is not yet ready for prime time, however, this research is only laying the groundwork for future efforts to create a better neural network. According to the researchers:

We believe that there is still a lot of room for improvement in narrative paragraph generation tasks, such as how to better simulate the human imagination to create more stories. lively and more diversified.

But ultimately, unless there is an undiscovered stalemate, neural networks like AREL will mature and gain a level of social intelligence that could become universally comparable to that of an average human.

If this AI can deceive half of the people right now, imagine what it will do in five years.

The 2018 Next Web conference will only be in a few weeks, and it will be 💥💥. Discover all about our tracks here .

Leave a Reply

Your email address will not be published.