next evolution Artificial intelligence (AI) Research shows that there may be agents that can communicate directly and teach each other to perform tasks.
Scientists have modeled an AI network that can learn and perform tasks based solely on written instructions. This AI then explained what it had learned to its “sister” AI, which performed the same task despite having no prior training or experience.
The first AI used natural language processing (NLP) to communicate with its sisters, the scientists said in a paper published March 18 in the same journal. Nature.
NLP is a subfield of AI that aims to recreate human language within computers. This allows machines to understand and reproduce written text and audio naturally. They are built on neural networks, which are collections of machine learning algorithms modeled to replicate the arrangement of neurons in the brain.
After learning these tasks, the network was able to write them into a second network (a copy of the first network) and reproduce them. “To our knowledge, this is the first time that two AIs have been able to talk to each other in a purely linguistic way,” said the paper’s first author. Alexandre Pougetleader of the Neurology Center of the University of Geneva, statement.
Scientists achieved this knowledge transfer by starting with an NLP model called “S-Bert” that was pre-trained to understand human language. They connected his S-Bert to a small neural network centered on interpreting sensory input and simulating motor movements in response.
Related: AI-powered humanoid robots can serve meals, stack plates and have conversations
This composite AI (a “sensorimotor recurrent neural network” (RNN)) was trained on a set of 50 psychophysical tasks. These focus on responding to stimuli, such as responding to light, through instructions supplied through the S-Bert language model.
Through an embedded language model, the RNN understood the fully written text. This allowed them to perform tasks based on natural language instructions and achieved an average accuracy rate of 83%, even though they had never seen any training footage or performed the tasks before.
That understanding is then reversed, allowing the RNN to use verbal instructions to communicate the results of its sensorimotor learning to its identical sibling AI, which in turn performs the task, which was also previously I had never done that before.
Please do as we humans do.
The inspiration for this research comes from how humans learn by performing tasks following verbal or written instructions. Even if you have never performed such an action before. This cognitive function distinguishes humans from animals. For example, you need to show your dog something before you can train him to respond to verbal commands.
AI-powered chatbots can interpret verbal instructions and generate images or text, but they cannot convert written or verbal instructions into physical actions, much less create another You can’t explain instructions to an AI.
But by simulating the areas of the human brain responsible for perceiving language, interpreting it, and acting on instructions, researchers have created an AI with human-like learning and communication skills.
With this alone, Artificial general intelligence (AGI) — AI agents will be able to reason like humans and perform tasks in multiple domains. But the researchers noted that AI models like the one they created can help us understand how things work. human brain work.
Robots with built-in AI could also communicate with each other to learn and perform tasks. If only one robot receives initial instructions, it could be highly effective for manufacturing and training in other automated industries.
“The network we developed is very small,” the researchers explained in a statement. “On this basis, there is nothing to prevent the development of more complex networks that can be integrated into humanoid robots that can not only understand us, but also each other.”