Existential questions are also emerging as Anthropic battles OpenAI and other challengers in the growing artificial intelligence industry. It’s about how large language models and the systems they enable can continue to grow in size and functionality. For CEO and co-founder Dario Amodei, the answer is simple. “Yes.
Speaking on stage at TechCrunch Disrupt, Amodei explained that he doesn’t see any barriers looming for his company’s key technologies.
“Over the past decade, there has been an impressive increase in the scale used to train neural networks. We continue to scale up neural networks, and they continue to perform better and better,” he says. said. “That’s the basis of my feeling that what we’re going to see in the next two, three, four years…what we’re seeing today is going to pale in comparison. ”
When asked if he thought 1000 trillion parameter models would appear next year (there are rumors that 100 trillion parameter models will appear this year), he replied that it is outside the range of expected scaling laws and that calculations will be roughly He said it is squared. . But certainly, the model can still be expected to grow, he said.
However, some researchers have suggested that no matter how large these transformer-based models get, certain tasks may still feel difficult, if not impossible. Mr. Yejin Choi pointed out that some of his LLMs have a very difficult time when he multiplies his 3-digit numbers by two. This implies that there is some capacity missing at the heart of these high-performance models.
“Do you think it is necessary to identify such fundamental limitations?” the moderator (me) asked.
“Yeah, so I don’t know if there is,” Amodei replied.
“And even if there are many, I’m not sure there’s a good way to measure them,” he continued. “I think years of scaling experience have taught me to be very skeptical and also skeptical of claims that LLMs can’t do anything. It is also possible that you would not be able to do anything if you were not trained in a slightly different way. But no. I’m skeptical of these harsh lists, but I’m also skeptical of the skeptics.”
At the very least, Amodei suggested that there will be no decline in profits over the next three to four years. Beyond this point, he would need an AI of over 1 quadrillion parameters to make predictions.