Mon. Dec 23rd, 2024
How Ai And Ml Are Impacting Physics

Physics 16, 166

The more physicists use artificial intelligence and machine learning, the more important it becomes to understand why the technology works and when it fails.

J. Holstuis

Digital artist Julius Horsthuis created this piece. gravitational wavesuses fractals and artificial intelligence.

The advent of ChatGPT, Bard, and other large-scale language models (LLMs) naturally got everyone excited, including the entire physics community. There are many evolving questions for physicists regarding LLM in particular and artificial intelligence (AI) in general. What do these incredible developments in big data technologies mean for physics? How can they be incorporated into physics? In the process of physics discovery, machine learning (ML) itself What role does it play?

Before exploring the implications of these questions, it is important to point out that there is no doubt that AI and ML will become an integral part of physics research and education. Still, similar to the role of AI in human society, we do not know how this new and rapidly evolving technology will impact physics in the long term. It’s just like when technology was born, our ancestors didn’t know how transistors and computers would impact physics. Developed in the early 1950s. What we do know is that AI/ML will have a significant impact on physics and will continue to evolve as the technology develops.

Its impact is already being felt. After a quick search, I found physical review Journals with “machine learning” in the article title, abstract, or both returned 1,456 hits since 2015, but only 64 hits in the entire period since 2015. did. physical review‘s debut was from 1893 to 2014! The derivative use of ML within articles is also increasing.310 results for the same search physical review 2022 articles with ML in the title, abstract, or both. In the first six months of 2023, there are already 189 such publications.

This is surprising since ML is already widely used in physics, but physics often deals with very large data sets, such as some high-energy physics and astrophysics experiments. That’s not the point. In fact, physicists have been using some form of ML for a long time, even before the term became popular. Neural networks, a fundamental pillar of AI, also have a long history in theoretical physics, as evidenced by the term “neural network” appearing in hundreds of terms. physical review Paper titles and abstracts have been used since 1985, when they were first used in the context of models for understanding spin glasses. Although the use of neural networks in AI/ML is quite different from how they are represented in spin glass models, the basic idea of ​​using neural networks to represent complex systems is the same in both cases. doing. ML and neural networks have been incorporated into the fabric of physics for more than 40 years.

What changed was the availability of very large computer clusters with vast amounts of computing power. This makes it possible to apply ML to many physical systems in a practical way. For my field of condensed matter physics, these advances mean that ML will increasingly be used to analyze large datasets containing material properties and predictions. In such complex situations, the use of AI/ML will become an everyday tool for all professional physicists, along with vector calculus, differential geometry, and group theory. In fact, the use of AI/ML will soon become so widespread that we won’t remember why it became such a big deal. At that point, this opinion of mine would seem a bit naive, like his 1940s arrogance about using computers to do physics.

But what about the deeper uses of AI/ML in physics, beyond their use as everyday tools? Can they help solve deep problems with significant implications? For example: If physicists had access to AI/ML, would they have come up with the Bardeen-Cooper-Schriefer theory of superconductivity in 1950? Could our discovery revolutionize the practice of theoretical physics? Most physicists I spoke to firmly believe that this is impossible. Mathematicians feel the same way. I don’t know of any mathematicians who believe that AI/ML can prove things like the Riemann hypothesis or the Goldbach conjecture. On the other hand, I’m not so sure. All ideas are deeply rooted in some way in accumulated knowledge, and I’m not willing to claim that we already know what AI/ML can’t do. After all, I remember a time when there was a widespread feeling that AI would never be able to beat the great champions of the complex game of Go. An academic example is DeepMind’s AlphaFold’s ability to predict what structure a sequence of amino acids in a protein will take, something he thought was impossible 20 years ago. This is a great feat.

This brings me to my final point. AI/ML is being used to do physics, and it will soon become commonplace. But what about understanding the effectiveness of AI/ML, and LLM in particular? If we think of LLM as a complex system that suddenly becomes highly predictive after being trained on vast amounts of data, physicists The natural question to ask is: What is the nature of that change? Is it a true dynamical phase transition that occurs at some threshold training point? Or is it a routine result of interpolation between known data and only works empirically, and in some cases Does it even work when extrapolated? The latter seems to be believed by most professional statisticians, but there are no deep principles involved. But the former involves what we might call the physics of AI/ML, and constitutes the most important intellectual question in my mind. It’s about why AI/ML works and when it fails. Is there a phase transition at some threshold where the AI/ML algorithm predicts everything correctly? Or is this algorithm just a giant interpolation and the amount of data being interpolated is so huge that most Does it work simply because the questions fall within its scope? As physicists, we need to dig deeper into these questions and not just be passive users of AI/ML. To paraphrase a famous quote from a former U.S. president, don’t just ask what AI/ML can do for us (actually, it’s a lot), but also ask what you can do with AI/ML. is needed.

About the author

Image of Sankar das Salma

Sankar Das Sarma is the Richard E. Prange Professor of Physics and Distinguished Professor at the University of Maryland, College Park. He is also a research associate at the Joint Quantum Institute and director of the Center for Condensed Matter Theory at the University of Maryland. Das Salma received his Ph.D. from Brown University in Rhode Island. He has been a member of the Physics Department at the University of Maryland since 1980. His research interests include condensed matter physics, quantum computing, and statistical mechanics.


recent articles

Turbulent jets have a double negative effect on liquid batteries
Creating a beamline for deep ultraviolet spectroscopy
Quantum ratchet using optical lattice

Read more articles