Nature’s speed limits aren’t posted on road signs, but Rice University physicists have discovered a new way to deduce them that is better — infinitely better, in some cases — than previous methods.
“The big question is, ‘How fast can anything — information, mass, energy — move in nature?'” said Kaden Hazzard, a theoretical quantum physicist at Rice. “It turns out that if somebody hands you a material, it is incredibly difficult, in general, to answer the question.”
In a study published today in the American Physical Society journal PRX Quantum, Hazzard and Rice graduate student Zhiyuan Wang describe a new method for calculating the upper bound of speed limits in quantum matter.
“At a fundamental level, these bounds are much better than what was previously available,” said Hazzard, an assistant professor of physics and astronomy and member of the Rice Center for Quantum Materials. “This method frequently produces bounds that are 10 times more accurate, and it’s not unusual for them to be 100 times more accurate. In some cases, the improvement is so dramatic that we find finite speed limits where previous approaches predicted infinite ones.”
Nature’s ultimate speed limit is the speed of light, but in nearly all matter around us, the speed of energy and information is much slower. Frequently, it is impossible to describe this speed without accounting for the large role of quantum effects.
In the 1970s, physicists proved that information must move much slower than the speed of light in quantum materials, and though they could not compute an exact solution for the speeds, physicists Elliott Lieb and Derek Robinson pioneered mathematical methods for calculating the upper bounds of those speeds.
“The idea is that even if I can’t tell you the exact top speed, can I tell you that the top speed must be less than a particular value,” Hazzard said. “If I can give a 100% guarantee that the real value is less than that upper bound, that can be extremely useful.”
Hazzard said physicists have long known that some of the bounds produced by the Lieb-Robinson method are “ridiculously imprecise.”
“It might say that information must move less than 100 miles per hour in a material when the real speed was measured at 0.01 miles per hour,” he said. “It’s not wrong, but it’s not very helpful.”
The more accurate bounds described in the PRX Quantum paper were calculated by a method Wang created.
“We invented a new graphical tool that lets us account for the microscopic interactions in the material instead of relying only on cruder properties such as its lattice structure,” Wang said.
Hazzard said Wang, a third-year graduate student, has an incredible talent for synthesizing mathematical relationships and recasting them in new terms.
“When I check his calculations, I can go step by step, churn through the calculations and see that they’re valid,” Hazzard said. “But to actually figure out how to get from point A to point B, what set of steps to take when there’s an infinite variety of things you could try at each step, the creativity is just amazing to me.”
The Wang-Hazzard method can be applied to any material made of particles moving in a discrete lattice. That includes oft-studied quantum materials like high-temperature superconductors, topological materials, heavy fermions and others. In each of these, the behavior of the materials arises from interactions of billions upon billions of particles, whose complexity is beyond direct calculation.
Hazzard said he expects the new method to be used in several ways.
“Besides the fundamental nature of this, it could be useful for understanding the performance of quantum computers, in particular in understanding how long they take to solve important problems in materials and chemistry,” he said.
Hazzard said he is certain the method will also be used to develop numerical algorithms because Wang has shown it can put rigorous bounds on the errors produced by oft-used numerical techniques that approximate the behavior of large systems.
A popular technique physicists have used for more than 60 years is to approximate a large system by a small one that can be simulated by a computer.
“We draw a small box around a finite chunk, simulate that and hope that’s enough to approximate the gigantic system,” Hazzard said. “But there has not been a rigorous way of bounding the errors in these approximations.”
The Wang-Hazzard method of calculating bounds could lead to just that.
“There is an intrinsic relationship between the error of a numerical algorithm and the speed of information propagation,” Wang explained, using the sound of his voice and the walls in his room to illustrate the link.
“The finite chunk has edges, just as my room has walls. When I speak, the sound will get reflected by the wall and echo back to me. In an infinite system, there is no edge, so there is no echo.”
In numerical algorithms, errors are the mathematical equivalent of echoes. They reverberate from the edges of the finite box, and the reflection undermines the algorithms’ ability to simulate the infinite case. The faster information moves through the finite system, the shorter the time the algorithm faithfully represents the infinite. Hazzard said he, Wang and others in his research group are using their method to craft numerical algorithms with guaranteed error bars.
“We don’t even have to change the existing algorithms to put strict, guaranteed error bars on the calculations,” he said. “But you can also flip it around and use this to make better numerical algorithms. We’re exploring that, and other people are interested in using these as well.”