More stories

  • in

    Quantum theory needs complex numbers

    Physicists construct theories to describe nature. Let us explain it through an analogy with something that we can do in our everyday life, like going on a hike in the mountains. To avoid getting lost, we generally use a map. The map is a representation of the mountain, with its houses, rivers, paths, etc. By using it, it is rather easy to find our way to the top of the mountain. But the map is not the mountain. The map constitutes the theory we use to represent the mountain’s reality.
    Physical theories are expressed in terms of mathematical objects, such as equations, integrals or derivatives. During history, physics theories evolved, making use of more elaborate mathematical concepts to describe more complicated physics phenomena. Introduced in the early 20th century to represent the microscopic world, the advent of quantum theory was a game changer. Among the many drastic changes it brought, it was the first theory phrased in terms of complex numbers.
    Invented by mathematicians centuries ago, complex numbers are made of a real and imaginary part. It was Descartes, the famous philosopher considered as the father of rational sciences, who coined the term “imaginary,” to strongly contrast it with what he called “real” numbers. Despite their fundamental role in mathematics, complex numbers were not expected to have a similar role in physics because of this imaginary part. And in fact, before quantum theory, Newton’s mechanics or Maxwell’s electromagnetism used real numbers to describe, say, how objects move, as well as how electro-magnetic fields propagate. The theories sometimes employ complex numbers to simplify some calculations, but their axioms only make use of real numbers.
    Schrödinger’s bewilderment
    Quantum theory radically challenged this state of affairsbecause its building postulates were phrased in terms of complex numbers. The new theory, even if very useful for predicting the results of experiments, and for instance perfectly explains the hydrogen atom energy levels, went against the intuition in favor of real numbers. Looking for a description of electrons, Schrödinger was the first to introduce complex numbers in quantum theory through his famous equation. However, he could not conceive that complex numbers could actually be necessary in physics at that fundamental level. It was as though he had found a map to represent the mountains but this map was actually made out of abstract and non-intuitive drawings. Such was his bewilderment that he wrote a letter to Lorentz on June 6, 1926, stating “What is unpleasant here, and indeed directly to be objected to, is the use of complex numbers. ? is surely fundamentally a real function.” Several decades later, in 1960, Prof. E.C.G. Stueckelberg, from the University of Geneva, demonstrated that all predictions of quantum theory for single-particle experiments could equally be derived using only real numbers. Since then, the consensus was that complex numbers in quantum theory were only a convenient tool.
    However, in a recent study published in Nature, ICFO researchers Marc-Olivier Renou and ICREA Prof. at ICFO Antonio Acín, in collaboration with Prof. Nicolas Gisin from the University of Geneva and the Schaffhausen Institute of Technology, Armin Tavakoli from the Vienna University of Technology, and David Trillo, Mirjam Weilenmann, and Thinh P. Le, led by Prof. Miguel Navascués, from the Institute of Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences in Vienna have proven that if the quantum postulates were phrased in terms of real numbers, instead of complex, then some predictions about quantum networks would necessarily differ. Indeed, the team of researchers came up with a concrete experimental proposal involving three parties connected by two sources of particles where the prediction by standard complex quantum theory cannot be expressed by its real counterpart.
    Two sources and three nodes
    To do this, they thought of a specific scenario that involves two independent sources (S and R), placed between three measurement nodes (A, B and C) in an elementary quantum network. The source S emits two particles, say photons, one to A, and the second to B. The two photons are prepared in an entangled state, say in polarization. That is, they have correlated polarization in a way which is allowed by (both complex and real) quantum theory but impossible classically. The source R does exactly the same, emits two other photons prepared in an entangled state and sends them to B and to C, respectively. The key point in this study was to find the appropriate way to measure these four photons in the nodes A, B, C in order to obtain predictions which cannot be explained when quantum theory is restricted to real numbers.
    As ICFO researcher Marc-Olivier Renou comments “When we found this result, the challenge was to see if our thought experiment could be done with current technologies. After discussing with colleagues from Shenzhen-China, we found a way to adapt our protocol to make it feasible with their state-of-the-art devices. And, as expected, the experimental results match the predictions!.” This remarkable experiment, realized in collaboration with Zheng-Da Li,Ya-Li Mao,Hu Chen, Lixin Feng, Sheng-Jun Yang, Jingyun Fan from the Southern University of Science and Technology, and Zizhu Wang from the University of Electronic Science and Technology is published at the same time as the Nature paper in Physical Review Letters.
    The results published in Nature can be seen as a generalization of Bell’s theorem, which provides a quantum experiment which cannot be explained by any local physics formalism. Bell’s experiment involves one quantum source S that emits two entangled photons, one to A, and the second to B, prepared in an entangled state. Here, in contrast, one needs two independent sources, the assumed independence is crucial and was carefully designed in the experiment.
    The study also shows how outstanding predictions can be when combining the concept of a quantum network with Bell’s ideas. For sure, the tools developed to obtain this first result are such that they will allow physicists to achieve a better understanding of quantum theory, and will one day trigger the realization and materialization of so far unfathomable applications for the quantum internet. More

  • in

    Quantum physics requires imaginary numbers to explain reality

    Imaginary numbers might seem like unicorns and goblins — interesting but irrelevant to reality. 

    But for describing matter at its roots, imaginary numbers turn out to be essential. They seem to be woven into the fabric of quantum mechanics, the math describing the realm of molecules, atoms and subatomic particles. A theory obeying the rules of quantum physics needs imaginary numbers to describe the real world, two new experiments suggest.

    Imaginary numbers result from taking the square root of a negative number. They often pop up in equations as a mathematical tool to make calculations easier. But everything we can actually measure about the world is described by real numbers, the normal, nonimaginary figures we’re used to (SN: 5/8/18). That’s true in quantum physics too. Although imaginary numbers appear in the inner workings of the theory, all possible measurements generate real numbers.

    Quantum theory’s prominent use of complex numbers — sums of imaginary and real numbers — was disconcerting to its founders, including physicist Erwin Schrödinger. “From the early days of quantum theory, complex numbers were treated more as a mathematical convenience than a fundamental building block,” says physicist Jingyun Fan of the Southern University of Science and Technology in Shenzhen, China.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Some physicists have attempted to build quantum theory using real numbers only, avoiding the imaginary realm with versions called “real quantum mechanics.” But without an experimental test of such theories, the question remained whether imaginary numbers were truly necessary in quantum physics, or just a useful computational tool.

    A type of experiment known as a Bell test resolved a different quantum quandary, proving that quantum mechanics really requires strange quantum linkages between particles called entanglement (SN: 8/28/15). “We started thinking about whether an experiment of this sort could also refute real quantum mechanics,” says theoretical physicist Miguel Navascués of the Institute for Quantum Optics and Quantum Information Vienna. He and colleagues laid out a plan for an experiment in a paper posted online at arXiv.org in January 2021 and published December 15 in Nature.

    In this plan, researchers would send pairs of entangled particles from two different sources to three different people, named according to conventional physics lingo as Alice, Bob and Charlie. Alice receives one particle, and can measure it using various settings that she chooses. Charlie does the same. Bob receives two particles and performs a special type of measurement to entangle the particles that Alice and Charlie receive. A real quantum theory, with no imaginary numbers, would predict different results than standard quantum physics, allowing the experiment to distinguish which one is correct.

    Fan and colleagues performed such an experiment using photons, or particles of light, they report in a paper to be published in Physical Review Letters. By studying how Alice, Charlie and Bob’s results compare across many measurements, Fan, Navascués and colleagues show that the data could be described only by a quantum theory with complex numbers.

    Another team of physicists conducted an experiment based on the same concept using a quantum computer made with superconductors, materials which conduct electricity without resistance. Those researchers, too, found that quantum physics requires complex numbers, they report in another paper to be published in Physical Review Letters. “We are curious about why complex numbers are necessary and play a fundamental role in quantum mechanics,” says quantum physicist Chao-Yang Lu of the University of Science and Technology of China in Hefei, a coauthor of the study.

    But the results don’t rule out all theories that eschew imaginary numbers, notes theoretical physicist Jerry Finkelstein of Lawrence Berkeley National Laboratory in California, who was not involved with the new studies. The study eliminated certain theories based on real numbers, namely those that still follow the conventions of quantum mechanics. It’s still possible to explain the results without imaginary numbers by using a theory that breaks standard quantum rules. But those theories run into other conceptual issues, making them “ugly,” he says. But “if you’re willing to put up with the ugliness, then you can have a real quantum theory.”

    Despite the caveat, other physicists agree that the quandaries raised by the new findings are compelling. “I find it intriguing when you ask questions about why is quantum mechanics the way it is,” says physicist Krister Shalm of the National Institute of Standards and Technology in Boulder, Colo. Asking whether quantum theory could be simpler or if it contains anything unnecessary, “these are very interesting and thought-provoking questions.” More

  • in

    E-waste recycling emits emerging synthetic antioxidants

    Manufacturers add synthetic antioxidants to plastics, rubbers and other polymers to make them last longer. However, the health effects of these compounds, and how readily they migrate into the environment, are largely unknown. Now, researchers reporting in ACS’ Environmental Science & Technology Letters have detected a broad range of emerging synthetic antioxidants, called hindered phenol and sulfur antioxidants, in dust from electronic waste (e-waste) recycling workshops, possibly posing risks for the workers inside.
    Previous studies revealed widespread environmental pollution and human exposure to a class of compounds called low-molecular weight synthetic phenolic antioxidants. In lab experiments, some of these compounds were toxic to rodents or human cells. Recently, manufacturers introduced a class of high-molecular weight synthetic phenolic antioxidants, also known as hindered phenol antioxidants (HPAs), with improved performance and slower migration from products. In addition to HPAs, compounds called sulfur antioxidants (SAs) are often added to rubber and plastic polymers as “helper” antioxidants. The toxicological effects and environmental occurrence of most of these new compounds are unknown. Therefore, Lixi Zeng and colleagues wanted to investigate the occurrence of emerging HPAs and SAs in dust from e-waste recycling centers — workshops where large amounts of discarded electronics, such as laptop computers, cell phones, tablets, wires and cables are dismantled and processed.
    In August 2020, the researchers collected 45 dust samples from three categories of e-waste recycling workshops in an industrial park in Yichun City, China: wire and cable dismantling, electronic plastic processing, and general e-waste dismantling. Then, they used liquid chromatography/tandem mass spectrometry to screen for 18 emerging HPAs and 6 emerging SAs. All 24 compounds were detected in the dust: 22 for the first time, and some at relatively high levels compared with other e-waste pollutants. Although dust concentrations of SAs were similar for the different categories of workshops, centers that dismantled wires and cables and processed electronic plastics had significantly higher levels of dust HPAs than those that dismantled general e-wastes. Given the ubiquitous occurrence of emerging HPAs and SAs in e-waste dust, further research is needed on their environmental behaviors, fates, toxicities and risks, the researchers say.
    The authors acknowledge funding from the National Natural Science Foundation of China, the Guangdong Special Support Program, the Guangdong (China) Innovative and Entrepreneurial Research Team Program, the Special Fund Project for Science and Technology Innovation Strategy of Guangdong Province and the Fundamental Research Funds for the Central Universities.
    Story Source:
    Materials provided by American Chemical Society. Note: Content may be edited for style and length. More

  • in

    Creating the human-robotic dream team

    Using autonomous vehicle guidelines, a team of UBC Okanagan researchers has developed a system to improve interactions between people and robots.
    The way people interact safely with robots is at the forefront of today’s research related to automation and manufacturing, explains Debasita Mukherjee, a doctoral student and lead author of a recently published study. She is one of several researchers at UBC’s Advanced Control and Intelligent Systems Laboratory who are working to develop systems that allow humans and robots to interact safely and efficiently.
    “It is incredibly important for robots in manufacturing to perform their tasks in the safest and most efficient method possible,” Mukherjee says. “In order to make these automated machines as smart as possible, we are developing systems that perceive their environments and carry out tasks in a similar manner as their human partners.”
    To develop such systems, researchers are using artificial intelligence and machine learning to help guide the machines. Mechanical Engineering Professor Homayoun Najjaran says the process is not as straightforward as it seems.
    “Robots don’t think or feel, so they need systems that capture and analyze their environment enabling them to respond,” says Dr. Najjaran. “Often those responses need to be in hundredths of a second to ensure the safety of humans in their vicinity.”
    Traditionally, industrial robots have been fixed and programmed to operate at high speeds and perform tasks such as welding, painting, assembly, pick-and-place and material handling. Social robots, on the other hand, are built to assist people in service industries. They are typically mobile, lightweight and programmed to work in a variety of environments. More

  • in

    Cancer-spotting AI and human experts can be fooled by image-tampering attacks

    Artificial intelligence (AI) models that evaluate medical images have potential to speed up and improve accuracy of cancer diagnoses, but they may also be vulnerable to cyberattacks. In a new study, University of Pittsburgh researchers simulated an attack that falsified mammogram images, fooling both an AI breast cancer diagnosis model and human breast imaging radiologist experts.
    The study, published today in Nature Communications, brings attention to a potential safety issue for medical AI known as “adversarial attacks,” which seek to alter images or other inputs to make models arrive at incorrect conclusions.
    “What we want to show with this study is that this type of attack is possible, and it could lead AI models to make the wrong diagnosis — which is a big patient safety issue,” said senior author Shandong Wu, Ph.D., associate professor of radiology, biomedical informatics and bioengineering at Pitt. “By understanding how AI models behave under adversarial attacks in medical contexts, we can start thinking about ways to make these models safer and more robust.”
    AI-based image recognition technology for cancer detection has advanced rapidly in recent years, and several breast cancer models have U.S. Food and Drug Administration (FDA) approval. According to Wu, these tools can rapidly screen mammogram images and identify those most likely to be cancerous, helping radiologists be more efficient and accurate.
    But such technologies are also at risk from cyberthreats, such as adversarial attacks. Potential motivations for such attacks include insurance fraud from health care providers looking to boost revenue or companies trying to adjust clinical trial outcomes in their favor. Adversarial attacks on medical images range from tiny manipulations that change the AI’s decision, but are imperceptible to the human eye, to more sophisticated versions that target sensitive contents of the image, such as cancerous regions — making them more likely to fool a human.
    To understand how AI would behave under this more complex type of adversarial attack, Wu and his team used mammogram images to develop a model for detecting breast cancer. First, the researchers trained a deep learning algorithm to distinguish cancerous and benign cases with more than 80% accuracy. Next, they developed a so-called “generative adversarial network” (GAN) — a computer program that generates false images by inserting or removing cancerous regions from negative or positive images, respectively, and then they tested how the model classified these adversarial images. More

  • in

    First optical oscilloscope

    A team from UCF has developed the world’s first optical oscilloscope, an instrument that is able to measure the electric field of light. The device converts light oscillations into electrical signals, much like hospital monitorsconvert a patient’s heartbeat into electrical oscillation.
    Until now, reading the electric field of light has been a challenge because of the high speeds at which light waves oscillates. The most advanced techniques, which power our phone and internet communications, can currently clock electric fields at up to gigahertz frequencies — covering the radio frequency and microwave regions of the electromagnetic spectrum. Light waves oscillate at much higher rates, allowing a higher density of information to be transmitted. However, the current tools for measuring light fields could resolve only an average signal associated with a ‘pulse’ of light, and not the peaks and valleys within the pulse. Measuring those peaks and valleys within a single pulse is important because it is in that space that information can be packed and delivered.
    “Fiber optic communications have taken advantage of light to make things faster, but we are still functionally limited by the speed of the oscilloscope,” says Physics Associate Professor Michael Chini, who worked on the research at UCF. “Our optical oscilloscope may be able to increase that speed by a factor of about 10,000.”
    The team’s findings are published in this week’s Nature Photonics journal.
    The team developed the device and demonstrated its capability for real-time measurement of the electric fields of individual laser pulses in Chini’s lab at UCF. The next step for the team is to see how far they can push the speed limits of the technique.
    The lead author of the paper is UCF postdoctoral scholar Yangyang Liu. Other authors include physics alums Jonathan Nesper ’19 ’21MS, who earned his bachelor’s in math and master’s in physics; Shima Gholam-Mirzaei ’18MS ’20PhD; and John E. Beetar ’15 ’17MS ’20PhD.
    Gholam-Mirzaei is now a postdoctoral researcher at the Joint Attosecond Science Laboratory at the National Research Council of Canada and University of Ottawa and Beetar is completing a postdoc at the University of California at Berkeley.
    Chini had the idea for the single-shot waveform measurement scheme and oversaw the research team. Liu led the experimental effort and performed most of the measurements and simulations. Beetar assisted with the measurements of the carrier-envelope phase dependence. Nesper and Gholam-Mirzaei assisted with the construction of the experimental setup and with the data collection. All authors contributed to the data analysis and wrote the journal article.
    The work was supported primarily through a grant from the Air Force Office of Scientific Research under Award No. FA9550-20-1-0284, while Gholam-Mirzaei was supported by the Army Research Office under Award No. W911NF-19-1-0211.
    Story Source:
    Materials provided by University of Central Florida. Original written by Zenaida Gonzalez Kotala. Note: Content may be edited for style and length. More

  • in

    Antarctica’s Thwaites Glacier ice shelf could collapse within five years

    The demise of a West Antarctic glacier poses the world’s biggest threat to raise sea levels before 2100 — and an ice shelf that’s holding it back from the sea could collapse within three to five years, scientists reported December 13 at the American Geophysical Union’s fall meeting in New Orleans.

    Thwaites Glacier is “one of the largest, highest glaciers in Antarctica — it’s huge,” Ted Scambos, a glaciologist at the Boulder, Colo.–based Cooperative Institute for Research in Environmental Sciences, told reporters. Spanning 120 kilometers across, the glacier is roughly the size of Florida, and were the whole thing to fall into the ocean, it would raise sea levels by 65 centimeters, or more than two feet. Right now, its melting is responsible for about 4 percent of global sea level rise.

    But a large portion of the glacier is about to lose its tenuous grip on the seafloor, and that will dramatically speed up its seaward slide, the researchers said. Since about 2004, the eastern third of Thwaites has been braced by a floating ice shelf, an extension of the glacier that juts out into the sea. Right now, the underbelly of that ice shelf is lodged against an underwater mountain located about 50 kilometers offshore. That pinning point is essentially helping to hold the whole mass of ice in place.

    But data collected by researchers beneath and around the shelf in the last two years suggests that brace won’t hold much longer. Warm ocean waters are inexorably eating away at the ice from below (SN: 4/9/21; SN: 9/9/20). As the glacier’s ice shelf loses mass, it’s retreating inland, and will eventually retreat completely behind the underwater mountain pinning it in place. Meanwhile, fractures and crevasses, widened by these waters, are swiftly snaking through the ice like cracks in a car’s windshield, shattering and weakening it. 

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    This deadly punch-jab-uppercut combination of melting from below, ice shattering and losing its grip on the pinning point is pushing the ice shelf to imminent collapse, within as little as three to five years, said Erin Pettit, a glaciologist at Oregon State University in Corvallis. And “the collapse of this ice shelf will result in a direct increase in sea level rise, pretty rapidly,” Pettit added. “It’s a little bit unsettling.”

    Satellite data show that over the last 30 years, the flow of Thwaites Glacier across land and toward the sea has nearly doubled in pace. The collapse of this “Doomsday Glacier” alone would alter sea levels significantly, but its fall would also destabilize other West Antarctic glaciers, dragging more ice into the ocean and raising sea levels even more.

    That makes Thwaites “the most important place to study for near-term sea level rise,” Scambos said. So in 2018, researchers from the United States and the United Kingdom embarked on a joint five-year project to intensively study the glacier and try to anticipate its imminent future by planting instruments atop, within, below it as well as offshore of it.

    This pull-out-all-the-stops approach to studying Thwaites is leading to other rapid discoveries, including the first observations of ocean and melting conditions right at a glacier’s grounding zone, where the land-based glacier begins to jut out into a floating ice shelf. Scientists have also spotted how the rise and fall of ocean tides can speed up melting, by pumping warm waters farther beneath the ice and creating new melt channels and crevasses in the underside of the ice.

    To better understand the rapid retreat of Thwaites Glacier, scientists drilled a hole through the ice at the glacier’s grounding zone, the region where the land-based glacier juts out into the sea to become a floating ice shelf. Heated water (heaters shown here) carved a borehole through the ice down to the grounding zone, allowing scientists to take the first ever measurements of ocean conditions in the region.PETER DAVIS/BAS

    As Thwaites and other glaciers retreat inland, some scientists have pondered whether they might form very tall cliffs of ice along the edge of the ocean — and the potential tumble of such massive blocks of ice into the sea could lead to devastatingly rapid sea level rise, a hypothesis known as marine ice cliff instability (SN: 2/6/19). How likely researchers say such a collapse is depends on our understanding of the physics and dynamics of ice behavior, something about which scientists have historically known very little (SN: 9/23/20).

    The Thwaites collaboration is also tackling this problem. In simulations of the further retreat of Thwaites, glaciologist Anna Crawford of the University of St. Andrews in Scotland and her colleagues found that if the shape of the land beneath the glacier dips deep enough in some places, that could lead to some very tall ice cliffs — but, they found, the ice itself might also deform and thin enough to make tall ice cliff formation difficult.

    The collaboration is only at its halfway point now, but these data already promise to help scientists better estimate the near-term future of Thwaites, including how quickly and dramatically it might fall, Scambos said. “We’re watching a world that’s doing things we haven’t really seen before, because we’re pushing on the climate extremely rapidly with carbon dioxide emissions,” he added. “It’s daunting.” More

  • in

    Losing isn’t always bad: Gaining topology from loss

    Losing particles can lead to positive, robust effects.
    An international collaboration has demonstrated a novel topology arising from losses in hybrid light-matter particles, introducing a new avenue to induce the highly-prized effects inherent to conventional topological materials, which can potentially revolutionise electronics.
    Led by Singapore’s Nanyang Technological University (NTU) and the Australian National University (ANU), the study represents the first experimental observation of a non-Hermitian topological invariant in a semiconductor in the strong light-matter coupling regime supporting formation of exciton-polaritons.
    Losing Is Not Always Losing
    Losses, such as friction or electrical resistance, are ubiquitous in nature, but are seen as detrimental to devices.
    In electronics, for example, resistance leads to heating and limits computing efficiency. More