More stories

  • in

    Towards quantum simulation of false vacuum decay

    Phase transitions are everywhere, ranging from water boiling to snowflakes melting, and from magnetic transitions in solids to cosmological phase transitions in the early universe. Particularly intriguing are quantum phase transitions that occur at temperatures close to absolute zero and are driven by quantum rather than thermal fluctuations.
    Researchers in the University of Cambridge studied properties of quantum phases and their transitions using ultracold atoms in an optical lattice potential (formed by a set of standing wave lasers). Typically, the transition from a Mott insulator (MI) to a superfluid (SF), which is governed by the interplay of the atom-atom interactions and the hopping of atoms, is a continuous transition, where the system undergoes a smooth continuous change crossing the phase transition point.
    However, many phase transitions are discontinuous, such as water freezing to ice, or the transition thought to have triggered the inflation period in the early universe. These are called ‘first-order transitions’ and for instance allow both phases to coexist — just like ice blocks in a glass of water — and can lead to hysteresis and metastability, where a system remains stuck in its original phase (the false vacuum) even though the ground state has changed.
    By resonantly shaking the position of the lattice potential, the researchers could couple or “mix” the first two bands of the lattice. For the right parameters, this can excite the atoms from the lowest band into the first excited band, where they would form a new superfluid in which the atoms appear at the edge of the band. Crucially, the transition from the original Mott insulator in the lowest band to the resulting staggered superfluid in the excited band can be first-order (discontinuous), because the non-staggered order in the Mott insulator is incompatible with the staggered order of this superfluid — so the system has to choose one. The researchers could directly observe the metastability and hysteresis associated with this first-order transition by monitoring how fast one phase changes into another, or not. The findings are published in the journal Nature Physics.
    “We realised a very flexible platform where phase transitions could be tuned from continuous to discontinuous by changing the shaking strength. This demonstration opens up new opportunities for exploring the role of quantum fluctuations in first-order phase transitions, for instance, the false vacuum decay in the early universe,” said first author Dr Bo Song from Cambridge’s Cavendish Laboratory. “It is really fascinating that we are on the road to cracking the mystery of the hot and dense early universe using such a cold and tiny atomic ensemble.”
    “We are excited to enhance the scope of quantum simulators from condensed matter settings towards potential simulations of the early universe. While there clearly is a long way still to go, this work is an important first step,” added Professor Ulrich Schneider, who led the research at the Cavendish Laboratory. “This work also provides a testbed for exploring the spontaneous formation of spatial structures when a strongly interacting quantum system undergoes a discontinuous transition.”
    “The underlying physics involves ideas that have a long history at the Cavendish, from Nevill Mott (on correlations) to Pyotr Kapitsa (on superfluidity), and even using shaking to effect dynamical control in a manner explained by Kapitsa but put to use in a way he would never have envisaged,” explained Professor Nigel Cooper, also from the Cavendish.
    The research is funded in part by the European Research Council (ERC), and the UK Engineering and Physical Sciences Research Council (EPSRC) as well as the Simons Foundation.
    Story Source:
    Materials provided by University of Cambridge. Note: Content may be edited for style and length. More

  • in

    Towards compact quantum computers thanks to topology

    Researchers at PSI have compared the electron distribution below the oxide layer of two semiconductors. The investigation is part of an effort to develop particularly stable quantum bits -and thus, in turn, particularly efficient quantum computers. They have now published their latest research, which is supported in part by Microsoft, in the scientific journal Advanced Quantum Technologies.
    By now, the future of computing is inconceivable without quantum computers. For the most part, these are still in the research phase. They hold the promise of speeding up certain calculations and simulations by orders of magnitude compared to classical computers.
    Quantum bits, or qubits for short, form the basis of quantum computers. So-called topological quantum bits are a novel type that might prove to be superior. To find out how these could be created, an international team of researchers has carried out measurements at the Swiss Light Source SLS at PSI.
    More stable quantum bits
    “Computer bits that follow the laws of quantum mechanics can be achieved in different ways,” explains Niels Schröter, one of the study’s authors. He was a researcher at PSI until April 2021, when he moved to the Max Planck Institute of Microstructure Physics in Halle, Germany. “Most types of qubits unfortunately lose their information quickly; you could say they are forgetful qubits.” There is a technical solution to this: Each qubit is backed up with a system of additional qubits that correct any errors that occur. But this means that the total number of qubits needed for an operational quantum computer quickly rises into the millions.
    “Microsoft’s approach, which we are now collaborating on, is quite different,” Schröter continues. “We want to help create a new kind of qubit that is immune to leakage of information. This would allow us to use just a few qubits to achieve a slim, functioning quantum computer.”
    The researchers hope to obtain such immunity with so-called topological quantum bits. These would be something completely new that no research group has yet been able to create. More

  • in

    A new approach to a $1 million mathematical enigma

    Numbers like π, e and φ often turn up in unexpected places in science and mathematics. Pascal’s triangle and the Fibonacci sequence also seem inexplicably widespread in nature. Then there’s the Riemann zeta function, a deceptively straightforward function that has perplexed mathematicians since the 19th century. The most famous quandary, the Riemann hypothesis, is perhaps the greatest unsolved question in mathematics, with the Clay Mathematics Institute offering a $1 million prize for a correct proof.
    UC Santa Barbara physicist Grant Remmen believes he has a new approach for exploring the quirks of the zeta function. He has found an analogue that translates many of the function’s important properties into quantum field theory. This means that researchers can now leverage the tools from this field of physics to investigate the enigmatic and oddly ubiquitous zeta function. His work could even lead to a proof of the Riemann hypothesis. Remmen lays out his approach in the journal Physical Review Letters.
    “The Riemann zeta function is this famous and mysterious mathematical function that comes up in number theory all over the place,” said Remmen, a postdoctoral scholar at UCSB’s Kavli Institute for Theoretical Physics. “It’s been studied for over 150 years.”
    An outside perspective
    Remmen generally doesn’t work on cracking the biggest questions in mathematics. He’s usually preoccupied chipping away at the biggest questions in physics. As the fundamental physics fellow at UC Santa Barbara, he normally devotes his attention to topics like particle physics, quantum gravity, string theory and black holes. “In modern high-energy theory, the physics of the largest scales and smallest scales both hold the deepest mysteries,” he remarked.
    One of his specialties is quantum field theory, which he describes as a “triumph of 20th century physics.” Most people have heard of quantum mechanics (subatomic particles, uncertainty, etc.) and special relativity (time dilation, E=mc2, and so forth). “But with quantum field theory, physicists figured out how to combine special relativity and quantum mechanics into a description of how particles moving at or near the speed of light behave,” he explained. More

  • in

    New simulations can improve avalanche forecasting

    Computer simulations of snow cover can accurately forecast avalanche hazard, according to a new international study involving researchers from Simon Fraser University.
    Currently, avalanche forecasts in Canada are made by experienced professionals who rely on data from local weather stations and on-the-ground observations from ski and backcountry ski operators, avalanche control workers for transportation and industry, and volunteers who manually test the snowpack.
    But simulated snow cover models developed by a team of researchers are able detect and track weak layers of snow and identify avalanche hazard in a completely different way — and can provide forecasters with another reliable tool when local data is insufficient or not available, according to a new study published in the journal Cold Regions Science and Technology.
    “As far as natural hazards go, avalanches are still one of the leading causes of fatalities in Canada,” says Simon Horton, a post-doctoral fellow with the SFU Centre for Natural Hazards Research and a forecaster with Avalanche Canada. “We’ve had these complex models that simulate the layers in the snowpack for a few decades now and they’re getting more and more accurate, but it’s been difficult to find out how to apply that to actual decision-making and improving safety.”
    Researchers took 16 years’ worth of daily meteorological, snow cover and avalanche data from two sites in Canada (Whistler and Rogers Pass, both in British Columbia) and Weissfluhjoch in Davos, Switzerland and ran computer simulations that could classify different avalanche situations.
    The simulations could determine avalanche risk, for either natural or artificial release, for problem types such as new snow, wind slab, persistent weak layers and wet snow conditions.
    “In the avalanche forecasting world, describing avalanche problems — the common scenarios that you might expect to find — are a good way for forecasters to describe avalanche hazard and communicate it to the public, so they know what kind of conditions to expect when they head out,” says Horton. “So that information is already available, except those are all done through expert assessment based on what they know from available field observations. In a lot of situations, there’s a fair bit of uncertainty about the human assessment of what these types of avalanche problems will be.
    “That’s where having more automated tools that can help predict potential hazards can help forecasters better prepare an accurate, precise forecast.”
    The results of the study showed the modelling was consistent with the real observed frequencies of avalanches over those 16 years and that the approach has potential to support avalanche forecasting in the future.
    Researchers also believe the modelling might be useful to study the future impacts of climate change on snow instability.
    Story Source:
    Materials provided by Simon Fraser University. Note: Content may be edited for style and length. More

  • in

    Scientists achieve key elements for fault-tolerant quantum computation in silicon spin qubits

    Researchers from RIKEN and QuTech — a collaboration between TU Delft and the Netherlands Organisation for Applied Scientific Research (TNO) — have achieved a key milestone toward the development of a fault-tolerant quantum computer. They were able to demonstrate a two-qubit gate fidelity of 99.5 percent — higher than the 99 percent considered to be the threshold for building fault-tolerant computers — using electron spin qubits in silicon, which are promising for large-scale quantum computers as the nanofabrication technology for building them already exists. This study was published in Nature.
    The world is currently in a race to develop large-scale quantum computers that could vastly outperform classical computers in certain areas. However, these efforts have been hindered by a number of factors, including in particular the problem of decoherence, or noise generated in the qubits. This problem becomes more serious with the number of qubits, hampering scaling up. In order to achieve a large-scale computer that could be used for useful applications, it is believed that a two-qubit gate fidelity of at least 99 percent to implement the surface code for error correction is required. This has been achieved in certain types of computers, using qubits based on superconducting circuits, trapped ions, and nitrogen-vacancy centers in diamond, but these are hard to scale up to the millions of qubits required to implement practical quantum computation with an error correction.
    To address these problems, the group decided to experiment with a quantum dot structure that was nanofabricated on a strained silicon/silicon germanium quantum well substrate, using a controlled-NOT (CNOT) gate. In previous experiments, the gate fidelity was limited due to slow gate speed. To improve the gate speed, they carefully designed the device and tuned it by applying different voltages to the gate electrodes. This combined an established fast single-spin rotation technique using micromagnets with large two-qubit coupling. The result was a gate speed that was 10 times better than previous attempts. Interestingly, although it had been thought that increasing gate speed would always lead to better fidelity, they found that there was a limit beyond which increasing the speed actually made the fidelity worse.
    In the course of the experiments, they discovered that a property called the Rabi frequency — a marker of how the qubits change states in response to an oscillating field — is key to the performance of the system, and they found a range of frequencies for which the single-qubit gate fidelity was 99.8 percent and the two-qubit gate fidelity was 99.5 percent, clearing the required threshold.
    Through this, they demonstrated that they could achieve universal operations, meaning that all the basic operations that constitute quantum operations, consisting of a single qubit operation and a two-qubit operation, could be performed at gate fidelities above the error correction threshold.
    To test the capability of the new system, the researchers implemented a two-qubit Deutsch-Jozsa algorithm and the Grover search algorithm. Both algorithms output correct results with a high fidelity of 96%-97%, demonstrating that silicon quantum computers can perform quantum calculations with high accuracy.
    Akito Noiri, the first author of the study, says, “We are very happy to have achieved a high-fidelity universal quantum gate set, one of the key challenges for silicon quantum computers.”
    Seigo Tarucha, leader of the research groups, said, “The presented result makes spin qubits, for the first time, competitive against superconducting circuits and ion traps in terms of universal quantum control performance. This study demonstrates that silicon quantum computers are promising candidates, along with superconductivity and ion traps, for research and development toward the realization of large-scale quantum computers.
    Story Source:
    Materials provided by RIKEN. Note: Content may be edited for style and length. More

  • in

    Quantum computing in silicon hits 99% accuracy

    UNSW Sydney-led research paves the way for large silicon-based quantum processors for real-world manufacturing and application.
    Australian researchers have proven that near error-free quantum computing is possible, paving the way to build silicon-based quantum devices compatible with current semiconductor manufacturing technology.
    “Today’s publication in Nature shows our operations were 99 per cent error-free,” says Professor Andrea Morello of UNSW, who led the work.
    “When the errors are so rare, it becomes possible to detect them and correct them when they occur. This shows that it is possible to build quantum computers that have enough scale, and enough power, to handle meaningful computation.”
    This piece of research is an important milestone on the journey that will get us there,” Prof. Morello says.
    Quantum computing in silicon hits the 99% threshold
    Morello’s paper is one of three published today in Nature that independently confirm that robust, reliable quantum computing in silicon is now a reality. This breakthrough features on the front cover of the journal. Morello et al achieved 1-qubit operation fidelities up to 99.95 per cent, and 2-qubit fidelity of 99.37 per cent with a three-qubit system comprising an electron and two phosphorus atoms, introduced in silicon via ion implantation. A Delftteam in the Netherlands led by Lieven Vandersypen achieved 99.87 per cent 1-qubit and 99.65 per cent 2-qubit fidelities using electron spins in quantum dots formed in a stack of silicon and silicon-germanium alloy (Si/SiGe). A RIKEN team in Japan led by Seigo Tarucha similarly achieved 99.84 per cent 1-qubit and 99.51 per cent 2-qubit fidelities in a two-electron system using Si/SiGe quantum dots. More

  • in

    Inner workings of quantum computers

    A precision diagnostic developed at the Department of Energy’s Sandia National Laboratories is emerging as a gold standard for detecting and describing problems inside quantum computing hardware.
    Two papers published today in the scientific journal Nature describe how separate research teams — one including Sandia researchers — used a Sandia technique called gate set tomography to develop and validate highly reliable quantum processors. Sandia has been developing gate set tomography since 2012, with funding from the DOE Office of Science through the Advanced Scientific Computing Research program.
    Sandia scientists collaborated with Australian researchers at the University of New South Wales in Sydney, led by Professor Andrea Morello, to publish one of today’s papers. Together, they used GST to show that a sophisticated, three-qubit system comprising two atomic nuclei and one electron in a silicon chip could be manipulated reliably with 99%-plus accuracy.
    In another Nature article appearing today, a group led by Professor Lieven Vandersypen at Delft University of Technology in the Netherlands used gate set tomography, implemented using Sandia software, to demonstrate the important milestone of 99%-plus accuracy but with a different approach, controlling electrons trapped within quantum dots instead of isolated atomic nuclei.
    “We want researchers everywhere to know they have access to a powerful, cutting-edge tool that will help them make their breakthroughs,” said Sandia scientist Robin Blume-Kohout.
    Future quantum processors with many more qubits, or quantum bits, could enable users working in national security, science and industry to perform some tasks faster than they ever could with a conventional computer. But flaws in current system controls cause computational errors. A quantum computer can correct some errors, but the more errors it must correct, the larger and more expensive that computer becomes to build. More

  • in

    Solving a crystal's structure when you've only got powder

    Crystals reveal the hidden geometry of molecules to the naked eye. Scientists use crystals to figure out the atomic structure of new materials, but many can’t be grown large enough. Now, a team of researchers report a new technique in the January 19 issue of Nature that can discover the crystalline structure of any material.
    To truly understand a chemical, a scientist needs to know how its atoms are arranged. Sometimes that’s easy: for example, both diamond and gold are made of a single kind of atom (carbon or gold, respectively) arranged in a cubic grid. But often it’s harder to figure out more complicated ones.
    “Every single one of these is a special snowflake — growing them is really difficult,” says UConn chemical physicist Nate Hohman. Hohman studies metal organic chacogenolates. They’re made of a metal combined with an organic polymer and an element from column 16 of the periodic table (sulfur, selenium, tellurium or polonium.) Some are brightly colored pigments; others become more electrically conductive when light is shined on them; others make good solid lubricants that don’t burn up in the high temperatures of oil refineries or mines.
    It’s a large, useful family of chemicals. But the ones Hohman studies — hybrid chalcogenolates — are really difficult to crystallize. Hohman’s lab couldn’t solve the atomic structures, because they couldn’t grow large perfect crystals. Even the tiny powdered crystals they could get were imperfect and messy.
    X-ray crystallography is the standard way to figure out the atomic arrangements of more complicated materials. A famous, early example was how Rosalind Franklin used it to figure out the structure of DNA. She isolated large, perfect pieces of DNA in crystalline form, and then illuminated them with x-rays. X-rays are so small they diffract through the spaces between atoms, the same way visible light diffracts through slots in metal. By doing the math on the diffraction pattern, you can figure out the spacing of the slots — or atoms — that made it.
    Once you know the atomic structure of a material, a whole new world opens up. Materials scientists use that information to design specific materials to do special things. For example, maybe you have a material that bends light in cool ways, so that it becomes invisible under ultraviolet light. If you understand the atomic structure, you might be able to tweak it — substitute a similar element of a different size in a specific spot, say — and make it do the same thing in visible light. Voila, an invisibility cloak! More