More stories

  • in

    Foundational step shows quantum computers can be better than the sum of their parts

    Pobody’s nerfect — not even the indifferent, calculating bits that are the foundation of computers. But JQI Fellow Christopher Monroe’s group, together with colleagues from Duke University, have made progress toward ensuring we can trust the results of quantum computers even when they are built from pieces that sometimes fail. They have shown in an experiment, for the first time, that an assembly of quantum computing pieces can be better than the worst parts used to make it. In a paper published in the journal Nature on Oct. 4, 2021, the team shared how they took this landmark step toward reliable, practical quantum computers.
    In their experiment, the researchers combined several qubits — the quantum version of bits — so that they functioned together as a single unit called a logical qubit. They created the logical qubit based on a quantum error correction code so that, unlike for the individual physical qubits, errors can be easily detected and corrected, and they made it to be fault-tolerant — capable of containing errors to minimize their negative effects.
    “Qubits composed of identical atomic ions are natively very clean by themselves,” says Monroe, who is also a Fellow of the Joint Center for Quantum Information and Computer Science and a College Park Professor in the Department of Physics at the University of Maryland. “However, at some point, when many qubits and operations are required, errors must be reduced further, and it is simpler to add more qubits and encode information differently. The beauty of error correction codes for atomic ions is they can be very efficient and can be flexibly switched on through software controls.”
    This is the first time that a logical qubit has been shown to be more reliable than the most error-prone step required to make it. The team was able to successfully put the logical qubit into its starting state and measure it 99.4% of the time, despite relying on six quantum operations that are individually expected to work only about 98.9% of the time.
    That might not sound like a big difference, but it’s a crucial step in the quest to build much larger quantum computers. If the six quantum operations were assembly line workers, each focused on one task, the assembly line would only produce the correct initial state 93.6% of the time (98.9% multiplied by itself six times) — roughly ten times worse than the error measured in the experiment. That improvement is because in the experiment the imperfect pieces work together to minimize the chance of quantum errors compounding and ruining the result, similar to watchful workers catching each other’s mistakes.
    The results were achieved using Monroe’s ion-trap system at UMD, which uses up to 32 individual charged atoms — ions — that are cooled with lasers and suspended over electrodes on a chip. They then use each ion as a qubit by manipulating it with lasers. More

  • in

    Scientists are one step closer to error-correcting quantum computers

    Mistakes happen — especially in quantum computers. The fragile quantum bits, or qubits, that make up the machines are notoriously error-prone, but now scientists have shown that they can fix the flubs.

    Computers that harness the rules of quantum mechanics show promise for making calculations far out of reach for standard computers (SN: 6/29/17). But without a mechanism for fixing the computers’ mistakes, the answers that a quantum computer spits out could be gobbledygook (SN: 6/22/20).

    Combining the power of multiple qubits into one can solve the error woes, researchers report October 4 in Nature. Scientists used nine qubits to make a single, improved qubit called a logical qubit, which, unlike the individual qubits from which it was made, can be probed to check for mistakes.

    “This is a key demonstration on the path to build a large-scale quantum computer,” says quantum physicist Winfried Hensinger of the University of Sussex in Brighton, England, who was not involved in the new study.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Still, that path remains a long one, Hensinger says. To do complex calculations, scientists will have to dramatically scale up the number of qubits in the machines. But now that scientists have shown that they can keep errors under control, he says, “there’s nothing fundamentally stopping us to build a useful quantum computer.”

    In a logical qubit, information is stored redundantly. That allows researchers to check and fix mistakes in the data. “If a piece of it goes missing, you can reconstruct it from the other pieces, like Voldemort,” says quantum physicist David Schuster of the University of Chicago, who was not involved with the new research. (The Harry Potter villain kept his soul safe by concealing it in multiple objects called Horcruxes.)

    In the new study, four additional, auxiliary qubits interfaced with the logical qubit, in order to identify errors in its data. Future quantum computers could make calculations using logical qubits in place of the original, faulty qubits, repeatedly checking and fixing any errors that crop up.

    To make their logical qubit, the researchers used a technique called a Bacon-Shor code, applying it to qubits made of ytterbium ions hovering above an ion-trapping chip inside a vacuum, which are manipulated with lasers. The researchers also designed sequences of operations so that errors don’t multiply uncontrollably, what’s known as “fault tolerance.”

    Thanks to those efforts, the new logical qubit had a lower error rate than that of the most flawed components that made it up, says quantum physicist Christopher Monroe of the University of Maryland in College Park and Duke University.

    However, the team didn’t quite complete the full process envisioned for error correction. While the computer detected the errors that arose, the researchers didn’t correct the mistakes and continue on with computation. Instead, they fixed errors after the computer was finished. In a full-fledged example, scientists would detect and correct errors multiple times on the fly.

    Demonstrating quantum error correction is a necessity for building useful quantum computers. “It’s like achieving criticality with [nuclear] fission,” Schuster says. Once that nuclear science barrier was passed in 1942, it led to technologies like nuclear power and atomic bombs (SN: 11/29/17).

    As quantum computers gradually draw closer to practical usefulness, companies are investing in the devices. Technology companies such as IBM, Google and Intel host major quantum computing endeavors. On October 1, a quantum computing company cofounded by Monroe, called IonQ, went public; Monroe spoke to Science News while on a road trip to ring the opening bell at the New York Stock Exchange.

    The new result suggests that full-fledged quantum error correction is almost here, says coauthor Kenneth Brown, a quantum engineer also at Duke University. “It really shows that we can get all the pieces together and do all the steps.” More

  • in

    Unprecedented view of a single catalyst nanoparticle at work

    A DESY-led research team has been using high-intensity X-rays to observe a single catalyst nanoparticle at work. The experiment has revealed for the first time how the chemical composition of the surface of an individual nanoparticle changes under reaction conditions, making it more active. The team led by DESY’s Andreas Stierle is presenting its findings in the journal Science Advances. This study marks an important step towards a better understanding of real, industrial catalytic materials.
    Catalysts are materials that promote chemical reactions without being consumed themselves. Today, catalysts are used in numerous industrial processes, from fertiliser production to manufacturing plastics. Because of this, catalysts are of huge economic importance. A very well-known example is the catalytic converter installed in the exhaust systems of cars. These contain precious metals such as platinum, rhodium and palladium, which allow highly toxic carbon monoxide (CO) to be converted into carbon dioxide (CO2) and reduce the amount of harmful nitrogen oxides (NOx).
    “In spite of their widespread use and great importance, we are still ignorant of many important details of just how the various catalysts work,” explains Stierle, head of the DESY NanoLab. “That’s why we have long wanted to study real catalysts while in operation.” This is not easy, because in order to make the active surface as large as possible, catalysts are typically used in the form of tiny nanoparticles, and the changes that affect their activity occur on their surface.
    Surface strain relates to chemical composition
    In the framework of the EU project Nanoscience Foundries and Fine Analysis (NFFA), the team from DESY NanoLab has developed a technique for labelling individual nanoparticles and thereby identifying them in a sample. “For the study, we grew nanoparticles of a platinum-rhodium alloy on a substrate in the lab and labelled one specific particle,” says co-author Thomas Keller from DESY NanoLab and in charge of the project at DESY. “The diameter of the labelled particle is around 100 nanometres, and it is similar to the particles used in a car’s catalytic converter.” A nanometre is a millionth of a millimetre.
    Using X-rays from the European Synchrotron Radiation Facility ESRF in Grenoble, France, the team was not only able to create a detailed image of the nanoparticle; it also measured the mechanical strain within its surface. “The surface strain is related to the surface composition, in particular the ratio of platinum to rhodium atoms,” explains co-author Philipp Pleßow from the Karlsruhe Institute of Technology (KIT), whose group computed strain as a function of surface composition. By comparing the observed and computed facet-dependent strain, conclusions can be drawn concerning the chemical composition at the particle surface. The different surfaces of a nanoparticle are called facets, just like the facets of a cut gemstone.
    When the nanoparticle is grown, its surface consists mainly of platinum atoms, as this configuration is energetically favoured. However, the scientists studied the shape of the particle and its surface strain under different conditions, including the operating conditions of an automotive catalytic converter. To do this, they heated the particle to around 430 degrees Celsius and allowed carbon monoxide and oxygen molecules to pass over it. “Under these reaction conditions, the rhodium inside the particle becomes mobile and migrates to the surface because it interacts more strongly with oxygen than the platinum,” explains Pleßow. This is also predicted by theory.
    “As a result, the surface strain and the shape of the particle change,” reports co-author Ivan Vartaniants, from DESY, whose team converted the X-ray diffraction data into three-dimensional spatial images. “A facet-dependent rhodium enrichment takes place, whereby additional corners and edges are formed.” The chemical composition of the surface, and the shape and size of the particles have a significant effect on their function and efficiency. However, scientists are only just beginning to understand exactly how these are connected and how to control the structure and composition of the nanoparticles. The X-rays allow researchers to detect changes of as little as 0.1 in a thousand in the strain, which in this experiment corresponds to a precision of about 0.0003 nanometres (0.3 picometres).
    Crucial step towards analysing industrial catalyst maerials
    “We can now, for the first time, observe the details of the structural changes in such catalyst nanoparticles while in operation,” says Stierle, Lead Scientist at DESY and professor for nanoscience at the University of Hamburg. “This is a major step forward and is helping us to understand an entire class of reactions that make use of alloy nanoparticles.” Scientists at KIT and DESY now want to explore this systematically at the new Collaborative Research Centre 1441, funded by the German Research Foundation (DFG) and entitled “Tracking the Active Sites in Heterogeneous Catalysis for Emission Control (TrackAct).”
    “Our investigation is an important step towards analysing industrial catalytic materials,” Stierle points out. Until now, scientists have had to grow model systems in the laboratory in order to conduct such investigations. “In this study, we have gone to the limit of what can be done. With DESY’s planned X-ray microscope PETRA IV, we will be able to look at ten times smaller individual particles in real catalysts, and under reaction conditions.” More

  • in

    Low-cost, portable device could diagnose heart attacks in minutes

    Researchers from the University of Notre Dame and the University of Florida have developed a sensor that could diagnose a heart attack in less than 30 minutes, according to a study published in Lab on a Chip.
    Currently, it takes health care professionals hours to diagnose a heart attack. Initial results from an echocardiogram can quickly show indications of heart disease, but to confirm a patient is having a heart attack, a blood sample and analysis is required. Those results can take up to eight hours.
    “The current methods used to diagnose a heart attack are not only time intensive, but they also have to be applied within a certain window of time to get accurate results,” said Pinar Zorlutuna, the Sheehan Family Collegiate Professor of Engineering at Notre Dame and lead author of the paper. “Because our sensor targets a combination of miRNA, it can quickly diagnose more than just heart attacks without the timeline limitation.”
    By targeting three distinct types of microRNA or miRNA, the newly developed sensor can distinguish between an acute heart attack and a reperfusion — the restoration of blood flow, or reperfusion injury, and requires less blood than traditional diagnostic methods to do so. The ability to differentiate between someone with inadequate blood supply to an organ and someone with a reperfusion injury is an unmet, clinical need that this sensor addresses.
    “The technology developed for this sensor showcases the advantage of using miRNA compared to protein-based biomarkers, the traditional diagnostic target,” said Hsueh-Chia Chang, the Bayer Professor of Chemical and Biomolecular Engineering at Notre Dame and co-author of the paper. “Additionally, the portability and cost efficiency of this device demonstrates the potential for it to improve how heart attacks and related issues are diagnosed in clinical settings and in developing countries.”
    A patent application has been filed for the sensor and the researchers are working with Notre Dame’s IDEA Center to potentially establish a startup company that would manufacture the device.
    Bioengineers Chang and Zorlutuna are both affiliated with Notre Dame’s Institute for Precision Health. Additional co-authors from Notre Dame are Stuart Ryan Blood, Cameron DeShetler, Bradley Ellis, Xiang Ren, George Ronan and Satyajyoti Senapati. Co-authors from the University of Florida are David Anderson, Eileen Handberg, Keith March and Carl Pepine. The study was funded by the National Institutes of Health National Heart, Lung, and Blood Institute.
    Story Source:
    Materials provided by University of Notre Dame. Original written by Brandi Wampler. Note: Content may be edited for style and length. More

  • in

    How flawed diamonds 'lead' to flawless quantum networks

    Lead-based vacancy centers in diamonds that form after high-pressure and high-temperature treatment are ideal for quantum networks, find scientists. The modified crystal system could also find applications in spintronics and quantum sensors.
    The color in a diamond comes from a defect, or “vacancy,” where there is a missing carbon atom in the crystal lattice. Vacancies have long been of interest to electronics researchers because they can be used as ‘quantum nodes’ or points that make up a quantum network for the transfer of data. One of the ways of introducing a defect into a diamond is by implanting it with other elements, like nitrogen, silicon, or tin.
    In a recent study published in ACS Photonics, scientists from Japan demonstrate that lead-vacancy centers in diamond have the right properties to function as quantum nodes. “The use of a heavy group IV atom like lead is a simple strategy to realize superior spin properties at increased temperatures, but previous studies have not been consistent in determining the optical properties of lead-vacancy centers accurately,” says Associate Professor Takayuki Iwasaki of Tokyo Institute of Technology (Tokyo Tech), who led the study.
    The three critical properties researchers look for in a potential quantum node are symmetry, spin coherence time, and zero phonon lines (ZPLs), or electronic transition lines that do not affect “phonons,” the quanta of crystal lattice vibrations. Symmetry provides insight into how to control spin (rotational velocity of subatomic particles like electrons), coherence refers to an identicalness in the wave nature of two particles, and ZPLs describe the optical quality of the crystal.
    The researchers fabricated the lead-vacancies in diamond and then subjected the crystal to high pressure and high temperature. They then studied the lead vacancies using photoluminescence spectroscopy, a technique that allows you to read the optical properties and to estimate the spin properties. They found that the lead-vacancies had a type of dihedral symmetry, which is appropriate for the construction of quantum networks. They also found that the system showed a large “ground state splitting,” a property that contributes to the coherence of the system. Finally, they saw that the high-pressure high-temperature treatment they inflicted upon the crystals suppressed inhomogeneous distribution of ZPLs by recovering the damage done to the crystal lattice during the implantation process. A simple calculation showed that lead-vacancies had a long spin coherence time at a higher temperature (9K) than previous systems with silicon and tin vacancies.
    “The simulation we presented in our study seems to suggest that the lead-vacancy center will likely be an essential system for creating a quantum light-matter interface — one of the key elements in the application of quantum networks,” concludes an optimistic Dr. Iwasaki.
    This study paves the way for the future development of large (defective) diamond wafers and thin (defective) diamond films with reliable properties for quantum network applications.
    Story Source:
    Materials provided by Tokyo Institute of Technology. Note: Content may be edited for style and length. More

  • in

    2020 babies may suffer up to seven times as many extreme heat waves as 1960s kids

    The kids are not all right. Children born in 2020 could live through seven times as many extreme heat waves as people born in 1960.

    That’s the projected generational disparity if global greenhouse gas emissions are curbed by the amount currently promised by the world’s nations, climate scientist Wim Thiery of Vrije Universiteit Brussel in Belgium and colleagues report September 26 in Science. Under current pledges, Earth’s average temperature is expected to increase by about 2.4 degrees Celsius relative to preindustrial times by 2100. While the older generation will experience an average of about four extreme heat waves during their lifetime, the younger generation could experience an average of about 30 such heat waves, the researchers say.

    More stringent reductions that would limit warming to just 1.5 degrees C would shrink — but not erase — the disparity: Children born in 2020 could still experience four times as many extreme heat waves as people born in 1960.

    Scientists have previously outlined how climate change has already amped up extreme weather events around the globe, and how those climate impacts are projected to increase as the world continues to warm (SN: 8/9/21). The new study is the first to specifically quantify how much more exposed younger generations will be to those events.

    An average child born in 2020 also will experience two times as many wildfires, 2.8 times as many river floods, 2.6 times as many droughts and about three times as many crop failures as a child born 60 years earlier, under climate scenarios based on current pledges. That exposure to extreme events becomes even higher in certain parts of the world: In the Middle East, for example, 2020 children will see up to 10 times as many heat waves as the older cohort, the team found.

    With this possible grim future in mind, student climate activists in the #FridaysforFuture movement have been among the most powerful voices of protest in recent years (SN: 12/16/19). Thiery and colleagues note that these findings come at a crucial time, as world leaders prepare to gather in Glasgow, Scotland, in late October for the 2021 United Nations Climate Change Conference to negotiate new pledges to reduce greenhouse gas emissions. More

  • in

    Connecting the dots between material properties and qubit performance

    Engineers and materials scientists studying superconducting quantum information bits (qubits) — a leading quantum computing material platform based on the frictionless flow of paired electrons — have collected clues hinting at the microscopic sources of qubit information loss. This loss is one of the major obstacles in realizing quantum computers capable of stringing together millions of qubits to run demanding computations. Such large-scale, fault-tolerant systems could simulate complicated molecules for drug development, accelerate the discovery of new materials for clean energy, and perform other tasks that would be impossible or take an impractical amount of time (millions of years) for today’s most powerful supercomputers.
    An understanding of the nature of atomic-scale defects that contribute to qubit information loss is still largely lacking. The team helped bridge this gap between material properties and qubit performance by using state-of-the-art characterization capabilities at the Center for Functional Nanomaterials (CFN) and National Synchrotron Light Source II (NSLS-II), both U.S. Department of Energy (DOE) Office of Science User Facilities at Brookhaven National Laboratory. Their results pinpointed structural and surface chemistry defects in superconducting niobium qubits that may be causing loss.
    “Superconducting qubits are a promising quantum computing platform because we can engineer their properties and make them using the same tools used to make regular computers,” said Anjali Premkumar, a fourth-year graduate student in the Houck Lab at Princeton University and first author on the Communications Materials paper describing the research. “However, they have shorter coherence times than other platforms.”
    In other words, they can’t hold onto information very long before they lose it. Though coherence times have recently improved from microseconds to milliseconds for single qubits, these times significantly decrease when multiple qubits are strung together.
    “Qubit coherence is limited by the quality of the superconductors and the oxides that will inevitably grow on them as the metal comes into contact with oxygen in the air,” continued Premkumar. “But, as qubit engineers, we haven’t characterized our materials in great depth. Here, for the first time, we collaborated with materials experts who can carefully look at the structure and chemistry of our materials with sophisticated tools.”
    This collaboration was a “prequel” to the Co-design Center for Quantum Advantage (C2QA), one of five National Quantum Information Science Centers established in 2020 in support of the National Quantum Initiative. Led by Brookhaven Lab, C2QA brings together hardware and software engineers, physicists, materials scientists, theorists, and other experts across national labs, universities, and industry to resolve performance issues with quantum hardware and software. Through materials, devices, and software co-design efforts, the C2QA team seeks to understand and ultimately control material properties to extend coherence times, design devices to generate more robust qubits, optimize algorithms to target specific scientific applications, and develop error-correction solutions. More

  • in

    Scientists create material that can both move and block heat

    Moving heat around where you want it to go — adding it to houses and hairdryers, removing it from car engines and refrigerators — is one of the great challenges of engineering.
    All activity generates heat, because energy escapes from everything we do. But too much can wear out batteries and electronic components — like parts in an aging laptop that runs too hot to actually sit on your lap. If you can’t get rid of heat, you’ve got a problem.
    Scientists at the University of Chicago have invented a new way to funnel heat around at the microscopic level: a thermal insulator made using an innovative technique. They stack ultra-thin layers of crystalline sheets on top of each other, but rotate each layer slightly, creating a material with atoms that are aligned in one direction but not in the other.
    “Think of a partly-finished Rubik’s cube, with layers all rotated in random directions,” said Shi En Kim, a graduate student with the Pritzker School of Molecular Engineering who is the first author of the study. “What that means is that within each layer of the crystal, we still have an ordered lattice of atoms, but if you move to the neighboring layer, you have no idea where the next atoms will be relative to the previous layer — the atoms are completely messy along this direction.”
    The result is a material that is extremely good at both containing heat and moving it, albeit in different directions — an unusual ability at the microscale, and one that could have very useful applications in electronics and other technology.
    “The combination of excellent heat conductivity in one direction and excellent insulation in the other direction does not exist at all in nature,” said study lead author Jiwoong Park, professor of chemistry and molecular engineering at the University of Chicago. “We hope this could open up an entirely new direction for making novel materials.”
    ‘Just amazingly low’ More