More stories

  • in

    Natural radiation can interfere with quantum computers

    A multidisciplinary research team has shown that radiation from natural sources in the environment can limit the performance of superconducting quantum bits, known as qubits. The discovery, reported today in the journal Nature, has implications for the construction and operation of quantum computers, an advanced form of computing that has attracted billions of dollars in public and private investment globally.
    The collaboration between teams at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) and the Massachusetts Institute of Technology (MIT), helps explain a mysterious source of interference limiting qubit performance.
    “Our study is the first to show clearly that low-level ionizing radiation in the environment degrades the performance of superconducting qubits,” said John Orrell, a PNNL research physicist, senior author of the study and expert in low-level radiation measurement. “These findings suggest that radiation shielding will be necessary to attain long-sought performance in quantum computers of this design.”
    Natural radiation wreaks havoc with computers
    Computer engineers have known for at least a decade that natural radiation emanating from materials like concrete and pulsing through our atmosphere in the form of cosmic rays can cause digital computers to malfunction. But digital computers aren’t nearly as sensitive as a quantum computer.
    “We found that practical quantum computing with these devices will not be possible unless we address the radiation issue,” said PNNL physicist Brent VanDevender, a co-investigator on the study.

    advertisement

    The researchers teamed up to solve a puzzle that has been vexing efforts to keep superconducting quantum computers working for long enough to make them reliable and practical. A working quantum computer would be thousands of times faster than even the fastest supercomputer operating today. And it would be able to tackle computing challenges that today’s digital computers are ill-equipped to take on. But the immediate challenge is to have the qubits maintain their state, a feat called “coherence,” said Orrell. This desirable quantum state is what gives quantum computers their power.
    MIT physicist Will Oliver was working with superconducting qubits and became perplexed at a source of interference that helped push the qubits out of their prepared state, leading to “decoherence,” and making them non-functional. After ruling out a number of different possibilities, he considered the idea that natural radiation from sources like metals found in the soil and cosmic radiation from space might be pushing the qubits into decoherence.
    A chance conversation between Oliver, VanDevender, and his long-time collaborator, MIT physicist Joe Formaggio, led to the current project.
    It’s only natural
    To test the idea, the research team measured the performance of prototype superconducting qubits in two different experiments:
    They exposed the qubits to elevated radiation from copper metal activated in a reactor.
    They built a shield around the qubits that lowered the amount of natural radiation in their environment.
    The pair of experiments clearly demonstrated the inverse relationship between radiation levels and length of time qubits remain in a coherent state.

    advertisement

    “The radiation breaks apart matched pairs of electrons that typically carry electric current without resistance in a superconductor,” said VanDevender. “The resistance of those unpaired electrons destroys the delicately prepared state of a qubit.”
    The findings have immediate implications for qubit design and construction, the researchers concluded. For example, the materials used to construct quantum computers should exclude material that emits radiation, the researchers said. In addition, it may be necessary to shield experimental quantum computers from radiation in the atmosphere.
    At PNNL, interest has turned to whether the Shallow Underground Laboratory, which reduces surface radiation exposure by 99%, could serve future quantum computer development. Indeed, a recent study by a European research team corroborates the improvement in qubit coherence when experiments are conducted underground.
    “Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing,” said VanDevender.
    The researchers emphasize that factors other than radiation exposure are bigger impediments to qubit stability for the moment. Things like microscopic defects or impurities in the materials used to construct qubits are thought to be primarily responsible for the current performance limit of about one-tenth of a millisecond. But once those limitations are overcome, radiation begins to assert itself as a limit and will eventually become a problem without adequate natural radiation shielding strategies, the researchers said.
    Findings affect global search for dark matter
    In addition to helping explain a source of qubit instability, the research findings may also have implications for the global search for dark matter, which is thought to comprise just under 85% of the known universe, but which has so far escaped human detection with existing instruments. One approach to signals involves using research that depends on superconducting detectors of similar design to qubits. Dark matter detectors also need to be shielded from external sources of radiation, because radiation can trigger false recordings that obscure the desirable dark matter signals.
    “Improving our understanding of this process may lead to improved designs for these superconducting sensors and lead to more sensitive dark matter searches,” said Ben Loer, a PNNL research physicist who is working both in dark matter detection and radiation effects on superconducting qubits. “We may also be able to use our experience with these particle physics sensors to improve future superconducting qubit designs.” More

  • in

    New study warns: We have underestimated the pace at which the Arctic is melting

    Temperatures in the Arctic Ocean between Canada, Russia and Europe are warming faster than researchers’ climate models have been able to predict.
    Over the past 40 years, temperatures have risen by one degree every decade, and even more so over the Barents Sea and around Norway’s Svalbard archipelago, where they have increased by 1.5 degrees per decade throughout the period.
    This is the conclusion of a new study published in Nature Climate Change.
    “Our analyses of Arctic Ocean conditions demonstrate that we have been clearly underestimating the rate of temperature increases in the atmosphere nearest to the sea level, which has ultimately caused sea ice to disappear faster than we had anticipated,” explains Jens Hesselbjerg Christensen, a professor at the University of Copenhagen’s Niels Bohr Institutet (NBI) and one of the study’s researchers.
    Together with his NBI colleagues and researchers from the Universities of Bergen and Oslo, the Danish Metrological Institute and Australian National University, he compared current temperature changes in the Arctic with climate fluctuations that we know from, for example, Greenland during the ice age between 120,000-11,000 years ago.
    “The abrupt rise in temperature now being experienced in the Arctic has only been observed during the last ice age. During that time, analyses of ice cores revealed that temperatures over the Greenland Ice Sheet increased several times, between 10 to 12 degrees, over a 40 to 100-year period,” explains Jens Hesselbjerg Christensen.
    He emphasizes that the significance of the steep rise in temperature is yet to be fully appreciated. And, that an increased focus on the Arctic and reduced global warming, more generally, are musts.
    Climate models ought to take abrupt changes into account Until now, climate models predicted that Arctic temperatures would increase slowly and in a stable manner. However, the researchers’ analysis demonstrates that these changes are moving along at a much faster pace than expected.
    “We have looked at the climate models analysed and assessed by the UN Climate Panel. Only those models based on the worst-case scenario, with the highest carbon dioxide emissions, come close to what our temperature measurements show over the past 40 years, from 1979 to today,” says Jens Hesselbjerg Christensen.
    In the future, there ought to be more of a focus on being able to simulate the impact of abrupt climate change on the Arctic. Doing so will allow us to create better models that can accurately predict temperature increases:
    “Changes are occurring so rapidly during the summer months that sea ice is likely to disappear faster than most climate models have ever predicted. We must continue to closely monitor temperature changes and incorporate the right climate processes into these models,” says Jens Hesselbjerg Christensen. He concludes:
    “Thus, successfully implementing the necessary reductions in greenhouse gas emissions to meet the Paris Agreement is essential in order to ensure a sea-ice packed Arctic year-round.”

    Story Source:
    Materials provided by University of Copenhagen. Note: Content may be edited for style and length. More

  • in

    NBA playoff format is optimizing competitive balance by eliminating travel

    In addition to helping protect players from COVID-19, the NBA “bubble” in Orlando may be a competitive equalizer by eliminating team travel. Researchers analyzing the results of nearly 500 NBA playoff games over six seasons found that a team’s direction of travel and the number of time zones crossed were associated with its predicted win probability and actual game performance.
    Preliminary results of the study suggest that the 2020 NBA playoffs, which begin Aug. 17, will eliminate any advantages or disadvantages related to long-distance travel. In this year’s unique playoff format, implemented due to the COVID-19 pandemic, all 16 teams will stay in Orlando, Florida, and compete at the ESPN Wide World of Sports Complex in Walt Disney World.
    The study found that scoring was significantly higher following eastward travel. Although there were no differences in actual game outcomes based on overall direction of travel, there were differences when considering both the direction and magnitude of travel. Teams that traveled east with three-hour time zone changes had higher predicted probabilities of winning than teams that traveled west or played in the same time zone. In contrast, teams that traveled west across three time zones had lower predicted win probabilities than teams that traveled east or played in the same time zone.
    “During this initial study, it was interesting to find that team scoring improved during general eastward travel compared to westward travel and travel in the same zone, but game outcomes were unaffected by direction of travel during the playoffs,” said lead author Sean Pradhan, assistant professor of sports management and business analytics in the School of Business Administration at Menlo College in Atherton, California. “However, when considering the magnitude of travel across different time zones, we found that teams had predicted probabilities of winning that were lower after traveling three time zones westward, and tended to actually lose more games when traveling two time zones westward compared to most other types of travel.”
    Circadian rhythms are endogenous, near-24-hour biological rhythms that exist in all living organisms, and these daily rhythms have peaks and troughs in both alertness and sleepiness that can impact individuals in high-performance professions. Therefore, an athlete has a greater opportunity for optimal performance when the timing of an activity is synchronized with the body’s circadian clock.
    Researchers from Menlo College and other collaborators reviewed data from 499 NBA playoff games from the 2013-2014 through 2018-2019 seasons. They looked at the impact of direction of travel and time zones traveled on actual game outcomes, team quality, predicted win probability, and team scoring for visiting teams.
    “A great deal of prior work has examined the effects of travel and circadian advantages on team performance during the regular season of various professional sports leagues,” said Pradhan. “The current study extends such findings of previous research by examining team performance in the NBA playoffs, which is obviously an extremely crucial time for teams competing.”

    Story Source:
    Materials provided by American Academy of Sleep Medicine. Note: Content may be edited for style and length. More

  • in

    Revised code could help improve efficiency of fusion experiments

    An international team of researchers led by the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has upgraded a key computer code for calculating forces acting on magnetically confined plasma in fusion energy experiments. The upgrade will be part of a suite of computational tools that will allow scientists to further improve the design of breakfast-cruller-shaped facilities known as stellarators. Together, the three codes in the suite could help scientists bring efficient fusion reactors closer to reality.
    The revised software lets researchers more easily determine the boundary of plasma in stellarators. When used in concert with two other codes, the code could help find a stellarator configuration that improves the performance of the design. The two complementary codes determine the optimal location for the plasma in a stellarator vacuum chamber to maximize the efficiency of the fusion reactions, and determine the shape that the external electromagnets must have to hold the plasma in the proper position.
    The revised software, called the “free-boundary stepped-pressure equilibrium code (SPEC),” is one of a set of tools scientists can use to tweak the performance of plasma to more easily create fusion energy. “We want to optimize both the plasma position and the magnetic coils to balance the force that makes the plasma expand while holding it in place,” said Stuart Hudson, physicist, deputy head of the Theory Department at PPPL and lead author of the paper reporting the results in Plasma Physics and Controlled Fusion.
    “That way we can create a stable plasma whose particles are more likely to fuse. The updated SPEC code enables us to know where the plasma will be for a given set of magnetic coils.”
    Fusion combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — and in the process generates massive amounts of energy in the sun and stars. Scientists are seeking to replicate fusion in devices on Earth for a virtually inexhaustible supply of safe and clean power to generate electricity.
    Plasma stability is crucial for fusion. If plasma bounces around inside a stellarator, it can escape, cool, and tamp down the fusion reactions, in effect quenching the fusion fire. An earlier version of the code, also developed by Hudson, could only calculate how forces were affecting a plasma if the researchers already knew the plasma’s location. Researchers, however, typically don’t have that information. “That’s one of the problems with plasmas,” Hudson said. “They move all over the place.”
    The new version of the SPEC code helps solve the problem by allowing researchers to calculate the plasma’s boundary without knowing its position beforehand. Used in coordination with a coil-design code called FOCUS and an optimization code called STELLOPT — both of which were also developed at PPPL — SPEC lets physicists simultaneously ensure that the plasma will have the best fusion performance and the magnets will not be too complicated to build. “There’s no point optimizing the shape of the plasma and then later finding out that the magnets would be incredibly difficult to construct,” Hudson said.
    One challenge that Hudson and colleagues faced was verifying that each step of the code upgrade was done correctly. Their slow-and-steady approach was crucial to making sure that the code makes accurate calculations. “Let’s say you are designing a component that will go on a rocket to the moon,” Hudson said. “It’s very important that that part works. So you test and test and test.”
    Updating any computer code calls for a number of interlocking steps:
    First, scientists must translate a set of mathematical equations describing the plasma into a programming language that a computer can understand;
    Next, scientists must determine the mathematical steps needed to solve the equations;
    Finally, the scientists must verify that the code produces correct results, either by comparing the results with those produced by a code that has already been verified or using the code to solve simple equations whose answers are easy to check.
    Hudson and colleagues performed the calculations with widely different methods. They used pencil and paper to determine the equations and solution steps, and powerful PPPL computers to verify the results. “We demonstrated that the code works,” Hudson said. “Now it can be used to study current experiments and design new ones.”
    Collaborators on the paper include researchers at the Max Planck Institute for Plasma Physics, the Australian National University, and the Swiss École Polytechnique Fédérale de Lausanne. The research was supported by the DOE’s Office of Science (Fusion Energy Sciences), the Euratom research and training program, the Australian Research Council, and the Simons Foundation.

    Story Source:
    Materials provided by DOE/Princeton Plasma Physics Laboratory. Original written by Raphael Rosen. Note: Content may be edited for style and length. More

  • in

    Virtual imaging trials optimize CT, radiography for COVID-19

    An open-access article in ARRS’ American Journal of Roentgenology (AJR) established a foundation for the use of virtual imaging trials in effective assessment and optimization of CT and radiography acquisitions and analysis tools to help manage the coronavirus disease (COVID-19) pandemic.
    Virtual imaging trials have two main components–representative models of targeted subjects and realistic models of imaging scanners–and the authors of this AJR article developed the first computational models of patients with COVID-19, while showing, as proof of principle, how they can be combined with imaging simulators for COVID-19 imaging studies.
    “For the body habitus of the models,” lead author Ehsan Abadi explained, “we used the 4D extended cardiac-torso (XCAT) model that was developed at Duke University.”
    Abadi and his Duke colleagues then segmented the morphologic features of COVID-19 abnormalities from 20 CT images of patients with multidiagnostic confirmation of SARS-CoV-2 infection and incorporated them into XCAT models.
    “Within a given disease area, the texture and material of the lung parenchyma in the XCAT were modified to match the properties observed in the clinical images,” Abadi et al. continued.
    Using a specific CT scanner (Definition Flash, Siemens Healthineers) and validated radiography simulator (DukeSim) to help illustrate utility, the team virtually imaged three developed COVID-19 computational phantoms.
    “Subjectively,” the authors concluded, “the simulated abnormalities were realistic in terms of shape and texture,” adding their preliminary results showed that the contrast-to-noise ratios in the abnormal regions were 1.6, 3.0, and 3.6 for 5-, 25-, and 50-mAs images, respectively.
     
     

    Story Source:
    Materials provided by American Roentgen Ray Society. Note: Content may be edited for style and length. More

  • in

    Building mechanical memory boards using origami

    The ancient Japanese art of paper folding, known as origami, can be used to create mechanical, binary switches.
    In Applied Physics Letters, by AIP Publishing, researchers report the fabrication of such a paper device using a particular origami pattern known as the Kresling pattern. This device can act as a mechanical switch.
    By putting several of these together on a single platform, the investigators built a functioning mechanical memory board.
    Origami structures can be either rigid or nonrigid. For the first type, only the creases between panels of paper can deform, but the panels stay fixed. In nonrigid origami, however, the panels themselves can deform.
    The Kresling pattern is an example of nonrigid origami. Folding a piece of paper using this pattern generates a bellowslike structure that can flip between one orientation and another. The bellows act as a type of spring and can be controlled by vibrating a platform that holds the bellows. This creates a switch, which the investigators refer to as a Kresling-inspired mechanical switch, or KIMS.
    The researchers found that oscillating a platform holding the KIMS up and down at a certain speed will cause it to flip, or switch, between its two stable states. They used an electrodynamic shaker to provide controlled movements of the base and monitored the upper surface of the KIMS using a laser. In this way, they were able to map out and analyze the basic physics that underlies the switching behavior.
    “We used the Kresling origami pattern to also develop a cluster of mechanical binary switches,” author Ravindra Masana said. “These can be forced to transition between two different static states using a single controlled input in the form of a harmonic excitation applied at the base of the switch.”
    The group first considered a 2-bit memory board created by placing two KIMS units on a single platform. Because each KIMS bit has two stable states, four distinct states identified as S00, S01, S10 and S11 can be obtained. Oscillations of the platform will cause switching between these four stable states. This proof of concept with just two bits could be extended to multiple KIMS units, creating a type of mechanical memory.
    “Such switches can be miniaturized,” said Mohammed Daqaq, one of the authors and the director of the Laboratory of Applied Nonlinear Dynamics at NYU Abu Dhabi. “Instead of using a bulky electrodynamic shaker for actuation, the memory board can then be actuated using scalable piezoelectric and graphene actuators.”
    Miniaturized origami memory boards should have wide applicability and hold great promise for future device development.

    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Computer modeling used to predict reef health

    A UBC Okanagan researcher has developed a way to predict the future health of the planet’s coral reefs.
    Working with scientists from Australia’s Flinders’ University and privately-owned research firm Nova Blue Environment, biology doctoral student Bruno Carturan has been studying the ecosystems of the world’s endangered reefs.
    “Coral reefs are among the most diverse ecosystems on Earth and they support the livelihoods of more than 500 million people,” says Carturan. “But coral reefs are also in peril. About 75 per cent of the world’s coral reefs are threatened by habitat loss, climate change and other human-caused disturbances.”
    Carturan, who studies resilience, biodiversity and complex systems under UBCO Professors Lael Parrott and Jason Pither, says nearly all the world’s reefs will be dangerously affected by 2050 if no effective measures are taken.
    There is hope, however, as he has determined a way to examine the reefs and explore why some reef ecosystems appear to be more resilient than others. Uncovering why, he says, could help stem the losses.
    “In other ecosystems, including forests and wetlands, experiments have shown that diversity is key to resilience,” says Carturan. “With more species, comes a greater variety of form and function — what ecologists call traits. And with this, there is a greater likelihood that some particular traits, or combination of traits, help the ecosystem better withstand and bounce back from disturbances.”
    The importance of diversity for the health and stability of ecosystems has been extensively investigated by ecologists, he explains. While the consensus is that ecosystems with more diversity are more resilient and function better, the hypothesis has rarely been tested experimentally with corals.

    advertisement

    Using an experiment to recreate the conditions found in real coral reefs is challenging for several reasons — one being that the required size, timeframe and number of different samples and replicates are just unmanageable.
    That’s where computer simulation modelling comes in.
    “Technically called an ‘agent-based model’, it can be thought of as a virtual experimental arena that enables us to manipulate species and different types of disturbances, and then examine their different influences on resilience in ways that are just not feasible in real reefs,” explains Carturan.
    In his simulation arena, individual coral colonies and algae grow, compete with one another, reproduce and die. And they do all this in realistic ways. By using agent-based models — with data collected by many researchers over decades — scientists can manipulate the initial diversity of corals, including their number and identity, and see how the virtual reef communities respond to threats.
    “This is crucial because these traits are the building blocks that give rise to ecosystem structure and function. For instance, corals come in a variety of forms — from simple spheres to complex branching — and this influences the variety of fish species these reefs host, and their susceptibility to disturbances such as cyclones and coral bleaching.”
    By running simulations over and over again, the model can identify combinations that can provide the greatest resilience. This will help ecologists design reef management and restoration strategies using predictions from the model, says collaborating Flinders researcher Professor Corey Bradshaw.

    advertisement

    “Sophisticated models like ours will be useful for coral-reef management around the world,” Bradshaw adds. “For example, Australia’s iconic Great Barrier Reef is in deep trouble from invasive species, climate change-driven mass bleaching and overfishing.”
    “This high-resolution coral ‘video game’ allows us to peek into the future to make the best possible decisions and avoid catastrophes.”
    The research, supported by grants from the Natural Sciences and Engineering Research Council of Canada and the Canada Foundation for Innovation, was published recently in eLife. More

  • in

    Faster, more efficient energy storage could stem from holistic study of layered materials

    A team led by the Department of Energy’s Oak Ridge National Laboratory developed a novel, integrated approach to track energy-transporting ions within an ultra-thin material, which could unlock its energy storage potential leading toward faster charging, longer lasting devices.
    Scientists have for a decade studied the energy-storing possibilities of an emerging class of two-dimensional materials — those constructed in layers that are only a few atoms thick — called MXenes, pronounced “max-eens.”
    The ORNL-led team integrated theoretical data from computational modeling of experimental data to pinpoint potential locations of a variety of charged ions in titanium carbide, the most studied MXene phase. Through this holistic approach, they could track and analyze the ions’ motion and behavior from the single-atom to the device scale.
    “By comparing all the methods we employed, we were able to form links between theory and different types of materials characterization, ranging from very simple to very complex over a wide range of length and time scales,” said Nina Balke, ORNL co-author of the published study that was conducted within the Fluid Interface Reactions, Structures and Transport, or FIRST, Center. FIRST is a DOE-funded Energy Frontier Research Center located at ORNL.
    “We pulled all those links together to understand how ion storage works in layered MXene electrodes,” she added. The study’s results allowed the team to predict the material’s capacitance, or its ability to store energy. “And, in the end, after much discussion, we were able to unify all these techniques into one cohesive picture, which was really cool.”
    Layered materials can enhance energy stored and power delivered because the gaps between the layers allow charged particles, or ions, to move freely and quickly. However, ions can be difficult to detect and characterize, especially in a confined environment with multiple processes at play. A better understanding of these processes can advance the energy storage potential of lithium-ion batteries and supercapacitors.
    As a FIRST center project, the team focused on the development of supercapacitors — devices that charge quickly for short-term, high-power energy needs. In contrast, lithium-ion batteries have a higher energy capacity and provide electrical power longer, but the rates of discharge, and therefore their power levels, are lower.
    MXenes have the potential to bridge the benefits of these two concepts, Balke said, which is the overarching goal of fast-charging devices with greater, more efficient energy storage capacity. This would benefit a range of applications from electronics to electric vehicle batteries.
    Using computational modeling, the team simulated the conditions of five different charged ions within the layers confined in an aqueous solution, or “water shell.” The theoretical model is simple, but combined with experimental data, it created a baseline that provided evidence of where the ions within the MXene layers went and how they behaved in a complex environment.
    “One surprising outcome was we could see, within the simulation limits, different behavior for the different ions,” said ORNL theorist and co-author Paul Kent.
    The team hopes their integrated approach can guide scientists toward future MXene studies. “What we developed is a joint model. If we have a little bit of data from an experiment using a certain MXene, and if we knew the capacitance for one ion, we can predict it for the other ones, which is something that we weren’t able to do before,” Kent said.
    “Eventually, we’ll be able to trace those behaviors to more real-world, observable changes in the material’s properties,” he added. More