More stories

  • in

    Placing cosmological constraints on quantum gravity phenomenology

    A description of gravity compatible with the principles of quantum mechanics has long been a widely pursued goal in physics. Existing theories of this ‘quantum gravity’ often involve mathematical corrections to Heisenberg’s Uncertainty Principle (HUP), which quantifies the inherent limits in the accuracy of any quantum measurement. These corrections arise when gravitational interactions are considered, leading to a ‘Generalized Uncertainty Principle’ (GUP). Two specific GUP models are often used: the first modifies the HUP with a linear correction, while the second introduces a quadratic one. Through new research published in EPJ C, Serena Giardino and Vincenzo Salzano at the University of Szczecin in Poland have used well-established cosmological observations to place tighter constraints on the quadratic model, while discrediting the linear model.
    The GUP can influence the black hole evaporation process first described by Stephen Hawking, and may also lead to better understanding of the relationship between thermodynamics and gravity. Intriguingly, the GUP also places a lower limit on length scales that are possible to probe — below the so-called ‘Planck length,’ any concentration of energy would collapse under gravity to form a black hole. Previously, both the linear and quadratic GUP models were rigorously tested by comparing their predictions with data gathered in quantum experiments, placing stringent limits on their parameters.
    In their study, Giardino and Salzano instead compared the predictions of GUP-influenced models of the universe with observations of cosmological phenomena, including supernovae and cosmic microwave background radiation. These comparisons were not widely made in the past, since the constraints they imposed on the GUP parameters were believed to be far weaker than those possible in quantum experiments. However, the researchers’ analysis revealed that stricter bounds could be imposed on the quadratic model, comparable to those placed by some quantum experiments. In addition, they showed that the linear correction to the HUP generally could not account for the observed data. Ultimately, these results highlight the promising role of cosmological observations in constraining the phenomenology of quantum gravity.

    Story Source:
    Materials provided by Springer. Note: Content may be edited for style and length. More

  • in

    Quantum effects help minimize communication flaws

    Among the most active fields of research in modern physics, both at an academic level and beyond, are quantum computation and communication, which apply quantum phenomena such as superposition and entanglement to perform calculations, or to exchange information. A number of research groups around the world have built quantum devices that are able to perform calculations faster than any classical computer. Yet, there is still a long way to go before these devices can be converted into marketable quantum computers. One reason for this is that both quantum computation and quantum communication are strongly deteriorated by the ease with which a quantum superposition state can be destroyed, or entanglement between two or more quantum particles can be lost.
    The primary approach to overcome these limitations is the application of so-called quantum error-correcting codes. This, however, requires an amount of resources exceeding that which can be currently achieved in a controlled way. While, in the long run, error correction is likely to become an integral part of future quantum devices, a complementary approach is to mitigate the noise — that is, the cumulative effect of uncorrected errors — without relying on so many additional resources. These are referred to as noise reduction schemes.
    Noise mitigation without additional resources through simple quantum schemes
    A new approach along this research line was recently proposed to reduce noise in a communication scheme between two parties. Imagine two parties who want to communicate by exchanging a quantum particle, yet the particle has to be sent over some faulty transmission lines.
    Recently, a team of researchers at Hong-Kong University proposed that an overall reduction in noise could be achieved by directing the particle along a quantum superposition of paths through regions of noise in opposite order. In particular, while classically a particle can only travel along one path, in quantum mechanics it can move along multiple paths at once. If one uses this property to send the particle along two quantum paths, one can, for instance, lead the particle across the noisy regions in opposite order simultaneously. This effect had been demonstrated experimentally by two independent research investigations.
    These results suggested that, to achieve this noise reduction, it is necessary to place the noisy transmission lines in a quantum superposition of opposite orders. Shortly after this, research groups in Vienna and in Grenoble realised that this effect can also be achieved via simpler configurations, which can even completely eliminate the noise between the two parties.
    All of these schemes have now been implemented experimentally and compared with each other by a research team led by Philip Walther at the University of Vienna. In this work, different ways of passing through two noisy regions in quantum superposition are compared for a variety of noise types. The experimental results are also supported with numerical simulations to extend the study to more generic types of noise. Surprisingly, it is found that the simplest schemes for quantum superposition of noisy channels also offer the best reduction of the noise affecting communication.
    “Error correction in modern quantum technologies is among the most pressing needs of current quantum computation and communication schemes. Our work shows that, at least in the case of quantum communication, already with the technologies currently in use it may be possible to mitigate this issue with no need for additional resources,” says Giulia Rubino, first author of the publication in Physical Review Research. The ease of the demonstrated technique allows immediate use in current long-distance communications, and promises potential further applications in quantum computation and quantum thermodynamics.

    Story Source:
    Materials provided by University of Vienna. Note: Content may be edited for style and length. More

  • in

    Virtual reality helping to treat fear of heights

    Researchers from the University of Basel have developed a virtual reality app for smartphones to reduce fear of heights. Now, they have conducted a clinical trial to study its efficacy. Trial participants who spent a total of four hours training with the app at home showed an improvement in their ability to handle real height situations.
    Fear of heights is a widespread phenomenon. Approximately 5% of the general population experiences a debilitating level of discomfort in height situations. However, the people affected rarely take advantage of the available treatment options, such as exposure therapy, which involves putting the person in the anxiety-causing situation under the guidance of a professional. On the one hand, people are reluctant to confront their fear of heights. On the other hand, it can be difficult to reproduce the right kinds of height situations in a therapy setting.
    This motivated the interdisciplinary research team led by Professor Dominique de Quervain to develop a smartphone-based virtual reality exposure therapy app called Easyheights. The app uses 360° images of real locations, which the researchers captured using a drone. People can use the app on their own smartphones together with a special virtual reality headset.
    Gradually increasing the height
    During the virtual experience, the user stands on a platform that is initially one meter above the ground. After allowing acclimatization to the situation for a certain interval, the platform automatically rises. In this way, the perceived distance above the ground increases slowly but steadily without an increase in the person’s level of fear.
    The research team studied the efficacy of this approach in a randomized, controlled trial and published the results in the journal NPJ Digital Medicine. Fifty trial participants with a fear of heights either completed a four-hour height training program (one 60-minute session and six 30-minute sessions over the course of two weeks) using virtual reality, or were assigned to the control group, which did not complete these training sessions.
    Before and after the training phase — or the same period of time without training — the trial participants ascended the Uetliberg lookout tower near Zurich as far as their fear of heights allowed them. The researchers recorded the height level reached by the participants along with their subjective fear level at each level of the tower. At the end of the trial, the researchers evaluated the results from 22 subjects who completed the Easyheights training and 25 from the control group.
    The group that completed the training with the app exhibited less fear on the tower and was able to ascend further towards the top than they could before completing the training. The control group exhibited no positive changes. The efficacy of the Easyheights training proved comparable to that of conventional exposure therapy.
    Therapy in your own living room
    Researchers have already been studying the use of virtual reality for treating fear of heights for more than two decades. “What is new, however, is that smartphones can be used to produce the virtual scenarios that previously required a technically complicated type of treatment, and this makes it much more accessible,” explains Dr. Dorothée Bentz, lead author of the study.
    The results from the study suggest that the repeated use of a smartphone-based virtual reality exposure therapy can greatly improve the behavior and subjective state of well-being in height situations. People who suffer from a mild fear of heights will soon be able to download the free app from major app stores and complete training sessions on their own. However, the researchers recommend that people who suffer from a serious fear of heights only use the app with the supervision of a professional.
    The current study is one of several projects in progress at the Transfaculty Research Platform for Molecular and Cognitive Neurosciences, led by Professor Andreas Papassotiropoulos and Professor Dominique de Quervain. Their goal is to improve the treatment of mental disorders through the use of new technologies and to make these treatments widely available.

    Story Source:
    Materials provided by University of Basel. Note: Content may be edited for style and length. More

  • in

    A drop in CFC emissions puts the hole in the ozone layer back on track to closing

    Good news for the ozone layer: After a recent spike in CFC-11 pollution, emissions of this ozone-destroying chemical are on the decline.
    Emissions of trichlorofluoromethane, or CFC-11, were supposed to taper off after the Montreal Protocol banned CFC-11 production in 2010 (SN: 7/7/90). But 2014 to 2017 saw an unexpected bump. About half of that illegal pollution was pegged to eastern China (SN: 5/22/19). Now, atmospheric data show that global CFC-11 emissions in 2019 were back down to the average levels seen from 2008 to 2012, and about 60 percent of that decline was due to reduced emissions in eastern China, two teams report online February 10 in Nature. 
    These findings suggest that the hole in Earth’s ozone layer is still on track to close up within the next 50 years — rather than being delayed, as it would have been if CFC-11 emissions had remained at the levels seen from 2014 to 2017 (SN: 12/14/16).
    One group analyzed the concentration of CFC-11, used to make insulating foams for buildings and household appliances, in the air above atmospheric monitoring stations around the globe. The team found that the world emitted about 52,000 metric tons of CFC-11 in 2019 — a major drop from the annual average of 69,000 metric tons from 2014 to 2018. The 2019 emissions were comparable to the average annual emissions from 2008 to 2012, Stephen Montzka, an atmospheric chemist at the U.S. National Oceanic and Atmospheric Administration in Boulder, Colo., and colleagues report.

    The new measurements imply that there has been a significant decrease in illicit CFC-11 production within the last couple of years, the researchers say, probably thanks to more rigorous regulation enforcement in China and elsewhere.
    Another group confirmed that emissions from eastern China have diminished since 2018 by analyzing air samples from Hateruma, Japan and Gosan, South Korea. The region emitted about 5,000 metric tons of CFC-11 in 2019, which was about 10,000 metric tons less than its average annual emissions from 2014 to 2017 and was similar to the 2008 to 2012 average. That analysis was led by Sunyoung Park, a geochemist at Kyungpook National University in Daegu, South Korea.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    The recent downturn in CFC-11 pollution shows that “the Montreal Protocol is working,” says A.R. “Ravi” Ravishankara, an atmospheric scientist at Colorado State University in Fort Collins not involved in either study. When someone violates the treaty, “atmospheric sleuthing” can uncover the culprits and spur countries to take action, he says. “China clearly took action, because you can see the result of that action in the atmosphere.” 
    Montzka cautions that it might not always be so easy to point the finger at rogue emitters. “I think we got lucky this time,” he says, because atmospheric monitoring sites in Asia were able to trace the bulk of illegal emissions to eastern China and monitor the situation over several years. Many places around the world, such as in Africa and South America, lack atmospheric monitoring stations — so it’s still a mystery which countries besides China were responsible for the recent rise and fall of CFC-11 emissions. More

  • in

    Emerging robotics technology may lead to better buildings in less time

    Emerging robotics technology may soon help construction companies and contractors create buildings in less time at higher quality and at lower costs.
    Purdue University innovators developed and are testing a novel construction robotic system that uses an innovative mechanical design with advances in computer vision sensing technology to work in a construction setting.
    The technology was developed with support from the National Science Foundation.
    “Our work helps to address workforce shortages in the construction industry by automating key construction operations,” said Jiansong Zhang, an assistant professor of construction management technology in the Purdue Polytechnic Institute. “On a construction site, there are many unknown factors that a construction robot must be able to account for effectively. This requires much more advanced sensing and reasoning technologies than those commonly used in a manufacturing environment.”
    The Purdue team’s custom end effector design allows for material to be both placed and fastened in the same operation using the same arm, limiting the amount of equipment that is required to complete a given task.
    Computer vision algorithms developed for the project allow the robotic system to sense building elements and match them to building information modeling (BIM) data in a variety of environments, and keep track of obstacles or safety hazards in the system’s operational context.
    “By basing the sensing for our robotic arm around computer vision technology, rather than more limited-scope and expensive sensing systems, we have the capability to complete many sensing tasks with a single affordable sensor,” Zhang said. “This allows us to implement a more robust and versatile system at a lower cost.”
    Undergraduate researchers in Zhang’s Automation and Intelligent Construction (AutoIC) Lab helped create this robotic technology.
    The innovators worked with the Purdue Research Foundation Office of Technology Commercialization to patent the technology.
    This work will be featured at OTC’s 2021 Technology Showcase: The State of Innovation. The annual showcase, being held virtually this year Feb. 10-11, will feature novel innovations from inventors at Purdue and across the state of Indiana.

    Story Source:
    Materials provided by Purdue University. Original written by Chris Adam. Note: Content may be edited for style and length. More

  • in

    ‘Designer molecules’ could create tailor-made quantum devices

    Quantum bits made from “designer molecules” are coming into fashion. By carefully tailoring the composition of molecules, researchers are creating chemical systems suited to a variety of quantum tasks.
    “The ability to control molecules … makes them just a beautiful and wonderful system to work with,” said Danna Freedman, a chemist at Northwestern University in Evanston, Ill. “Molecules are the best.” Freedman described her research February 8 at the annual meeting of the American Association for the Advancement of Science, held online.
    Quantum bits, or qubits, are analogous to the bits found in conventional computers. But rather than existing in a state of either 0 or 1, as standard bits do, qubits can possess both values simultaneously, enabling new types of calculations impossible for conventional computers.
    Besides their potential use in quantum computers, molecules can also serve as quantum sensors, devices that can make extremely sensitive measurements, such as sussing out minuscule electromagnetic forces (SN: 3/23/18).

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    In Freedman and colleagues’ qubits, a single chromium ion, an electrically charged atom, sits at the center of the molecule. The qubit’s value is represented by that chromium ion’s electronic spin, a measure of the angular momentum of its electrons. Additional groups of atoms are attached to the chromium; by swapping out some of the atoms in those groups, the researchers can change the qubit’s properties to alter how it functions.
    Recently, Freedman and colleagues crafted molecules to fit one particular need: molecular qubits that respond to light. Lasers can set the values of the qubits and help read out the results of calculations, the researchers reported in the Dec. 11 Science. Another possibility might be to create molecules that are biocompatible, Freedman says, so they can be used for sensing conditions inside living tissue.
    Molecules have another special appeal: All of a given type are exactly the same. Many types of qubits are made from bits of metal or other material deposited on a surface, resulting in slight differences between qubits on an atomic level. But using chemical techniques to build up molecules atom by atom means the qubits are identical, making for better-performing devices. “That’s something really powerful about the bottom-up approach that chemistry affords,” said Freedman.
    Scientists are already using individual atoms and ions in quantum devices (SN: 6/29/17), but molecules are more complicated to work with, thanks to their multiple constituents. As a result, molecules are a relatively new quantum resource, Caltech physicist Nick Hutzler said at the meeting. “People don’t even really know what you can do with [molecules] yet.… But people are discovering new things every day.” More

  • in

    Three things to know about the disastrous flood in India

    A flash flood surged down a river in India’s Himalayan Uttarakhand state on February 7, killing at least 30 people and washing away two hydroelectric power stations.
    As rescue workers search for more than 100 people who are still missing, officials and scientists are trying to unravel the causes of the sudden flood. Did a glacier high up in the mountains collapse, releasing a huge plug of frigid meltwater that spilled into the river? Or was the culprit a landslide that then triggered an avalanche? And what, if any, link might these events have to a changing climate?
    Here are three things to know about what might have caused the disaster in Uttarakhand.
    1. One possible culprit was the sudden break of a glacier high in the mountains.
    News reports in the immediate wake of the disaster suggested that the floodwaters were caused by the sudden overflow of a glacial lake high up in the mountain, an event called a glacial lake outburst flood.
    “It’s likely too early to know what exactly happened,” says Anjal Prakash, the research director of the Bharti Institute of Public Policy at the Indian School of Business in Hyderabad. Satellite images show that a section of a glacier broke off, but how that break relates to the subsequent floods is still unknown. One possibility is that the glacier was holding back a lake of meltwater, and that heavy snowfall in the region two days earlier added enough volume to the lake that the water forced its way out, breaking the glacier and surging into nearby rivers.
    This scenario is certainly in line with known hazards for the region. “These mountains are very fragile,” says Prakash, who was also a lead author on the Intergovernmental Panel on Climate Change’s 2019 special report on oceans and the cryosphere, Earth’s icy places. But, he notes, there isn’t yet much on-the-ground data to help clarify events. “The efforts are still focused on relief at the moment.”
    2. A landslide may be to blame instead.
    Other researchers contend that the disaster wasn’t caused by a glacial lake outburst flood at all. Instead, says Daniel Shugar, a geomorphologist at the University of Calgary in Canada, satellite images snapped during the disaster show the telltale marks of a landslide: a dark scar snaking through the white snow and clouds of dust clogging the air above. “You could see this train of dust in the valley, and that’s common for a very large landslide,” Shugar says.
    “WOW,” he wrote on Twitter the morning of February 7, posting side-by-side satellite shots of a dark area of possible “massive dust deposition,” contrasted against the same snowy, pristine region just the day before.

    Landslides — the sudden failure of a slope, sending a rush of rocks and sediment downhill — can be triggered by anything from an earthquake to an intense deluge of rain. In high, snowy mountains, cycles of freezing and thawing and refreezing again can also begin to break the ground apart; the ice-filled cracks can slowly widen over time, setting the stage for sudden failure, and then, disaster.
    The satellite images seem to point clearly to such a landslide, rather than a typical glacial lake overflow, Shugar says. The force of the landslide may have actually broken off that piece of hanging glacier, he says. Another line of evidence against a sudden lake burst is that “there were no lakes of any size visible” in the satellite images taken over the region.
    However, an outlying question for this hypothesis is where the floodwaters came from. It might be that one of the rivers draining down the mountain was briefly dammed by the rockfall; a sudden release of that dam could send a large plug of water from the river swiftly and disastrously downhill. “But that’s a pure guess at the moment,” Shugar says.
    3. It’s not yet clear whether climate change played a role in the disaster.
    The risk of both glacial lake outburst floods and freeze-thaw-related landslides in Asia’s high mountains has increased due to climate change. At first glance, “it was a climate event,” Prakash says. “But the data are still coming.”
    The region, which includes the Hindu Kush Himalayan mountains and the Tibetan Plateau, “has been a climate change hot spot for a pretty long time,” Prakash says. The region is often called Earth’s third pole, because the stores of ice and snow in the Himalayan watershed amount to the largest reserves of freshwater outside of the polar regions. The region is the source of 10 major river systems that provide water to almost 2 billion people.
    Climate change reports have warned that warming is not only threatening this water supply, but also increasing the likelihood of natural hazards (SN: 5/29/19). In the Intergovernmental Panel on Climate Change’s 2019 special report on oceans and the cryosphere, scientists noted that glacier retreat, melting snow and thawing permafrost are making mountain slopes more unstable and also increasing the number of glacial lakes, upping the likelihood of a sudden, catastrophic failure (SN: 9/25/19).
    A 2019 comprehensive assessment focusing on climate change’s impacts in Asia’s high mountains found that the glaciers in the region have retreated much more quickly in the last decade than was anticipated, Prakash says, “and that is alarming for us.” Here’s another way to look at it: Glaciers are retreating twice as fast as they were at the end of the 20th century (SN: 6/19/19).
    Glacier-related landslides in the region have also become increasingly common in the last decade, as the region warms and destabilizing freeze-thaw cycles within the ground occur higher and higher up on the slopes.
    But in the case of this particular disaster, Shugar says, it’s just hard to say conclusively at this point what role climate change might have played, or even what specific event might have triggered a landslide. “Sometimes there is no trigger; sometimes it’s just time,” he says. “Or it’s that we just don’t understand the trigger.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox More

  • in

    AI can predict early death risk

    Researchers at Geisinger have found that a computer algorithm developed using echocardiogram videos of the heart can predict mortality within a year.
    The algorithm — an example of what is known as machine learning, or artificial intelligence (AI) — outperformed other clinically used predictors, including pooled cohort equations and the Seattle Heart Failure score. The results of the study were published in Nature Biomedical Engineering.
    “We were excited to find that machine learning can leverage unstructured datasets such as medical images and videos to improve on a wide range of clinical prediction models,” said Chris Haggerty, Ph.D., co-senior author and assistant professor in the Department of Translational Data Science and Informatics at Geisinger.
    Imaging is critical to treatment decisions in most medical specialties and has become one of the most data-rich components of the electronic health record (EHR). For example, a single ultrasound of the heart yields approximately 3,000 images, and cardiologists have limited time to interpret these images within the context of numerous other diagnostic data. This creates a substantial opportunity to leverage technology, such as machine learning, to manage and analyze this data and ultimately provide intelligent computer assistance to physicians.
    For their study, the research team used specialized computational hardware to train the machine learning model on 812,278 echocardiogram videos collected from 34,362 Geisinger patients over the last ten years. The study compared the results of the model to cardiologists’ predictions based on multiple surveys. A subsequent survey showed that when assisted by the model, cardiologists’ prediction accuracy improved by 13 percent. Leveraging nearly 50 million images, this study represents one of the largest medical image datasets ever published.
    “Our goal is to develop computer algorithms to improve patient care,” said Alvaro Ulloa Cerna, Ph.D., author and senior data scientist in the Department of Translational Data Science and Informatics at Geisinger. “In this case, we’re excited that our algorithm was able to help cardiologists improve their predictions about patients, since decisions about treatment and interventions are based on these types of clinical predictions.”

    Story Source:
    Materials provided by Geisinger Health System. Note: Content may be edited for style and length. More