More stories

  • in

    Southern Ocean storms cause outgassing of carbon dioxide

    Storms over the waters around Antarctica drive an outgassing of carbon dioxide into the atmosphere, according to a new international study with researchers from the University of Gothenburg. The research group used advanced ocean robots for the study, which provides a better understanding of climate change and can lead to better global climate models.
    The world’s southernmost ocean, the Southern Ocean that surrounds Antarctica, plays an important role in the global climate because its waters contain large amounts of carbon dioxide. A new international study, in which researchers from the University of Gothenburg participated, has examined the complex processes driving air-sea fluxes of gasses, such as carbon dioxide.
    Storms bring carbon dioxide-rich waters to the surface
    The research group is now delivering new findings that shed light on the area’s important role in climate change.
    “We show how the intense storms that often occur in the region increase ocean mixing and bring carbon dioxide-rich waters from the deep to the surface. This drives an outgassing of carbon dioxide from the ocean to the atmosphere. There has been a lack of knowledge about these complex processes, so the study is an important key to understanding the Southern Ocean’s significance for the climate and the global carbon budget,” says Sebastiaan Swart, professor of oceanography at the University of Gothenburg and co-author of the study.
    Facilitates better climate models
    Half of all carbon dioxide bound in the world’s oceans is found in the Southern Ocean. At the same time, climate change is expected to result in more intense storms in the future. Therefore, it is vital to understand the storms’ impact on the outgassing of carbon dioxide into the atmosphere, the researchers point out.
    “This knowledge is necessary to be able to make more accurate predictions about future climate change. Currently, these environmental processes are not captured by global climate models,” says Marcel du Plessis at the University of Gothenburg, who also participated in the study.
    Pioneering ocean robotics
    Measuring the inaccessible and stormy waters around Antarctica for a long period of time is a real challenge, which the researchers tackled with the help of unique robot technology. For several months, autonomous ocean robots; drones and ocean gliders, collected data from the surface and through to depths of one kilometer.
    “This pioneering technology gave us the opportunity to collect data with long endurance, which would not have been possible via a research vessel. Thanks to these ocean robots we can now fill important knowledge gaps and gain a better understanding of the importance of the ocean for the climate, says Sebastiaan Swart.
    The contributions to the study from University of Gothenburg have been supported by the Knut and Alice Wallenberg Foundation through the Wallenberg Academy Fellows Program and the Swedish Research Council.
    Story Source:
    Materials provided by University of Gothenburg. Original written by Ulrika Ernström. Note: Content may be edited for style and length. More

  • in

    Studying the Big Bang with artificial intelligence

    It could hardly be more complicated: tiny particles whir around wildly with extremely high energy, countless interactions occur in the tangled mess of quantum particles, and this results in a state of matter known as “quark-gluon plasma.” Immediately after the Big Bang, the entire universe was in this state; today it is produced by high-energy atomic nucleus collisions, for example at CERN.
    Such processes can only be studied using high-performance computers and highly complex computer simulations whose results are difficult to evaluate. Therefore, using artificial intelligence or machine learning for this purpose seems like an obvious idea. Ordinary machine-learning algorithms, however, are not suitable for this task. The mathematical properties of particle physics require a very special structure of neural networks. At TU Wien (Vienna), it has now been shown how neural networks can be successfully used for these challenging tasks in particle physics.
    Neural networks
    “Simulating a quark-gluon plasma as realistically as possible requires an extremely large amount of computing time,” says Dr. Andreas Ipp from the Institute for Theoretical Physics at TU Wien. “Even the largest supercomputers in the world are overwhelmed by this.” It would therefore be desirable not to calculate every detail precisely, but to recognise and predict certain properties of the plasma with the help of artificial intelligence.
    Therefore, neural networks are used, similar to those used for image recognition: Artificial “neurons” are linked together on the computer in a similar way to neurons in the brain — and this creates a network that can recognise, for example, whether or not a cat is visible in a certain picture.
    When applying this technique to the quark-gluon plasma, however, there is a serious problem: the quantum fields used to mathematically describe the particles and the forces between them can be represented in various different ways. “This is referred to as gauge symmetries,” says Ipp. “The basic principle behind this is something we are familiar with: if I calibrate a measuring device differently, for example if I use the Kelvin scale instead of the Celsius scale for my thermometer, I get completely different numbers, even though I am describing the same physical state. It’s similar with quantum theories — except that there the permitted changes are mathematically much more complicated.” Mathematical objects that look completely different at first glance may in fact describe the same physical state.
    Gauge symmetries built into the structure of the network
    “If you don’t take these gauge symmetries into account, you can’t meaningfully interpret the results of the computer simulations,” says Dr. David I. Müller. “Teaching a neural network to figure out these gauge symmetries on its own would be extremely difficult. It is much better to start out by designing the structure of the neural network in such a way that the gauge symmetry is automatically taken into account — so that different representations of the same physical state also produce the same signals in the neural network,” says Müller. “That is exactly what we have now succeeded in doing: We have developed completely new network layers that automatically take gauge invariance into account.” In some test applications, it was shown that these networks can actually learn much better how to deal with the simulation data of the quark-gluon plasma.
    “With such neural networks, it becomes possible to make predictions about the system — for example, to estimate what the quark-gluon plasma will look like at a later point in time without really having to calculate every single intermediate step in time in detail,” says Andreas Ipp. “And at the same time, it is ensured that the system only produces results that do not contradict gauge symmetry — in other words, results which make sense at least in principle.”
    It will be some time before it is possible to fully simulate atomic core collisions at CERN with such methods, but the new type of neural networks provides a completely new and promising tool for describing physical phenomena for which all other computational methods may never be powerful enough.
    Story Source:
    Materials provided by Vienna University of Technology. Note: Content may be edited for style and length. More

  • in

    A soft, stretchable thermometer

    The next generation of soft robotics, smart clothing and biocompatible medical devices are going to need integrated soft sensors that can stretch and twist with the device or wearer. The challenge: most of the components used in traditional sensing are rigid.
    Now, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a soft, stretchable, self-powered thermometer that can be integrated into stretchable electronics and soft robots.
    “We have developed soft temperature sensors with high sensitivity and quick response time, opening new possibilities to create new human-machine interfaces and soft robots in healthcare, engineering and entertainment,” said Zhigang Suo, the Allen E. and Marilyn M. Puckett Professor of Mechanics and Materials at SEAS and senior author of the paper.
    The research is published in the Proceedings of the National Academy of Sciences.
    The thermometer consists of three simple parts: an electrolyte, an electrode, and a dielectric material to separate the two. The electrolyte/dielectric interface accumulates ions while the dielectric/electrode interface accumulates electrons. The charge imbalance between the two sets up an ionic cloud in the electrolyte. When the temperature changes, the ionic cloud changes thickness and a voltage is generated. The voltage is sensitive to temperature, but insensitive to stretch.
    “Because the design is so simple, there are so many different ways to customize the sensor, depending on the application,” said Yecheng Wang, a postdoctoral fellow at SEAS and first author of the paper. “You can choose different materials, arranged in different ways and optimized for different tasks.”
    By arranging the electrolyte, dielectric, and electrode in different configurations, the researchers developed four designs for the temperature sensor. In one test, they integrated the sensor into a soft gripper and measured the temperature of a hot hard boiled egg. The sensors are more sensitive than traditional thermoelectric thermometers and can respond to changes in temperature within about 10 milliseconds.
    “We demonstrated that these sensors can be made small, stable, and even transparent,” said Wang.
    Depending on the materials used, the thermometer can measure temperatures upwards of 200 degrees Celsius or as cold as -100 degrees Celsius.
    “This highly customizable platform could usher in new developments to enable and improve the internet of everything and everyone,” said Suo.
    The research was co-authored by Kun Jia, Shuwen Zhang, Hyeong Jun Kim, Yang Bai and Ryan C. Hayward. The research was supported in part by the National Science Foundation through the Harvard University Materials Research Science and Engineering Center under grant DMR2011754.
    Video of stretchable thermometer: https://youtu.be/AJN6OTZAe14 More

  • in

    Artificial intelligence identifies individuals at risk for heart disease complications

    For the first time, University of Utah Health scientists have shown that artificial intelligence could lead to better ways to predict the onset and course of cardiovascular disease. The researchers, working in conjunction with physicians from Intermountain Primary Children’s Hospital, developed unique computational tools to precisely measure the synergistic effects of existing medical conditions on the heart and blood vessels.
    The researchers say this comprehensive approach could help physicians foresee, prevent, or treat serious heart problems, perhaps even before a patient is aware of the underlying condition.
    Although the study only focused on cardiovascular disease, the researchers believe it could have far broader implications. In fact, they suggest that these findings could eventually lead to a new era of personalized, preventive medicine. Doctors would proactively contact patients to alert them to potential ailments and what can be done to alleviate the problem.
    “We can turn to AI to help refine the risk for virtually every medical diagnosis,” says Martin Tristani-Firouzi, M.D. the study’s corresponding author and a pediatric cardiologist at U of U Health and Intermountain Primary Children’s Hospital, and scientist at the Nora Eccles Harrison Cardiovascular Research and Training Institute. “The risk of cancer, the risk of thyroid surgery, the risk of diabetes — any medical term you can imagine.”
    The study appears in the online journal PLOS Digital Health.
    Current methods for calculating the combined effects of various risk factors — such as demographics and medical history — on cardiovascular disease are often imprecise and subjective, according to Mark Yandell, Ph.D., senior author of the study, a professor of human genetics, H.A. and Edna Benning Presidential Endowed Chair at U of U Health, and co-founder of Backdrop Health. As a result, these methods fail to identify certain interactions that could have profound effects on the health of the heart and blood vessels. More

  • in

    New way of gaining quantum control from loss

    Researchers at the Hong Kong University of Science and Technology (HKUST) have demonstrated a new way to control the quantum state through the loss of particles — a process that is usually avoided in the quantum device, offering a new way towards the realization of unprecedented quantum states.
    Manipulating a quantum system requires a subtle control of quantum state with zero imperfect operations, otherwise the useful information encoded in the quantum states is scrambled. One of the most common detrimental processes is the loss of particles that consist of the system. This issue has long been seen as an enemy of quantum control and was avoided through the isolation of the system. But now, researchers at the HKUST have discovered a way that could gain quantum control from loss in an atomic quantum system.
    The finding was published recently in Nature Physics.
    Prof. Gyu-Boong JO, lead researcher of the study and Hari Harilela Associate Professor of Physics at HKUST, said the result demonstrated loss as a potential knob for the quantum control.
    “The textbook taught us that in quantum mechanics, the system of interest will not suffer from a loss of particles as it is well isolated from the environment,” said Prof. Jo. “However, an open system — ranging from classical to quantum ones, is ubiquitous. Such open systems, effectively described by non-Hermitian physics, exhibit various counter-intuitive phenomena that cannot be observed in the Hermitian system.”
    The idea of non-Hermitian physics with loss has been actively examined in classical systems, but such counter-intuitive phenomena were only recently realised and observed in genuine quantum systems. In the study, HKUST researchers adjusted the systems’ parameters such that they sweep out a closed loop around a special point — also known as an exceptional point occurring in the non-Hermitian system. It was discovered that the direction of the loop (i.e. whether it goes clockwise or anti-clockwise) determines the final quantum state.
    Jensen LI, Professor of Physics at HKUST and the other leader of the team, said, “This chiral behavior of a directional quantum state transferring around an exceptional point can be an important ingredient in quantum control. We are at the starting point in controlling non-Hermitian quantum systems.”
    Another implication of the findings is how two seemingly unrelated mechanisms: non-Hermitian physics (induced by loss) and spin-orbit coupling, interplay. Spin-orbit coupling (SOC) is an essential mechanism behind intriguing quantum phenomena such as topological insulator, which behaves as an insulator in its interior but whose surface flow electrons act like a conductor.
    Despite the major advances in non-Hermitian physics, an SOC mechanism is only widely studied in Hermitian systems, much less is known experimentally on the major role played by the loss in spin-orbit-coupled quantum systems. The better understanding of such non-Hermitian SOC is of paramount importance to the development of novel materials, but it remains elusive in the area of condensed matter physics.
    In this work however, researchers realized for the first time a dissipative spin-orbit-coupled system for ultracold atoms, fully characterizing its quantum state and demonstrating chiral quantum control in the context of non-Hermitian physics. This finding sets the stage for future exploration of spin-orbit coupling physics in the non-Hermitian regime, and highlights the remarkable capabilities of non-Hermitian quantum systems to realize, characterize, and harness two fundamental mechanisms, namely loss and SOC, providing a new approach for precisely simulating such competing mechanisms in a highly controllable quantum simulator with ultracold atoms.
    The research was funded by the Research Grants Council of Hong Kong, the Croucher Foundation, and Harilela Foundation. More

  • in

    New software may help neurology patients capture clinical data with their own smartphones

    New pose estimation software has the potential to help neurologists and their patients capture important clinical data using simple tools such as smartphones and tablets, according to a study by Johns Hopkins Medicine, the Kennedy Krieger Institute and the University of Maryland. Human pose estimation is a form of artificial intelligence that automatically detects and labels specific landmarks on the human body, such as elbows and fingers, from simple images or videos.
    To measure the speed, rhythm and range of a patient’s motor function, neurologists will often have the patient perform certain repetitive movements, such as tapping fingers or opening and closing hands. An objective assessment of these tests provides the most accurate insight into the severity of a patient’s condition, thus better informing treatment decisions. However, objective motion capture devices are often expensive or only have the ability to measure one type of movement. Therefore, most neurologists must make subjective assessments of their patients’ motor function, usually by simply watching patients as they carry out different tasks.
    The new Hopkins-led study sought to find whether pose estimation software developed by the research team could track human motion as accurately as manual, frame-by-frame visual inspections of video recordings of patients performing movements.
    “Our goal was to develop a fast, inexpensive and easily accessible method to objectively measure a patient’s movements across multiple extremities,” says study lead author Ryan Roemmich, Ph.D., an assistant professor in the Department of Physical Medicine and Rehabilitation at the Johns Hopkins University School of Medicine and a human movement scientist at the Kennedy Krieger Institute.
    The research team had 10 healthy subjects between the ages of 24 and 33 record smartphone video of themselves performing five tasks often assigned to neurology patients during motor function assessments: finger taps, hand closures, toe taps, heel taps and hand rotations. The subjects performed each task at four different speeds. Their movements were tracked using a freely available human pose estimation algorithm, then fed into the team’s software for evaluation.
    The results showed that across all five tasks, the software accurately detected more than 96% of the movements detected by the manual inspection method. These results held up across several variables, including location, type of smartphone used and method of recording: Some subjects placed their smartphone on a stable surface and hit “record,” while others had a family member or friend hold the device.
    With encouraging results from their sample of young, healthy people, the research team’s next step is to test the software on people who require neurological care. Currently, the team is collecting a large sample of videos of people with Parkinson’s disease doing the same five motor function tasks that the healthy subjects performed.
    “We want anyone with a smartphone or tablet to be able to record video that can be successfully analyzed by their physician,” says Roemmich. “With further development of this pose estimation software, motor assessments could eventually be performed and analyzed without the patient having to leave their home.”
    Story Source:
    Materials provided by Johns Hopkins Medicine. Note: Content may be edited for style and length. More

  • in

    Social media use tied to poor physical health

    Social media use has been linked to biological and psychological indicators associated with poor physical health among college students, according to the results of a new study by a University at Buffalo researcher.
    Research participants who used social media excessively were found to have higher levels of C-reactive protein (CRP), a biological marker of chronic inflammation that predicts serious illnesses, such as diabetes, certain cancers and cardiovascular disease. In addition to elevated CRP levels, results suggest higher social media use was also related to somatic symptoms, like headaches, chest and back pains, and more frequent visits to doctors and health centers for the treatment of illness.
    “Social media use has become an integral part of many young adults’ daily lives,” said David Lee, PhD, the paper’s first author and assistant professor of communication in the UB College of Arts and Sciences. “It’s critical that we understand how engagement across these platforms contributes to physical health.”
    The findings appear in the journal Cyberpsychology, Behavior, and Social Networking.
    For decades, researchers have devoted attention to how social media engagement relates to users’ mental health, but its effects on physical health have not been thoroughly investigated. Recent surveys indicate social media usage is particularly high for people in their late teens and early 20s, a population that spends about six hours a day texting, online or using social media. And though a few studies have found links between social media usage and physical health, that research relied largely on self-reporting or the effects of usage with exclusively one platform.
    “Our goal was to extend prior work by examining how social media use across several platforms is associated with physical health outcomes measured with biological, behavioral and self-report measures,” said Lee, an expert on health outcomes related to social interactions. More

  • in

    Harnessing noise in optical computing for AI

    Artificial intelligence and machine learning are currently affecting our lives in many small but impactful ways. For example, AI and machine learning applications recommend entertainment we might enjoy through streaming services such as Netflix and Spotify.
    In the near future, it’s predicted that these technologies will have an even larger impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries.
    But the computers used for AI and machine learning demand a lot of energy. Currently, the need for computing power related to these technologies is doubling roughly every three to four months. And cloud computing data centers used by AI and machine learning applications worldwide are already devouring more electrical power per year than some small countries. It’s easy to see that this level of energy consumption is unsustainable.
    A research team led by the University of Washington has developed new optical computing hardware for AI and machine learning that is faster and much more energy efficient than conventional electronics. The research also addresses another challenge — the ‘noise’ inherent to optical computing that can interfere with computing precision.
    In a new paper, published Jan. 21 in Science Advances, the team demonstrates an optical computing system for AI and machine learning that not only mitigates this noise but actually uses some of it as input to help enhance the creative output of the artificial neural network within the system.
    “We’ve built an optical computer that is faster than a conventional digital computer,” said lead author Changming Wu, a UW doctoral student in electrical and computer engineering. “And also, this optical computer can create new things based on random inputs generated from the optical noise that most researchers tried to evade.”
    Optical computing noise essentially comes from stray light particles, or photons, that originate from the operation of lasers within the device and background thermal radiation. To target noise, the researchers connected their optical computing core to a special type of machine learning network, called a Generative Adversarial Network. More