More stories

  • in

    Virtual learning may help NICU nurses recognize baby pain

    Babies younger than four weeks old, called neonates, were once thought not to perceive pain due to not-yet-fully-developed sensory systems, but modern research says otherwise, according to researchers from Hiroshima University in Japan.
    Not only do babies experience pain, but the various levels can be standardized to help nurses recognize and respond to the babies’ cues — if the nurses have the opportunity to learn the scoring tools and skills needed to react appropriately. With tight schedules and limited in-person courses available, the researchers theorized, virtual e-learning may be able to provide a path forward for nurses to independently pursue training in this area.
    To test this hypothesis, researchers conducted a pilot study of 115 nurses with varying levels of formal training and years of experience in seven hospitals across Japan. They published their results on May 27 in Advances in Neonatal Care.
    “Despite a growing body of knowledge and guidelines being published in many countries about the preventions and management of pain in neonates hospitalized in the NICU, neonatal pain remains unrecognized, undertreated, and generally challenging,” said paper author Mio Ozawa, associate professor in the Graduate School of Biomedical and Health Sciences at Hiroshima University.
    The researchers developed a comprehensive multimedia virtual program on neonatal pain management, based on selected standardized pain scales, for nursing staff to independently learn how to employ measurement tools. The program, called e-Pain Management of Neonates, is the first of its kind in Japan.
    “The aim of the study was to verify the feasibility of the program and whether e-learning actually improves nurses’ knowledge and scoring skills,” said paper author Mio Ozawa, associate professor in the Graduate School of Biomedical and Health Sciences at Hiroshima University. “The results of this study suggest that nurses could obtain knowledge and skills about the measurement of neonatal pain through e-learning.”
    The full cohort took a pre-test at the start of the study, before embarking on a self-paced, four-week e-learning program dedicated to learning standardized pain scales to measure discomfort in babies. However, only 52 nurses completed the post-test after four weeks. For those 52, scores increased across a range of years of experience and formal education.
    Ozawa noted that the sample size is small but also said that the improved test scores indicated the potential for e-learning.
    “Future research will need to go beyond the individual level to determine which benefits are produced in the management of neonatal pain in hospitals where nurses learn neonatal pain management through e-learning,” Ozawa said. “This study demonstrates that virtually delivered neonatal pain management program can be useful for nurses’ attainment of knowledge and skills for managing neonatal pain, including an appropriate use of selected scoring tools.”
    Story Source:
    Materials provided by Hiroshima University. Note: Content may be edited for style and length. More

  • in

    Seeing with radio waves

    Scientists from the Division of Physics at the University of Tsukuba used the quantum effect called “spin-locking” to significantly enhance the resolution when performing radio-frequency imaging of nitrogen-vacancy defects in diamond. This work may lead to faster and more accurate material analysis, as well as a path towards practical quantum computers.
    Nitrogen-vacancy (NV) centers have long been studied for their potential use in quantum computers. A NV center is a type of defect in the lattice of a diamond, in which two adjacent carbon atoms have been replaced with a nitrogen atom and a void. This leaves an unpaired electron, which can be detected using radio-frequency waves, because its probability of emitting a photon depends on its spin state. However, the spatial resolution of radio wave detection using conventional radio-frequency techniques has remained less than optimal.
    Now, researchers at the University of Tsukuba have pushed the resolution to its limit by employing a technique called “spin-locking.” Microwave pulses are used to put the electron’s spin in a quantum superposition of up and down simultaneously. Then, a driving electromagnetic field causes the direction of the spin to precess around, like a wobbling top. The end result is an electron spin that is shielded from random noise but strongly coupled to the detection equipment. “Spin-locking ensures high accuracy and sensitivity of the electromagnetic field imaging,” first author Professor Shintaro Nomura explains. Due to the high density of NV centers in the diamond samples used, the collective signal they produced could be easily picked up with this method. This permitted the sensing of collections of NV centers at the micrometer scale. “The spatial resolution we obtained with RF imaging was much better than with similar existing methods,” Professor Nomura continues, “and it was limited only by the resolution of the optical microscope we used.”
    The approach demonstrated in this project may be applied in a broad variety of application areas — for example, the characterizations of polar molecules, polymers, and proteins, as well as the characterization of materials. It might also be used in medical applications — for example, as a new way to perform magnetocardiography.
    This work was partly supported by a Grant-in-Aid for Scientific Research (Nos. JP18H04283, 291 JP18H01243, JP18K18726, and JP21H01009) from the Japan Society for the Promotion of 292 Science.
    Story Source:
    Materials provided by University of Tsukuba. Note: Content may be edited for style and length. More

  • in

    Computer-assisted biology: Decoding noisy data to predict cell growth

    Scientists from The University of Tokyo Institute of Industrial Science have designed a machine learning algorithm to predict the size of an individual cell as it grows and divides. By using an artificial neural network that does not impose the assumptions commonly employed in biology, the computer was able to make more complex and accurate forecasts than previously possible. This work may help advance the field of quantitative biology as well as improve the industrial production of medications or fermented products.
    As in all of the natural sciences, biology has developed mathematical models to help fit data and make predictions about the future. However, because of the inherent complexities of living systems, many of these equations rely on simplifying assumptions that do not always reflect the actual underlying biological processes. Now, researchers at The University of Tokyo Institute of Industrial Science have implemented a machine learning algorithm that can use the measured size of single cells over time to predict their future size. Because the computer automatically recognizes patterns in the data, it is not constrained like conventional methods.
    “In biology, simple models are often used based on their capacity to reproduce the measured data,” first author Atsushi Kamimura says. “However, the models may fail to capture what is really going on because of human preconceptions,.”
    The data for this latest study were collected from either an Escherichia coli bacterium or a Schizosaccharomyces pombe yeast cell held in a microfluidic channel at various temperatures. The plot of size over time looked like a “sawtooth” as exponential growth was interrupted by division events. Human biologists usually use a “sizer” model, based on the absolute size of the cell, or “adder” model, based on the increase in size since birth, to predict when divisions will occur. The computer algorithm found support for the “adder” principle, but as part of a complex web of biochemical reactions and signaling.
    “Our deep-learning neural network can effectively separate the history-dependent deterministic factors from the noise in given data,” senior author Tetsuya Kobayashi says.
    This method can be extended to many other aspects of biology besides predicting cell size. In the future, life science may be driven more by objective artificial intelligence than human models. This may lead to more efficient control of microorganisms we use to ferment products and produce drugs.
    Story Source:
    Materials provided by Institute of Industrial Science, The University of Tokyo. Note: Content may be edited for style and length. More

  • in

    Physicists take big step in race to quantum computing

    A team of physicists from the Harvard-MIT Center for Ultracold Atoms and other universities has developed a special type of quantum computer known as a programmable quantum simulator capable of operating with 256 quantum bits, or “qubits.”
    The system marks a major step toward building large-scale quantum machines that could be used to shed light on a host of complex quantum processes and eventually help bring about real-world breakthroughs in material science, communication technologies, finance, and many other fields, overcoming research hurdles that are beyond the capabilities of even the fastest supercomputers today. Qubits are the fundamental building blocks on which quantum computers run and the source of their massive processing power.
    “This moves the field into a new domain where no one has ever been to thus far,” said Mikhail Lukin, the George Vasmer Leverett Professor of Physics, co-director of the Harvard Quantum Initiative, and one of the senior authors of the study published today in the journal Nature. “We are entering a completely new part of the quantum world.”
    According to Sepehr Ebadi, a physics student in the Graduate School of Arts and Sciences and the study’s lead author, it is the combination of system’s unprecedented size and programmability that puts it at the cutting edge of the race for a quantum computer, which harnesses the mysterious properties of matter at extremely small scales to greatly advance processing power. Under the right circumstances, the increase in qubits means the system can store and process exponentially more information than the classical bits on which standard computers run.
    “The number of quantum states that are possible with only 256 qubits exceeds the number of atoms in the solar system,” Ebadi said, explaining the system’s vast size.
    Already, the simulator has allowed researchers to observe several exotic quantum states of matter that had never before been realized experimentally, and to perform a quantum phase transition study so precise that it serves as the textbook example of how magnetism works at the quantum level. More

  • in

    First study of nickelate's magnetism finds a strong kinship with cuprate superconductors

    Ever since the 1986 discovery that copper oxide materials, or cuprates, could carry electrical current with no loss at unexpectedly high temperatures, scientists have been looking for other unconventional superconductors that could operate even closer to room temperature. This would allow for a host of everyday applications that could transform society by making energy transmission more efficient, for instance.
    Nickel oxides, or nickelates, seemed like a promising candidate. They’re based on nickel, which sits next to copper on the periodic table, and the two elements have some common characteristics. It was not unreasonable to think that superconductivity would be one of them.
    But it took years of trying before scientists at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University finally created the first nickelate that showed clear signs of superconductivity.
    Now SLAC, Stanford and Diamond Light Source researchers have made the first measurements of magnetic excitations that spread through the new material like ripples in a pond. The results reveal both important similarities and subtle differences between nickelates and cuprates. The scientists published their results in Science today.
    “This is exciting, because it gives us a new angle for exploring how unconventional superconductors work, which is still an open question after 30-plus years of research,” said
    Haiyu Lu, a Stanford graduate student who did the bulk of the research with Stanford postdoctoral researcher Matteo Rossi and SLAC staff scientist Wei-Sheng Lee. More

  • in

    The pressure is off and high temperature superconductivity remains

    In a critical next step toward room-temperature superconductivity at ambient pressure, Paul Chu, Founding Director and Chief Scientist at the Texas Center for Superconductivity at the University of Houston (TcSUH), Liangzi Deng, research assistant professor of physics at TcSUH, and their colleagues at TcSUH conceived and developed a pressure-quench (PQ) technique that retains the pressure-enhanced and/or -induced high transition temperature (Tc) phase even after the removal of the applied pressure that generates this phase.
    Pengcheng Dai, professor of physics and astronomy at Rice University and his group, and Yanming Ma, Dean of the College of Physics at Jilin University, and his group contributed toward successfully demonstrating the possibility of the pressure-quench technique in a model high temperature superconductor, iron selenide (FeSe). The results were published in the journal Proceedings of the National Academy of Sciences USA.
    “We derived the pressure-quench method from the formation of the human-made diamond by Francis Bundy from graphite in 1955 and other metastable compounds,” said Chu. “Graphite turns into a diamond when subjected to high pressure at high temperatures. Subsequent rapid pressure quench, or removal of pressure, leaves the diamond phase intact without pressure.”
    Chu and his team applied this same concept to a superconducting material with promising results.
    “Iron selenide is considered a simple high-temperature superconductor with a transition temperature (Tc) for transitioning to a superconductive state at 9 Kelvin (K) at ambient pressure,” said Chu.
    “When we applied pressure, the Tc increased to ~ 40 K, more than quadrupling that at ambient, enabling us to unambiguously distinguish the superconducting PQ phase from the original un-PQ phase. We then tried to retain the high-pressure enhanced superconducting phase after removing pressure using the PQ method, and it turns out we can.”
    Dr. Chu and colleagues’ achievement brings scientists a step closer to realizing the dream of room-temperature superconductivity at ambient pressure, recently reported in hydrides only under extremely high pressure. More

  • in

    Handwriting beats typing and watching videos for learning to read

    Though writing by hand is increasingly being eclipsed by the ease of computers, a new study finds we shouldn’t be so quick to throw away the pencils and paper: handwriting helps people learn certain skills surprisingly faster and significantly better than learning the same material through typing or watching videos.
    “The question out there for parents and educators is why should our kids spend any time doing handwriting,” says senior author Brenda Rapp, a Johns Hopkins University professor of cognitive science. “Obviously, you’re going to be a better hand-writer if you practice it. But since people are handwriting less then maybe who cares? The real question is: Are there other benefits to handwriting that have to do with reading and spelling and understanding? We find there most definitely are.”
    The work appears in the journal Psychological Science.
    Rapp and lead author Robert Wiley, a former Johns Hopkins University Ph.D. student who is now a professor at the University of North Carolina, Greensboro, conducted an experiment in which 42 people were taught the Arabic alphabet, split into three groups of learners: writers, typers and video watchers.
    Everyone learned the letters one at a time by watching videos of them being written along with hearing names and sounds. After being introduced to each letter, the three groups would attempt to learn what they just saw and heard in different ways. The video group got an on-screen flash of a letter and had to say if it was the same letter they’d just seen. The typers would have to find the letter on the keyboard. The writers had to copy the letter with pen and paper.
    At the end, after as many as six sessions, everyone could recognize the letters and made few mistakes when tested. But the writing group reached this level of proficiency faster than the other groups — a few of them in just two sessions. More

  • in

    Simulations of turbulence's smallest structures

    When you pour cream into a cup of coffee, the viscous liquid seems to lazily disperse throughout the cup. Take a mixing spoon or straw to the cup, though, and the cream and coffee seem to quickly and seamlessly combine into a lighter color and, at least for some, a more enjoyable beverage.
    The science behind this relatively simple anecdote actually speaks to a larger truth about complex fluid dynamics and underpins many of the advancements made in transportation, power generation, and other technologies since the industrial era — the seemingly random chaotic motions known as turbulence play a vital role in chemical and industrial processes that rely on effective mixing of different fluids.
    While scientists have long studied turbulent fluid flows, their inherent chaotic natures have prevented researchers from developing an exhaustive list of reliable “rules,” or universal models for accurately describing and predicting turbulence. This tall challenge has left turbulence as one of the last major unsolved “grand challenges” in physics.
    In recent years, high-performance computing (HPC) resources have played an increasingly important role in gaining insight into how turbulence influences fluids under a variety of circumstances. Recently, researchers from the RWTH Aachen University and the CORIA (CNRS UMR 6614) research facility in France have been using HPC resources at the Jülich Supercomputing Centre (JSC), one of the three HPC centres comprising the Gauss Centre for Supercomputing (GCS), to run high-resolution direct numerical simulations (DNS) of turbulent setups including jet flames. While extremely computationally expensive, DNS of turbulence allows researchers to develop better models to run on more modest computing resources that can help academic or industrial researchers using turbulence’s effects on a given fluid flow.
    “The goal of our research is to ultimately improve these models, specifically in the context of combustion and mixing applications,” said Dr. Michael Gauding, CORIA scientist and researcher on the project. The team’s recent work was just named the distinguished paper from the “Turbulent Flames” colloquium, which happened as part of the 38th International Symposium on Combustion.
    Starts and stops
    Despite its seemingly random, chaotic characteristics, researchers have identified some important properties that are universal, or at least very common, for turbulence under specific conditions. Researchers studying how fuel and air mix in a combustion reaction, for instance, rely on turbulence to ensure a high mixing efficiency. Much of that important turbulent motion may stem from what happens in a thin area near the edge of the flame, where its chaotic motions collide with the smoother-flowing fluids around it. This area, the turbulent-non-turbulent interface (TNTI), has big implications for understanding turbulent mixing. More