More stories

  • in

    Transformation in the particle zoo

    An international study led by the University of Bonn has found evidence of a long-sought effect in accelerator data. The so-called “triangle singularity” describes how particles can change their identities by exchanging quarks, thereby mimicking a new particle. The mechanism also provides new insights into a mystery that has long puzzled particle physicists: Protons, neutrons and many other particles are much heavier than one would expect. This is due to peculiarities of the strong interaction that holds the quarks together. The triangle singularity could help to better understand these properties. The publication is now available in Physical Review Letters.
    In their study, the researchers analyzed data from the COMPASS experiment at the European Organization for Nuclear Research CERN in Geneva. There, certain particles called pions are brought to extremely high velocities and shot at hydrogen atoms.
    Pions consist of two building blocks, a quark and an anti-quark. These are held together by the strong interaction, much like two magnets whose poles attract each other. When magnets are moved away from each other, the attraction between them decreases successively. With the strong interaction it is different: It increases in line with the distance, similar to the tensile force of a stretching rubber band.
    However, the impact of the pion on the hydrogen nucleus is so strong that this rubber band breaks. The “stretching energy” stored in it is released all at once. “This is converted into matter, which creates new particles,” explains Prof. Dr. Bernhard Ketzer of the Helmholtz Institute for Radiation and Nuclear Physics at the University of Bonn. “Experiments like these therefore provide us with important information about the strong interaction.”
    Unusual signal
    In 2015, COMPASS detectors registered an unusual signal after such a crash test. It seemed to indicate that the collision had created an exotic new particle for a few fractions of a second. “Particles normally consist either of three quarks — this includes the protons and neutrons, for example — or, like the pions, of one quark and one antiquark,” says Ketzer. “This new short-lived intermediate state, however, appeared to consist of four quarks.”
    Together with his research group and colleagues at the Technical University of Munich, the physicist has now put the data through a new analysis. “We were able to show that the signal can also be explained in a different way, that is, by the aforementioned triangle singularity,” he stresses. This mechanism was postulated as early as the 1950s by the Russian physicist Lev Davidovich Landau, but has not yet been proven directly.
    According to this, the particle collision did not produce a tetraquark at all, but a completely normal quark-antiquark intermediate. This, however, disintegrated again straight away, but in an unusual manner: “The particles involved exchanged quarks and changed their identities in the process,” says Ketzer, who is also a member of the Transdisciplinary Research Area “Building Blocks of Matter and Fundamental Interactions” (TRA Matter). “The resulting signal then looks exactly like that from a tetraquark with a different mass.” This is the first time such a triangle singularity has been detected directly mimicking a new particle in this mass range. The result is also interesting because it allows new insights into the nature of the strong interaction.
    Only a small fraction of the proton mass can be explained by Higgs mechanism
    Protons, neutrons, pions and other particles (called hadrons) have mass. They get this from the so-called Higgs mechanism, but obviously not exclusively: A proton has about 20 times more mass than can be explained by the Higgs mechanism alone. “The much bigger part of the mass of hadrons is due to the strong interaction,” Ketzer explains. “Exactly how the masses of hadrons come about, however, is not yet clear. Our data help us to better understand the properties of the strong interaction, and perhaps the ways in which it contributes to the mass of particles.”
    Story Source:
    Materials provided by University of Bonn. Note: Content may be edited for style and length. More

  • in

    From mathematics to medicine: Applying complex mathematics to analyze fMRI data

    Research led by a Wayne State University Department of Mathematics professor is aiding researchers in Wayne State’s Department of Psychiatry and Behavioral Neurosciences in analyzing fMRI data. fMRI is the preeminent class of signals collected from the brain in vivo and is irreplaceable in the study of brain dysfunction in many medical fields, including psychiatry, neurology and pediatrics.
    Andrew Salch, Ph.D., associate professor of mathematics in Wayne State’s College of Liberal Arts and Sciences, is leading the multidisciplinary team that is investigating how concepts of topological data analysis, a subfield of mathematics, can be applied to recovering “hidden” structure in fMRI data.
    “We hypothesized that aspects of the fMRI signal are not easily discoverable using many of the standard tools used for fMRI data analysis, which strategically reduce the number of dimensions in the data to be considered. Consequently, these aspects might be uncovered using concepts from the mathematical field of topological data analysis, also called TDA, which is intended for use on high-dimensional data sets,” said Salch. “The high dimensionality that characterizes fMRI data includes the three dimensions of space — that is, where in the brain the signal is being acquired — time — or how the signal varies as brain states change in time — and signal intensity — or how the strength of the fMRI signal changes in response to the task. When related to task-induced changes, the results reflect biologically meaningful aspects of brain function and dysfunction. This is a unique collaborative work focused on the complexities of both TDA and fMRI respectively, show how TDA can be applied to real fMRI data collected, and provide open access computational software we have developed for implementing the analyses.”
    The research article, “From mathematics to medicine: A practical primer on topological data analysis and the development of related analytic tools for the functional discovery of latent structure in fMRI data,” appears in the Aug. 12 issue of PLOS ONE.
    In it, the team used TDA to discover data structures in the anterior cingulate cortex, a critical control region in the brain. These structures — called non-contractible loops in TDA — appeared in specific conditions of the experiment, and were not identified using conventional techniques for fMRI analyses.
    “We expect this work to become a citation classic,” said Vaibhav Diwadkar, Ph.D., professor of psychiatry and behavioral neurosciences and research collaborator. “Instead of merely applying TDA to fMRI, we provide a lucid argument for why medical researchers who use fMRI should consider using TDA, and why topologists should turn their attention to the study of complex fMRI data. Moreover, this important work provides readers with empirical demonstrations of such applications, and we provide potential users with the tools we used so they can in turn apply it to their own data.”
    “Our ongoing research utilizing TDA with fMRI will provide a unique and complementary method for assessing brain function, and will give medical researchers greater flexibility in tackling complex properties in their data,” said Salch. “In particular, our work will help fMRI researchers become aware of the significant power of TDA that is designed to address complexity in data, and will enhance the value of using fMRI in neuroscience and medicine.”
    In addition to Salch and Diwadkar, co-authors on the paper include Adam Regalski, Wayne State mathematics graduate student; Hassan Abdallah, Wayne State mathematics department alumni and current graduate student at the University of Michigan; and Michael Catanzaro, assistant professor of mathematics at Iowa State University and Wayne State mathematics department alumni.
    This work is supported by the National Institutes of Health (MH111177 and MH059299), the Jack Dorsey Endowment, the Cohen Neuroscience Endowment, and the Lycaki-Young Funds from the State of Michigan. More

  • in

    Tailoring wearable technology and telehealth in treating Parkinson's disease

    Wearable health technologies are vastly popular with people wanting to improve their physical and mental health. Everything from exercise, sleep patterns, calories consumed and heart rhythms can be tracked by a wearable device.
    But timely and accurate data is also especially valuable for doctors treating patients with complicated health conditions using virtual care.
    A new study from the Southern Medical Program (SMP), based at UBC Okanagan, has examined the use of wearable health technology and telehealth to treat patients with Parkinson’s disease.
    Dr. Daryl Wile, a movement disorder specialist and SMP clinical assistant professor, routinely uses telehealth to connect with Parkinson’s patients across the vast and rugged landscape of BC’s Interior.
    “Even prior to the pandemic, telehealth helped deliver specialized care to patients living in remote and rural settings,” says Wile, a clinical investigator with the Centre for Chronic Disease Prevention and Management. “But with the complex nature of Parkinson’s, we wanted to enhance these appointments to better understand how movements vary throughout a patient’s entire day.”
    To add a new layer of health information, Wile and the research team added wearable technology to the equation.
    “We recruited Parkinson’s patients with either tremors or involuntary movements,” says Joshua Yoneda, SMP student and co-author of the study. “We then divided them into two groups — some using telehealth and device-based health tracking and others attending traditional face-to-face appointments.”
    The telehealth group wore wearable devices to track their movements, involuntary or not, throughout waking hours. The reported data was then reviewed during telehealth appointments to identify peak times patients experienced Parkinson’s symptoms.
    “With the integration of accurate and reliable data from wearable devices, we were able to tailor a patient’s medication to better manage their symptoms throughout the day,” adds Wile.
    As part of the study, patients were asked a series of questions from the standardized Parkinson Disease Quality of Life Index. Both study groups were assessed at intervals of six weeks, three months and six months.
    Overall, the patients using the wearable devices reported positive experiences and health outcomes in combination with telehealth appointments to access specialized care.
    “There’s definitely a strong case to leverage multiple technologies to improve a patient’s quality of life and limit the added stress and cost associated with travel,” says Yoneda.
    Story Source:
    Materials provided by University of British Columbia Okanagan campus. Note: Content may be edited for style and length. More

  • in

    Water-driven soft actuator developed

    Sea cucumbers have a bumpy and oblong shape. They are soft but stiffen up quickly when touched. They can shrink or stretch to several meters, and their original shape can be recovered even after they die and shrivel up with the regulation of water uptake. Recently, a POSTECH research team has developed a soft actuator inspired by this unique behavior of sea cucumbers.
    A research team led by Professor Dong Sung Kim, Dr. Andrew Choi (currently the director of R&D at EDmicBio, Inc.), and Hyeonseok Han of POSTECH’s Department of Mechanical Engineering was inspired by the mutable collagenous tissue (MCT) of sea cucumbers to develop a water-driven self-operating soft actuator that exceeds the strength and speed of conventional soft actuators. This research was published as the inside front cover paper in the latest issue of Journal of Materials Chemistry A.
    The body of a sea cucumber is made of MCT; and thus, it can be hardened or softened according to the surrounding environment. In a matter of a few seconds, the elastic modulus of sea cucumbers can change up to 10 times to quickly squeeze through crevices or inflate to threaten predators. This change is induced by the formation or destruction of hydrogen bonds in collagenous tissues by controlling its chemical regulators.
    An actuator is a rigid device that alters a physical state by using an electrical signal change, such as a motor or a switch. However, a soft actuator that responds to water – which uses water as an energy source – can be applied to soft robotics that requires freedom in movement. However, the existing soft actuators are limited in their application due to their fragility and slow speed.
    Inspired by the MCT of sea cucumbers that freely change shape by reacting with water, the research team designed an actuator to be programmable. This actuator is based on the bulk PNIPAAm hydrogel that changes very flexibly and showed an actuation force 200 times (2 newtons) greater and an actuation force 300 times (1/3 second) faster than the conventional soft actuators that use water as an energy source – even underwater at 80°C temperature. In addition, through several tests, it was demonstrated that the actuator was robust enough to restore the original shape even when subjected to 300% of tensile strain.
    This actuator can be applicable in many different sectors such as industrial and biomedical fields, including industrial robots such as grippers that grab and lift materials like a human arm, wound closures, and artificial fingers.
    “The soft robot activates when it comes in contact with moisture and is flexible and deformable to easily adapt to various environments,” explained Professor Dong Sung Kim. “This newly developed hydrogel actuator is very powerful and actuates quickly to enable operation even in places without electricity by using chemical energy.”
    This research was conducted with the support of the Mid-career Researcher Program and the Core Technology Biomedical Development Program funded by the Ministry of Science and ICT and the National Research Foundation of Korea, and the Alchemist Project funded by the Ministry of Trade, Industry and Energy.
    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Fast changes between the solar seasons resolved by new sun clock

    Violent activity on our Sun leads to some of the most extreme space weather events on Earth, impacting systems such as satellites, communications systems, power distribution and aviation. The roughly 11 year cycle of solar activity has three ‘seasons’, each of which affects the space weather felt at Earth differently: (i) solar maximum, the sun is active and disordered, when space weather is stormy and events are irregular (ii) the declining phase, when the sun and solar wind becomes ordered, and space weather is more moderate and (iii) solar minimum, when activity is quiet.
    In a new study led by the University of Warwick and published in The Astrophysical Journal, scientists found that the change from solar maximum to the declining phase is fast, happening within a few (27 day) solar rotations. They also showed that the declining phase is twice as long in even-numbered solar cycles as it is in odd-numbered cycles.
    No two solar cycles are the same in amplitude or duration. To study the solar seasons, the scientists built a sun clock from the daily sunspot number record available since 1818. This maps the irregular solar cycles onto a regular clock. The magnetic polarity of the sun reverses after each roughly 11 year solar cycle giving a roughly 22 year magnetic cycle (named after George Ellery Hale) and to explore this, a 22 year clock was constructed. The effect on space weather at earth can be tracked back using the longest continuous records of geomagnetic activity over the past 150 years, and once the clock is constructed, it can be used to study multiple observations of seasonal solar activity which affect the earth.
    With the greater detail afforded by the sun clock, the scientists could see that the switch from solar maximum to the declining phase is fast, occurring within a few (27 day) solar rotations. There was also a clear difference in the duration of the declining phase when the sun’s magnetic polarity is ‘up’ compared to ‘down’: in even-numbered cycles it is around twice as long as odd-numbered cycles. As we are about to enter cycle 25, the scientists anticipate that the next declining phase will be short.
    Lead author Professor Sandra Chapman of the University of Warwick Department of Physics said: “By combining well known methods in a new way, our clock resolves changes in the Sun’s climate to within a few solar rotations. Then you find the changes between some phases can be really sharp.
    “If you know you’ve had a long cycle, you know the next one’s going to be short, we can estimate how long it’s going to last. Knowing the timing of the climate seasons helps to plan for space weather. Operationally it is useful to know when conditions will be active or quiet, for satellites, power grids, communications.”
    The results also provide a clue to understanding how the Sun reverses polarity after every cycle.
    Professor Chapman adds: “I also think it is remarkable that something the size of the sun can flip its magnetic field every 11 years, and going down-up is different to going up-down. Somehow the sun ‘knows which way up it is’, and this is an intriguing problem, at the heart of how the sun generates its magnetic field.”
    Story Source:
    Materials provided by University of Warwick. Note: Content may be edited for style and length. More

  • in

    On the road to faster and more efficient data storage

    A research team has discovered magnetic phenomena in antiferromagnets that could pave the way to developing faster and more efficient data storage.
    How do magnetic waves behave in antiferromagnets and how do they spread? What role do “domain walls” play in the process? And what could this mean for the future of data storage? These questions are the focus of a recent publication in the journal Physical Review Letters from an international research team led by Konstanz physicist Dr Davide Bossini. The team reports on magnetic phenomena in antiferromagnets that can be induced by ultrafast (femtosecond) laser pulses and with the potential to endow the materials with new functionalities for energy-efficient and ultrafast data storage applications.
    Demand for storage capacity is growing faster than the available infrastructure
    The wildly increasing use of big data technologies and cloud-based data services means that the global demand for data storage is constantly expanding — along with the need for ever-faster data processing. At the same time, the currently available technologies will not be able to keep up forever. “The estimates say that the growing demand can only be met for a limited period of about 10 years, if no novel, more efficient technologies for data storage and processing can be developed in the meantime,” says physicist Dr Davide Bossini from the University of Konstanz and lead author of the study.
    To prevent a data crisis from taking place, it will not be enough to simply keep building more and more data centres, operating at the current state-of-the art. The technologies of the future must also be faster and more energy-efficient than traditional mass data storage, based on magnetic hard disks. One class of materials, antiferromagnets, is a promising candidate for developing the next generation of information technology.
    The structure of antiferromagnets
    We are all familiar with household magnets made from iron or other ferromagnetic materials. These materials have atoms that are magnetically all oriented in the same direction — like small needles of a compass — so that a magnetic polarization (magnetization) occurs that affects the surrounding environment. The antiferromagnets, by contrast, have atoms with alternating magnetic moments that cancel each other out. Antiferromagnets thus have no net magnetization and therefore no magnetic impact on the surrounding environment. More

  • in

    Heavily enriched: An energy-efficient way of enriching hydrogen isotopes in silicon

    The discovery of isotopes in the early 20th century marked a key moment in the history of physics and led to a much more refined understanding of the atomic nucleus. Isotopes are ‘versions’ of a given element of the periodic table that bear the same number of protons but a different number of neutrons, and therefore vary in mass. These differences in mass can radically alter certain physical properties of the atoms, such as their radioactive decay rates, their possible reaction pathways in nuclear fission reactors, and much more.
    While most isotopes of an element share similar chemical properties, there is one notable exception: hydrogen isotopes. Most hydrogen atoms on Earth contain only one proton and one electron, but there exist hydrogen isotopes which also have one neutron (deuterium) or two neutrons (tritium). Deuterium, which essentially weighs twice as much as ‘normal’ hydrogen, has found many practical and scientific uses. For example, it can be used to label and track molecules such as proteins to investigate biochemical processes. It can also be strategically used in drugs to reduce their metabolic rate and increase their half-life in the body.
    Another important application of deuterium exists in the field of semiconductor electronics. The surface of silicon-based semiconductors has to be ‘passivated’ with hydrogen to ensure silicon atoms don’t come off (desorb) easily, thereby increasing the durability of microchips, batteries, and solar cells. However, through mechanisms that are still not completely understood, passivation with deuterium instead of hydrogen results in desorption probabilities about one hundred times lower, implying that deuterium may soon become an indispensable ingredient in electronic devices. Unfortunately, both the procurement of deuterium and available techniques to enrich silicon surfaces with it are very energy inefficient or require very expensive deuterium gas.
    Fortunately, at Nagoya City University (NCU), Japan, a team of scientists led by Professor Takahiro Matsumoto have found an energy-efficient strategy to enrich silicon surfaces using a dilute deuterium solution. This study, which was published in Physical Review Materials, was carried out in collaboration with Dr. Takashi Ohhara of Japan Atomic Energy Agency and Dr. Yoshihiko Kanemitsu from Kyoto University.
    The researchers found that a peculiar exchange reaction from hydrogen to deuterium can occur on the surface of nanocrystalline silicon (n-Si). They demonstrated this reaction in thin n-Si films submerged in a deuterium-containing solution using inelastic neutron scattering. This spectroscopy technique involves irradiating neutrons onto a sample and analyzing the resulting atomic motions or crystal vibrations. These experiments, coupled with other spectroscopy methods and energy calculations based on quantum mechanics, revealed the underlying mechanisms that favor the replacement of hydrogen terminations on the surface of n-Si with deuterium: The exchange process is closely related to differences in the surface vibrational modes between hydrogen- and deuterium-terminated n-Si. “We achieved a fourfold increase in the concentration of surface deuterium atoms on n-Si in our experiments performed in the liquid phase,” highlights Dr. Matsumoto, “We also proposed a gas-phase enrichment protocol for n-Si that, according to our theoretical calculations, could enhance the rate of deuterium enrichment 15-fold.”
    This innovative strategy of exploiting quantum effects on the surface of n-Si could pave the way to new methods to procure and utilize deuterium. “The efficient hydrogen-to-deuterium exchange reaction we reported may lead to sustainable, economically feasible, and environment-friendly deuterium enrichment protocols, leading to more durable semiconductor technology,” concludes Dr. Matsumoto.
    The NCU team also stated that “It has been theoretically predicted that the heavier the hydrogen is, the higher the efficiency of the exchange reaction is. Thus, we can expect more efficient enrichment of tritium atoms on n-Si, which leads to the possibility of purifying tritium contaminated water. We believe that this is an issue that must be urgently solved.”
    Let us hope the findings of this work allow us to benefit more from the heavier isotopes of hydrogen without taking a toll on our planet.
    Story Source:
    Materials provided by Nagoya City University. Note: Content may be edited for style and length. More

  • in

    Table-top electron camera catches ultrafast dynamics of matter

    Scientists at DESY have built a compact electron camera that can capture the inner, ultrafast dynamics of matter. The system shoots short bunches of electrons at a sample to take snapshots of its current inner structure and is the first such electron diffractometer that uses Terahertz radiation for pulse compression. The developer team around DESY scientists Dongfang Zhang and Franz Kärtner from the Center for Free-Electron Laser Science CFEL validated their Terahertz-enhanced ultrafast electron diffractometer with the investigation of a silicon sample and present their work in the first issue of the journal Ultrafast Science, a new title in the Science group of scientific journals.
    Electron diffraction is one way to investigate the inner structure of matter. However, it does not image the structure directly. Instead, when the electrons hit or traverse a solid sample, they are deflected in a systematic way by the electrons in the solid’s inner lattice. From the pattern of this diffraction, recorded on a detector, the internal lattice structure of the solid can be calculated. To detect dynamic changes in this inner structure, short bunches of sufficiently bright electrons have to be used. “The shorter the bunch, the faster the exposure time,” says Zhang, who is now a professor at Shanghai Jiao Tong University. “Typically, ultrafast electron diffraction (UED) uses bunch lengths, or exposure times, of some 100 femtoseconds, which is 0.1 trillionths of a second.”
    Such short electron bunches can be routinely produced with high quality by state-of-the-art particle accelerators. However, these machines are often large and bulky, partly due to the radio frequency radiation used to power them, which operates in the Gigahertz band. The wavelength of the radiation sets the size for the whole device. The DESY team is now using Terahertz radiation instead with roughly a hundred times shorter wavelengths. “This basically means, the accelerator components, here a bunch compressor, can be a hundred times smaller, too,” explains Kärtner, who is also a professor and a member of the cluster of excellence “CUI: Advanced Imaging of Matter” at the University of Hamburg.
    For their proof-of-principle study, the scientists fired bunches with roughly 10,000 electrons each at a silicon crystal that was heated by a short laser pulse. The bunches were about 180 femtoseconds long and show clearly how the crystal lattice of the silicon sample quickly expands within a picosecond (trillionths of a second) after the laser hits the crystal. “The behaviour of silicon under these circumstances is very well known, and our measurements fit the expectation perfectly, validating our Terahertz device,” says Zhang. He estimates that in an optimised set-up, the electron bunches can be compressed to significantly less than 100 femtoseconds, allowing even faster snapshots.
    On top of its reduced size, the Terahertz electron diffractometer has another advantage that might be even more important to researchers: “Our system is perfectly synchronised, since we are using just one laser for all steps: generating, manipulating, measuring and compressing the electron bunches, producing the Terahertz radiation and even heating the sample,” Kärtner explains. Synchronisation is key in this kind of ultrafast experiments. To monitor the swift structural changes within a sample of matter like silicon, researchers usually repeat the experiment many times while delaying the measuring pulse a little more each time. The more accurate this delay can be adjusted, the better the result. Usually, there needs to be some kind of synchronisation between the exciting laser pulse that starts the experiment and the measuring pulse, in this case the electron bunch. If both, the start of the experiment and the electron bunch and its manipulation are triggered by the same laser, the synchronisation is intrinsically given.
    In a next step, the scientists plan to increase the energy of the electrons. Higher energy means the electrons can penetrate thicker samples. The prototype set-up used rather low-energy electrons and the silicon sample had to be sliced down to a thickness of just 35 nanometres (millionths of a millimetre). Adding another acceleration stage could give the electrons enough energy to penetrate 30 times thicker samples with a thickness of up to 1 micrometre (thousandth of a millimetre), as the researchers explain. For even thicker samples, X-rays are normally used. While X-ray diffraction is a well established and hugely successful technique, electrons usually do not damage the sample as quickly as X-rays do. “The energy deposited is much lower when using electrons,” explains Zhang. This could prove useful when investigating delicate materials.
    This work has been supported by the European Research Council under the European Union’s Seventh Framework Program (FP7/2007-2013) through the Synergy Grant AXSIS (609920), Project KA908-12/1 of the Deutsche Forschungsgemeinschaft, and the accelerator on a chip program (ACHIP) funded by the Gordon and Betty Moore foundation (GBMF4744).
    DESY is one of the world’s leading particle accelerator centres and investigates the structure and function of matter — from the interaction of tiny elementary particles and the behaviour of novel nanomaterials and vital biomolecules to the great mysteries of the universe. The particle accelerators and detectors that DESY develops and builds at its locations in Hamburg and Zeuthen are unique research tools. They generate the most intense X-ray radiation in the world, accelerate particles to record energies and open up new windows onto the universe. DESY is a member of the Helmholtz Association, Germany’s largest scientific association, and receives its funding from the German Federal Ministry of Education and Research (BMBF) (90 per cent) and the German federal states of Hamburg and Brandenburg (10 per cent). More