More stories

  • in

    Ecologists confirm Alan Turing's theory for Australian fairy circles

    Fairy circles are one of nature’s greatest enigmas and most visually stunning phenomena. An international research team led by the University of Göttingen has now, for the first time, collected detailed data to show that Alan Turing’s model explains the striking vegetation patterns of the Australian fairy circles. In addition, the researchers showed that the grasses that make up these patterns act as “eco-engineers” to modify their own hostile and arid environment, thus keeping the ecosystem functioning. The results were published in the Journal of Ecology.
    Researchers from Germany, Australia and Israel undertook an in-depth fieldwork study in the remote Outback of Western Australia. They used drone technology, spatial statistics, quadrat-based field mapping, and continuous data-recording from a field-weather station. With the drone and a multispectral camera, the researchers mapped the “vitality status” of the Triodia grasses (how strong and how well they grew) in five one-hectare plots and classified them into high- and low-vitality.
    The systematic and detailed fieldwork enabled, for the first time in such an ecosystem, a comprehensive test of the “Turing pattern” theory. Turing’s concept was that in certain systems, due to random disturbances and a “reaction-diffusion” mechanism, interaction between just two diffusible substances was enough to allow strongly patterned structures to spontaneously emerge. Physicists have used this model to explain the striking skin patterns in zebrafish or leopards for instance. Earlier modelling had suggested this theory might apply to these intriguing vegetation patterns and now there is robust data from multiple scales which confirms that Alan Turing’s model applies to Australian fairy circles.
    The data show that the unique gap pattern of the Australian fairy circles, which occur only in a small area east of the town of Newman, emerges from ecohydrological biomass-water feedbacks from the grasses. In fact, the fairy circles — with their large diameters of 4m, clay crusts from weathering and resultant water run-off — are a critical extra source of water for the dryland vegetation. Clumps of grasses increased shading and water infiltration around the nearby roots. With increasing years after fire, they merged more and more at the periphery of the vegetation gaps to form a barrier so that they could maximize their water uptake from the fairy circle’s runoff. The protective plant cover of grasses could reduce soil-surface temperatures by about 25°C at the hottest time of the day, which facilitates the germination and growth of new grasses. In summary, the scientists found evidence both at the scale of the landscape and at much smaller scales that the grasses, with their cooperative growth dynamics, redistribute the water resources, modulate the physical environment, and thus function as “ecosystem engineers” to modify their own environment and better cope with the arid conditions.
    Dr Stephan Getzin, Department of Ecosystem Modelling at the University of Göttingen, explains, “The intriguing thing is that the grasses are actively engineering their own environment by forming symmetrically spaced gap patterns. The vegetation benefits from the additional runoff water provided by the large fairy circles, and so keeps the arid ecosystem functional even in very harsh, dry conditions.” This contrasts with the uniform vegetation cover seen in less water-stressed environments. “Without the self-organization of the grasses, this area would likely become desert, dominated by bare soil,” he adds. The emergence of Turing-like patterned vegetation seems to be nature’s way of managing an ancient deficit of permanent water shortage.
    In 1952 when the British mathematician, Alan Turing, published his ground-breaking theoretical paper on pattern formation, he had most likely never heard of the fairy circles before. But with his theory he laid the foundation for generations of physicists to explain highly symmetrical patterns like sand ripples in dunes, cloud stripes in the sky or spots on an animal’s coat with the reaction-diffusion mechanism. Now, ecologists have provided an empirical study to extend this principle from physics to dryland ecosystems with fairy circles.

    Story Source:
    Materials provided by University of Göttingen. Note: Content may be edited for style and length. More

  • in

    New freshwater database tells water quality story for 12K lakes globally

    Although less than one per cent of all water in the world is freshwater, it is what we drink and use for agriculture. In other words, it’s vital to human survival. York University researchers have just created a publicly available water quality database for close to 12,000 freshwater lakes globally — almost half of the world’s freshwater supply — that will help scientists monitor and manage the health of these lakes.
    The study, led by Faculty of Science Postdoctoral Fellow Alessandro Filazzola and Master’s student Octavia Mahdiyan, collected data for lakes in 72 countries, from Antarctica to the United States and Canada. Hundreds of the lakes are in Ontario.
    “The database can be used by scientists to answer questions about what lakes or regions may be faring worse than others, how water quality has changed over the years and which environmental stressors are most important in driving changes in water quality,” says Filazzola.
    The team included a host of graduate and undergraduate students working in the laboratory of Associate Professor Sapna Sharma in addition to a collaboration with Assistant Professor Derek Gray of Wilfrid Laurier University, Associate Professor Catherine O’Reilly of Illinois State University and York University Associate Professor Roberto Quinlan.
    The researchers reviewed 3,322 studies from as far back as the 1950s along with online data repositories to collect data on chlorophyll levels, a commonly used marker to determine lake and ecosystem health. Chlorophyll is a predictor of the amount of vegetation and algae in lakes, known as primary production, including invasive species such as milfoil.
    “Human activity, climate warming, agricultural, urban runoff and phosphorus from land use can all increase the level of chlorophyll in lakes. The primary production is most represented by the amount of chlorophyll in the lake, which has a cascading impact on the phytoplankton that eat the algae and the fish that eat the phytoplankton and the fish that eat those fish,” says Filazzola. “If the chlorophyll is too low, it can have cascading negative effects on the entire ecosystem, while too much can cause an abundance of algae growth, which is not always good.”
    Warming summer temperatures and increased solar radiation from decreased cloud cover in the northern hemisphere also contributes to an increase in chlorophyll, while more storm events caused by climate change contribute to degraded water quality, says Sharma. “Agricultural areas and urban watersheds are more associated with degraded water quality conditions because of the amount of nutrients input into these lakes.”
    The researchers also gathered data on phosphorus and nitrogen levels — often a predictor of chlorophyll — as well as lake characteristics, land use variables, and climate data for each lake. Freshwater lakes are particularly vulnerable to changes in nutrient levels, climate, land use and pollution.
    “In addition to drinking water, freshwater is important for transportation, agriculture, and recreation, and provides habitats for more than 100,000 species of invertebrates, insects, animals and plants,” says Sharma. “The database can be used to improve our understanding of how chlorophyll levels respond to global environmental change and it provides baseline comparisons for environmental managers responsible for maintaining water quality in lakes.”
    The researchers started looking only at Ontario lakes, but quickly expanded it globally as although there are thousands of lakes in Ontario a lot of the data is not as readily available as it is in other regions of the world.
    “The creation of this database is a feat typically only accomplished by very large teams with millions of dollars, not by a single lab with a few small grants, which is why I am especially proud of this research,” says Sharma.

    Story Source:
    Materials provided by York University. Note: Content may be edited for style and length. More

  • in

    Thin and ultra-fast photodetector sees the full spectrum

    Researchers have developed the world’s first photodetector that can see all shades of light, in a prototype device that radically shrinks one of the most fundamental elements of modern technology.
    Photodetectors work by converting information carried by light into an electrical signal and are used in a wide range of technologies, from gaming consoles to fibre optic communication, medical imaging and motion detectors. Currently photodetectors are unable to sense more than one colour in the one device.
    This means they have remained bigger and slower than other technologies, like the silicon chip, that they integrate with.
    The new hyper-efficient broadband photodetector developed by researchers at RMIT University is at least 1,000 times thinner than the smallest commercially available photodetector device.
    In a significant leap for the technology, the prototype device can also see all shades of light between ultraviolet and near infrared, opening new opportunities to integrate electrical and optical components on the same chip.
    *New possibilities*
    The breakthrough technology opens the door for improved biomedical imaging, advancing early detection of health issues like cancer.

    advertisement

    Study lead author, PhD researcher Vaishnavi Krishnamurthi, said in photodetection technologies, making a material thinner usually came at the expense of performance.
    “But we managed to engineer a device that packs a powerful punch, despite being thinner than a nanometre, which is roughly a million times smaller than the width of a pinhead,” she said.
    As well as shrinking medical imaging equipment, the ultra-thin prototype opens possibilities for more effective motion detectors, low-light imaging and potentially faster fibre optical communication.
    “Smaller photodetectors in biomedical imaging equipment could lead to more accurate targeting of cancer cells during radiation therapy,” Krishnamurthi said.
    “Shrinking the technology could also help deliver smaller, portable medical imaging systems that could be brought into remote areas with ease, compared to the bulky equipment we have today.”
    *Lighting up the spectrum*

    advertisement

    How versatile and useful photodetectors are depends largely on three factors: their operating speed, their sensitivity to lower levels of light and how much of the spectrum they can sense.
    Typically, when engineers have tried improving a photodetector’s capabilities in one of those areas, at least one of the other capabilities have been diminished.
    Current photodetector technology relies on a stacked structure of three to four layers.
    Imagine a sandwich, where you have bread, butter, cheese and another layer of bread — regardless of how good you are at squashing that sandwich, it will always be four layers thick, and if you remove a layer, you’d compromise the quality.
    The researchers from RMIT’s School of Engineering scrapped the stacked model and worked out how to use a nanothin layer — just a single atom thick — on a chip.
    Importantly, they did this without diminishing the photodetector’s speed, low-light sensitivity or visibility of the spectrum.
    The prototype device can interpret light ranging from deep ultraviolet to near infrared wavelengths, making it sensitive to a broader spectrum than a human eye.
    And it does this over 10,000 times faster than the blink of an eye.
    *Nano-thin technology*
    A major challenge for the team was ensuring electronic and optical properties didn’t deteriorate when the photodetector was shrunk, a technological bottleneck that had previously prevented miniaturisation of light detection technologies.
    Chief investigator Associate Professor Sumeet Walia said the material used, tin monosulfide, is low-cost and naturally abundant, making it attractive for electronics and optoelectronics.
    “The material allows the device to be extremely sensitive in low-lighting conditions, making it suitable for low-light photography across a wide light spectrum,” he said.
    Walia said his team is now looking at industry applications for their photodetector, which can be integrated with existing technologies such as CMOS chips.
    “With further development, we could be looking at applications including more effective motion detection in security cameras at night and faster, more efficient data storage,” he said. More

  • in

    Web resources bring new insight into COVID-19

    Researchers around the world are a step closer to a better understanding of the intricacies of COVID-19 thanks to two new web resources developed by investigators at Baylor College of Medicine and the University of California San Diego. The resources are freely available through the Signaling Pathways Project (Baylor) and the Network Data Exchange (UCSD). They put at researchers’ fingertips information about cellular genes whose expression is affected by coronavirus infection and place these data points in the context of the complex network of host molecular signaling pathways. Using this resource has the potential to accelerate the development of novel therapeutic strategies.
    The study appears in the journal Scientific Data.
    “Our motivation for developing this resource is to contribute to making research about COVID-19 more accessible to the scientific community. When researchers have open access to each other’s work, discoveries move forward more efficiently,” said leading author Dr. Neil McKenna, associate professor of molecular and cellular biology and member of the Dan L Duncan Comprehensive Cancer Center at Baylor.
    The Signaling Pathway Project
    For years, the scientific community has been generating and archiving molecular datasets documenting how genes are expressed as cells conduct their normal functions, or in association with disease. However, usually this information is not easily accessible.
    In 2019, McKenna and his colleagues developed the Signaling Pathways Project, a web-based platform that integrates molecular datasets published in the scientific literature into consensus regulatory signatures, or what they are calling consensomes, that rank genes according to their rates of differential expression.

    advertisement

    In the current study, the researchers generated consensomes for genes affected by infection with three major coronaviruses, Middle East respiratory syndrome coronavirus (MERS) and severe acute respiratory syndrome coronaviruses 1 (SARS1) and 2 (SARS2, which causes COVID-19).
    McKenna and his colleagues provide a resource that assists researchers in making the most out of coronavirus’ datasets. The resource identifies the genes whose expression is most consistently affected by the infection and integrates those responses with data about the cells’ molecular signaling pathways, in a sense getting a better picture of what happens inside a cell infected by coronavirus and how the cell responds.
    “The collaboration with UCSD makes our analyses available as intuitive Cytoscape-style networks,” says McKenna. “Because using these resources does not require training in meta-analysis, they greatly lower the barriers to usability by bench researchers.”
    Providing new insights into COVID-19
    The consensus strategy, the researchers explain, can bring to light previously unrecognized links or provide further support for suspected connections between coronavirus infection and human signaling pathways, ultimately simplifying the generation of hypotheses to be tested in the laboratory.

    advertisement

    For example, the connection between pregnancy and susceptibility to COVID-19 has been difficult to evaluate due to lack of clinical data, but McKenna and colleagues’ approach has provided new insights into this puzzle.
    “We found evidence that progesterone receptor signaling antagonizes SARS2-induced inflammatory signaling mediated by interferon in the airway epithelium. This finding suggests the hypothesis that the suppression of the interferon response to SARS2 infection by elevated circulating progesterone during pregnancy may contribute to the asymptomatic clinical course,” McKenna said.
    Consistent with their hypothesis, while this paper was being reviewed, a clinical trial was launched to evaluate progesterone as a treatment for COVID-19 in men.
    Scott A. Ochsner at Baylor College of Medicine and Rudolf T. Pillich at the University of California San Diego were also authors of this work.
    This study was supported by the National Institute of Diabetes, Digestive and Kidney Diseases NIDDK Information Network (DK097748), the National Cancer Institute (CA125123, CA184427) and by the Brockman Medical Research Foundation. The Signaling Pathways Project website is hosted by the Dan L Duncan Comprehensive Cancer Center. More

  • in

    Cities beat suburbs at inspiring cutting-edge innovations

    The disruptive inventions that make people go “Wow!” tend to come from research in the heart of cities and not in the suburbs, a new study suggests.
    Researchers found that, within metro areas, the majority of patents come from innovations created in suburbs — often in the office parks of big tech companies like Microsoft and IBM.
    But the unconventional, disruptive innovations — the ones that combine research from different technological fields — are more likely to be produced in cities, said Enrico Berkes, co-author of the study and postdoctoral researcher in economics at The Ohio State University.
    These unconventional patents are ones that, for example, may blend research on acoustics with research on information storage — the basis for digital music players like the iPod. Or patents that cite previous work on “vacuum cleaning” and “computing” to produce the Roomba.
    “Densely populated cities do not generate more patents than the suburbs, but they tend to generate more unconventional patents,” said Berkes, who did the work as a doctoral student at Northwestern University.
    “Our findings suggest that cities provide more opportunities for creative people in different fields to interact informally and exchange ideas, which can lead to more disruptive innovation.”
    Berkes conducted the study with Ruben Gaetani, assistant professor of strategic management at the University of Toronto. Their research was published online recently in The Economic Journal.

    advertisement

    Previous research had shown that large metropolitan areas are where patenting activity tends to concentrate, Berkes said, suggesting that population density is an important factor for innovation.
    But once Berkes and Gaetani started looking more closely at metro areas, they found that a sizable share of these patents was developed in the suburbs — the least densely populated part. Nearly three-quarters of patents came from places that had density below 3,650 people per square mile in 2000, about the density of Palo Alto, California.
    “If new technology is spurred by population density, we wanted to know why so much is happening in the least dense parts of the metro areas,” Berkes said.
    So Berkes and Gaetani analyzed more than 1 million U.S. patents granted between January 2002 and August 2014. They used finely geolocated data from the U.S. Patent and Trademark Office that allowed them to see exactly where in metro areas — including city centers and specific suburbs — that patented discoveries were made.
    But they were also interested in determining the type of innovations produced — whether they would be considered conventional or unconventional. They did this by analyzing the previous work on which each patent was based.

    advertisement

    The researchers tagged new patents as unconventional if the inventors cited previous work in widely different areas.
    For example, a patent from 2000 developed in Pittsburgh is one of the first recorded inventions in wearable technologies and one of the precursors to products such as Fitbit. It was recognized as unconventional because it cites previous patents in both apparel and electrical equipment — two very distant fields.
    After analyzing the data, the researchers found that both urban and suburban areas played a prominent role in the innovation process, but in different ways, Berkes said.
    Large innovative companies, such as IBM or Microsoft, tend to perform their research in large office parks located outside the main city centers.
    “These companies are very successful in taking advantage of formal channels of knowledge diffusion, such as meetings or conferences, where they can capitalize on the expertise of their scientists and have them work together on specialized projects for the company,” Berkes said.
    “But it is more difficult for them to tap ideas from other scientific fields because this demands interactions with inventors they’re not communicating with every day or running into in the cafeteria or in the hallway.”
    That’s where the urban cores excelled. In cities like San Francisco and Boston, researchers may meet people in entirely different fields at bars, restaurants, museums and cultural events. Any chance encounter could lead to productive partnerships, he said.
    “If you want to create something truly new and disruptive, it helps if you have opportunities to casually bump into people from other scientific fields and exchange ideas and experiences and knowledge. That’s what happens in cities,” he said.
    “Density plays an important role in the type, rather than the amount, of innovation.”
    These findings show the potential value of tech parks that gather technology startup companies in a variety of fields in one place, Berkes said. But they have to be set up properly.
    “Our research suggests that informal interactions are important. Tech parks should be structured in a way that people from different startups can easily interact with each other on a regular basis and share ideas,” he said. More

  • in

    AI could expand healing with bioscaffolds

    A dose of artificial intelligence can speed the development of 3D-printed bioscaffolds that help injuries heal, according to researchers at Rice University.
    A team led by computer scientist Lydia Kavraki of Rice’s Brown School of Engineering used a machine learning approach to predict the quality of scaffold materials, given the printing parameters. The work also found that controlling print speed is critical in making high-quality implants.
    Bioscaffolds developed by co-author and Rice bioengineer Antonios Mikos are bonelike structures that serve as placeholders for injured tissue. They are porous to support the growth of cells and blood vessels that turn into new tissue and ultimately replace the implant.
    Mikos has been developing bioscaffolds, largely in concert with the Center for Engineering Complex Tissues, to improve techniques to heal craniofacial and musculoskeletal wounds. That work has progressed to include sophisticated 3D printing that can make a biocompatible implant custom-fit to the site of a wound.
    That doesn’t mean there isn’t room for improvement. With the help of machine learning techniques, designing materials and developing processes to create implants can be faster and eliminate much trial and error.
    “We were able to give feedback on which parameters are most likely to affect the quality of printing, so when they continue their experimentation, they can focus on some parameters and ignore the others,” said Kavraki, an authority on robotics, artificial intelligence and biomedicine and director of Rice’s Ken Kennedy Institute.

    advertisement

    The team reported its results in Tissue Engineering Part A.
    The study identified print speed as the most important of five metrics the team measured, the others in descending order of importance being material composition, pressure, layering and spacing.
    Mikos and his students had already considered bringing machine learning into the mix. The COVID-19 pandemic created a unique opportunity to pursue the project.
    “This was a way to make great progress while many students and faculty were unable to get to the lab,” Mikos said.
    Kavraki said the researchers — graduate students Anja Conev and Eleni Litsa in her lab and graduate student Marissa Perez and postdoctoral fellow Mani Diba in the Mikos lab, all co-authors of the paper — took time at the start to establish an approach to a mass of data from a 2016 study on printing scaffolds with biodegradable poly(propylene fumarate), and then to figure out what more was needed to train the computer models.

    advertisement

    “The students had to figure out how to talk to each other, and once they did, it was amazing how quickly they progressed,” Kavraki said.
    From start to finish, the COVID-19 window let them assemble data, develop models and get the results published within seven months, record time for a process that can often take years.
    The team explored two modeling approaches. One was a classification method that predicted whether a given set of parameters would produce a “low” or “high” quality scaffold. The other was a regression-based approach that approximated the values of print-quality metrics to come to a result. Kavraki said both relied upon a “classical supervised learning technique” called random forest that builds multiple “decision trees” and “merges” them together to get a more accurate and stable prediction.
    Ultimately, the collaboration could lead to better ways to quickly print a customized jawbone, kneecap or bit of cartilage on demand.
    “A hugely important aspect is the potential to discover new things,” Mikos said. “This line of research gives us not only the ability to optimize a system for which we have a number of variables — which is very important — but also the possibility to discover something totally new and unexpected. In my opinion, that’s the real beauty of this work.
    “It’s a great example of convergence,” he said. “We have a lot to learn from advances in computer science and artificial intelligence, and this study is a perfect example of how they will help us become more efficient.”
    “In the long run, labs should be able to understand which of their materials can give them different kinds of printed scaffolds, and in the very long run, even predict results for materials they have not tried,” Kavraki said. “We don’t have enough data to do that right now, but at some point we think we should be able to generate such models.”
    Kavraki noted The Welch Institute, recently established at Rice to enhance the university’s already stellar reputation for advanced materials science, has great potential to expand such collaborations.
    “Artificial intelligence has a role to play in new materials, so what the institute offers should be of interest to people on this campus,” she said. “There are so many problems at the intersection of materials science and computing, and the more people we can get to work on them, the better.” More

  • in

    New composite material revs up pursuit of advanced electric vehicles

    Scientists at Oak Ridge National Laboratory used new techniques to create a composite that increases the electrical current capacity of copper wires, providing a new material that can be scaled for use in ultra-efficient, power-dense electric vehicle traction motors.
    The research is aimed at reducing barriers to wider electric vehicle adoption, including cutting the cost of ownership and improving the performance and life of components such as electric motors and power electronics. The material can be deployed in any component that uses copper, including more efficient bus bars and smaller connectors for electric vehicle traction inverters, as well as for applications such as wireless and wired charging systems.
    To produce a lighter weight conductive material with improved performance, ORNL researchers deposited and aligned carbon nanotubes on flat copper substrates, resulting in a metal-matrix composite material with better current handling capacity and mechanical properties than copper alone.
    Incorporating carbon nanotubes, or CNTs, into a copper matrix to improve conductivity and mechanical performance is not a new idea. CNTs are an excellent choice due to their lighter weight, extraordinary strength and conductive properties. But past attempts at composites by other researchers have resulted in very short material lengths, only micrometers or millimeters, along with limited scalability, or in longer lengths that performed poorly.
    The ORNL team decided to experiment with depositing single-wall CNTs using electrospinning, a commercially viable method that creates fibers as a jet of liquid speeds through an electric field. The technique provides control over the structure and orientation of deposited materials, explained Kai Li, a postdoctoral researcher in ORNL’s Chemical Sciences Division. In this case, the process allowed scientists to successfully orient the CNTs in one general direction to facilitate enhanced flow of electricity.
    The team then used magnetron sputtering, a vacuum coating technique, to add thin layers of copper film on top of the CNT-coated copper tapes. The coated samples were then annealed in a vacuum furnace to produce a highly conductive Cu-CNT network by forming a dense, uniform copper layer and to allow diffusion of copper into the CNT matrix.

    advertisement

    Using this method, ORNL scientists created a copper-carbon nanotube composite 10 centimeters long and 4 centimeters wide, with exceptional properties. The microstructural properties of the material were analyzed using instruments at the Center for Nanophase Materials Sciences at ORNL, a U.S. Department of Energy Office of Science user facility. Researchers found the composite reached 14% greater current capacity, with up to 20% improved mechanical properties compared with pure copper, as detailed in ACS Applied Nano Materials.
    Tolga Aytug, lead investigator for the project, said that “by embedding all the great properties of carbon nanotubes into a copper matrix, we are aiming for better mechanical strength, lighter weight and higher current capacity. Then you get a better conductor with less power loss, which in turn increases the efficiency and performance of the device. Improved performance, for instance, means we can reduce volume and increase the power density in advanced motor systems.”
    The work builds on a rich history of superconductivity research at ORNL, which has produced superior materials to conduct electricity with low resistance. The lab’s superconductive wire technology was licensed to several industry suppliers, enabling such uses as high-capacity electric transmission with minimal power losses.
    While the new composite breakthrough has direct implications for electric motors, it also could improve electrification in applications where efficiency, mass and size are a key metric, Aytug said. The improved performance characteristics, accomplished with commercially viable techniques, means new possibilities for designing advanced conductors for a broad range of electrical systems and industrial applications, he said.
    The ORNL team also is exploring the use of double-wall CNTs and other deposition techniques such as ultrasonic spray coating coupled with a roll-to-roll system to produce samples of some 1 meter in length.
    “Electric motors are basically a combination of metals — steel laminations and copper windings,” noted Burak Ozpineci, manager of the ORNL Electric Drive Technologies Program and leader of the Power Electronics and Electric Machinery group. “To meet DOE’s Vehicle Technologies Office’s 2025 electric vehicle targets and goals, we need to increase power density of the electric drive and reduce the volume of motors by 8 times, and that means improving material properties.”
    Other ORNL scientists on the project were Michael McGuire, Andrew Lupini, Lydia Skolrood, Fred List and Soydan Ozcan. The work was funded by DOE’s Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office. More

  • in

    Extra stability for magnetic knots

    Tiny magnetic whirls that can occur in materials — so-called skyrmions — hold high promises for novel electronic devices or magnetic memory in which they are used as bits to store information. A fundamental prerequisite for any application is the stability of these magnetic whirls. A research team of the Institute of Theoretical Physics and Astrophysics of Kiel University has now demonstrated that so far neglected magnetic interactions can play a key role for skyrmion stability and can drastically enhance skyrmion lifetime. Their work, which has been published today (September 21, 2020) in Nature Communications, opens also the perspective to stabilize skyrmions in new material systems in which the previously considered mechanisms are not sufficient.
    Intensive research on stability at room temperature
    Their unique magnetic structure — more precisely their topology — lends stability to skyrmions and protects them from collapse. Therefore, skyrmions are denoted as knots in the magnetization. However, on the atomic lattice of a solid this protection is imperfect and there is only a finite energy barrier. “The situation is comparable to a marble lying in a trough which thus needs a certain impetus, energy, to escape from it. The larger the energy barrier, the higher is the temperature at which the skyrmion is stable,” explains Professor Stefan Heinze from Kiel University. Especially skyrmions with diameters below 10 nanometers, which are needed for future spinelectronic devices, have so far only been detected at very low temperatures. Since applications are typically at room temperature the enhancement of the energy barrier is a key objective in today’s research on skyrmions.
    Previously, a standard model of the relevant magnetic interactions contributing to the barrier has been established. A team of theoretical physicists from the research group of Professor Stefan Heinze has now demonstrated that one type of magnetic interactions has so far been overlooked. In the 1920s Werner Heisenberg could explain the occurrence of ferromagnetism by the quantum mechanical exchange interaction which results from the spin dependent “hopping” of electrons between two atoms. “If one considers the electron hopping between more atoms, higher-order exchange interactions occur,” says Dr. Souvik Paul, first author of the study. However, these interactions are much weaker than the pair-wise exchange proposed by Heisenberg and were thus neglected in the research on skyrmions.
    Weak higher-order exchange interactions stabilize skyrmions
    Based on atomistic simulations and quantum mechanical calculations performed on the super computers of the North-German Supercomputing Alliance (HLRN) the scientists from Kiel have now explained that these weak interactions can still provide a surprisingly large contribution to skyrmion stability. Especially the cyclic hopping over four atomic sites influences the energy of the transition state extraordinarily strongly, where only a few atomic bar magnets are tilted against each other. Even stable antiskyrmions were found in the simulations which are advantageous for some future data storage concepts but typically decay too fast.
    Higher-order exchange interactions appear in many magnetic materials used for potential skyrmion applications such as cobalt or iron. They can also stabilize skyrmions in magnetic structures in which the previously considered magnetic interactions cannot occur or are too small. Therefore, the present study opens new promising routes for the research on these fascinating magnetic knots.

    Story Source:
    Materials provided by Kiel University. Note: Content may be edited for style and length. More