More stories

  • in

    Researchers gain deeper understanding of mechanism behind superconductors

    Physicists at Leipzig University have once again gained a deeper understanding of the mechanism behind superconductors. This brings the research group led by Professor Jürgen Haase one step closer to their goal of developing the foundations for a theory for superconductors that would allow current to flow without resistance and without energy loss. The researchers found that in superconducting copper-oxygen bonds, called cuprates, there must be a very specific charge distribution between the copper and the oxygen, even under pressure.
    This confirmed their own findings from 2016, when Haase and his team developed an experimental method based on magnetic resonance that can measure changes that are relevant to superconductivity in the structure of materials. They were the first team in the world to identify a measurable material parameter that predicts the maximum possible transition temperature — a condition required to achieve superconductivity at room temperature. Now they have discovered that cuprates, which under pressure enhance superconductivity, follow the charge distribution predicted in 2016. The researchers have published their new findings in the journal PNAS.
    “The fact that the transition temperature of cuprates can be enhanced under pressure has puzzled researchers for 30 years. But until now we didn’t know which mechanism was responsible for this,” Haase said. He and his colleagues at the Felix Bloch Institute for Solid State Physics have now come a great deal closer to understanding the actual mechanism in these materials. “At Leipzig University — with support from the Graduate School Building with Molecules and Nano-objects (BuildMoNa) — we have established the basic conditions needed to research cuprates using nuclear resonance, and Michael Jurkutat was the first doctoral researcher to join us. Together, we established the Leipzig Relation, which says that you have to take electrons away from the oxygen in these materials and give them to the copper in order to increase the transition temperature. You can do this with chemistry, but also with pressure. But hardly anyone would have thought that we could measure all of this with nuclear resonance,” Haase said.
    Their current research finding could be exactly what is needed to produce a superconductor at room temperature, which has been the dream of many physicists for decades and is now expected to take only a few more years, according to Haase. To date, this has only been possible at very low temperatures around minus 150 degrees Celsius and below, which are not easy to find anywhere on Earth. About a year ago, a Canadian research group verified the findings of Professor Haase’s team from 2016 using newly developed, computer-aided calculations and thus substantiated the findings theoretically.
    Superconductivity is already used today in a variety of ways, for example, in magnets for MRI machines and in nuclear fusion. But it would be much easier and less expensive if superconductors operated at room temperature. The phenomenon of superconductivity was discovered in metals as early as 1911, but even Albert Einstein did not attempt to come up with an explanation back then. Nearly half a century passed before BCS theory provided an understanding of superconductivity in metals in 1957. In 1986, the discovery of superconductivity in ceramic materials (cuprate superconductors) at much higher temperatures by physicists Georg Bednorz and Karl Alexander Müller raised new questions, but also raised hopes that superconductivity could be achieved at room temperature. More

  • in

    Researchers use AI to triage patients with chest pain

    Artificial intelligence (AI) may help improve care for patients who show up at the hospital with acute chest pain, according to a study published in Radiology, a journal of the Radiological Society of North America (RSNA).
    “To the best of our knowledge, our deep learning AI model is the first to utilize chest X-rays to identify individuals among acute chest pain patients who need immediate medical attention,” said the study’s lead author, Márton Kolossváry, M.D., Ph.D., radiology research fellow at Massachusetts General Hospital (MGH) in Boston.
    Acute chest pain syndrome may consist of tightness, burning or other discomfort in the chest or a severe pain that spreads to your back, neck, shoulders, arms, or jaw. It may be accompanied by shortness of breath.
    Acute chest pain syndrome accounts for over 7 million emergency department visits annually in the United States, making it one of the most common complaints.
    Fewer than 8% of these patients are diagnosed with the three major cardiovascular causes of acute chest pain syndrome, which are acute coronary syndrome, pulmonary embolism or aortic dissection. However, the life-threatening nature of these conditions and low specificity of clinical tests, such as electrocardiograms and blood tests, lead to substantial use of cardiovascular and pulmonary diagnostic imaging, often yielding negative results. As emergency departments struggle with high patient numbers and shortage of hospital beds, effectively triaging patients at very low risk of these serious conditions is important.
    Deep learning is an advanced type of artificial intelligence (AI) that can be trained to search X-ray images to find patterns associated with disease.

    For the study, Dr. Kolossváry and colleagues developed an open-source deep learning model to identify patients with acute chest pain syndrome who were at risk for 30-day acute coronary syndrome, pulmonary embolism, aortic dissection or all-cause mortality, based on a chest X-ray.
    The study used electronic health records of patients presenting with acute chest pain syndrome who had a chest X-ray and additional cardiovascular or pulmonary imaging and/or stress tests at MGH or Brigham and Women’s Hospital in Boston between January 2005 and December 2015. For the study, 5,750 patients (mean age 59, including 3,329 men) were evaluated.
    The deep-learning model was trained on 23,005 patients from MGH to predict a 30-day composite endpoint of acute coronary syndrome, pulmonary embolism or aortic dissection and all-cause mortality based on chest X-ray images.
    The deep-learning tool significantly improved prediction of these adverse outcomes beyond age, sex and conventional clinical markers, such as d-dimer blood tests. The model maintained its diagnostic accuracy across age, sex, ethnicity and race. Using a 99% sensitivity threshold, the model was able to defer additional testing in 14% of patients as compared to 2% when using a model only incorporating age, sex, and biomarker data.
    “Analyzing the initial chest X-ray of these patients using our automated deep learning model, we were able to provide more accurate predictions regarding patient outcomes as compared to a model that uses age, sex, troponin or d-dimer information,” Dr. Kolossváry said. “Our results show that chest X-rays could be used to help triage chest pain patients in the emergency department.”
    According to Dr. Kolossváry, in the future such an automated model could analyze chest X-rays in the background and help select those who would benefit most from immediate medical attention and may help identify patients who may be discharged safely from the emergency department.
    “Deep Learning Analysis of Chest Radiographs to Triage Patients with Acute Chest Pain Syndrome.” Collaborating with Dr. Kolossváry were Vineet K. Raghu, Ph.D., John T. Nagurney, M.D., Udo Hoffmann, M.D., M.P.H., and Michael T. Lu, M.D., M.P.H. More

  • in

    Novel framework provides 'measuring stick' for assessing patient matching tools

    Accurate linking of an individual’s medical records from disparate sources within and between health systems, known as patient matching, plays a critical role in patient safety and quality of care, but has proven difficult to accomplish in the United States, the last developed country without a unique patient identifier. In the U.S., linking patient data is dependent on algorithms designed by researchers, vendors and others. Research scientists led by Regenstrief Institute Vice President for Data and Analytics Shaun Grannis, M.D., M.S., have developed an eight-point framework for evaluating the validity and performance of algorithms to match medical records to the correct patient.
    “The value of data standardization is well recognized. There are national healthcare provider IDs. There are facility IDs and object identifiers. There are billing codes. There are standard vocabularies for healthcare lab test results and medical observations — such as LOINC® here at Regenstrief. Patient identity is the last gaping hole in our health infrastructure,” said Dr. Grannis. “We are providing a framework to evaluate patient matching algorithms for accuracy.
    “We recognize that the need for patient matching is not going away and that we need standardized methods to uniquely identify patients,” said Dr. Grannis. “Current patient matching algorithms come in many different flavors, shapes and sizes. To be able to compare how one performs against the other, or even to understand how they might interact together, we have to have a standard way of assessment. We have produced a novel, robust framework for consistent and reproducible evaluation. Simply put, the framework we’ve developed at Regenstrief provides a ‘measuring stick’ for the effectiveness of patient matching tools.”
    Individuals increasingly receive care from multiple sources. While patient matching is complex, it is crucial to health information exchange. Is the William Jones seen at one healthcare system the same person as the William, Will or Willy Jones or perhaps Bill or Billy Jones receiving care at other facilities? Does Elizabeth Smith’s name appear at different medical offices or perhaps at a physical therapy or a dialysis facility as Liz or Beth? To which Juan J. Gomez do various lab test results belong? Typos, missing information and other data errors as well as typical variations add to the complexity.
    The framework’s eight-point approach to the creation of gold standard matching data sets necessary for record linkage encompasses technical areas including data preprocessing, blocking, record adjudication, linkage evaluation and reviewer characteristics. The authors note that the framework “can help record linkage method developers provide necessary transparency when creating and validating gold standard reference matching data sets. In turn, this transparency will support both the internal and external validity of recording linkage studies and improve the robustness of new record linkage strategies.”
    Measures and standards are ubiquitous. “When you go to a gas station pump, the measure of how much gas goes through is standardized so that we know exactly how much is flowing. Similarly, we need to have a common way of measuring and understanding how algorithms for patient matching work,” said Dr. Grannis. “Our eight-pronged approach helps to cover the waterfront of what needs to be evaluated. Laying out the framework and specifying the tasks and activities that need to be completed goes a long way toward standardizing patient matching.”
    In addition to playing a critical role in patient safety and quality of care, improved patient matching accuracy supports more cost-effective healthcare delivery in a variety of ways including reduction in the number of duplicate medical tests. More

  • in

    These chemists cracked the code to long-lasting Roman concrete

    MIT chemist Admir Masic really hoped his experiment wouldn’t explode.

    Masic and his colleagues were trying to re-create an ancient Roman technique for making concrete, a mix of cement, gravel, sand and water. The researchers suspected that the key was a process called “hot mixing,” in which dry granules of calcium oxide, also called quicklime, are mixed with volcanic ash to make the cement. Then water is added.

    Hot mixing, they thought, would ultimately produce a cement that wasn’t completely smooth and mixed, but instead contained small calcium-rich rocks. Those little rocks, ubiquitous in the walls of the Romans’ concrete buildings, might be the key to why those structures have withstood the ravages of time.

    Science News headlines, in your inbox

    Headlines and summaries of the latest Science News articles, delivered to your email inbox every Thursday.

    Thank you for signing up!

    There was a problem signing you up.

    That’s not how modern cement is made. The reaction of quicklime with water is highly exothermic, meaning that it can produce a lot of heat — and possibly an explosion.

    “Everyone would say, ‘You are crazy,’” Masic says.

    But no big bang happened. Instead, the reaction produced only heat, a damp sigh of water vapor — and a Romans-like cement mixture bearing small white calcium-rich rocks.

    Researchers have been trying for decades to re-create the Roman recipe for concrete longevity — but with little success. The idea that hot mixing was the key was an educated guess.

    Masic and colleagues had pored over texts by Roman architect Vitruvius and historian Pliny, which offered some clues as to how to proceed. These texts cited, for example, strict specifications for the raw materials, such as that the limestone that is the source of the quicklime must be very pure, and that mixing quicklime with hot ash and then adding water could produce a lot of heat.

    The rocks were not mentioned, but the team had a feeling they were important.

    Subscribe to Science News

    Get great science journalism, from the most trusted source, delivered to your doorstep.

    “In every sample we have seen of ancient Roman concrete, you can find these white inclusions,” bits of rock embedded in the walls. For many years, Masic says, the origin of those inclusions was unclear — researchers suspected incomplete mixing of the cement, perhaps. But these are the highly organized Romans we’re talking about. How likely is it that “every operator [was] not mixing properly and every single [building] has a flaw?”

    What if, the team suggested, these inclusions in the cement were actually a feature, not a bug? The researchers’ chemical analyses of such rocks embedded in the walls at the archaeological site of Privernum in Italy indicated that the inclusions were very calcium-rich.

    That suggested the tantalizing possibility that these rocks might be helping the buildings heal themselves from cracks due to weathering or even an earthquake. A ready supply of calcium was already on hand: It would dissolve, seep into the cracks and re-crystallize. Voila! Scar healed.

    But could the team observe this in action? Step one was to re-create the rocks via hot mixing and hope nothing exploded. Step two: Test the Roman-inspired cement. The team created concrete with and without the hot mixing process and tested them side by side. Each block of concrete was broken in half, the pieces placed a small distance apart. Then water was trickled through the crack to see how long it took before the seepage stopped.

    “The results were stunning,” Masic says. The blocks incorporating hot mixed cement healed within two to three weeks. The concrete produced without hot mixed cement never healed at all, the team reports January 6 in Science Advances.

    Cracking the recipe could be a boon to the planet. The Pantheon and its soaring, detailed concrete dome have stood nearly 2,000 years, for instance, while modern concrete structures have a lifespan of perhaps 150 years, and that’s a best case scenario (SN: 2/10/12). And the Romans didn’t have steel reinforcement bars shoring up their structures.

    More frequent replacements of concrete structures means more greenhouse gas emissions. Concrete manufacturing is a huge source of carbon dioxide to the atmosphere, so longer-lasting versions could reduce that carbon footprint. “We make 4 gigatons per year of this material,” Masic says. That manufacture produces as much as 1 metric ton of CO2 per metric ton of produced concrete, currently amounting to about 8 percent of annual global CO2 emissions.

    Still, Masic says, the concrete industry is resistant to change. For one thing, there are concerns about introducing new chemistry into a tried-and-true mixture with well-known mechanical properties. But “the key bottleneck in the industry is the cost,” he says. Concrete is cheap, and companies don’t want to price themselves out of competition.

    The researchers hope that reintroducing this technique that has stood the test of time, and that could involve little added cost to manufacture, could answer both these concerns. In fact, they’re banking on it: Masic and several of his colleagues have created a startup they call DMAT that is currently seeking seed money to begin to commercially produce the Roman-inspired hot-mixed concrete. “It’s very appealing simply because it’s a thousands-of-years-old material.” More

  • in

    Cyclones in the Arctic are becoming more intense and frequent

    CHICAGO – In January 2022, a cyclone blitzed a large expanse of ice-covered ocean between Greenland and Russia. Frenzied gusts galvanized 8-meter-tall waves that pounded the region’s hapless flotillas of sea ice, while a bombardment of warm rain and a surge of southerly heat laid siege from the air.

    Six days after the assault began, about a quarter, or roughly 400,000 square kilometers, of the vast area’s sea ice had disappeared, leading to a record weekly loss for the region.

    Science News headlines, in your inbox

    Headlines and summaries of the latest Science News articles, delivered to your email inbox every Thursday.

    Thank you for signing up!

    There was a problem signing you up.

    The storm is the strongest Arctic cyclone ever documented. But it may not hold that title for long. Cyclones in the Arctic have become more frequent and intense in recent decades, posing risks to both sea ice and people, researchers reported December 13 at the American Geophysical Union’s fall meeting. “This trend is expected to persist as the region continues to warm rapidly in the future,” says climate scientist Stephen Vavrus of the University of Wisconsin–Madison.

    Rapid Arctic warming and more destructive storms

    The Arctic Circle is warming about four times as fast as the rest of Earth (SN: 8/11/22). A major driver is the loss of sea ice due to human-caused climate change. The floating ice reflects far more solar radiation back into space than naked seas do, influencing the global climate (SN: 10/14/21). During August, the heart of the sea ice melting season, cyclones have been observed to amplify sea ice losses on average, exacerbating warming.

    There’s more: Like hurricanes can ravage regions farther south, boreal vortices can threaten people living and traveling in the Arctic (SN: 12/11/19). As the storms intensify, “stronger winds pose a risk for marine navigation by generating higher waves,” Vavrus says, “and for coastal erosion, which has already become a serious problem throughout much of the Arctic and forced some communities to consider relocating inland.”

    Climate change is intensifying storms farther south (SN: 11/11/20). But it’s unclear how Arctic cyclones might be changing as the world warms. Some previous research suggested that pressures, on average, in Arctic cyclones’ cores have dropped in recent decades. That would be problematic, as lower pressures generally mean more intense storms, with “stronger winds, larger temperature variations and heavier rainfall [and] snowfall,” says atmospheric scientist Xiangdong Zhang of the University of Alaska Fairbanks.

    But inconsistencies between analyses had prevented a clear trend from emerging, Zhang said at the meeting. So he and his colleagues aggregated a comprehensive record, spanning 1950 to 2021, of Arctic cyclone timing, intensity and duration.

    Arctic cyclone activity has intensified in strength and frequency over recent decades, Zhang reported. Pressures in the hearts of today’s boreal vortices are on average about 9 millibars lower than in the 1950s. For context, such a pressure shift would be roughly equivalent to bumping a strong category 1 hurricane well into category 2 territory. And vortices became more frequent during winters in the North Atlantic Arctic and during summers in the Arctic north of Eurasia.

    What’s more, August cyclones appear to be damaging sea ice more than in the past, said meteorologist Peter Finocchio of the U.S. Naval Research Laboratory in Monterey, Calif. He and his colleagues compared the response of northern sea ice to summer cyclones during the 1990s and the 2010s.

    August vortices in the latter decade were followed by a 10 percent loss of sea ice area on average, up from the earlier decade’s 3 percent loss on average. This may be due, in part, to warmer water upwelling from below, which can melt the ice pack’s underbelly, and from winds pushing the thinner, easier-to-move ice around, Finocchio said.

    Stronger spring storms spell trouble too

    With climate change, cyclones may continue intensifying in the spring too, climate scientist Chelsea Parker said at the meeting. That’s a problem because spring vortices can prime sea ice for later summer melting.

    Parker, of NASA’s Goddard Space Flight Center in Greenbelt, Md., and her colleagues ran computer simulations of spring cyclone behavior in the Arctic under past, present and projected climate conditions. By the end of the century, the maximum near-surface wind speeds of spring cyclones — around 11 kilometers per hour today — could reach 60 km/h, the researchers found. And future spring cyclones may keep swirling at peak intensity for up to a quarter of their life spans, up from around 1 percent today. The storms will probably travel farther too, the team says.

    “The diminishing sea ice cover will enable the warmer Arctic seas to fuel these storms and probably allow them to penetrate farther into the Arctic,” says Vavrus, who was not involved in the research.

    Parker and her team plan to investigate the future evolution of Arctic cyclones in other seasons, to capture a broader picture of how climate change is affecting the storms.

    For now, it seems certain that Arctic cyclones aren’t going anywhere. What’s less clear is how humankind will contend with the storms’ growing fury. More

  • in

    New small laser device can help detect signs of life on other planets

    As space missions delve deeper into the outer solar system, the need for more compact, resource-conserving and accurate analytical tools has become increasingly critical — especially as the hunt for extraterrestrial life and habitable planets or moons continues.
    A University of Maryland-led team developed a new instrument specifically tailored to the needs of NASA space missions. Their mini laser-sourced analyzer is significantly smaller and more resource efficient than its predecessors — all without compromising the quality of its ability to analyze planetary material samples and potential biological activity onsite. The team’s paper on this new device was published in the journal Nature Astronomy on January 16, 2023.
    Weighing only about 17 pounds, the instrument is a physically scaled-down combination of two important tools for detecting signs of life and identifying compositions of materials: a pulsed ultraviolet laser that removes small amounts of material from a planetary sample and an OrbitrapTM analyzer that delivers high-resolution data about the chemistry of the examined materials.
    “The Orbitrap was originally built for commercial use,” explained Ricardo Arevalo, lead author of the paper and an associate professor of geology at UMD. “You can find them in the labs of pharmaceutical, medical and proteomic industries. The one in my own lab is just under 400 pounds, so they’re quite large, and it took us eight years to make a prototype that could be used efficiently in space — significantly smaller and less resource-intensive, but still capable of cutting-edge science.”
    The team’s new gadget shrinks down the original Orbitrap while pairing it with laser desorption mass spectrometry (LDMS) — techniques that have yet to be applied in an extraterrestrial planetary environment. The new device boasts the same benefits as its larger predecessors but is streamlined for space exploration and onsite planetary material analysis, according to Arevalo.
    Thanks to its diminutive mass and minimal power requirements, the mini Orbitrap LDMS instrument can be easily stowed away and maintained on space mission payloads. The instrument’s analyses of a planetary surface or substance are also far less intrusive and thus much less likely to contaminate or damage a sample than many current methods that attempt to identify unknown compounds.

    “The good thing about a laser source is that anything that can be ionized can be analyzed. If we shoot our laser beam at an ice sample, we should be able to characterize the composition of the ice and see biosignatures in it,” Arevalo said. “This tool has such a high mass resolution and accuracy that any molecular or chemical structures in a sample become much more identifiable.”
    The laser component of the mini LDMS Orbitrap also allows researchers access to larger, more complex compounds that are more likely to be associated with biology. Smaller organic compounds like amino acids, for example, are more ambiguous signatures of life forms.
    “Amino acids can be produced abiotically, meaning that they’re not necessarily proof of life. Meteorites, many of which are chock full of amino acids, can crash onto a planet’s surface and deliver abiotic organics to the surface,” Arevalo said. “We know now that larger and more complex molecules, like proteins, are more likely to have been created by or associated with living systems. The laser lets us study larger and more complex organics that can reflect higher fidelity biosignatures than smaller, simpler compounds.”
    For Arevalo and his team, the mini LDMS Orbitrap will offer much-needed insight and flexibility for future ventures into the outer solar system, such as missions focused on life detection objectives (e.g., Enceladus Orbilander) and exploration of the lunar surface (e.g., the NASA Artemis Program). They hope to send their device into space and deploy it on a planetary target of interest within the next few years.
    “I view this prototype as a pathfinder for other future LDMS and Orbitrap-based instruments,” Arevalo said. “Our mini Orbitrap LDMS instrument has the potential to significantly enhance the way we currently study the geochemistry or astrobiology of a planetary surface.”
    Other UMD-affiliated researchers on the team include geology graduate students Lori Willhite and Ziqin “Grace” Ni, geology postdoctoral associates Anais Bardyn and Soumya Ray, and astronomy visiting associate research engineer Adrian Southard.
    This study was supported by NASA (Award Nos. 80NSSC19K0610, 80NSSC19K0768, 80GSFC21M0002), NASA Goddard Space Flight Center Internal Research Development (IRAD), and the University of Maryland Faculty Incentive Program. More

  • in

    Blocking radio waves and electromagnetic interference with the flip of a switch

    Researchers in Drexel University’s College of Engineering have developed a thin film device, fabricated by spray coating, that can block electromagnetic radiation with the flip of a switch. The breakthrough, enabled by versatile two-dimensional materials called MXenes, could adjust the performance of electronic devices, strengthen wireless connections and secure mobile communications against intrusion.
    The team, led by Yury Gogotsi, PhD, Distinguished University and Bach professor in Drexel’s College of Engineering, previously demonstrated that the two-dimensional layered MXene materials, discovered just over a decade ago, when combined with an electrolyte solution, can be turned into a potent active shield against electromagnetic waves. This latest MXene discovery, reported in Nature Nanotechnology, shows how this shielding can be tuned when a small voltage — less than that produced by an alkaline battery — is applied.
    “Dynamic control of electromagnetic wave jamming has been a significant technological challenge for protecting electronic devices working at gigahertz frequencies and a variety of other communications technologies,” Gogotsi said. “As the number of wireless devices being used in industrial and private sectors has increased by orders of magnitude over the past decade, the urgency of this challenge has grown accordingly. This is why our discovery — which would dynamically mitigate the effect of electromagnetic interference on these devices — could have a broad impact.”
    MXene is a unique material in that it is highly conductive — making it perfectly suited for reflecting microwave radiation that could cause static, feedback or diminish the performance of communications devices — but its internal chemical structure can also be temporarily altered to allow these electromagnetic waves to pass through.
    This means that a thin coating on a device or electrical components prevents them from both emitting electromagnetic waves, as well as being penetrated by those emitted by other electronics. Eliminating the possibility of interference from both internal and external sources can ensure the performance of the device, but some waves must be allowed to exit and enter when it is being used for communication.
    “Without being able to control the ebb and flow of electromagnetic waves within and around a device, it’s a bit like a leaky faucet — you’re not really turning off the water and that constant dripping is no good,” Gogotsi said. “Our shielding ensures the plumbing is tight — so-to-speak — no electromagnetic radiation is leaking out or getting in until we want to use the device.”
    The key to eliciting bidirectional tunability of MXene’s shielding property is using the flow and expulsion of ions to alternately expand and compress the space between material’s layers, like an accordion, as well as to change the surface chemistry of MXenes.

    With a small voltage applied to the film, ions enter — or intercalate — between the MXene layers altering the charge of their surface and inducing electrostatic attraction, which serves to change the layer spacing, the conductivity and shielding efficiency of the material. When the ions are deintercalated, as the current is switched off, the MXene layers return to their original state.
    The team tested 10 different MXene-electrolyte combinations, applying each via paint sprayer in a layer about 30 to 100 times thinner than a human hair. The materials consistently demonstrated the dynamic tunability of shielding efficiency in blocking microwave radiation, which is impossible for traditional metals like copper and steel. And the device sustained the performance through more than 500 charge-discharge cycles.
    “These results indicate that the MXene films can convert from electromagnetic interference shielding to quasi-electromagnetic wave transmission by electrochemical oxidation of MXenes,” Gogotsi and his co-authors wrote. “The MXene film can potentially serve as a dynamic EMI shielding switch.”
    For security applications, Gogotsi suggests that the MXene shielding could hide devices from detection by radar or other tracing systems. The team also tested the potential of a one-way shielding switch. This would allow a device to remain undetectable and protected from unauthorized access until it is deployed for use.
    “A one-way switch could open the protection and allow a signal to be sent or communication to be opened in an emergency or at the required moment,” Gogotsi said. “This means it could protect communications equipment from being influenced or tampered with until it is in use. For example, it could encase the device during transportation or storage and then activate only when it is ready to be used.”
    The next step for Gogotsi’s team is to explore additional MXene-electrolyte combinations and mechanisms to fine-tune the shielding to achieve a stronger modulation of electromagnetic wave transmission and dynamic adjustment to block radiation at a variety of bandwidths. More

  • in

    COVID calculations spur solution to old problem in computer science

    During the corona epidemic many of us became amateur mathematicians. How quickly would the number of hospitalized patients rise, and when would herd immunity be achieved? Professional mathematicians were challenged as well, and a researcher at University of Copenhagen became inspired to solve a 30-year-old problem in computer science. The breakthrough has just been published in th Journal of the ACM (Association for Computing Machinery).
    “Like many others, I was out to calculate how the epidemic would develop. I wanted to investigate certain ideas from theoretical computer science in this context. However, I realized that the lack of solution to the old problem was a showstopper,” says Joachim Kock, Associate Professor at the Department of Mathematics, University of Copenhagen.
    His solution to the problem can be of use in epidemiology and computer science, and potentially in other fields as well. A common feature for these fields is the presence of systems where the various components exhibit mutual influence. For instance, when a healthy person meets a person infected with COVID, the result can be two people infected.
    Smart method invented by German teenager
    To understand the breakthrough, one needs to know that such complex systems can be described mathematically through so-called Petri nets. The method was invented in 1939 by German Carl Adam Petri (by the way at the age of only 13) for chemistry applications. Just like a healthy person meeting a person infected with COVID can trigger a change, the same may happen when two chemical substances mix and react.
    In a Petri net the various components are drawn as circles while events such as a chemical reaction or an infection are drawn as squares. Next, circles and squares are connected by arrows which show the interdependencies in the system.

    A simple version of a Petri net for COVID infection. The starting point is a non-infected person. “S” denotes “susceptible.” Contact with an infected person (“I”) is an event which leads to two persons being infected. Later another event will happen, removing a person from the group of infected. Here, “R” denotes “recovered” which in this context could be either cured or dead. Either outcome would remove the person from the infected group.
    Computer scientists regarded the problem as unsolvable
    In chemistry, Petri nets are applied for calculating how the concentrations of various chemical substances in a mixture will evolve. This manner of thinking has influenced the use of Petri nets in other fields such as epidemiology: we are starting out with a high “concentration” of un-infected people, whereafter the “concentration” of infected starts to rise. In computer science, the use of Petri nets is somewhat different: the focus is on individuals rather than concentrations, and the development happens in steps rather than continuously.
    What Joachim Kock had in mind was to apply the more individual-oriented Petri nets from computer science for COVID calculations. This was when he encountered the old problem:
    “Basically, the processes in a Petri net can be described through two separate approaches. The first approach regards a process as a series of events, while the second approach sees the net as a graphical expression of the interdependencies between components and events,” says Joachim Kock, adding:
    “The serial approach is well suited for performing calculations. However, it has a downside since it describes causalities less accurately than the graphical approach. Further, the serial approach tends to fall short when dealing with events that take place simultaneously.”

    “The problem was that nobody had been able to unify the two approaches. The computer scientists had more or less resigned, regarding the problem as unsolvable. This was because no-one had realized that you need to go all the way back and revise the very definition of a Petri net,” says Joachim Kock.
    Small modification with large impact
    The Danish mathematician realized that a minor modification to the definition of a Petri net would enable a solution to the problem:
    “By allowing parallel arrows rather than just counting them and writing a number, additional information is made available. Things work out and the two approaches can be unified.”
    The exact mathematical reason why this additional information matters is complex, but can be illustrated by an analogy:
    “Assigning numbers to objects has helped humanity greatly. For instance, it is highly practical that I can arrange the right number of chairs in advance for a dinner party instead of having to experiment with different combinations of chairs and guests after they have arrived. However, the number of chairs and guests does not reveal who will be sitting where. Some information is lost when we consider numbers instead of the real objects.”
    Similarly, information is lost when the individual arrows of the Petri net are replaced by a number.
    “It takes a bit more effort to treat the parallel arrows individually, but one is amply rewarded as it becomes possible to combine the two approaches so that the advantages of both can be obtained simultaneously.”
    The circle to COVID has been closed
    The solution helps our mathematical understanding of how to describe complex systems with many interdependencies, but will not have much practical effect on the daily work of computer scientists using Petri nets, according to Joachim Kock:
    “This is because the necessary modifications are mostly back-compatible and can be applied without need for revision of the entire Petri net theory.”
    “Somewhat surprisingly, some epidemiologists have started using the revised Petri nets. So, one might say the circle has been closed!”
    Joachim Kock does see a further point to the story:
    “I wasn’t out to find a solution to the old problem in computer science at all. I just wanted to do COVID calculations. This was a bit like looking for your pen but realizing that you must find your glasses first. So, I would like to take the opportunity to advocate the importance of research which does not have a predefined goal. Sometimes research driven by curiosity will lead to breakthroughs.” More