More stories

  • in

    Reducing data-transfer error in radiation therapy

    Just as helicopter traffic reporters use their “bird’s eye view” to route drivers around roadblocks safely, radiation oncologists treating a variety of cancers can use new guidelines developed by a West Virginia University researcher to reduce mistakes in data transfer and more safely treat their patients.
    Ramon Alfredo Siochi — the director of medical physics at WVU — led a task group to help ensure the accuracy of data that dictates a cancer patient’s radiation therapy. The measures he and his colleagues recommended in their new report safeguard against medical errors in a treatment that more than half of all cancer patients receive.
    “The most common mistake that happens in radiation oncology is the transfer of information from one system to another,” Siochi, the associate chair for the School of Medicine’s Department of Radiation Oncology, said. “This report gives you a good, bird’s-eye view of the way data is moving around in your department.”
    “How frequently do these accidents occur? I think one estimate I saw was that three out of every 100 patients might have an error, but it doesn’t necessarily harm them. Now, I don’t know what the incidence rate is of errors that are quote-unquote ‘near misses’ — when an error happens before it hits the patient — but I would imagine it is much higher.
    Siochi recently chaired the Task Group of Quality Assurance on External Beam Treatment Data Transfer, a division of the American Association of Physicists in Medicine.
    The group was formed in response to news coverage of radiation overdoses caused by faulty data transfer.
    “In 2010, it was reported in the New York Times that a patient [in a New York City hospital] was overdosed with radiation because the data somehow didn’t transfer properly from one system to another,” Siochi said. “Long story short, the patient received a lethal dose of radiation to his head that went on for three days undetected. Now, that falls into the general class of many things happening that were not standard practice. But it could have been avoided.”
    Radiation therapy is used to treat a variety of cancers, including cancers of the lung, pancreas, prostate, breast, brain and bladder. Depending on a cancer’s type or stage, radiation may cure it, shrink it or stop it from coming back.
    But as the complexity of radiation therapy has grown — making it possible to target cancers that would once have been too difficult to treat — so too has the amount of data that goes into treatment machines. With more data comes more opportunity for errors.
    When Siochi started practicing radiation oncology physics — in the 1990s — this data evoked a tree-lined residential street more than the six-lane highway it brings to mind today.
    “It was very analog,” he said. “We’re talking maybe 20 parameters that you would need to check on a plan, and you would put it all on a paper chart. But I once did a calculation — to do an order of magnitude — and now we’re talking about 100,000 parameters. It’s just impossible for a human to check.”
    The group’s report — which earned the approval of AAPM and the Science Council — makes that volume of parameters less overwhelming. It explains how data is transferred among various systems used in radiation therapy, and it suggests ways that medical physicists can test the data’s integrity throughout the process, contributing to safer treatments.
    Story Source:
    Materials provided by West Virginia University. Note: Content may be edited for style and length. More

  • in

    Hurricanes may not be becoming more frequent, but they’re still more dangerous

    Climate change is helping Atlantic hurricanes pack more of a punch, making them rainier, intensifying them faster and helping the storms linger longer even after landfall. But a new statistical analysis of historical records and satellite data suggests that there aren’t actually more Atlantic hurricanes now than there were roughly 150 years ago, researchers report July 13 in Nature Communications.

    The record-breaking number of Atlantic hurricanes in 2020, a whopping 30 named storms, led to intense speculation over whether and how climate change was involved (SN: 12/21/20). It’s a question that scientists continue to grapple with, says Gabriel Vecchi, a climate scientist at Princeton University. “What is the impact of global warming — past impact and also our future impact — on the number and intensity of hurricanes and tropical storms?”

    Satellite records over the last 30 years allow us to say “with little ambiguity how many hurricanes, and how many major hurricanes [Category 3 and above] there were each year,” Vecchi says. Those data clearly show that the number, intensity and speed of intensification of hurricanes has increased over that time span.

    But “there are a lot of things that have happened over the last 30 years” that can influence that trend, he adds. “Global warming is one of them.” Decreasing aerosol pollution is another (SN: 11/21/19). The amount of soot and sulfate particles and dust over the Atlantic Ocean was much higher in the mid-20th century than now; by blocking and scattering sunlight, those particles temporarily cooled the planet enough to counteract greenhouse gas warming. That cooling is also thought to have helped temporarily suppress hurricane activity in the Atlantic.  

    To get a longer-term perspective on trends in Atlantic storms, Vecchi and colleagues examined a dataset of hurricane observations from the U.S. National Oceanic and Atmospheric Administration that stretches from 1851 to 2019. It includes old-school observations by unlucky souls who directly observed the tempests as well as remote sensing data from the modern satellite era.

    How to directly compare those different types of observations to get an accurate trend was a challenge. Satellites, for example, can see every storm, but earlier observations will count only the storms that people directly experienced. So the researchers took a probabilistic approach to fill in likely gaps in the older record, assuming, for example, that modern storm tracks are representative of pre-satellite storm tracks to account for storms that would have stayed out at sea and unseen. The team found no clear increase in the number of storms in the Atlantic over that 168-year time frame. One possible reason for this, the researchers say, is a rebound from the aerosol pollution–induced lull in storms that may be obscuring some of the greenhouse gas signal in the data.  

    More surprisingly — even to Vecchi, he says — the data also seem to show no significant increase in hurricane intensity over that time. That’s despite “scientific consistency between theories and models indicating that the typical intensity of hurricanes is more likely to increase as the planet warms,” Vecchi says. But this conclusion is heavily caveated — and the study also doesn’t provide evidence against the hypothesis that global warming “has acted and will act to intensify hurricane activity,” he adds.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Climate scientists were already familiar with the possibility that storm frequency might not have increased much in the last 150 or so years — or over much longer timescales. The link between number of storms and warming has long been uncertain, as the changing climate also produces complex shifts in atmospheric patterns that could take the hurricane trend in either direction. The Intergovernmental Panel on Climate Change noted in a 2012 report that there is “low confidence” that tropical cyclone activity has increased in the long term.

    Geologic evidence of Atlantic storm frequency, which can go back over 1,000 years, also suggests that hurricane frequency does tend to wax and wane every few decades, says Elizabeth Wallace, a paleotempestologist at Rice University in Houston (SN: 10/22/17).

    Wallace hunts for hurricane records in deep underwater caverns called blue holes: As a storm passes over an island beach or the barely submerged shallows, winds and waves pick up sand that then can get dumped into these caverns, forming telltale sediment deposits. Her data, she says, also suggest that “the past 150 years hasn’t been exceptional [in storm frequency], compared to the past.”

    But, Wallace notes, these deposits don’t reveal anything about whether climate change is producing more intense hurricanes. And modern observational data on changes in hurricane intensity is muddled by its own uncertainties, particularly the fact that the satellite record just isn’t that long. Still, “I liked that the study says it doesn’t necessarily provide evidence against the hypothesis” that higher sea-surface temperatures would increase hurricane intensity by adding more energy to the storm, she says.

    Kerry Emanuel, an atmospheric scientist at MIT, says the idea that storm numbers haven’t increased isn’t surprising, given the longstanding uncertainty over how global warming might alter that. But “one reservation I have about the new paper is the implication that no significant trends in Atlantic hurricane metrics [going back to 1851] implies no effect of global warming on these storms,” he says. Looking for such a long-term trend isn’t actually that meaningful, he says, as scientists wouldn’t expect to see any global warming-related hurricane trends become apparent until about the 1970s anyway, as warming has ramped up.

    Regardless of whether there are more of these storms, there’s no question that modern hurricanes have become more deadly in many ways, Vecchi says. There’s evidence that global warming has already been increasing the amount of rain from some storms, such as Hurricane Harvey in 2017, which led to widespread, devastating flooding (SN: 9/28/18). And, Vecchi says, “sea level will rise over the coming century … so [increasing] storm surge is one big hazard from hurricanes.” More

  • in

    'Hydrogel-based flexible brain-machine interface'

    A KAIST research team and collaborators revealed a newly developed hydrogel-based flexible brain-machine interface. To study the structure of the brain or to identify and treat neurological diseases, it is crucial to develop an interface that can stimulate the brain and detect its signals in real time. However, existing neural interfaces are mechanically and chemically different from real brain tissue. This causes foreign body response and forms an insulating layer (glial scar) around the interface, which shortens its lifespan.
    To solve this problem, the research team of Professor Seongjun Park developed a ‘brain-mimicking interface’ by inserting a custom-made multifunctional fiber bundle into the hydrogel body. The device is composed not only of an optical fiber that controls specific nerve cells with light in order to perform optogenetic procedures, but it also has an electrode bundle to read brain signals and a microfluidic channel to deliver drugs to the brain.
    The interface is easy to insert into the body when dry, as hydrogels become solid. But once in the body, the hydrogel will quickly absorb body fluids and resemble the properties of its surrounding tissues, thereby minimizing foreign body response.
    The research team applied the device on animal models, and showed that it was possible to detect neural signals for up to six months, which is far beyond what had been previously recorded. It was also possible to conduct long-term optogenetic and behavioral experiments on freely moving mice with a significant reduction in foreign body responses such as glial and immunological activation compared to existing devices.
    “This research is significant in that it was the first to utilize a hydrogel as part of a multifunctional neural interface probe, which increased its lifespan dramatically,” said Professor Park. “With our discovery, we look forward to advancements in research on neurological disorders like Alzheimer’s or Parkinson’s disease that require long-term observation.”
    Story Source:
    Materials provided by The Korea Advanced Institute of Science and Technology (KAIST). Note: Content may be edited for style and length. More

  • in

    The first step in using trees to slow climate change: Protect the trees we have

    Between a death and a burial was hardly the best time to show up in a remote village in Madagascar to make a pitch for forest protection. Bad timing, however, turned out to be the easy problem.

    This forest was the first one that botanist Armand Randrianasolo had tried to protect. He’s the first native of Madagascar to become a Ph.D. taxonomist at Missouri Botanical Garden, or MBG, in St. Louis. So he was picked to join a 2002 scouting trip to choose a conservation site.

    Other groups had already come into the country and protected swaths of green, focusing on “big forests; big, big, big!” Randrianasolo says. Preferably forests with lots of big-eyed, fluffy lemurs to tug heartstrings elsewhere in the world.

    The Missouri group, however, planned to go small and to focus on the island’s plants, legendary among botanists but less likely to be loved as a stuffed cuddly. The team zeroed in on fragments of humid forest that thrive on sand along the eastern coast. “Nobody was working on it,” he says.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    As the people of the Agnalazaha forest were mourning a member of their close-knit community, Randrianasolo decided to pay his respects: “I wanted to show that I’m still Malagasy,” he says. He had grown up in a seaside community to the north.

    The village was filling up with visiting relatives and acquaintances, a great chance to talk with many people in the region. The deputy mayor conceded that after a morning visit to the bereaved, Randrianasolo and MBG’s Chris Birkinshaw could speak in the afternoon with anyone wishing to gather at the roofed marketplace.

    Courtesy of the staff of the Missouri Botanical Garden, St. Louis and Madagascar

    Courtesy of the staff of the Missouri Botanical Garden, St. Louis and Madagascar

    Courtesy of the staff of the Missouri Botanical Garden, St. Louis and Madagascar

    Conserving natural forests is a double win for trapping carbon and saving rich biodiversity. Forests matter to humans (with a Treculia fruit), Phromnia planthoppers and mouse lemurs.

    The two scientists didn’t get the reception they’d hoped for. Their pitch to help the villagers conserve their forest while still serving people’s needs met protests from the crowd: “You’re lying!”

    The community was still upset about a different forest that outside conservationists had protected. The villagers had assumed they would still be able to take trees for lumber, harvest their medicinal plants or sell other bits from the forest during cash emergencies. They were wrong. That place was now off-limits. People caught doing any of the normal things a forest community does would be considered poachers. When MBG proposed conserving yet more land, residents weren’t about to get tricked again. “This is the only forest we have left,” they told the scientists.

    Finding some way out of such clashes to save existing forests has become crucial for fighting climate change. Between 2001 and 2019, the planet’s forests trapped an estimated 7.6 billion metric tons of carbon dioxide a year, an international team reported in Nature Climate Change in March. That rough accounting suggests trees may capture about one and a half times the annual emissions of the United States, one of the largest global emitters.

    Planting trees by the millions and trillions is basking in round-the-world enthusiasm right now. Yet saving the forests we already have ranks higher in priority and in payoff, say a variety of scientists.

    How to preserve forests may be a harder question than why. Success takes strong legal protections with full government support. It also takes a village, literally. A forest’s most intimate neighbors must wholeheartedly want it saved, one generation after another. That theme repeats in places as different as rural Madagascar and suburban New Jersey.

    Overlooked and underprotected

    First a word about trees themselves. Of course, trees capture carbon and fight climate change. But trees are much more than useful wooden objects that happen to be leafy, self-manufacturing and great shade for picnics.

    “Plant blindness,” as it has been called, reduces trees and other photosynthetic organisms to background, lamented botanist Sandra Knapp in a 2019 article in the journal Plants, People, Planet. For instance, show people a picture with a squirrel in a forest. They’ll likely say something like “cute squirrel.” Not “nice-size beech tree, and is that a young black oak with a cute squirrel on it?”

    This tunnel vision also excludes invertebrates, argues Knapp, of the Natural History Museum in London, complicating efforts to save nature. These half-seen forests, natural plus human-planted, now cover close to a third of the planet’s land, according to the 2020 version of The State of the World’s Forests report from the United Nation’s Food and Agriculture Organization. Yet a calculation based on the report’s numbers says that over the last 10 years, net tree cover vanished at an average rate of about 12,990 hectares — a bit more than the area of San Francisco — every day.

    This is an improvement over the previous decades, the report notes. In the 1990s, deforestation, on average, destroyed about 1.75 San Francisco equivalents of forest every day.

    Branches of a Dracaena cinnabari dragon’s blood tree from Yemen ooze red sap and repeatedly bifurcate in even Y-splits.BORIS KHVOSTICHENKO/WIKIMEDIA COMMONS (CC BY-SA 4.0)

    Trees were the planet’s skyscrapers, many rising to great heights, hundreds of millions of years before humans began piling stone upon stone to build their own. Trees reach their stature by growing and then killing their innermost core of tissue. The still-living outer rim of the tree uses its ever-increasing inner ghost architecture as plumbing pipes that can function as long as several human lifetimes. And tree sex lives, oh my. Plants invented “steamy but not touchy” long before the Victorian novel — much flowering, perfuming and maybe green yearning, all without direct contact of reproductive organs. Just a dusting of pollen wafted on a breeze or delivered by a bee.

    To achieve the all-important goal of cutting global emissions, saving the natural forests already in the ground must be a priority, 14 scientists from around the world wrote in the April Global Change Biology. “Protect existing forests first,” coauthor Kate Hardwick of Kew Gardens in London said during a virtual conference on reforestation in February. That priority also gives the planet’s magnificent biodiversity a better chance at surviving. Trees can store a lot of carbon in racing to the sky. And size and age matter because trees add carbon over so much of their architecture, says ecologist David Mildrexler with Eastern Oregon Legacy Lands at the Wallowology Natural History Discovery Center in Joseph. Trees don’t just start new growth at twigs tipped with unfurling baby leaves. Inside the branches, the trunk and big roots, an actively growing sheath surrounds the inner ghost plumbing. Each season, this whole sheath adds a layer of carbon-capturing tissue from root to crown.

    “Imagine you’re standing in front of a really big tree — one that’s so big you can’t even wrap your arms all the way around, and you look up the trunk,” Mildrexler says. Compare that sky-touching vision to the area covered in a year’s growth of some sapling, maybe three fingers thick and human height. “The difference is, of course, just huge,” he says.

    Big trees may not be common, but they make an outsize difference in trapping carbon, Mildrexler and colleagues have found. In six Pacific Northwest national forests, only about 3 percent of all the trees in the study, including ponderosa pines, western larches and three other major species, reached full-arm-hug size (at least 53.3 centimeters in diameter). Yet this 3 percent of trees stored 42 percent of the aboveground carbon there, the team reported in 2020 in Frontiers in Forests and Global Change. An earlier study, with 48 sites worldwide and more than 5 million tree trunks, found that the largest 1 percent of trees store about 50 percent of the aboveground carbon-filled biomass.

    Plant paradise

    The island nation of Madagascar was an irresistible place for the Missouri Botanical Garden to start trying to conserve forests. Off the east coast of Africa, the island stretches more than the distance from Savannah, Ga., to Toronto, and holds more than 12,000 named species of trees, other flowering plants and ferns. Madagascar “is absolute nirvana,” says MBG botanist James S. Miller, who has spent decades exploring the island’s flora.

    The Ravenala traveler’s tree is widely grown, but native only to Madagascar.CEPHOTO, UWE ARANAS/WIKIMEDIA COMMONS (CC BY-SA 3.0)

    Just consider the rarities. Of the eight known species of baobab trees, which raise a fat trunk to a cartoonishly spindly tuft of little branches on top, six are native to Madagascar. Miller considers some 90 percent of the island’s plants as natives unique to the country. “It wrecks you” for botanizing elsewhere, Miller says.

    He was rooting for his MBG colleagues Randrianasolo and Birkinshaw in their foray to Madagascar’s Agnalazaha forest. Several months after getting roasted as liars by residents, the two got word that the skeptics had decided to give protection a chance after all.

    The Agnalazaha residents wanted to make sure, however, that the Missouri group realized the solemnity of their promise. Randrianasolo had to return to the island for a ceremony of calling the ancestors as witnesses to the new partnership and marking the occasion with the sacrifice of a cow. A pact with generations of deceased residents may be an unusual form of legal involvement, but it carried weight. Randrianasolo bought the cow.

    Randrianasolo looked for ways to be helpful. MBG worked on improving the village’s rice yields, and supplied starter batches of vegetable seeds for expanding home gardens. The MBG staff helped the forest residents apply for conservation funds from the Malagasy government. A new tree nursery gave villagers an alternative to cutting timber in the forest. The nursery also meant some jobs for local people, which further improved relationships.

    Trying to build trust with people living near southeastern Madagascar’s coast was the first task the Missouri Botanical Garden faced in efforts to conserve the Agnalazaha forest.Courtesy of the staff of the Missouri Botanical Garden, St. Louis and Madagascar

    The MBG staff now works with Malagasy communities to preserve forests at 11 sites dotted in various ecosystems in Madagascar. Says Randrianasolo: “You have to be patient.”

    Today, 19 years after his first visit among the mourners, Agnalazaha still stands.

    Saving forests is not a simple matter of just meeting basic needs of people living nearby, says political scientist Nadia Rabesahala Horning of Middlebury College in Vermont, who published The Politics of Deforestation in Africa in 2018. Her Ph.D. work, starting in the late 1990s, took her to four remote forests in her native Madagascar. The villagers around each forest followed different rules for harvesting timber, finding places to graze livestock and collecting medicinal plants.

    Three of the forests shrank, two of them rapidly, over the decade. One, called Analavelona, however, barely showed any change in the aerial views Horning used to look for fraying forests.

    Near Madagascar’s Analavelona sacred forest, taxonomist Armand Randrianasolo (blue cap) joins (from left) Miandry Fagnarena, Rehary, and Tefy Andriamihajarivo to collect a surprising new species in the mango family (green leaves at front of image). The Spondias tefyi, named for Tefy and his efforts to protect the island’s biodiversity, is the first wild relative of the popular hog plum found outside of South America or Asia.Courtesy of the staff of the Missouri Botanical Garden, St. Louis and Madagascar

    The people living around Analavelona revered it as a sacred place where their ancestors dwelled. Living villagers made offerings before entering, and cut only one kind of tree, which they used for coffins.

    Since then, Horning’s research in Tanzania and Uganda has convinced her that forest conservation can happen only under very specific conditions, she says. The local community must be able to trust that the government won’t let some commercial interest or a political heavyweight slip through loopholes to exploit a forest that its everyday neighbors can’t touch. And local people must be able to meet their own needs too, including the spiritual ones.

    A different kind of essential

    Tied with yarn to nearly 3,000 trees in a Maryland forest, tags displayed the names of the people lost on 9/11. The memorial, organized by ecologist Joan Maloof who runs the Old-Growth Forest Network, helped protect a patch of woods where people could go for solace and meditation.Friends of the Forest, Salisbury

    Another constellation of old forests, on the other side of the world, sports some less-than-obvious similarities. Ecologist Joan Maloof launched the Old-Growth Forest Network in 2011 to encourage people to save the remaining scraps of U.S. old-growth forests. Her bold idea: to permanently protect one patch of old forest in each of the more than 2,000 counties in the United States where forests can grow.

    She calls for strong legal measures, such as conservation easements that prevent logging, but also recognizes the need to convey the emotional power of communing with nature. One of the early green spots she and colleagues campaigned for was not old growth, but it had become one of the few left unlogged where she lived on Maryland’s Eastern Shore.

    She heard about Buddhist monks in Thailand who had ordained trees as monks because loggers revered the monks, so the trees were protected. A month after the 9/11 terrorist attacks, she was inspired to turn the Maryland forest into a place to remember the victims. By putting each victim’s name on a metal tag and tying it to a tree, she and other volunteers created a memorial with close to 3,000 trees. The local planning commission, she suspected, would feel awkward about approving timber cutting from that particular stand. She wasn’t party to their private deliberations, but the forest still stands.

    In 1973, high school freshman Doug Hefty wrote more than 80 pages about the value of Saddler’s Woods in Haddon Township, N.J. His typed report, with its handmade cover, played a dramatic role in saving the forest. Saddler’s Woods Conservation Association

    As of Earth Day 2021, the network had about 125 forests around the country that should stay forests in perpetuity. Their stories vary widely, but are full of local history and political maneuvering.

     In southern New Jersey, Joshua Saddler, an escaped enslaved man from Maryland, acquired part of a small forest in the mid-1880s and bequeathed it to his wife with the stipulation that it not be logged. His section was logged anyway, and the rest of the original old forest was about to meet the same fate. In 1973, high school student Doug Hefty wrote more than 80 pages on the forest’s value — and delivered it to the developer. In this case, life delivered a genuine Hollywood ending. The developer relented, and scaled back the project, stopping across the street from the woods.

    In 1999, however, developers once again eyed the forest, says Janet Goehner-Jacobs, who heads the Saddler’s Woods Conservation Association. It took four years, but now, she and the forests’ other fans have a conservation easement forbidding commercial development or logging, giving the next generation better tools to protect the forest.

    Goehner-Jacobs had just moved to the area and fallen in love with that 10-hectare patch of green in the midst of apartment buildings and strip malls. When she first happened upon the forest and found the old-growth section, “I just instinctively knew I was seeing something very different.”

    Saddler’s Woods, with a scrap of old-growth forest, has survived in the rush of development in suburban New Jersey thanks to generations of dedicated forest lovers.Saddler’s Woods Conservation Association More

  • in

    Discovery of 10 faces of plasma leads to new insights in fusion and plasma science

    Scientists have discovered a novel way to classify magnetized plasmas that could possibly lead to advances in harvesting on Earth the fusion energy that powers the sun and stars. The discovery by theorists at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) found that a magnetized plasma has 10 unique phases and the transitions between them might hold rich implications for practical development.
    The spatial boundaries, or transitions, between different phases will support localized wave excitations, the researchers found. “These findings could lead to possible applications of these exotic excitations in space and laboratory plasmas,” said Yichen Fu, a graduate student at PPPL and lead author of a paper in Nature Communications that outlines the research. “The next step is to explore what these excitations could do and how they might be utilized.”
    Possible applications
    Possible applications include using the excitations to create current in magnetic fusion plasmas or facilitating plasma rotation in fusion experiments. However, “Our paper doesn’t consider any practical applications,” said physicist Hong Qin, co-author of the paper and Fu’s advisor. “The paper is the basic theory and the technology will follow the theoretical understanding.”
    In fact, “the discovery of the 10 phases in plasma marks a primary development in plasma physics,” Qin said. “The first and foremost step in any scientific endeavor is to classify the objects under investigation. Any new classification scheme will lead to improvement in our theoretical understanding and subsequent advances in technology,” he said.
    Qin cites discovery of the major types of diabetes as an example of the role classification plays in scientific progress. “When developing treatments for diabetes, scientists found that there were three major types,” he said. “Now medical practitioners can effectively treat diabetic patients.”
    Fusion, which scientists around the world are seeking to produce on Earth, combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei that makes up 99 percent of the visible universe — to release massive amounts of energy. Such energy could serve as a safe and clean source of power for generating electricity.
    The plasma phases that PPPL has uncovered are technically known as “topological phases,” indicating the shapes of the waves supported by plasma. This unique property of matter was first discovered in the discipline of condensed matter physics during the 1970s — a discovery for which physicist Duncan Haldane of Princeton University shared the 2016 Nobel Prize for his pioneering work.
    Robust and intrinsic
    The localized plasma waves produced by phase transitions are robust and intrinsic because they are “topologically protected,” Qin said. “The discovery that this topologically protected excitation exists in magnetized plasmas is a big step forward that can be explored for practical applications,” he said.
    For first author Fu, “The most important progress in the paper is looking at plasma based on its topological properties and identifying its topological phases. Based on these phases we identify the necessary and sufficient condition for the excitations of these localized waves. As for how this progress can be applied to facilitate fusion energy research, we have to find out.”
    Story Source:
    Materials provided by DOE/Princeton Plasma Physics Laboratory. Original written by John Greenwald. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence could be new blueprint for precision drug discovery

    Writing in the July 12, 2021 online issue of Nature Communications , researchers at University of California San Diego School of Medicine describe a new approach that uses machine learning to hunt for disease targets and then predicts whether a drug is likely to receive FDA approval.
    The study findings could measurably change how researchers sift through big data to find meaningful information with significant benefit to patients, the pharmaceutical industry and the nation’s health care systems.
    “Academic labs and pharmaceutical and biotech companies have access to unlimited amounts of ‘big data’ and better tools than ever to analyze such data. However, despite these incredible advances in technology, the success rates in drug discovery are lower today than in the 1970s,” said Pradipta Ghosh, MD, senior author of the study and professor in the departments of Medicine and Cellular and Molecular Medicine at UC San Diego School of Medicine.
    “This is mostly because drugs that work perfectly in preclinical inbred models, such as laboratory mice, that are genetically or otherwise identical to each other, don’t translate to patients in the clinic, where each individual and their disease is unique. It is this variability in the clinic that is believed to be the Achilles heel for any drug discovery program.”
    In the new study, Ghosh and colleagues replaced the first and last steps in preclinical drug discovery with two novel approaches developed within the UC San Diego Institute for Network Medicine (iNetMed), which unites several research disciplines to develop new solutions to advance life sciences and technology and enhance human health.
    The researchers used the disease model for inflammatory bowel disease (IBD), which is a complex, multifaceted, relapsing autoimmune disorder characterized by inflammation of the gut lining. Because it impacts all ages and reduces the quality of life in patients, IBD is a priority disease area for drug discovery and is a challenging condition to treat because no two patients behave similarly. More

  • in

    MaxDIA: Taking proteomics to the next level

    Proteomics produces enormous amounts of data, which can be very complex to analyze and interpret. The free software platform MaxQuant has proven to be invaluable for data analysis of shotgun proteomics over the past decade. Now, Jürgen Cox, group leader at the Max Planck Institute of Biochemistry, and his team present the new version 2.0. It provides an improved computational workflow for data-independent acquisition (DIA) proteomics, called MaxDIA. MaxDIA includes library-based and library-free DIA proteomics and permits highly sensitive and accurate data analysis. Uniting data-dependent and data-independent acquisition into one world, MaxQuant 2.0 is a big step towards improving applications for personalized medicine.
    Proteins are essential for our cells to function, yet many questions about their synthesis, abundance, functions, and defects still remain unanswered. High-throughput techniques can help improve our understanding of these molecules. For analysis by liquid chromatography followed by mass spectrometry (MS), proteins are broken down into smaller peptides, in a process referred to as “shotgun proteomics.” The mass-to-charge ratio of these peptides is subsequently determined with a mass spectrometer, resulting in MS spectra. From these spectra, information about the identity of the analyzed proteins can be reconstructed. However, the enormous amount and complexity of data make data analysis and interpretation challenging.
    Two ways to analyze proteins with mass spectrometry
    Two main methods are used in shotgun proteomics: Data-dependent acquisition (DDA) and data-independent acquisition (DIA). In DDA, the most abundant peptides of a sample are preselected for fragmentation and measurement. This allows to reconstruct the sequences of these few preselected peptides, making analysis simpler and faster. However, this method induces a bias towards highly abundant peptides. DIA, in contrast, is more robust and sensitive. All peptides from a certain mass range are fragmented and measured at once, without preselection by abundance.
    As a result, this method generates large amounts of data, and the complexity of the obtained information increases considerably. Up to now, identification of the original proteins was only possible by matching the newly measured spectra against spectra in libraries that comprise previously measured spectra.
    Combining DDA and DIA into one world
    Jürgen Cox and his team have now developed a software that provides a complete computational workflow for DIA data. It allows, for the first time, to apply algorithms to DDA and DIA data in the same way. Consequently, studies based on either DDA or DIA will now become more easily comparable. MaxDIA analyzes proteomics data with and without spectral libraries. Using machine learning, the software predicts peptide fragmentation and spectral intensities. Hence, it creates precise MS spectral libraries in silico. In this way, MaxDIA includes a library-free discovery mode with reliable control of false positive protein identifications.
    Furthermore, the software supports new technologies such as bootstrap DIA, BoxCar DIA and trapped ion mobility spectrometry DIA. What are the next steps? The team is already working on further improving the software. Several extensions are being developed, for instance for improving the analysis of posttranslational modifications and identification of cross-linked peptides.
    Enabling researchers to conduct complex proteomics data analysis
    MaxDIA is a free software available to scientists all over the world. It is embedded in the established software environment MaxQuant. “We would like to make proteomics data analysis accessible to all researchers,” says Pavel Sinitcyn, first author of the paper that introduces MaxDIA. Thus, at the MaxQuant summer school, Cox and his team offer hands-on training in this software for all interested researchers. They thereby help bridging the gap between wet lab work and complex data analysis.
    Sinitcyn states that the aim is to “bring mass spectrometry from the Max Planck Institute of Biochemistry to the clinics.” Instead of measuring only a few proteins, thousands of proteins can now be measured and analyzed. This opens up new possibilities for medical applications, especially in the field of personalized medicine.
    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Mathematicians develop ground-breaking modeling toolkit to predict local COVID-19 impact

    A Sussex team — including university mathematicians — have created a new modelling toolkit which predicts the impact of COVID-19 at a local level with unprecedented accuracy. The details are published in the International Journal of Epidemiology, and are available for other local authorities to use online, just as the UK looks as though it may head into another wave of infections.
    The study used the local Sussex hospital and healthcare daily COVID-19 situation reports, including admissions, discharges, bed occupancy and deaths.
    Through the pandemic, the newly-published modelling has been used by local NHS and public health services to predict infection levels so that public services can plan when and how to allocate health resources — and it has been conclusively shown to be accurate. The team are now making their modelling available to other local authorities to use via the Halogen toolkit.
    Anotida Madzvamuse, professor of mathematical and computational biology within the School of Mathematical and Physical Sciences at the University of Sussex, who led of the study, said:
    “We undertook this study as a rapid response to the COVID-19 pandemic. Our objective was to provide support and enhance the capability of local NHS and Public Health teams to accurately predict and forecast the impact of local outbreaks to guide healthcare demand and capacity, policy making, and public health decisions.”
    “Working with outstanding mathematicians, Dr James Van Yperen and Dr Eduard Campillo-Funollet, we formulated an epidemiological model and inferred model parameters by fitting the model to local datasets to allow for short, and medium-term predictions and forecasts of the impact of COVID-19 outbreaks. More