More stories

  • in

    Microchips of the future: Suitable insulators are still missing

    For decades, there has been a trend in microelectronics towards ever smaller and more compact transistors. 2D materials such as graphene are seen as a beacon of hope here: they are the thinnest material layers that can possibly exist, consisting of only one or a few atomic layers. Nevertheless, they can conduct electrical currents — conventional silicon technology, on the other hand, no longer works properly if the layers become too thin.
    However, such materials are not used in a vacuum; they have to be combined with suitable insulators — in order to seal them off from unwanted environmental influences, and also in order to control the flow of current via the so-called field effect. Until now, hexagonal boron nitride (hBN) has frequently been used for this purpose as it forms an excellent environment for 2D materials. However, studies conducted by TU Wien, in cooperation with ETH Zurich, the Russian Ioffe Institute and researchers from Saudi Arabia and Japan, now show that, contrary to previous assumptions, thin hBN layers are not suitable as insulators for future miniaturised field-effect transistors, as exorbitant leakage currents occur. So if 2D materials are really to revolutionise the semiconductor industry, one has to start looking for other insulator materials. The study has now been published in the scientific journal “Nature Electronics.”
    The supposedly perfect insulator material
    “At first glance, hexagonal boron nitride fits graphene and two-dimensional materials better than any other insulator,” says Theresia Knobloch, first author of the study, who is currently working on her dissertation in Tibor Grasser’s team at the Institute of Microelectronics at TU Wien. “Just like the 2D semiconducting materials, hBN consists of individual atomic layers that are only weakly bonded to each other.”
    As a result, hBN can easily be used to make atomically smooth surfaces that do not interfere with the transport of electrons through 2D materials. “You might therefore think that hBN is the perfect material — both as a substrate on which to place thin-film semiconductors and also as a gate insulator needed to build field-effect transistors,” says Tibor Grasser.
    Small leakage currents with big effects
    A transistor can be compared to a water tap — only instead of a stream of water, electric current is switched on and off. As with a water tap, it is very important for a transistor that nothing leaks out of the valve itself.
    This is exactly what the gate insulator is responsible for in the transistor: It isolates the controlling electrode, via which the current flow is switched on and off, from the semiconducting channel itself, through which the current then flows. A modern microprocessor contains about 50 billion transistors — so even a small loss of current at the gates can play an enormous role, because it significantly increases the total energy consumption.
    In this study, the research team investigated the leakage currents that flow through thin hBN layers, both experimentally and using theoretical calculations. They found that some of the properties that make hBN such a suitable substrate also significantly increase the leakage currents through hBN. Boron nitride has a small dielectric constant, which means that the material interacts only weakly with electric fields. In consequence, the hBN layers used in miniaturised transistors must only be a few atomic layers thick so that the gate’s electric field can sufficiently control the channel. At the same time, however, the leakage currents become too large in this case, as they increase exponentially when reducing the layer thickness.
    The search for insulators
    “Our results show that hBN is not suitable as a gate insulator for miniaturised transistors based on 2D materials,” says Tibor Grasser. “This finding is an important guide for future studies, but it is only the beginning of the search for suitable insulators for the smallest transistors. Currently, no known material system can meet all the requirements, but it is only a matter of time and resources until a suitable material system is found.”
    “The problem is complex, but this makes it all the more important that many scientists devote themselves to the search for a solution, because our society will need small, fast and, above all, energy-efficient computer chips in the future,” Theresia Knobloch is convinced. More

  • in

    Making the role of AI in medicine explainable

    Researchers at Charité — Universitätsmedizin Berlin and TU Berlin as well as the University of Oslo have developed a new tissue-section analysis system for diagnosing breast cancer based on artificial intelligence (AI). Two further developments make this system unique: For the first time, morphological, molecular and histological data are integrated in a single analysis. Secondly, the system provides a clarification of the AI decision process in the form of heatmaps. Pixel by pixel, these heatmaps show which visual information influenced the AI decision process and to what extent, thus enabling doctors to understand and assess the plausibility of the results of the AI analysis. This represents a decisive and essential step forward for the future regular use of AI systems in hospitals. The results of this research have now been published in Nature Machine Intelligence.
    Cancer treatment is increasingly concerned with the molecular characterization of tumor tissue samples. Studies are conducted to determine whether and/or how the DNA has changed in the tumor tissue as well as the gene and protein expression in the tissue sample. At the same time, researchers are becoming increasingly aware that cancer progression is closely related to intercellular cross-talk and the interaction of neoplastic cells with the surrounding tissue — including the immune system.
    Although microscopic techniques enable biological processes to be studied with high spatial detail, they only permit a limited measurement of molecular markers. These are rather determined using proteins or DNA taken from tissue. As a result, spatial detail is not possible and the relationship between these markers and the microscopic structures is typically unclear. “We know that in the case of breast cancer, the number of immigrated immune cells, known as lymphocytes, in tumor tissue has an influence on the patient’s prognosis. There are also discussions as to whether this number has a predictive value — in other words if it enables us to say how effective a particular therapy is,” says Prof. Dr. Frederick Klauschen of Charité’s Institute of Pathology.
    “The problem we have is the following: We have good and reliable molecular data and we have good histological data with high spatial detail. What we don’t have as yet is the decisive link between imaging data and high-dimensional molecular data,” adds Prof. Dr. Klaus-Robert Müller, professor of machine learning at TU Berlin. Both researchers have been working together for a number of years now at the national AI center of excellence the Berlin Institute for the Foundations of Learning and Data (BIFOLD) located at TU Berlin.
    It is precisely this symbiosis which the newly published approach makes possible. “Our system facilitates the detection of pathological alterations in microscopic images. Parallel to this, we are able to provide precise heatmap visualizations showing which pixel in the microscopic image contributed to the diagnostic algorithm and to what extent,” explains Prof. Müller. The research team has also succeeded in significantly further developing this process: “Our analysis system has been trained using machine learning processes so that it can also predict various molecular characteristics, including the condition of the DNA, the gene expression as well as the protein expression in specific areas of the tissue, on the basis of the histological images.
    Next on the agenda are certification and further clinical validations — including tests in tumor routine diagnostics. However, Prof. Klauschen is already convinced of the value of the research: “The methods we have developed will make it possible in the future to make histopathological tumor diagnostics more precise, more standardized and qualitatively better.”

    Story Source:
    Materials provided by Charité – Universitätsmedizin Berlin. Note: Content may be edited for style and length. More

  • in

    The world wasted nearly 1 billion metric tons of food in 2019

    The world wasted about 931 million metric tons of food in 2019 — an average of 121 kilograms per person. That’s about 17 percent of all food that was available to consumers that year, a new United Nations report estimates.
    “Throwing away food de facto means throwing away the resources that went into its production,” said Martina Otto, who leads the U.N. Environment Program’s work on cities, during a news conference. “If food waste ends up in landfills, it does not feed people, but it does feed climate change.”
    Some 690 million people are impacted by hunger each year, and over 3 billion people can’t afford a healthy diet. Meanwhile, lost and wasted food accounts for 8 to 10 percent of global greenhouse gas emissions. Reducing food waste could ease both of those problems, according to the Food Waste Index Report 2021 published March 4 by the U.N. Environment Program and WRAP, an environmental charity based in the United Kingdom.
    Researchers analyzed food waste data from 54 countries. Most waste — 61 percent — came from homes. Food services such as restaurants accounted for 26 percent of global food waste while retail outlets such as supermarkets contributed just 13 percent. Surprisingly, food waste was a substantial problem for nearly all countries regardless of their income level, the team found. “We thought waste was predominantly a problem in rich countries,” Otto said.
    While the report is the most comprehensive analysis of global food waste to date, several knowledge gaps remain. The 54 countries account for just 75 percent of the world’s population, and only 23 countries provided waste estimates for their food service or retail sectors. The researchers accounted for these gaps by extrapolating values for the rest of the world from countries that did track these data. The report does not differentiate between potentially edible wasted food and inedible waste such as eggshells or bones.
    Otto recommends that countries begin addressing food waste by integrating reduction into their climate strategies and COVID-19 recovery plans. “Food waste has been largely overlooked in national climate strategies,” Otto said. “We know what to do, and we can take action quickly — collectively and individually.” More

  • in

    First AI system for contactless monitoring of heart rhythm using smart speakers

    Smart speakers, such as Amazon Echo and Google Home, have proven adept at monitoring certain health care issues at home. For example, researchers at the University of Washington have shown that these devices can detect cardiac arrests or monitor babies breathing.
    But what about tracking something even smaller: the minute motion of individual heartbeats in a person sitting in front of a smart speaker?
    UW researchers have developed a new skill for a smart speaker that for the first time monitors both regular and irregular heartbeats without physical contact. The system sends inaudible sounds from the speaker out into a room and, based on the way the sounds are reflected back to the speaker, it can identify and monitor individual heartbeats. Because the heartbeat is such a tiny motion on the chest surface, the team’s system uses machine learning to help the smart speaker locate signals from both regular and irregular heartbeats.
    When the researchers tested this system on healthy participants and hospitalized cardiac patients, the smart speaker detected heartbeats that closely matched the beats detected by standard heartbeat monitors. The team published these findings March 9 in Communications Biology.
    “Regular heartbeats are easy enough to detect even if the signal is small, because you can look for a periodic pattern in the data,” said co-senior author Shyam Gollakota, a UW associate professor in the Paul G. Allen School of Computer Science & Engineering. “But irregular heartbeats are really challenging because there is no such pattern. I wasn’t sure that it would be possible to detect them, so I was pleasantly surprised that our algorithms could identify irregular heartbeats during tests with cardiac patients.”
    While many people are familiar with the concept of a heart rate, doctors are more interested in the assessment of heart rhythm. Heart rate is the average of heartbeats over time, whereas a heart rhythm describes the pattern of heartbeats.

    advertisement

    For example, if a person has a heart rate of 60 beats per minute, they could have a regular heart rhythm — one beat every second — or an irregular heart rhythm — beats are randomly scattered across that minute but they still average out to 60 beats per minute.
    “Heart rhythm disorders are actually more common than some other well-known heart conditions. Cardiac arrhythmias can cause major morbidities such as strokes, but can be highly unpredictable in occurrence, and thus difficult to diagnose,” said co-senior author Dr. Arun Sridhar, assistant professor of cardiology at the UW School of Medicine. “Availability of a low-cost test that can be performed frequently and at the convenience of home can be a game-changer for certain patients in terms of early diagnosis and management.”
    The key to assessing heart rhythm lies in identifying the individual heartbeats. For this system, the search for heartbeats begins when a person sits within 1 to 2 feet in front of the smart speaker. Then the system plays an inaudible continuous sound, which bounces off the person and then returns to the speaker. Based on how the returned sound has changed, the system can isolate movements on the person — including the rise and fall of their chest as they breathe.
    “The motion from someone’s breathing is orders of magnitude larger on the chest wall than the motion from heartbeats, so that poses a pretty big challenge,” said lead author Anran Wang, a doctoral student in the Allen School. “And the breathing signal is not regular so it’s hard to simply filter it out. Using the fact that smart speakers have multiple microphones, we designed a new beam-forming algorithm to help the speakers find heartbeats.”
    The team designed what’s called a self-supervised machine learning algorithm, which learns on the fly instead of from a training set. This algorithm combines signals from all of the smart speaker’s multiple microphones to identify the elusive heartbeat signal.

    advertisement

    “This is similar to how Alexa can always find my voice even if I’m playing a video or if there are multiple people talking in the room,” Gollakota said. “When I say, ‘Hey, Alexa,’ the microphones are working together to find me in the room and listen to what I say next. That’s basically what’s happening here but with the heartbeat.”
    The heartbeat signals that the smart speaker detects don’t look like the typical peaks that are commonly associated with traditional heartbeat monitors. The researchers used a second algorithm to segment the signal into individual heartbeats so that the system could extract what is known as the inter-beat interval, or the amount of time between two heartbeats.
    “With this method, we are not getting the electric signal of the heart contracting. Instead we’re seeing the vibrations on the skin when the heart beats,” Wang said.
    The researchers tested a prototype smart speaker running this system on two groups: 26 healthy participants and 24 hospitalized patients with a diversity of cardiac conditions, including atrial fibrillation and heart failure. The team compared the smart speaker’s inter-beat interval with one from a standard heartbeat monitor. Of the nearly 12,300 heartbeats measured for the healthy participants, the smart speaker’s median inter-beat interval was within 28 milliseconds of the standard monitor. The smart speaker performed almost as well with cardiac patients: of the more than 5,600 heartbeats measured, the median inter-beat interval was within 30 milliseconds of the standard.
    Currently this system is set up for spot checks: If a person is concerned about their heart rhythm, they can sit in front of a smart speaker to get a reading. But the research team hopes that future versions could continuously monitor heartbeats while people are asleep, something that could help doctors diagnose conditions such as sleep apnea.
    “If you have a device like this, you can monitor a patient on an extended basis and define patterns that are individualized for the patient. For example, we can figure out when arrhythmias are happening for each specific patient and then develop corresponding care plans that are tailored for when the patients actually need them,” Sridhar said. “This is the future of cardiology. And the beauty of using these kinds of devices is that they are already in people’s homes.”
    This research was funded by the National Science Foundation. More

  • in

    A remote, computerized training program eases anxiety in children

    Anxiety levels in the United States are rising sharply and have especially intensified in younger populations. According to the Anxiety and Depression Association of America, anxiety disorders affect 31.9 percent of children ages 13 to 18 years old. Because of the COVID-19 pandemic, children and adolescents have experienced unprecedented interruptions to their daily lives and it is expected that these disruptions may precipitate mental illness, including anxiety, depression, and/or stress related symptoms.
    Traditional anxiety and depression treatments include cognitive behavioral therapy and psychiatric medications, which are somewhat successful in alleviating symptoms in adults. However, they have yielded some mixed results in children. Therefore, discovering appropriate means for reducing childhood anxiety and depression that are both affordable and accessible is paramount.
    Using a computerized and completely remote training program, researchers from Florida Atlantic University’s Charles E. Schmidt College of Science have found a way to alleviate negative emotions in preadolescent children. They examined the relationship between anxiety, inhibitory control, and resting-state electroencephalography (EEG) in a critical age-range for social and emotional development (ages 8 to 12 years old). Inhibitory control is the ability to willfully withhold or suppress a thought, action or feeling. Without it, people would act purely on impulses or on old habits of action and thought.
    Results of the study published in the journal, Applied Neuropsychology: Child, reveal that computerized inhibitory training helps to mitigate negative emotions in preadolescent children. EEG results also provide evidence of frontal alpha asymmetry shifting to the left after children completed an emotional version of the training. At the baseline time point, there was further indication to support the link between inhibitory control dysfunction and anxiety/depression. Decreased inhibitory control performance predicted higher levels of anxiety and depression, signifying that inhibitory impairments could be a risk factor for the development of these conditions in children.
    Prior research has focused on adults and has only used self-report measures to operationalize anxiety and depressive symptoms. This novel study expands upon research investigating cognitive and neurological mechanisms involved in childhood anxiety and depression. In addition, it includes an objective outcome measure (resting-state EEG) to enable more succinct conclusions about training efficacy.
    “In the current social climate of the world, internalizing conditions like anxiety and depression are becoming increasingly common in children and adolescents. Meanwhile, the availability and accessibility of computer and tablet technology also has rapidly increased,” said Nathaniel Shanok, lead author and a recent Ph.D. graduate of FAU’s Department of Psychology, who received an award from the American Psychological Society in 2020 for this research. “Providing computerized cognitive training programs to children can be a highly beneficial use of this technology for improving not only academic performance, but as seen in our study, psychological and emotional functioning during a challenging time period of development.”
    Participants in the study were assigned to four weeks of either an emotional inhibitory control training program, a neutral inhibitory control training program, or a waitlisted control, and were tested using cognitive, emotional and EEG measures. Researchers evaluated the effects of the four-week, 16-session computerized inhibitory control training program using three tasks (go/no-go, flanker, and Stroop). The training program utilized for the study is gFocus from IQ Mindware and was created by Mark Ashton Smith, Ph.D.
    Researchers found that inhibitory control accuracy was significantly and negatively related to anxiety as well as depression. Emotional and neutral training conditions led to significant reductions in anxiety, depression, and negative affect relative to the waitlist group, with the emotional training condition showing the largest reductions in anxiety and negative affect. These two conditions showed comparable improvements in inhibitory control accuracy relative to the waitlist, with greater increases observed in the neutral training condition.
    “Given the predominately adverse influence of anxiety on social, psychological and cognitive functioning; early prevention, management and quality treatment plans are critical research areas to explore,” said Nancy Aaron Jones, Ph.D., co-author, an associate professor, and director of the FAU WAVES Emotion Laboratory in the Department of Psychology, Charles E. Schmidt College of Science, and a member of the FAU Brain Institute. “Advancements in technology have made it possible to train certain cognitive abilities using child-friendly applications or games that can be easily accessed from a home computer. Devising computerized training programs, which target underlying cognitive characteristics related to anxiety is a promising method for attenuating symptoms and risk in children.”
    Anxiety involves strong cognitive and emotional influences, which are both explicit (obsessive thought processes and ruminations) as well as implicit (negative processing bias and reduced cognitive control). Inhibitory control impairment is believed to be a cognitive mediator of anxiety through both explicit and implicit mechanisms. The explicit theory of generalized anxiety disorder explains that the manifestation of anxiety occurs in part due to the inability of individuals to effectively recognize unrealistic or overly critical thinking patterns (thoughts) and suppress them. More

  • in

    A year after Australia’s wildfires, extinction threatens hundreds of species

    When Isabel Hyman heads out in coming weeks to the wilds of northern New South Wales, she’s worried about what she won’t find. Fifteen years ago, the malacologist — or mollusk scientist — with the Australian Museum made an incredible discovery among the limestone outcrops there: a tiny, 3-millimeter-long snail, with a ribbed, dark golden-brown shell, that was new to science.
    Subsequently named after her husband, Hugh Palethorpe, Palethorpe’s pinwheel snail (Rophodon palethorpei) “is only known from a single location, at the Kunderang Brook limestone outcrops in Werrikimbe National Park,” she says. Now it may become known for a different, more devastating distinction: It is one of hundreds of species that experts fear have been pushed close to, or right over, the precipice of extinction by the wildfires that blazed across more than 10 million hectares of southeastern Australia in the summer of 2019–2020.
    “This location was completely burnt,” says Hyman, who is based in Sydney. “We expect the mortality at this site could be very high and … there is a possibility this species is extinct.”
    A year after the last of the fires were doused, their toll on species is becoming increasingly clear.  Flames devoured more than 20 percent of Australia’s entire forest cover, according to a February 2020 analysis in Nature Climate Change. Even if plants and animals survived the flames, their habitats may have been so changed that their survival is at risk (SN: 2/11/20). As a result of the scale of the disaster, experts say that more than 500 species of plants and animals may now be endangered — or even completely gone. 
    A wallaby licks its burnt paws after escaping a bushfire near Nana Glen in New South Wales on November 12, 2019.Wolter Peeters/The Sydney Morning Herald via Getty Image
    Australia’s iconic koala became the poster child of the crisis as images of rescuers carrying these singed marsupials out of the flames went global: As many as 60,000 of the nation’s estimated population of 330,000 koalas perished in the fires, ecologists concluded in December in a report for World Wildlife Fund Australia. While there’s no doubt that such charismatic megafauna suffered enormously, the greatest toll is likely to have been in other groups of species, such as invertebrates and plants, which often escape the public’s attention.
    As Kingsley Dixon, an ecologist at Curtin University in Perth told the Associated Press last year: “I don’t think we’ve seen a single event in Australia that has destroyed so much habitat and pushed so many creatures to the very brink of extinction.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Koala charisma
    Even before the fires, many vertebrate species were already on downward trends, says John Woinarski, an ecologist at Charles Darwin University in Darwin. The blazes have “exacerbated the threats that were driving the declines,” he says.
    For example, fluffy arboreal marsupials called greater gliders (Petauroides volans) had already experienced a 50 percent population decline in recent decades. The fires then burned a third of their remaining habitat along Australia’s eastern coastline. An ongoing assessment may lead to the gliders being recategorized from vulnerable to endangered.
    Overall, 49 vertebrates that previously were not endangered now qualify for being listed as threatened under Australia’s guidelines for that designation, researchers reported in July in Nature Ecology and Evolution. That shift alone would increase the tally of nationally protected nonmarine vertebrate species by about 15 percent, from 324 to 373.
    Another 21 already threatened vertebrates had more than 30 percent of their ranges burned, and some may now qualify for being reassessed to higher categories of threat, the authors found. One species that may need to be recategorized is the koala (Phascolarctos cinereas), with some state’s populations that were hardest hit under consideration to be upgraded from vulnerable to endangered. 
    A koala named Paul recovers from his burns at an ICU in November 2019 after being rescued by volunteers following weeks of bushfires across New South Wales and Queensland.Edwards/Getty Image
    Besides the impact on koalas, the WWF Australia report suggests that as many as 3 billion individual mammals, birds, reptiles and frogs died or were displaced during the crisis. Though those figures are astounding, the impacts on lesser-studied groups such as invertebrates and plants may have been even greater.
    “Many of those have much smaller ranges [than vertebrates], which means they are going to be even more impacted when a big fire goes through,” says James Watson, a conservation scientist at the University of Queensland in Brisbane and an author of the Nature Ecology and Evolution paper on vertebrates. “I am willing to bet that there’s many species … that may disappear forever.”
    Invertebrate impact
    In February, more than 100 biologists convened the first of several online workshops to assess whether 234 Australian invertebrates now need to be added to the International Union for the Conservation of Nature’s Red List — a global who’s who of threatened species. 
    Snails, similar to many invertebrates, are particularly susceptible to wildfires, as they are unable to outrun flames and can’t survive intense heat, Hyman notes. Many also have small ranges that were completely incinerated, leaving no survivors that can recolonize the burned area.
    “A snail can’t do much to escape,” she says. “You could expect more than 90 percent mortality in a high-intensity bushfire.” In October, Hyman’s team published one of the first papers quantifying the impacts on invertebrates in New South Wales in the Technical Reports of the Australian Museum Online.
    The Palethorpe’s pinwheel snail (Rophodon palethorpei) has not yet turned up in searches following the wildfires, but other snail species did survive.Vince Railton, Queensland Museum
    Their surveys showed that 29 species in the state — including dung beetles, freshwater crayfish, flies, snails and spiders — had their entire ranges burned. Another 46 species had at least half their known habitat within the fire zones. These 75 species were among the 234 under consideration for adding to the IUCN Red List during the biologists’ first online workshop.
    “We’ve gathered together 230-odd species that are believed to now be of concern. These include a range of different taxa from land snails to millipedes to arachnids to insects, and this 230 is growing rapidly,” says Jess Marsh, an arachnologist at Charles Darwin University who was one of the conveners of the workshop. “I expect it will massively increase.”
    Some of the spiders she studies were the first to be added to that list. She’s already spent several months on South Australia’s Kangaroo Island hunting without luck for the Kangaroo Island assassin spider (Zephyrarchaea austini). Dependent on leaf litter suspended in the understory, and restricted to just a few locations that were razed in early 2020, she suspects that the species may be extinct.
    Spiders on Kangaroo Island such as this assassin spider (Zephyrarchaea austini) may now be extinct after most of their habitat was razed in early 2020.M.G. Rix and M.S. Harvey/ZooKeys 2012
    “There’s no understory vegetation left, let alone any leaf litter suspended in it, so that species is really hanging in the balance,” says Marsh.
    Generally, the species being considered for recognition as endangered had more than 50 percent of their ranges burned, lived in flammable parts of the habitat and have little ability to disperse to other areas. More than 150 of the 234 species being urgently assessed had their entire range burned. And it’s not just the flames themselves that are problematic; so is the reshaped environment following fires. Millipedes, for example, are very vulnerable not only to fire but also to drying out in the reduced shade and shelter of the post-fire environment.
    “A lot of invertebrates are very susceptible to desiccation, and need cover and humidity to survive a hot summer, which are obviously lacking following the fire,” Marsh says. “Taking into account all of the threats … we could be looking at significant numbers going extinct.”
    Rooted in place
    Lost vegetation hasn’t just put animals in danger. Many plants themselves may also be at risk, though experts have yet to compile an official list.
    Rachael Gallagher, a plant ecologist at Macquarie University in Sydney, has been prioritizing endemic plant species — those found nowhere else on Earth — that are in most urgent need of conservation for the Australian government. Perhaps surprisingly, she’s particularly worried about some trees that actually depend on fire to survive. Eucalypts known as alpine ash (Eucalyptus delegatensis) and mountain ash (E. regnans), for instance, are typically killed by fire and then regenerate from surviving seeds in the aftermath. Australia has many trees that must complete their entire life cycle from germination through to reproductively mature adult before the next major bushfire passes through (SN: 2/11/20). For some species, this may take 15 to 20 years.
    Some trees in Australia, such as this mountain ash (Eucalyptus regnans), depend on fire for their lifecycle, but recent wildfires may have been too much too soon.station96/iStock/Getty Images Plus
    The problem now is that climate change has increased the frequency of fires to the degree that many of these plants are unable to reach adulthood and set seed before the next fire passes through, meaning they may be lost from these ecosystems (SN: 3/4/20).
    The fires burned 25–100 percent of the ranges of 257 species of plants for which “the historical intervals between fire events across their range are likely to be too short to allow them to effectively regenerate,” Gallagher says. These species, which have some degree of fire tolerance, are at “increased risk of extinction.” These include shrubs and trees such as the granite boronia (Boronia granitica), Forrester’s bottlebrush (Callistemon forresterae), dwarf cypress pine (Callitris oblonga) and the Wolgan snow gum (Eucalyptus gregsoniana).
    Found, not lost
    Nevertheless, as researchers head out into the field to assess what’s lost, what they are sometimes finding are glimmers of hope. “Australian plants are remarkably resilient and there’s been regeneration in places where nobody thought there would be,” Gallagher says.
    One species that survived against all the odds is the Gibraltar Range waratah (Telopea aspera), a drought-resistant shrub with leathery leaves and bright red flowers. “This species has a very small range, being specialized to granite outcrops in one mountain range, which was burnt during the fires,” she says. “However, it has been noted as resprouting after the fires by park rangers and, in the absence of another fire in coming years, is likely to be able to recover.”
    Several animal species that were thought to be in grave peril following the fires that burned nearly half of the 4,400-square-kilometer Kangaroo Island have survived better than expected too (SN: 1/13/20). In the particularly badly burned reserves of the western end of the island, tiny marsupial carnivores called Kangaroo Island dunnarts (Sminthopsis aitkeni) are frequently appearing on camera traps. Swiftly erected predator-exclusion fences now protect survivors from feral cats.
    Tiny marsupials known as Kangaroo Island dunnarts (Sminthopsis aitkeni) have fared much better than other animals, appearing frequently on camera traps.Australian Wildlife Conservancy
    Similarly, large flocks of the glossy black-cockatoo (Calyptorhynchus lathami) have adapted by moving to unburned areas with food trees, says Karleah Berris of Natural Resources Kangaroo Island, who heads the crew that manages the endangered birds. Better news yet, a surprising number of birds bred and fledged young in mid-2020. “The important thing now is to protect what is left from fire until the burnt areas regenerate,” she says. “But I think, at present, all signs are that they are coping.”
    Hyman says that, hearteningly, her team found handfuls of survivors of some snail species during several surveys in New South Wales in late 2020. The snails turned up in small patches of unburned habitat, sometimes at the bottom of gullies or in deep leaf litter around the bases of large trees. And that gives her hope that other snail species may have held on in other, larger unburned patches with greater numbers of survivors.
    “But the question then becomes, what sort of recovery can they make from that?” she says. “Whether they can recover and breed up and start to move back into surviving areas again perhaps depends on how dry the weather is in coming years and if there are more fires.”
    She’s still hoping that a handful of Palethorpe’s pinwheel snails may have clung on against all the odds. “My husband is on tenterhooks wondering if his snail is still there or not,” she says. More

  • in

    Assessing regulatory fairness through machine learning

    The perils of machine learning — using computers to identify and analyze data patterns, such as in facial recognition software — have made headlines lately. Yet the technology also holds promise to help enforce federal regulations, including those related to the environment, in a fair, transparent way, according to a new study by Stanford researchers.
    The analysis, published this week in the proceedings of the Association of Computing Machinery Conference on Fairness, Accountability and Transparency(link is external), evaluates machine learning techniques designed to support a U.S. Environmental Protection Agency (EPA) initiative to reduce severe violations of the Clean Water Act. It reveals how two key elements of so-called algorithmic design influence which communities are targeted for compliance efforts and, consequently, who bears the burden of pollution violations. The analysis — funded through the Stanford Woods Institute for the Environment’s Realizing Environmental Innovation Program — is timely given recent executive actions(link is external) calling for renewed focus on environmental justice.
    “Machine learning is being used to help manage an overwhelming number of things that federal agencies are tasked to do — as a way to help increase efficiency,” said study co-principal investigator Daniel Ho, the William Benjamin Scott and Luna M. Scott Professor of Law at Stanford Law School. “Yet what we also show is that simply designing a machine learning-based system can have an additional benefit.”
    Pervasive noncompliance
    The Clean Water Act aims to limit pollution from entities that discharge directly into waterways, but in any given year, nearly 30 percent of such facilities self-report persistent or severe violations of their permits. In an effort to halve this type of noncompliance by 2022, EPA has been exploring the use of machine learning to target compliance resources.
    To test this approach, EPA reached out to the academic community. Among its chosen partners: Stanford’s Regulation, Evaluation and Governance Lab (RegLab), an interdisciplinary team of legal experts, data scientists, social scientists and engineers that Ho heads. The group has done ongoing work with federal and state agencies to aid environmental compliance.

    advertisement

    In the new study, RegLab researchers examined how permits with similar functions, such as wastewater treatment plants, were classified by each state in ways that would affect their inclusion in the EPA national compliance initiative. Using machine learning models, they also sifted through hundreds of millions of observations — an impossible task with conventional approaches — from EPA databases on historical discharge volumes, compliance history and permit-level variables to predict the likelihood of future severe violations and the amount of pollution each facility would likely generate. They then evaluated demographic data, such as household income and minority population, for the areas where each model indicated the riskiest facilities were located.
    Devil in the details
    The team’s algorithmic process helped surface two key ways that the design of the EPA compliance initiative could influence who receives resources. These differences centered on which types of permits were included or excluded, as well as how the goal itself was articulated.
    In the process of figuring out how to achieve the compliance goal, the researchers first had to translate the overall objective into a series of concrete instructions — an algorithm — needed to fulfill it. As they were assessing which facilities to run predictions on, they noticed an important embedded decision. While the EPA initiative expands covered permits by at least sevenfold relative to prior efforts, it limits its scope to “individual permits,” which cover a specific discharging entity, such as a single wastewater treatment plant. Left out are “general permits,” intended to cover multiple dischargers engaged in similar activities and with similar types of effluent. A related complication: Most permitting and monitoring authority is vested in state environmental agencies. As a result, functionally similar facilities may be included or excluded from the federal initiative based on how states implement their pollution permitting process.
    “The impact of this environmental federalism makes partnership with states critical to achieving these larger goals in an equitable way,” said co-author Reid Whitaker, a RegLab affiliate and 2020 graduate of Stanford Law School now pursuing a PhD in the Jurisprudence and Social Policy Program at the University of California, Berkeley.
    Second, the current EPA initiative focuses on reducing rates of noncompliance. While there are good reasons for this policy goal, the researchers’ algorithmic design process made clear that favoring this over pollution discharges that exceed the permitted limit would have a powerful unintended effect. Namely, it would shift enforcement resources away from the most severe violators, which are more likely to be in densely populated minority communities, and toward smaller facilities in more rural, predominantly white communities, according to the researchers.
    “Breaking down the big idea of the compliance initiative into smaller chunks that a computer could understand forced a conversation about making implicit decisions explicit,” said study lead author Elinor Benami, a faculty affiliate at the RegLab and assistant professor of agricultural and applied economics at Virginia Tech. “Careful algorithmic design can help regulators transparently identify how objectives translate to implementation while using these techniques to address persistent capacity constraints.” More

  • in

    Someone to watch over AI and keep it honest – and it's not the public!

    The public doesn’t need to know how Artificial Intelligence works to trust it. They just need to know that someone with the necessary skillset is examining AI and has the authority to mete out sanctions if it causes or is likely to cause harm.
    Dr Bran Knowles, a senior lecturer in data science at Lancaster University, says: “I’m certain that the public are incapable of determining the trustworthiness of individual AIs… but we don’t need them to do this. It’s not their responsibility to keep AI honest.”
    Dr Knowles presents (March 8) a research paper ‘The Sanction of Authority: Promoting Public Trust in AI’ at the ACM Conference on Fairness, Accountability and Transparency (ACM FAccT).
    The paper is co-authored by John T. Richards, of IBM’s T.J. Watson Research Center, Yorktown Heights, New York.
    The general public are, the paper notes, often distrustful of AI, which stems both from the way AI has been portrayed over the years and from a growing awareness that there is little meaningful oversight of it.
    The authors argue that greater transparency and more accessible explanations of how AI systems work, perceived to be a means of increasing trust, do not address the public’s concerns.

    advertisement

    A ‘regulatory ecosystem’, they say, is the only way that AI will be meaningfully accountable to the public, earning their trust.
    “The public do not routinely concern themselves with the trustworthiness of food, aviation, and pharmaceuticals because they trust there is a system which regulates these things and punishes any breach of safety protocols,” says Dr Richards.
    And, adds Dr Knowles: “Rather than asking that the public gain skills to make informed decisions about which AIs are worthy of their trust, the public needs the same guarantees that any AI they might encounter is not going to cause them harm.”
    She stresses the critical role of AI documentation in enabling this trustworthy regulatory ecosystem. As an example, the paper discusses work by IBM on AI Factsheets, documentation designed to capture key facts regarding an AI’s development and testing.
    But, while such documentation can provide information needed by internal auditors and external regulators to assess compliance with emerging frameworks for trustworthy AI, Dr Knowles cautions against relying on it to directly foster public trust.
    “If we fail to recognise that the burden to oversee trustworthiness of AI must lie with highly skilled regulators, then there’s a good chance that the future of AI documentation is yet another terms and conditions-style consent mechanism — something no one really reads or understands,” she says.
    The paper calls for AI documentation to be properly understood as a means to empower specialists to assess trustworthiness.
    “AI has material consequences in our world which affect real people; and we need genuine accountability to ensure that the AI that pervades our world is helping to make that world better,” says Dr Knowles.
    ACM FAccT is a computer science conference that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.

    Story Source:
    Materials provided by Lancaster University. Note: Content may be edited for style and length. More