More stories

  • in

    As a population gets older, automation accelerates

    You might think robots and other forms of workplace automation gain traction due to intrinsic advances in technology — that innovations naturally find their way into the economy. But a study co-authored by an MIT professor tells a different story: Robots are more widely adopted where populations become notably older, filling the gaps in an aging industrial work force.
    “Demographic change — aging — is one of the most important factors leading to the adoption of robotics and other automation technologies,” says Daron Acemoglu, an MIT economist and co-author of a new paper detailing the results of the study.
    The study finds that when it comes to the adoption of robots, aging alone accounts for 35 percent of the variation among countries. Within the U.S., the research shows the same pattern: Metro areas where the population is getting older at a faster rate are the places where industry invests more in robots.
    “We provide a lot of evidence to bolster the case that this is a causal relationship, and it is driven by precisely the industries that are most affected by aging and have opportunities for automating work,” Acemoglu adds.
    The paper, “Demographics and Automation,” has been published online by The Review of Economic Studies, and will be appearing in a forthcoming print edition of the journal. The authors are Acemoglu, an Institute Professor at MIT, and Pascual Restrepo PhD ’16, an assistant professor of economics at Boston University.
    An “amazing frontier,” but driven by labor shortages
    The current study is the latest in a series of papers Acemoglu and Restrepo have published about automation, robots, and the workforce. They have previously quantified job displacement in the U.S. due to robots, looked at the firm-level effects of robot use, and identified the late 1980s as a key moment when automation started replacing more jobs than it was creating. More

  • in

    AI system identifies buildings damaged by wildfire

    People around the globe have suffered the nerve-wracking anxiety of waiting weeks or months to find out if their homes have been damaged by wildfires that scorch with increased intensity. Now, once the smoke has cleared for aerial photography, researchers have found a way to identify building damage within minutes.
    Through a system they call DamageMap, a team at Stanford University and the California Polytechnic State University (Cal Poly) has brought an artificial intelligence approach to building assessment: Instead of comparing before-and-after photos, they’ve trained a program using machine learning to rely solely on post-fire images. The findings appear in the International Journal of Disaster Risk Reduction.
    “We wanted to automate the process and make it much faster for first responders or even for citizens that might want to know what happened to their house after a wildfire,” said lead study author Marios Galanis, a graduate student in the Civil and Environmental Engineering Department at Stanford’s School of Engineering. “Our model results are on par with human accuracy.”
    The current method of assessing damage involves people going door-to-door to check every building. While DamageMap is not intended to replace in-person damage classification, it could be used as a scalable supplementary tool by offering immediate results and providing the exact locations of the buildings identified. The researchers tested it using a variety of satellite, aerial and drone photography with at least 92 percent accuracy.
    “With this application, you could probably scan the whole town of Paradise in a few hours,” said senior author G. Andrew Fricker, an assistant professor at Cal Poly, referencing the Northern California town destroyed by the 2018 Camp Fire. “I hope this can bring more information to the decision-making process for firefighters and emergency responders, and also assist fire victims by getting information to help them file insurance claims and get their lives back on track.”
    A different approach
    Most computational systems cannot efficiently classify building damage because the AI compares post-disaster photos with pre-disaster images that must use the same satellite, camera angle and lighting conditions, which can be expensive to obtain or unavailable. Current hardware is not advanced enough to record high-resolution surveillance daily, so the systems can’t rely on consistent photos, according to the researchers. More

  • in

    A statistical fix for archaeology's dating problem

    Archaeologists have long had a dating problem. The radiocarbon analysis typically used to reconstruct past human demographic changes relies on a method easily skewed by radiocarbon calibration curves and measurement uncertainty. And there’s never been a statistical fix that works — until now.
    “Nobody has systematically explored the problem, or shown how you can statistically deal with it,” says Santa Fe Insitute archaeologist Michael Price, lead author on a paper in the Journal of Archaeological Science about a new method he developed for summarizing sets of radiocarbon dates. “It’s really exciting how this work came together. We identified a fundamental problem and fixed it.”
    In recent decades, archaeologists have increasingly relied on sets of radiocarbon dates to reconstruct past population size through an approach called “dates as data.” The core assumption is that the number of radiocarbon samples from a given period is proportional to the region’s population size at that time. Archaeologists have traditionally used “summed probability densities,” or SPDs, to summarize these sets of radiocarbon dates. “But there are a lot of inherent issues with SPDs,” says Julie Hoggarth, Baylor University archaeologist and a co-author on the paper.
    Radiocarbon dating measures the decay of carbon-14 in organic matter. But the amount of carbon-14 in the atmosphere fluctuates through time; it’s not a constant baseline. So researchers create radiocarbon calibration curves that map the carbon-14 values to dates. Yet a single carbon-14 value can correspond to different dates — a problem known as “equifinality,” which can naturally bias the SPD curves. “That’s been a major issue,” and a hurdle for demographic analyses, says Hoggarth. “How do you know that the change you’re looking at is an actual change in population size, and it isn’t a change in the shape of the calibration curve?”
    When she discussed the problem with Price several years ago, he told her he wasn’t a fan of SPDs, either. She asked what archaeologists should do instead. “Essentially, he said, ‘Well, there is no alternative.'”
    That realization led to a years-long quest. Price has developed an approach to estimating prehistoric populations that uses Bayesian reasoning and a flexible probability model that allows researchers to overcome the problem of equifinality. The approach also allows them to combine additional archaeological information with radiocarbon analyses to get a more accurate population estimate. He and his team applied the approach to existing radiocarbon dates from the Maya city of Tikal, which has extensive prior archaeological research. “It serves as a really good test case,” says Hoggarth, a Maya scholar. For a long time, archaeologists debated two demographic reconstructions: Tikal’s population spiked in the early Classic period and then plateaued, or it spiked in the late Classic period. When the team applied the new Bayesian algorithm, “it showed a really steep population increase associated with the late Classic,” she says, “so that was really wonderful confirmation for us.”
    The authors produced an open-source package that implements the new approach, and website links and code are included in their paper. “The reason I’m excited for this,” Price says, “is that it’s pointing out a mistake that matters, fixing it, and laying the groundwork for future work.”
    This paper is just the first step. Next, through “data fusion,” the team will add ancient DNA and other data to radiocarbon dates for even more reliable demographic reconstructions. “That’s the long-term plan,” Price says. And it could help resolve a second issue with the dates as data approach: a “bias problem” if and when radiocarbon dates are skewed toward a particular time period, leading to inaccurate analyses.
    Story Source:
    Materials provided by Santa Fe Institute. Note: Content may be edited for style and length. More

  • in

    Physicists make square droplets and liquid lattices

    When two substances are brought together, they will eventually settle into a steady state called the thermodynamic equilibrium; in everyday life, we see examples of this when oil floats on top of water and when milk mixes uniformly into coffee. Researchers at Aalto University in Finland wanted to disrupt this sort of state to see what happens — and whether they can control the outcome.
    ‘Things in equilibrium tend to be quite boring,’ says Professor Jaakko Timonen, whose research group carried out new work published in Science Advances on 15 September. ‘It’s fascinating to drive systems out of equilibrium and see if the non-equilibrium structures can be controlled or be useful. Biological life itself is a good example of truly complex behavior in a bunch of molecules that are out of thermodynamic equilibrium.’
    In their work, the team used combinations of oils with different dielectric constants and conductivities. They then subjected the liquids to an electric field.
    ‘When we turn on an electric field over the mixture, electrical charge accumulates at the interface between the oils. This charge density shears the interface out of thermodynamic equilibrium and into interesting formations,’ explains Dr Nikos Kyriakopoulos, one of the authors of the paper. As well as being disrupted by the electric field, the liquids were confined into a thin, nearly two-dimensional sheet. This combination led to the oils reshaping into various completely unexpected droplets and patterns.
    The droplets in the experiment could be made into squares and hexagons with straight sides, which is almost impossible in nature, where small bubbles and droplets tend to form spheres. The two liquids could be also made to form into interconnected lattices: grid patterns that occur regularly in solid materials but are unheard of in liquid mixtures. The liquids can even be coaxed into forming a torus, a donut shape, which was stable and held its shape while the field was applied — unlike in nature, as liquids have a strong tendency to collapse in and fill the hole at the centre. The liquids can also form filaments that roll and rotate around an axis.
    ‘All these strange shapes are caused and sustained by the fact that they are prevented from collapsing back into equilibrium by the motion of the electrical charges building up at the interface,’ says Geet Raju, the first author of the paper.
    One of the exciting results of this work is the ability to create temporary structures with a controlled and well-defined size which can be turned on and off with voltage, an area that the researchers are interested in exploring further for creating voltage-controlled optical devices. Another potential outcome is the ability to create interacting populations of rolling microfilaments and microdroplets that, at some elementary level, mimic the dynamics and collective behaviour of microorganisms like bacteria and microalgae that propel themselves using completely different mechanisms.
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More

  • in

    Using artificial intelligence to predict COVID patients' oxygen needs

    Addenbrooke’s Hospital in Cambridge along with 20 other hospitals from across the world and healthcare technology leader, NVIDIA, have used artificial intelligence (AI) to predict Covid patients’ oxygen needs on a global scale.
    The research was sparked by the pandemic and set out to build an AI tool to predict how much extra oxygen a Covid-19 patient may need in the first days of hospital care, using data from across four continents.
    The technique, known as federated learning, used an algorithm to analyse chest x-rays and electronic health data from hospital patients with Covid symptoms.
    To maintain strict patient confidentiality, the patient data was fully anonymised and an algorithm was sent to each hospital so no data was shared or left its location.
    Once the algorithm had ‘learned’ from the data, the analysis was brought together to build an AI tool which could predict the oxygen needs of hospital Covid patients anywhere in the world.
    Published today in Nature Medicine, the study dubbed EXAM (for EMR CXR AI Model), is one of the largest, most diverse clinical federated learning studies to date. More

  • in

    Scientists develop 'optimal strategies' computer model that could significantly reduce future COVID-19 infections and deaths

    A team of scientists from Nanyang Technological University, Singapore (NTU Singapore) has developed a predictive computer model that, when tested on real pandemic data, proposed strategies that would have reduced the rate of both COVID-19 infections and deaths by an average of 72 per cent, based on a sample from four countries.
    The model, called NSGA-II, could be used to alert local governments in advance on possible surges in COVID-19 infections and mortalities, allowing them time to put forward relevant counter measures more rapidly.
    Through the testing of NSGA-II in four Asian countries using data available from 1 January 2020 to 31 December 2020, the team demonstrated that it could have helped reduce the number of COVID-19 infections and deaths by up to 76 per cent in Japan, 65 per cent in South Korea, 59 per cent in Pakistan, and 89 per cent in Nepal.
    The computer model achieved the result by recommending timely and country-specific advice on the optimal application and duration of COVID-19 interventions, such as home quarantines, social distancing measures, and personal protective measures that would help to thwart the negative impact of the pandemic.
    The team also showed NSGA-II could make predictions on the daily increases of COVID-19 confirmed cases and deaths that were highly accurate, at a confidence level of 95 per cent, compared to the actual cases that took place in the four countries over the past year.
    Harnessing the power of machine learning, the research team developed NSGA-II by inputting large amounts of data on COVID-19 mortalities and infections worldwide that is available for the whole of 2020, helping it learn the dynamics of the pandemic. The research was reported in the peer-reviewed scientific journal Sustainable Cities and Society in August. More

  • in

    Australian fires in 2019–2020 had even more global reach than previously thought

    The severe, devastating wildfires that raged across southeastern Australia in late 2019 and early 2020 packed a powerful punch that extended far beyond the country, two new studies find.

    The blazes injected at least twice as much carbon dioxide into the atmosphere as was previously thought, one team’s satellite-derived estimates revealed. The fires also sent up vast clouds of smoke and ash that wafted far to the east over the Southern Ocean, fertilizing the waters with nutrients and triggering widespread blooms of microscopic marine algae called phytoplankton, another team found. Both studies were published online September 15 in Nature.

    Meteorologist Ivar van der Velde of the SRON Netherlands Institute for Space Research in Leiden and colleagues first examined carbon monoxide data collected over southeastern Australia by the satellite-based instrument TROPOMI from November 2019 to January 2020, during the worst of the fires. Then, to get new estimates of the carbon dioxide emissions attributable to the fires, the team used previously determined ratios of carbon monoxide to carbon dioxide emitted by the region’s eucalyptus forests — the predominant type of forest that was scorched in the blazes — during earlier wildfires and prescribed burns.

    Van der Velde’s team estimates that the fires released from 517 trillion to 867 trillion grams of carbon dioxide to the atmosphere. “The sheer magnitude of CO2 that was emitted to the atmosphere … was much larger than what we initially thought it would be,” van der Velde says. The emissions “from this single event were significantly higher than what all Australians normally emit with the combustion of fossil fuels in an entire year.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Previous assessments of CO2 emissions from the fires, based on estimations of burned area and biomass consumed by the blazes, calculated an average of about 275 trillion grams. Using the satellite-derived carbon monoxide data, the researchers say, dramatically improves the ability to distinguish actual emissions from the fires from other background sources of the gases, giving a more accurate assessment.

    That finding has worrisome implications. The fires swiftly cut a swath through southeastern Australia’s eucalyptus forests, devastating the forests to a degree that made their rapid recovery more difficult — which in turn affects how much carbon the trees can sequester, van der Velde says (SN: 3/9/21). Fires in northern and central Australia’s dry, grassy savannas are seen as more climate neutral because the grasses can regrow more quickly, he says.

    And severe fire seasons are likely to become more common in southeastern Australia with ongoing climate change. Climate change has already increased the likelihood of severe fire events such as the 2019–2020 fire season by at least 30 percent (SN: 3/4/20).

    The smoke and ash from the fires also packed a powerful punch. Scientists watched in awe as the fires created a “super outbreak” of towering thunderclouds from December 29 to December 31 in 2019 (SN: 12/15/20). These clouds spewed tiny aerosol particles of ash and smoke high into the stratosphere.

    Aerosols from the fires also traveled eastward through the lower atmosphere, ultimately reaching the Southern Ocean where they triggered blooms of phytoplankton in its iron-starved waters. Geochemist Weiyi Tang, now at Princeton University, and colleagues analyzed aerosols from the fires and found the particles to be rich in iron, an important nutrient for the algae. By tracing the atmospheric paths of the cloud of ash and smoke across the ocean, the team was able to link the observed blooms — huge patches of chlorophyll detected by satellite — to the fires.

    A satellite image snapped on January 6, 2020, shows smoke from southeastern Australia’s wildfires wafting eastward over the Southern Ocean.Japan’s National Institute of Information and Communication Technology

    Researchers have long thought that fires can trigger ocean blooms, particularly in the Southern Ocean, under the right conditions, says marine biogeochemist Joan Llort, now at the Barcelona Supercomputing Center and a coauthor on the study. But this research marks the most direct observation ever made of such an event — in part because it was such a massive one, Llort says.

    Large ocean blooms are “yet another process which is potentially being modified by climate change,” says biogeochemist Nicolas Cassar of Duke University, also a coauthor on the study.

    One of the big questions to emerge from the study, Cassar adds, is just how much carbon these phytoplankton may have ultimately removed from the atmosphere as they bloomed. Some of the carbon that the algae draw out of the air through photosynthesis sinks with them to the seafloor as they die. But some of it is quickly respired back to the atmosphere, muting any mitigating effect that the blooms might have on the wildfire emissions. To really assess what role the algae play, he says, would require a rapid-response team aboard an ocean vessel that could measure these chemical processes as they are happening.

    The sheer size of this wildfire-triggered bloom — “larger than Australia itself” — shows that “wildfires have the potential to increase marine productivity by very large amounts,” says Douglas Hamilton, a climate scientist at Cornell University who was not connected with the study.

    “The impact of fires on society is not straightforward,” Hamilton adds. The same smoke that can cause severe health impacts when inhaled “is also supplying nutrients to ecosystems and helping support marine food webs.” What this study demonstrates, he adds, is that to understand how future increases in fire activity might help shape the future of marine productivity “it is crucial that we monitor the impacts closely now.” More

  • in

    New DNA-based chip can be programmed to solve complex math problems

    The field of DNA computing has evolved by leaps and bounds since it was first proposed nearly 30 years ago. But most DNA computing processes are still performed manually, with reactants being added step-by-step to the reaction by hand. Now, finally, scientists at Incheon National University, Korea have found a way to automate DNA calculations by developing a unique chip that can be controlled by a personal computer.
    The term ‘DNA’ immediately calls to mind the double-stranded helix that contains all our genetic information. But the individual units of its two strands are pairs of molecules bonded with each other in a selective, complementary fashion. Turns out, one can take advantage of this pairing property to perform complex mathematical calculations, and this forms the basis of DNA computing.
    Since DNA has only two strands, performing even a simple calculation requires multiple chemical reactions using different sets of DNA. In most existing research, the DNA for each reaction are added manually, one by one, into a single reaction tube, which makes the process very cumbersome. Microfluidic chips, which consist of narrow channels etched onto a material like plastic, offer a way to automate the process. But despite their promise, the use of microfluidic chips for DNA computing remains underexplored.
    In a recent article — made available online in ACS Nano on 7 July 2021 and published in Volume 15 Issue 7 of the journal on 27 July 2021 — a team of scientists from Incheon National University (INU), Korea, present a programmable DNA-based microfluidic chip that can be controlled by a personal computer to perform DNA calculations. “Our hope is that DNA-based CPUs will replace electronic CPUs in the future because they consume less power, which will help with global warming. DNA-based CPUs also provide a platform for complex calculations like deep learning solutions and mathematical modelling,” says Dr. Youngjun Song from INU, who led the study.
    Dr. Song and team used 3D printing to fabricate their microfluidic chip, which can execute Boolean logic, one of the fundamental logics of computer programming. Boolean logic is a type of true-or-false logic that compares inputs and returns a value of ‘true’ or ‘false’ depending on the type of operation, or ‘logic gate,’ used. The logic gate in this experiment consisted of a single-stranded DNA template. Different single-stranded DNA were then used as inputs. If part of an input DNA had a complementary Watson-Crick sequence to the template DNA, it paired to form double-stranded DNA. The output was considered true or false based on the size of the final DNA.
    What makes the designed chip extraordinary is a motor-operated valve system that can be operated using a PC or smartphone. The chip and software set-up together form a microfluidic processing unit (MPU). Thanks to the valve system, the MPU could perform a series of reactions to execute a combination of logic operations in a rapid and convenient manner.
    This unique valve system of the programmable DNA-based MPU paves the way for more complex cascades of reactions that can code for extended functions. “Future research will focus on a total DNA computing solution with DNA algorithms and DNA storage systems,” says Dr. Song.
    Story Source:
    Materials provided by Incheon National University. Note: Content may be edited for style and length. More