More stories

  • in

    A robotic revolution for urban nature

    Drones, robots and autonomous systems can transform the natural world in and around cities for people and wildlife.
    International research, involving over 170 experts and led by the University of Leeds, assessed the opportunities and challenges that this cutting-edge technology could have for urban nature and green spaces.
    The researchers highlighted opportunities to improve how we monitor nature, such as identifying emerging pests and ensuring plants are cared for, and helping people engage with and appreciate the natural world around them.
    As robotics, autonomous vehicles and drones become more widely used across cities, pollution and traffic congestion may reduce, making towns and cities more pleasant places to spend time outside.
    But the researchers also warned that advances in robotics and automation could be damaging to the environment.
    For instance, robots and drones might generate new sources of waste and pollution themselves, with potentially substantial negative implications for urban nature. Cities might have to be re-planned to provide enough room for robots and drones to operate, potentially leading to a loss of green space. And they could also increase existing social inequalities, such as unequal access to green space.

    advertisement

    Lead author Dr Martin Dallimer, from the School of Earth and Environment at the University of Leeds, said: “Technology, such as robotics, has the potential to change almost every aspect of our lives. As a society, it is vital that we proactively try to understand any possible side effects and risks of our growing use of robots and automated systems.
    “Although the future impacts on urban green spaces and nature are hard to predict, we need to make sure that the public, policy makers and robotics developers are aware of the potential pros and cons, so we can avoid detrimental consequences and fully realise the benefits.”
    The research, published today in Nature Ecology & Evolution, is authored by a team of 77 academics and practitioners.
    The researchers conducted an online survey of 170 experts from 35 countries, which they say provides a current best guess of what the future could hold.
    Participants gave their views on the potential opportunities and challenges for urban biodiversity and ecosystems, from the growing use of robotics and autonomous systems. These are defined as technologies that can sense, analyse, interact with and manipulate their physical environment. This includes unmanned aerial vehicles (drones), self-driving cars, robots able to repair infrastructure, and wireless sensor networks used for monitoring.

    advertisement

    These technologies have a large range of potential applications, such as autonomous transport, waste collection, infrastructure maintenance and repair, policing and precision agriculture.
    The research was conducted as part of Leeds’ Self Repairing Cities project, which aims to enable robots and autonomous systems to maintain urban infrastructure without causing disruption to citizens.
    First author Dr Mark Goddard conducted the work whilst at the University of Leeds and is now based at the Northumbria University. He said: “Spending time in urban green spaces and interacting with nature brings a range of human health and well-being benefits, and robots are likely to transform many of the ways in which we experience and gain benefits from urban nature.
    “Understanding how robotics and autonomous systems will affect our interaction with nature is vital for ensuring that our future cities support wildlife that is accessible to all.”
    This work was funded by the Engineering and Physical Sciences Research Council (EPSRC). More

  • in

    A high order for a low dimension

    Spintronics refers to a suite of physical systems which may one day replace many electronic systems. To realize this generational leap, material components that confine electrons in one dimension are highly sought after. For the first time, researchers created such a material in the form of a special bismuth-based crystal known as a high-order topological insulator.
    To create spintronic devices, new materials need to be designed that take advantage of quantum behaviors not seen in everyday life. You are probably familiar with conductors and insulators, which permit and restrict the flow of electrons, respectively. Semiconductors are common but less familiar to some; these usually insulate, but conduct under certain circumstances, making them ideal miniature switches.
    For spintronic applications, a new kind of electronic material is required and it’s called a topological insulator. It differs from these other three materials by insulating throughout its bulk, but conducting only along its surface. And what it conducts is not the flow of electrons themselves, but a property of them known as their spin or angular momentum. This spin current, as it’s known, could open up a world of ultrahigh-speed and low-power devices.
    However, not all topological insulators are equal: Two kinds, so-called strong and weak, have already been created, but have some drawbacks. As they conduct spin along their entire surface, the electrons present tend to scatter, which weakens their ability to convey a spin current. But since 2017, a third kind of topological insulator called a higher-order topological insulator has been theorized. Now, for the first time, one has been created by a team at the Institute for Solid State Physics at the University of Tokyo.
    “We created a higher-order topological insulator using the element bismuth,” said Associate Professor Takeshi Kondo. “It has the novel ability of being able to conduct a spin current along only its corner edges, essentially one-dimensional lines. As the spin current is bound to one dimension instead of two, the electrons do not scatter so the spin current remains stable.”
    To create this three-dimensional crystal, Kondo and his team stacked two-dimensional slices of crystal one atom thick in a certain way. For strong or weak topological insulators, crystal slices in the stack are all oriented the same way, like playing cards face down in a deck. But to create the higher-order topological insulator, the orientation of the slices was alternated, the metaphorical playing cards were faced up then down repeatedly throughout the stack. This subtle change in arrangement makes a huge change in the behavior of the resultant three-dimensional crystal.
    The crystal layers in the stack are held together by a quantum mechanical force called the van der Waals force. This is one of the rare kinds of quantum phenomena that you actually do see in daily life, as it is partly responsible for the way that powdered materials clump together and flow the way they do. In the crystal, it adheres the layers together.
    “It was exciting to see that the topological properties appear and disappear depending only on the way the two-dimensional atomic sheets were stacked,” said Kondo. “Such a degree of freedom in material design will bring new ideas, leading toward applications including fast and efficient spintronic devices, and things we have yet to envisage.”

    Story Source:
    Materials provided by University of Tokyo. Note: Content may be edited for style and length. More

  • in

    Using artificial intelligence to find new uses for existing medications

    Scientists have developed a machine-learning method that crunches massive amounts of data to help determine which existing medications could improve outcomes in diseases for which they are not prescribed.
    The intent of this work is to speed up drug repurposing, which is not a new concept — think Botox injections, first approved to treat crossed eyes and now a migraine treatment and top cosmetic strategy to reduce the appearance of wrinkles.
    But getting to those new uses typically involves a mix of serendipity and time-consuming and expensive randomized clinical trials to ensure that a drug deemed effective for one disorder will be useful as a treatment for something else.
    The Ohio State University researchers created a framework that combines enormous patient care-related datasets with high-powered computation to arrive at repurposed drug candidates and the estimated effects of those existing medications on a defined set of outcomes.
    Though this study focused on proposed repurposing of drugs to prevent heart failure and stroke in patients with coronary artery disease, the framework is flexible — and could be applied to most diseases.
    “This work shows how artificial intelligence can be used to ‘test’ a drug on a patient, and speed up hypothesis generation and potentially speed up a clinical trial,” said senior author Ping Zhang, assistant professor of computer science and engineering and biomedical informatics at Ohio State. “But we will never replace the physician — drug decisions will always be made by clinicians.”
    The research is published today (Jan. 4, 2021) in Nature Machine Intelligence.

    advertisement

    Drug repurposing is an attractive pursuit because it could lower the risk associated with safety testing of new medications and dramatically reduce the time it takes to get a drug into the marketplace for clinical use.
    Randomized clinical trials are the gold standard for determining a drug’s effectiveness against a disease, but Zhang noted that machine learning can account for hundreds — or thousands — of human differences within a large population that could influence how medicine works in the body. These factors, or confounders, ranging from age, sex and race to disease severity and the presence of other illnesses, function as parameters in the deep learning computer algorithm on which the framework is based.
    That information comes from “real-world evidence,” which is longitudinal observational data about millions of patients captured by electronic medical records or insurance claims and prescription data.
    “Real-world data has so many confounders. This is the reason we have to introduce the deep learning algorithm, which can handle multiple parameters,” said Zhang, who leads the Artificial Intelligence in Medicine Lab and is a core faculty member in the Translational Data Analytics Institute at Ohio State. “If we have hundreds or thousands of confounders, no human being can work with that. So we have to use artificial intelligence to solve the problem.
    “We are the first team to introduce use of the deep learning algorithm to handle the real-world data, control for multiple confounders, and emulate clinical trials,” Zhang said.

    advertisement

    The research team used insurance claims data on nearly 1.2 million heart-disease patients, which provided information on their assigned treatment, disease outcomes and various values for potential confounders. The deep learning algorithm also has the power to take into account the passage of time in each patient’s experience — for every visit, prescription and diagnostic test. The model input for drugs is based on their active ingredients.
    Applying what is called causal inference theory, the researchers categorized, for the purposes of this analysis, the active drug and placebo patient groups that would be found in a clinical trial. The model tracked patients for two years — and compared their disease status at that end point to whether or not they took medications, which drugs they took and when they started the regimen.
    “With causal inference, we can address the problem of having multiple treatments. We don’t answer whether drug A or drug B works for this disease or not, but figure out which treatment will have the better performance,” Zhang said.
    Their hypothesis: that the model would identify drugs that could lower the risk for heart failure and stroke in coronary artery disease patients.
    The model yielded nine drugs considered likely to provide those therapeutic benefits, three of which are currently in use — meaning the analysis identified six candidates for drug repurposing. Among other findings, the analysis suggested that a diabetes medication, metformin, and escitalopram, used to treat depression and anxiety, could lower risk for heart failure and stroke in the model patient population. As it turns out, both of those drugs are currently being tested for their effectiveness against heart disease.
    Zhang stressed that what the team found in this case study is less important than how they got there.
    “My motivation is applying this, along with other experts, to find drugs for diseases without any current treatment. This is very flexible, and we can adjust case-by-case,” he said. “The general model could be applied to any disease if you can define the disease outcome.”
    The research was supported by the National Center for Advancing Translational Sciences, which funds the Center for Clinical and Translational Science at Ohio State. More

  • in

    What the pandemic can teach us about ways to reduce air pollution

    The COVID-19 pandemic wasn’t just a shock to the human immune system. It was also a shock to the Earth system, dramatically changing the air quality in cities around the globe.
    As countries around the globe struggled to contain the disease, they imposed temporary shutdowns. Scientists are now sifting through data collected by satellite and on the ground to understand what this hiatus in human activities can tell us about the atmospheric cocktail that generates city pollution. Much of this preliminary data was shared at the American Geophysical Union annual meeting in December.
    See all our coverage of the coronavirus outbreak
    It was already known that peoples’ activities were curtailed enough to result in a dramatic drop in emissions of greenhouse gases in April, as well as a dip in the seismic noises produced by humans (SN: 5/19/20; SN: 7/23/20). That quiet period didn’t last, though, and carbon dioxide emissions began to climb back upward by the summer. April 2020 saw a drop of about 17 percent in global monthly CO2 emissions from fossil fuels, but by year’s end, annual CO2 emissions for the globe were only 7 percent lower than they were in 2019. That reduction was too brief, compared with the hundreds of years that the gas can linger in Earth’s atmosphere, to put a dent in the planet’s atmospheric CO2 level (SN: 8/7/20).

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    But in addition to briefly reducing emissions of climate-warming gases, this abrupt halt in many human activities — particularly commuter traffic — also created an unprecedented experiment for scientists to examine the complicated chemistry of atmospheric pollutants in cities. By altering the usual mix of pollutants hovering over cities, the shutdowns may help scientists better understand another longstanding misery for human health: poor air quality in many cities.
    That’s not to say that the pandemic has a silver lining, says Jessica Gilman, a tropospheric chemist at the National Oceanic and Atmospheric Administration in Boulder, Colo. “Misery is no solution to our global environmental challenges.”
    But there’s now a wealth of data from cities around the globe on how the pandemic altered regional or local concentrations of the precursors of ozone, a primary component of smog. Those precursors include nitrogen oxides and volatile organic compounds — both produced by traffic — as well as methane, produced by the oil and gas industry. With satellites, scientists are also able to assess how levels of these pollutants changed around the globe.
    Building a global picture of altered city pollution is no easy task, though. Researchers are finding that the pandemic’s impact on levels of various pollutants was highly regional, affected by differences in wind and rain as well as by photochemical interactions with sunlight — the intensity of which also changes with the season.  
    That stark variety of regional effects was evident in, for example, the different post-pandemic ozone levels in Denver and New York City. Nitrogen oxide gases produced by traffic are a powerful precursor to cities’ elevated ozone levels, which can damage the lungs and trigger respiratory ailments. The United States has made strides in reducing these gases over the last few decades — but there hasn’t been a corresponding drop in ozone levels, Dan Jaffe, an environmental chemist at the University of Washington Bothell, reported at the meeting on December 9.
    Sign up for e-mail updates on the latest coronavirus news and research
    The shutdowns gave researchers some insight into why, Jaffe says. From March 15 through July 23, New York City had a 21 percent decrease in nitrogen dioxide, one of several nitrogen oxide gases, in comparison with 2019 levels. Although the shutdowns were more stringent during the spring months, it turned out that summertime reductions in nitrogen dioxide were most strongly linked to the city’s change in ozone levels, the researchers found. “We see very strong reduction in summertime ozone this year,” Jaffe said at the meeting, citing unpublished data.
    That’s because in the summer months, heat and sunlight react with the precursor gases in the atmosphere, like nitrogen dioxide, creating a toxic cocktail. This kind of insight can be a boon to policy makers in a non-pandemic year, suggesting that nitrogen oxide regulations should focus most strongly on the summer, Jaffe says. “It’s really good evidence that NOx reductions extending into July in 2020 had an important impact.”
    In Denver, however, ozone didn’t drop so consistently — possibly because wildfires were beginning to rage across the U.S. West by the end of the summer (SN: 12/21/20). The fires produce nitrogen oxides, carbon monoxide and fine particles that can also help to increase ground-level ozone.
    “There are different patterns in different cities,” Jaffe says. “There are a lot of factors to sort out, and a lot of work to be done.” Armed with a wealth of new data from 2020, scientists hope to be able to make some headway. More

  • in

    Stretching diamond for next-generation microelectronics

    Diamond is the hardest material in nature. It also has great potential as an excellent electronic material. A research team has demonstrated for the first time the large, uniform tensile elastic straining of microfabricated diamond arrays through the nanomechanical approach. Their findings have shown the potential of strained diamonds as prime candidates for advanced functional devices in microelectronics, photonics, and quantum information technologies. More

  • in

    Spontaneous robot dances highlight a new kind of order in active matter

    Predicting when and how collections of particles, robots, or animals become orderly remains a challenge across science and engineering.
    In the 19th century, scientists and engineers developed the discipline of statistical mechanics, which predicts how groups of simple particles transition between order and disorder, as when a collection of randomly colliding atoms freezes to form a uniform crystal lattice.
    More challenging to predict are the collective behaviors that can be achieved when the particles become more complicated, such that they can move under their own power. This type of system — observed in bird flocks, bacterial colonies and robot swarms — goes by the name “active matter.”
    As reported in the January 1, 2021 issue of the journal Science, a team of physicists and engineers have proposed a new principle by which active matter systems can spontaneously order, without need for higher level instructions or even programmed interaction among the agents. And they have demonstrated this principle in a variety of systems, including groups of periodically shape-changing robots called “smarticles” — smart, active particles.
    The theory, developed by Dr. Pavel Chvykov at the Massachusetts Institute of Technology while a student of Prof. Jeremy England, who is now a researcher in the School of Physics at Georgia Institute of Technology, posits that certain types of active matter with sufficiently messy dynamics will spontaneously find what the researchers refer to as “low rattling” states.
    “Rattling is when matter takes energy flowing into it and turns it into random motion,” England said. “Rattling can be greater either when the motion is more violent, or more random. Conversely, low rattling is either very slight or highly organized — or both. So, the idea is that if your matter and energy source allow for the possibility of a low rattling state, the system will randomly rearrange until it finds that state and then gets stuck there. If you supply energy through forces with a particular pattern, this means the selected state will discover a way for the matter to move that finely matches that pattern.”
    To develop their theory, England and Chvykov took inspiration from a phenomenon — dubbed dubbed — discovered by the Swiss physicist Charles Soret in the late 19th century. In Soret’s experiments, he discovered that subjecting an initially uniform salt solution in a tube to a difference in temperature would spontaneously lead to an increase in salt concentration in the colder region — which corresponds to an increase in order of the solution.
    Chvykov and England developed numerous mathematical models to demonstrate the low rattling principle, but it wasn’t until they connected with Daniel Goldman, Dunn Family Professor of Physics at the Georgia Institute of Technology, that they were able to test their predictions.
    Said Goldman, “A few years back, I saw England give a seminar and thought that some of our smarticle robots might prove valuable to test this theory.” Working with Chvykov, who visited Goldman’s lab, Ph.D. students William Savoie and Akash Vardhan used three flapping smarticles enclosed in a ring to compare experiments to theory. The students observed that instead of displaying complicated dynamics and exploring the container completely, the robots would spontaneously self-organize into a few dances — for example, one dance consists of three robots slapping each other’s arms in sequence. These dances could persist for hundreds of flaps, but suddenly lose stability and be replaced by a dance of a different pattern.
    After first demonstrating that these simple dances were indeed low rattling states, Chvykov worked with engineers at Northwestern University, Prof. Todd Murphey and Ph.D. student Thomas Berrueta, who developed more refined and better controlled smarticles. The improved smarticles allowed the researchers to test the limits of the theory, including how the types and number of dances varied for different arm flapping patterns, as well as how these dances could be controlled. “By controlling sequences of low rattling states, we were able to make the system reach configurations that do useful work,” Berrueta said. The Northwestern University researchers say that these findings may have broad practical implications for microrobotic swarms, active matter, and metamaterials.
    As England noted: “For robot swarms, it’s about getting many adaptive and smart group behaviors that you can design to be realized in a single swarm, even though the individual robots are relatively cheap and computationally simple. For living cells and novel materials, it might be about understanding what the ‘swarm’ of atoms or proteins can get you, as far as new material or computational properties.” More