More stories

  • in

    How robots can rule roads

    An ethical framework developed by government, road users and other stakeholders must steer the introduction of new road rules for connected and automated vehicles (CAVs), international experts say.
    They warn that strictly forbidding CAVs of various kinds to break existing traffic rules may hamper road safety, contrary to what most people may claim. However, this requires close scrutiny so these high-tech vehicles can meet their potential to reduce road casualties.
    “While they promise to minimise road safety risk, CAVs like hybrid AI systems can still create collision risk due to technological and human-system interaction issues, the complexity of traffic, interaction with other road users and vulnerable road users,” says UK transport consultant Professor Nick Reed, from Reed Mobility, in a new paper in Ethics and Information Technology.
    “Ethical goal functions for CAVs would enable developers to optimise driving behaviours for safety under conditions of uncertainty while allowing for differentiation of products according to brand values.”
    This part is important since it does not state that all vehicle brands should drive in exactly the same manner, which still allows brand differentiation, researchers say.
    Around the world, transport services are already putting CAVs, including driverless cars, on the road to deliver new services and freight options to improve road safety, alleviate congestion and increase drive comfort and transport system productivity. More

  • in

    A novel solution to a combinatorial optimization problem in bicycle sharing systems

    Traffic congestion has been worsening since the 1950s in large cities thanks to the exorbitant number of cars sold each year. Unfortunately, the figurative price tag attached to excessive traffic includes higher carbon dioxide emissions, more collectively wasted time, and exacerbated health problems. Many municipalities have tackled the problem of traffic by implementing bicycle sharing systems, in which people can borrow bikes from strategically placed ports and ride wherever they want, as long as they eventually return the bikes to a port, although not necessarily the one from where the bike was originally obtained.
    As one may or may not immediately notice, this last permission creates a new problem by itself. Whenever someone borrows a bike and does not make a round trip with it, an additional bike crops up at the destination port just as there’s a loss of one bike at the origin port. As time passes, the distribution of bikes across ports becomes unbalanced, causing both an excessive accumulation of bikes at certain ports and a dearth of bikes in others. This issue is generally addressed by periodically sending out a fleet of vehicles capable of transporting multiple bikes in order to restore ports to their ‘ideal’ number of bikes.
    Much research has been dedicated to the bicycle rebalancing problem using a fleet of vehicles. Finding the optimal routing paths for the vehicles is in and of itself a highly complex mathematical problem in the field of combinatorial optimization. One must make sure that the optimization algorithms used can reach a good-enough solution in a reasonable time for a realistically large number of ports and vehicles. Many methods, however, fail to find feasible solutions when multiple constrains are considered simultaneously, such as time, capacity, and loading/unloading constraints for the vehicles.
    But what if we allowed the optimization strategy to change the strategies a little bit to make the best out of difficult situations? In a recent study published in MDPI’s Applied Sciences, a team of scientists suggested an innovative twist to the routing problem of bicycle sharing systems using this concept. Led by Professor Tohru Ikeguchi of Tokyo University of Science, the team comprising PhD student Honami Tsushima from Tokyo University of Science and Associate Professor Takafumi Matsuura from Nippon Institute of Technology, Japan, proposed a new formulation of the routing problem in which the constraints imposed on the routings can be violated. This enabled using the optimization algorithm for exploring what is known as the space of “infeasible solutions.” Prof. Ikeguchi explains their reasoning, “In real life, if a work can be completed through overtime within a few minutes, we would work beyond the time limit. Similarly, if we are only carrying four bikes and need to supply five, we would still supply the four we have.”
    Following this line of thought, the researchers formulated the “soft constraints” variant of the routing problem in bicycle rebalancing. Using this approach, instead of outright excluding solutions that violate constraints, these can be considered valid paths that incur dynamically adjusted penalties and taken into consideration when assessing possible routings. This approach enabled the team to devise an algorithm that can make use of the space of infeasible solutions to speed up the search for optimal or near-optimal solutions.
    The researchers evaluated the performance of their method through numerical experiments with benchmark problems including up to 50 ports and three vehicles. The results show that their strategy could find optimal or near-optimal solutions in all cases, and that the algorithm could search both the feasible and infeasible solution spaces efficiently. This paints a brighter future for people in cities with congested traffic in which bicycle sharing systems could become an attractive solution. As Prof. Ikeguchi remarks, “It is likely that bike sharing systems will spread worldwide in the future, and we believe that the routing problem in bicycle rebalancing is an important issue to be solved in modern societies.”
    Hopefully, further efforts to improve bicycle sharing systems will alleviate traffic congestion and make people’s lives in big cities healthier and more enjoyable.
    Story Source:
    Materials provided by Tokyo University of Science. Note: Content may be edited for style and length. More

  • in

    New computer modeling could boost drug discovery

    Scientists from Queen’s University Belfast have developed a computer-aided data tool that could improve treatment for a range of illnesses.
    The computer modelling tool will predict novel sites of binding for potential drugs that are more selective, leading to more effective drug targeting, increasing therapeutic efficacy and reducing side effects.
    The data tool or protocol will uncover a novel class of compounds — allosteric drugs in G protein-coupled receptors (GPCRs).
    GPCRs are the largest membrane protein family that transduce a signal inside cells from hormones, neurotransmitters, and other endogenous molecules. As a result of their broad influence on human physiology, GPCRs are drug targets in many therapeutic areas such as inflammation, infertility, metabolic and neurological disorders, viral infections and cancer. Currently over a third of drugs act via GPCRs. Despite the substantial therapeutic success, the discovery of GPCR drugs is challenging due to promiscuous binding and subsequent side effects.
    Recent studies point to the existence of other binding sites, called allosteric sites that drugs can bind to and provide several therapeutic benefits. However, the discovery of allosteric sites and drugs has been mostly serendipitous. Recent X-ray crystallography, that determines the atomic and molecular structure, and cryo-electron microscopy that offers 3D models of several GPCRs offer opportunities to develop computer-aided methodologies to search for allosteric sites.
    The researchers developed a computer-aided protocol to map allosteric sites in GPCRs with a view to start rational search of allosteric drugs, presenting the opportunity for new solutions and therapies for a range of diseases.
    Dr Irina Tikhonova from the School of Pharmacy at Queen’s University and senior author, explains: “We have developed a novel, cost-effective and rapid pipeline for the discovery of GPCRs allosteric sites, which overcomes the limitations of current computational protocols such as membrane distortion and non-specific binding.
    “Our pipeline can identify allosteric sites in a short time, which makes it suitable for industry settings. As such, our pipeline is a feasible solution to initiate structure-based search of allosteric drugs for any membrane-bound drug targets that have an impact on cancer, inflammation, and CNS diseases.”
    This research published in ACS Central Science is a collaboration with Queen’s University Belfast and Queen Mary University of London. It is supported by the European Union ‘s Horizon 2020 research and innovation programme under the Marie-Sklodowska-Curie grants agreement and Biotechnology and Biological Science Research Council.
    Story Source:
    Materials provided by Queen’s University Belfast. Note: Content may be edited for style and length. More

  • in

    A new 3D printing frontier: Self-powered wearable devices

    When most people think of wearable devices, they think of smart watches, smart glasses, fitness trackers, even smart clothing. These devices, part of a fast-growing market, have two things in common: They all need an external power source, and they all require exacting manufacturing processes. Until now.
    Yanliang Zhang, associate professor of aerospace and mechanical engineering at the University of Notre Dame, and doctoral student Yipu Du have created an innovative hybrid printing method — combining multi-material aerosol jet printing and extrusion printing — that integrates both functional and structural materials into a single streamlined printing platform. Their work was recently published in Nano Energy.
    Zhang and Du, in collaboration with a team at Purdue University led by professor Wenzhuo Wu, also have developed an all-printed piezoelectric (self-powered) wearable device.
    Using their new hybrid printing process, the team demonstrated stretchable piezoelectric sensors, conformable to human skin, with integrated tellurium nanowire piezoelectric materials, silver nanowire electrodes and silicone films. The devices printed by the team were then attached to a human wrist, accurately detecting hand gestures, and to an individual’s neck, detecting the individual’s heartbeat. Neither device used an external power source.
    Piezoelectric materials are some of the most promising materials in the manufacture of wearable electronics and sensors because they generate their own electrical charge from applied mechanical stress instead of from a power source.
    Yet printing piezoelectric devices is challenging because it often requires high electric fields for poling and high sintering temperatures. This adds to the time and cost of the printing process and can be detrimental to surrounding materials during sensor integration.
    “The biggest advantage of our new hybrid printing method is the ability to integrate a wide range of functional and structural materials in one platform,” said Zhang.
    “This streamlines the processes, reducing the time and energy needed to fabricate a device, while ensuring the performance of printed devices.”
    Vital to the design, said Zhang, are nanostructured materials with piezoelectric properties, which eliminate the need for poling or sintering, and the highly stretchable silver nanowire electrodes, which are important for wearable devices attached to bodies in motion.
    “We’re excited to see the wide range of opportunities that will open up for printed electronics and wearable devices because of this very versatile printing process,” said Zhang.
    Story Source:
    Materials provided by University of Notre Dame. Original written by Nina Welding. Note: Content may be edited for style and length. More

  • in

    Earth will warm 2.7 degrees Celsius based on current pledges to cut emissions

    This year was supposed to be a turning point in addressing climate change. But the world’s nations are failing to meet the moment, states a new report by the United Nations Environment Programme.

    The Emissions Gap Report 2021: The Heat Is On, released October 26, reveals that current pledges to reduce greenhouse gas emissions and rein in global warming still put the world on track to warm by 2.7 degrees Celsius above preindustrial levels by the end of the century.

    Aiming for “net-zero emissions” by midcentury — a goal recently announced by China, the United States and other countries, but without clear plans on how to do so — could reduce that warming to 2.2 degrees C. But that still falls short of the mark, U.N. officials stated at a news event for the report’s release.

    At a landmark meeting in Paris in 2015, 195 nations pledged to eventually reduce their emissions enough to hold global warming to well below 2 degrees C by 2100 (SN: 12/12/15). Restricting global warming further, to just 1.5 degrees C, would forestall many more devastating consequences of climate change, as the Intergovernmental Panel on Climate Change, or IPCC, reported in 2018 (SN: 12/17/18). In its latest report, released in August, the IPCC noted that extreme weather events, exacerbated by human-caused climate change, now occur in every part of the planet — and warned that the window to reverse some of these effects is closing (SN: 8/9/21).

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Despite these dire warnings, “the parties to the Paris Agreement are utterly failing to keep [its] target in reach,” said U.N. Secretary-General António Guterres. “The era of half measures and hollow promises must end.”

    The new U.N. report comes at a crucial time, just days before world leaders meet for the 2021 U.N. Climate Change Conference, or COP26, in Glasgow, Scotland. The COP26 meeting — postponed from 2020 to 2021 due to the COVID-19 pandemic — holds particular significance because it is the first COP meeting since the 2015 agreement in which signatories are expected to significantly ramp up their emissions reductions pledges.

    The U.N. Environment Programme has kept annual tabs on the still-yawning gap between existing national pledges to reduce emissions and the Paris Agreement target (SN: 11/26/19). Ahead of the COP26 meeting, 120 countries, responsible for emitting just over half of the world’s greenhouse gas emissions, announced their new commitments to address climate change by 2030.

    The 2021 report finds that new commitments bring the world only slightly closer to where emissions need to be by 2030 to reach warming targets. With the new pledges, total annual emissions in 2030 would be 7.5 percent lower (about 55 gigatons of carbon dioxide equivalent) than they would have been with pledges as of last year (about 59 gigatons). But to stay on track for 2 degrees C of warming, emissions would have to be about 30 percent lower than the new pledges, or about 39 gigatons each year. To hold warming to 1.5 degrees C requires a roughly 55 percent drop in emissions compared with the latest pledges, to about 25 gigatons a year.

    “I’m hoping that the collision of the science and the statistics in the gap analysis, and the voices of the people will promote a greater sense of urgency,” says Gabriel Filippelli, a geochemist at Indiana University–Purdue University Indianapolis.

    On October 26, Filippelli, the editor of the American Geophysical Union journal GeoHealth, and editors in chief of other journals published by the organization coauthored a statement in Geophysical Research Letters. Theyurged world leaders at COP26 to keep the “devastating impacts” of climate change in check by immediately reducing global carbon emissions and shifting to a green economy. “We are scientists, but we also have families and loved ones alongside our fellow citizens on this planet,” the letter states. “The time to bridge the divide between scientist and citizen, head and heart, is now.”

    Publishing that plea was a departure for some of the scientists, Filippelli says. “We have been publishing papers for the last 20 to 30 years, documenting the train wreck of climate change,” he says. “As you can imagine, behind the scenes there were some people who were a little uncomfortable because it veered away from the true science. But ultimately, we felt it was more powerful to write a true statement that showed our hearts.” More

  • in

    Dynamical scaling of entanglement entropy and surface roughness in random quantum systems

    In physics, “universality” refers to properties of systems that are independent of their details. Establishing the universality of quantum dynamics is one of the key interests of theoretical physicists. Now, researchers from Japan have identified such a universality in disordered quantum systems, characterized by a one-parameter scaling for surface roughness and entanglement entropy (a measure of quantum entanglement).
    Many-particle systems in the real world are often imbued with “disorder” or “randomness.” This, in turn, leads to the occurrence of phenomena unique to such systems. For instance, electrons in strongly disordered systems can become localized due to destructive interference, a phenomenon known as “Anderson localization.”
    Anderson localization has been studied extensively in terms of one-parameter scaling, where system properties are scaled based on one specific parameter. But while most studies have focused on static properties, disorder can also significantly influence quantum dynamics such as entanglement dynamics and transport phenomena.
    In a recent study published in Physical Review Letters, a team of physicists led by Prof. Kazuya Fujimoto from Nagoya University has now demonstrated numerically a dynamical one-parameter scaling called “Family-Vicsek (FV) scaling” for disordered quantum systems.
    “While the FV scaling is originally known from classical surface growth, we found the scaling in random quantum systems by introducing a ‘quantum surface height operator’,” explains Prof. Fujimoto.
    In their study, the physicists considered a system of non-interacting spinless fermions in a disordered one-dimensional potential for three common models. They found that the surface roughness followed FV scaling characterized with three exponents. Further numerical analysis showed that the surface roughness could be related to the entanglement entropy (EE), thus indicating an FV-type scaling for EE. In addition, they observed anomalous scaling exponents for one of the models and attributed it to the presence of localized states in a delocalized phase, a classic signature of quantum disordered systems.
    Importantly, surface roughness can be measured experimentally for cold-atomic systems using microscopy techniques, which makes the experimental estimation of EE viable in non-interacting fermions.
    “These findings will deepen our understanding of nonequilibrium physics and provide a novel viewpoint to classify the universal non-equilibrium phenomena emerging in random quantum systems,” says Prof. Fujimoto.
    While the findings of the study do not have a direct influence on our daily lives, they certainly pave the way for a better understanding of real-world quantum systems.
    Story Source:
    Materials provided by Nagoya University. Note: Content may be edited for style and length. More

  • in

    Enhanced touch screens could help you 'feel' objects

    The next time you buy a new couch, you may not ever have to leave your old one to get a feel for the texture of the new material.
    Dr. Cynthia Hipwell, Oscar S. Wyatt Jr. ’45 Chair II Professor in the J. Mike Walker ’66 Department of Mechanical Engineering at Texas A&M University, is leading a team working to better define how the finger interacts with a device with the hope of aiding in the further development of technology that goes beyond sensing and reacting to your touch.
    The team’s research was recently published and featured on the cover of the journal Advanced Materials.
    The ultimate goal of furthering this human-machine interface is to give touch devices the ability to provide users with a richer touch-based experience by equipping the technology with the ability to mimic the feeling of physical objects. Hipwell shared examples of potential implementations ranging from a more immersive virtual reality platform to tactile display interfaces like those in a motor vehicle dashboard and a virtual shopping experience that would let the user feel the texture of materials before purchasing them.
    “This could allow you to actually feel textures, buttons, slides and knobs on the screen,” Hipwell said. “It can be used for interactive touch screen-based displays, but one holy grail would certainly be being able to bring touch into shopping so that you could feel the texture of fabrics and other products while you’re shopping online.”
    Hipwell explained that at its essence, the “touch” in current touch screen technology is more for the screen’s benefit than the user. With the emergence and refinement of increasingly sophisticated haptic technology, that relationship between user and device can grow to be more reciprocal.
    She added that the addition of touch as a sensory input would ultimately enrich virtual environments and lighten the burden of communication currently carried by audio and visuals.
    “When we look at virtual experiences, they’re primarily audio and visual right now and we can get audio and visual overload,” Hipwell said. “Being able to bring touch into the human-machine interface can bring a lot more capability, much more realism, and it can reduce that overload. Haptic effects can be used to draw your attention to make something easier to find or easier to do using a lower cognitive load.”
    Hipwell and her team are approaching the research by looking at the multiphysics — the coupled processes or systems involving multiple physical fields occurring at the same time — of the interface between the user’s finger and the device. This interface is incredibly complex and changes with different users and environmental conditions.
    “We’re looking at electro-wetting effects (the forces that result from an applied electric field), electrostatic effects, changes in properties of the finger, the material properties and surface geometry of the device, the contact mechanics, the fluid motion, charge transport — really, everything that’s going on in the interface to understand how the device can be designed to be more reliable and higher performing,” Hipwell said. “Ultimately, our goal is to create predictive models than enable a designer to create devices with maximum haptic effect and minimum sensitivity to user and environmental variation.”
    As research into and development of the technology continues to progress, Hipwell said she predicts consumers will begin to see early elements implemented into common devices over the next few years, with some early products already in development.
    “I think early elements of it will definitely be within the next five years,” Hipwell said. “Then, it will just be a matter of maturing the technology and how advanced, how realistic and how widespread it becomes.”
    Story Source:
    Materials provided by Texas A&M University. Original written by Steve Kuhlmann. Note: Content may be edited for style and length. More

  • in

    Modeling improvements promise increased accuracy for epidemic forecasting

    Accurate forecasting of epidemic scenarios is critical to implementing effective public health intervention policies. While much progress has been made in predicting the general magnitude and timing of epidemics, there’s still room for improvement in forecasting peak times, as unfortunately evidenced with H1N1 and COVID-19, when peak times occurred later than predicted.
    In Chaos, by AIP Publishing, researchers from France and Italy use dynamical stochastic modeling techniques to reveal that infection and recovery rate fluctuations play a critical role in determining peak times for epidemics.
    “Some averaged quantities, like infection and recovery rates, are highly sensitive to parameter fluctuations, which means that the latter must be understood, even when average behavior is the only focus of interest,” said co-author Maxence Arutkin. “Our work shows that epidemic peak timing depends on these fluctuations, and neglecting them in epidemiological models can lead to inaccurate epidemic scenarios and unsuitable mitigation policies, not to mention enable viruses to evolve into new variants.”
    Using a susceptible-infected-recovered epidemic model that incorporates daily fluctuations on control parameters, the study applies probability theory calculations to infection counts at the beginning of an epidemic wave and at peak times for populations in Italy. While previous works using standard epidemiological models have suggested there is a delay between the epidemic peak date and its prediction (without fluctuations), the researchers suggest the epidemic peak time depends not only on the mean value of the infection and recovery rates but also on their fluctuations.
    To predict epidemic trajectory, an important parameter is the basic reproduction number, R0, which describes the average number of infections transmitted from an individual. Infection and recovery rate fluctuations lead to lognormal probability distribution of the number of infected people, similar in its analytical form to price distributions for financial assets.
    “In the short term, even when average infections transmitted from a single individual are less than one, we can observe epidemic resurgence due to parameter fluctuations,” said Arutkin. “Also, a dispersion of the epidemic peak time can be quantified showing that, without taking these fluctuations into account, the peak time estimates are biased.”
    The study reveals that improved prediction depends on both R0 levels and fluctuations in infection and recovery rates and may provide policymakers with a tool to assess the consequences of parameter fluctuations based on different R0 levels.
    “Our findings suggest we must introduce parameter fluctuations in epidemiological models going forward,” said Arutkin.
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More