More stories

  • in

    Tear-resistant rubbery materials could pave the way for tougher tires

    A new material design could reduce pollution where the rubber meets the road.

    Strategically adding weak points along microscopic chains called polymers actually makes them harder to tear, researchers report in the June 23 Science. Because polymers are used in car tires, the findings could help reduce plastic pollution as tires wear down over time.

    When tires scrape against the road, they drop tiny particles of rubber and plastic polymers, which pollute waterways and contaminate the air (SN: 11/12/18). Every year, tires release an estimated 6 million metric tons of these microplastics into the environment. Stronger polymers that break apart less easily could limit the amount of particles shed annually. More

  • in

    New Zealand kids spending one-third of after-school time on screens

    Regulations are urgently needed to protect children from harm in the unregulated online world, researchers at the University of Otago, New Zealand, say.
    The call comes as the researchers publish the results of their study into the after-school habits of 12-year-olds. Their research, published today in the New Zealand Medical Journal, finds children are spending a third of their after-school time on screens, including more than half their time after 8pm.
    Senior researcher Dr Moira Smith from the University’s Department of Public Health says this is considerably more than the current guidelines, which recommend less than two hours of screen time per day (outside school time) for school-aged children and adolescents.
    The results are from the innovative Kids’Cam project, with the 108 children involved wearing cameras that captured images every seven seconds, offering a unique insight into their everyday lives in 2014 and 2015.
    Children were mostly playing games and watching programmes. For ten per cent of the time the children were using more than one screen.
    Screen use harms children’s health and wellbeing.

    “It is associated with obesity, poor mental wellbeing, poor sleep and mental functioning and lack of physical activity,” Dr Smith says. “It also affects children’s ability to concentrate and regulate their behaviour and emotions.”
    Screen use is now a regular part of children’s everyday lives and is likely to have increased since the Kids’Cam data was collected.
    “Screen use rose rapidly during the COVID-19 pandemic, and children in 2023 are frequently spending time online, particularly on smartphones. According to the latest media use survey, YouTube and Netflix are the most popular websites for watching programmes, with one in three children under 14 using social media, most commonly TikTok, which is rated R13.”
    She says children are being exposed to ads for vaping, alcohol, gambling and junk food, and experiencing sexism, racism and bullying while online.
    “Cyberbullying is particularly high among children in Aotearoa, with one in four parents reporting their child has been subjected to bullying while online.”
    Dr Smith says current New Zealand legislation is outdated and fails to adequately deal with the online world children are being exposed to.
    “While screen use has many benefits, children need to be protected from harm in this largely unregulated space.”
    She says the Government is to be applauded for proposing more regulation of social media in its recent consultation document from the Department of Internal Affairs (DIA), which notes concern about children accessing inappropriate content while online.
    The Otago researchers are currently studying the online worlds of children in Aotearoa using screen capture technology, with the results expected to be published soon. More

  • in

    AI finds a way to people’s hearts (literally!)

    AI (artificial intelligence) may sound like a cold robotic system, but Osaka Metropolitan University scientists have shown that it can deliver heartwarming — or, more to the point, “heart-warning” — support. They unveiled an innovative use of AI that classifies cardiac functions and pinpoints valvular heart disease with unprecedented accuracy, demonstrating continued progress in merging the fields of medicine and technology to advance patient care. The results will be published in The Lancet Digital Health.
    Valvular heart disease, one cause of heart failure, is often diagnosed using echocardiography. This technique, however, requires specialized skills, so there is a corresponding shortage of qualified technicians. Meanwhile, chest radiography is one of the most common tests to identify diseases, primarily of the lungs. Even though the heart is also visible in chest radiographs, little was known heretofore about the ability of chest radiographs to detect cardiac function or disease. Chest radiographs, or chest X-Rays, are performed in many hospitals and very little time is required to conduct them, making them highly accessible and reproducible. Accordingly, the research team led by Dr. Daiju Ueda, from the Department of Diagnostic and Interventional Radiology at the Graduate School of Medicine of Osaka Metropolitan University, reckoned that if cardiac function and disease could be determined from chest radiographs, this test could serve as a supplement to echocardiography.
    Dr. Ueda’s team successfully developed a model that utilizes AI to accurately classify cardiac functions and valvular heart diseases from chest radiographs. Since AI trained on a single dataset faces potential bias, leading to low accuracy, the team aimed for multi-institutional data. Accordingly, a total of 22,551 chest radiographs associated with 22,551 echocardiograms were collected from 16,946 patients at four facilities between 2013 and 2021. With the chest radiographs set as input data and the echocardiograms set as output data, the AI model was trained to learn features connecting both datasets.
    The AI model was able to categorize precisely six selected types of valvular heart disease, with the Area Under the Curve, or AUC, ranging from 0.83 to 0.92. (AUC is a rating index that indicates the capability of an AI model and uses a value range from 0 to 1, with the closer to 1, the better.) The AUC was 0.92 at a 40% cut-off for detecting left ventricular ejection fraction — an important measure for monitoring cardiac function.
    “It took us a very long time to get to these results, but I believe this is significant research,” stated Dr. Ueda. “In addition to improving the efficiency of doctors’ diagnoses, the system might also be used in areas where there are no specialists, in night-time emergencies, and for patients who have difficulty undergoing echocardiography.” More

  • in

    Physicists generate the first snapshots of fermion pairs

    When your laptop or smartphone heats up, it’s due to energy that’s lost in translation. The same goes for power lines that transmit electricity between cities. In fact, around 10 percent of the generated energy is lost in the transmission of electricity. That’s because the electrons that carry electric charge do so as free agents, bumping and grazing against other electrons as they move collectively through power cords and transmission lines. All this jostling generates friction, and, ultimately, heat.
    But when electrons pair up, they can rise above the fray and glide through a material without friction. This “superconducting” behavior occurs in a range of materials, though at ultracold temperatures. If these materials can be made to superconduct closer to room temperature, they could pave the way for zero-loss devices, such as heat-free laptops and phones, and ultraefficient power lines. But first, scientists will have to understand how electrons pair up in the first place.
    Now, new snapshots of particles pairing up in a cloud of atoms can provide clues to how electrons pair up in a superconducting material. The snapshots were taken by MIT physicists and are the first images that directly capture the pairing of fermions — a major class of particles that includes electrons, as well as protons, neutrons, and certain types of atoms.
    In this case, the MIT team worked with fermions in the form of potassium-40 atoms, and under conditions that simulate the behavior of electrons in certain superconducting materials. They developed a technique to image a supercooled cloud of potassium-40 atoms, which allowed them to observe the particles pairing up, even when separated by a small distance. They could also pick out interesting patterns and behaviors, such as a the way pairs formed checkerboards, which were disturbed by lonely singles passing by.
    The observations, reported today in Science, can serve as a visual blueprint for how electrons may pair up in superconducting materials. The results may also help to describe how neutrons pair up to form an intensely dense and churning superfluid within neutron stars.
    “Fermion pairing is at the basis of superconductivity and many phenomena in nuclear physics,” says study author Martin Zwierlein, the Thomas A. Frank Professor of Physics at MIT. “But no one had seen this pairing in situ. So it was just breathtaking to then finally see these images onscreen, faithfully.”
    The study’s co-authors include Thomas Hartke, Botond Oreg, Carter Turnbaugh, and Ningyuan Jia, all members of MIT’s Department of Physics, the MIT-Harvard Center for Ultracold Atoms, and the Research Laboratory of Electronics.

    A decent view
    To directly observe electrons pair up is an impossible task. They are simply too small and too fast to capture with existing imaging techniques. To understand their behavior, physicists like Zwierlein have looked to analogous systems of atoms. Both electrons and certain atoms, despite their difference in size, are similar in that they are fermions — particles that exhibit a property known as “half-integer spin.” When fermions of opposite spin interact, they can pair up, as electrons do in superconductors, and as certain atoms do in a cloud of gas.
    Zwierlein’s group has been studying the behavior of potassium-40 atoms, which are known fermions, that can be prepared in one of two spin states. When a potassium atom of one spin interacts with an atom of another spin, they can form a pair, similar to superconducting electrons. But under normal, room-temperature conditions, the atoms interact in a blur that is difficult to capture.
    To get a decent view of their behavior, Zwierlein and his colleagues study the particles as a very dilute gas of about 1,000 atoms, that they place under ultracold, nanokelvin conditions that slow the atoms to a crawl. The researchers also contain the gas within an optical lattice, or a grid of laser light that the atoms can hop within, and that the researchers can use as a map to pinpoint the atoms’ precise locations.
    In their new study, the team made enhancements to their existing technique for imaging fermions that enabled them to momentarily freeze the atoms in place, then take snapshots separately of potassium-40 atoms with one particular spin or the other. The researchers could then overlay an image of one atom type over the other, and look to see where the two types paired up, and how.

    “It was bloody difficult to get to a point where we could actually take these images,” Zwierlein says. “You can imagine at first getting big fat holes in your imaging, your atoms running away, nothing is working. We’ve had terribly complicated problems to solve in the lab through the years, and the students had great stamina, and finally, to be able to see these images was absolutely elating.”
    Pair dance
    What the team saw was pairing behavior among the atoms that was predicted by the Hubbard model — a widely held theory believed to hold they key to the behavior of electrons in high-temperature superconductors, materials that exhibit superconductivity at relatively high (though still very cold) temperatures. Predictions of how electrons pair up in these materials have been tested through this model, but never directly observed until now.
    The team created and imaged different clouds of atoms thousands of times and translated each image into a digitized version resembling a grid. Each grid showed the location of atoms of both types (depicted as red versus blue in their paper). From these maps, they were able to see squares in the grid with either a lone red or blue atom, and squares where both a red and blue atom paired up locally (depicted as white), as well as empty squares that contained neither a red or blue atom (black).
    Already individual images show many local pairs, and red and blue atoms in close proximity. By analyzing sets of hundred of images, the team could show that atoms indeed show up in pairs, at times linking up in a tight pair within one square, and at other times forming looser pairs, separated by one or several grid spacings. This physical separation, or “nonlocal pairing,” was predicted by the Hubbard model but never directly observed.
    The researchers also observed that collections of pairs seemed to form a broader, checkerboard pattern, and that this pattern wobbled in and out of formation as one partner of a pair ventured outside its square and momentarily distorted the checkerboard of other pairings. This phenomenon, known as a “polaron,” was also predicted but never seen directly.
    “In this dynamic soup, the particles are constantly hopping on top of each other, moving away, but never dancing too far from each other,” Zwierlein notes.
    The pairing behavior between these atoms must also occur in superconducting electrons, and Zwierlein says the team’s new snapshots will help to inform scientists’ understanding of high-temperature superconductors, and perhaps provide insight into how these materials might be tuned to higher, more practical temperatures.
    “If you normalize our gas of atoms to the density of electrons in a metal, we think this pairing behavior should occur far above room temperature,” Zwierlein offers. “That gives a lot of hope and confidence that such pairing phenomena can in principle occur at elevated temperatures, and there’s no a priori limit to why there shouldn’t be a room-temperature superconductor one day.”
    This research was supported, in part, by the U.S. National Science Foundation, the U.S. Air Force Office of Scientific Research, and the Vannevar Bush Faculty Fellowship. More

  • in

    New design rule for high-entropy superionic solid-state conductors

    Solid electrolytes with high lithium-ion conductivity can be designed for millimeter-thick battery electrodes by increasing the complexity of their composite superionic crystals, report researchers from Tokyo Tech. This new design rule enables the synthesis of high-entropy active materials while preserving their superionic conduction.
    As the world transitions towards a greener and more sustainable energy economy, reliance on lithium (Li)-ion batteries is expected to rise. Scientists from across the globe are working towards designing smaller yet efficient batteries that can keep up with the ever-increasing demand for energy storage. In recent years, all-solid-state lithium batteries (ASSLBs) have captured research interest due to their unique use of solid electrolytes instead of conventional liquid ones. Solid electrolytes not only make the battery safer from leakage and fire-related hazards, but also provide superior energy and power characteristics. However, their stiffness results in poor wetting of the cathode surface and a lack of homogenous supply of Li ions to the cathode. This, in turn, leads to a loss of capacity in the solid-state battery. The issue becomes more pronounced in thick battery cathode electrode such as millimeter-thick one, which is a more advantageous electrode configuration for realizing inexpensive and high-energy-density battery package, compared to conventional electrode with typical thickness of < 0.1 mm. Fortunately, a recent study published in Science found a way to overcome this problem. The paper -- authored by a team of researchers led by Prof. Ryoji Kanno from Tokyo Institute of Technology (Tokyo Tech) -- describes a new strategy to produce solid electrolytes with enhanced Li-ion conductivity. Their work establishes a design rule for synthesizing high-entropy crystals of lithium superionic conductors via the multi-substitution approach. "Many studies have shown that inorganic ionic conductors tend to show better ion conductivity after multi-element substitution probably because of the flattened potential barrier of Li-ion migration, which is essential for better ion conductivity," points out Prof. Kanno. This was where they started their research. For the design of their new material, the team took inspiration from the chemical compositions of two well-known Li-based solid electrolytes: argyrodite-type (Li6PS5Cl) and LGPS-type (Li10GeP2S12) superionic crystals. They modified the LGPS-type Li9.54Si1.74P1.44S11.7Cl0.3 via multi-substitution and synthesized a series of crystals with composition Li9.54[Si1−δMδ]1.74P1.44S11.1Br0.3O0.6 (M = Ge, Sn; 0 ≤ δ ≤ 1). The researchers used a crystal with Ge = M and δ = 0.4 as a catholyte in an ASSLB with an 1- or 0.8- millimeter-thick cathode. The former and latter ASSLB exhibited discharge capacities of 26.4 mAh cm−2 at 25 °C (1 mm) and 17.3 mAh cm−2 at −10 °C (0.8 mm), respectively, with the area-specific capacity 1.8 and 5.3 times larger than those reported for previous state-of the-art ASSLBs, respectively. Theoretical calculations suggested that the enhanced conductivity of the solid electrolyte could be a result of the flattening of the energy barrier for ion migration, caused by a small degree of chemical substitution in the above-mentioned crystal. This study provides a new way for preparing high-entropy solid electrolytes for millimeter-thick electrodes while preserving their superionic conduction pathways. "In effect, the proposed design rule lays a solid groundwork for exploring new superionic conductors with superior charge-discharge performance, even at room temperature," concludes Prof. Kanno. More

  • in

    Number cruncher calculates whether whales are acting weirdly

    We humans can be a scary acquaintance for whales in the wild. This includes marine biologists tagging them with measuring devices to understand them better. These experiences can make whales behave erratically for a while. Such behaviour can affect research quality and highlights an animal ethics dilemma. Now, University of Copenhagen researchers have figured out how to solve the problems with math.
    Maybe you have tried taking a howling pooch or cranky cat to the vet. Regardless of your noblest intentions, your pet’s experience may have been equally unpleasant. Animals react to the unknown in their own way. The case is no different for cetaceans like narwhal and bowhead whales when they encounter human generated noises such as ship noise or mining blasts in the North Atlantic — or when they are caught by well-meaning marine biologists who just want to get to know them better.
    When biologists ‘tag’ whales with measuring devices, the animals react by behaving unusually — abnormally. For example, for a while after being tagged, they may perform many atypical shallow dives and quick jerks. Such behaviour is misleading when the goal is to study the animal’s normal and natural behaviour.
    The problem is getting help from an unusual corner.
    “Biologists seek to understand animals as natural beings, but their reactions turn into unnatural behaviour that creates noise in the dataset. Because of this, a lot of data from just after whales are tagged ends up getting discarded. In this study, we have proposed a mathematical approach using statistical methods that can determine exactly how much data to keep,” says PhD student Lars Reiter from the Department of Mathematics.
    Valuable for humans and animals alike
    With two statistical calculations, the researcher has found a way to estimate when whales like narwhals and bowhead whales will return to their natural behaviour after being tagged. It is a method that can also be used to study how animals respond to other types of disturbances.

    “This research is extremely valuable to us as marine biologists who are interested in the behaviour and well-being of whales. It provides us with a standardised approach by which to distinguish between natural behaviour and affected behaviour in whales. Thus far, we’ve made individual estimates that are more or less spot on,” says marine biologist Outi Tervo from the Greenland Institute of Natural Resources, who collaborated with the mathematicians on the study.
    The statistical method allows researchers to avoid discarding too much or too little data. If too much data is kept, it can interfere with the research results, and if too much is lost, it comes at cost to both the animals and humans.
    “It really matters in terms of research, but also financially. And not least, it means something for animal welfare. If we throw away data unnecessarily, more whales will eventually have to go through the experience for us to conduct this research, which is ultimately meant to benefit the animals,” says Outi Tervo.
    Idea came from a parliamentary election
    Whale behaviour does not go from abnormal to normal with a flick of its tail. Their behaviour normalizes gradually, typically over a day — and in a few cases over a longer period of time. During this transition, a whale’s behaviour manifests itself on both sides of an area designated as normal whale behaviour. So how do scientists figure out where to make the cut?

    “The idea came to me while I was standing in the voting booth during parliamentary elections. Borrowing from the logic of the electoral system, you can consider it as if the whales — or these data points which show the whale’s behaviour — vote on whether they are in or out of their normal range,” explains Lars Reiter.
    By recording 1 positive “vote” when the behaviour is within the normal range, and 1 negative “vote” when outside, the scientists can add up all the votes and find the moment at which the number of votes goes from predominantly negative to positive.
    The researchers use two approaches to determine normal whale behaviour. In part, they look at the whale’s diving pattern, as well as its acceleration and fine motor skills.
    How to calculate the behaviour of animals statistically
    Sometimes it hunts in the deep, while at others times, it cruises quietly at the surface. The activity that a whale is engaged in is crucial for understanding its normal energy level. Lars Reiter’s method takes this into account as something new:
    “Where previous research focused on the mean behavior, we instead situate a whale in an activity based on its movements — where it is assessed based on a normal value for acceleration that matches the speci?c activity being engaged in. We do this by using what are known as quantiles, instead of averages, because they allow us to focus on behavioural extremes. For example, hunting and resting are opposing extremes in terms of energy levels,” explains Lars Reiter.
    When the focus is on the whale’s diving profile, on the other hand, you look at the pattern formed by the whale’s overall activities. By combining depth and time, one can assess whether the distribution of different dive types is natural.
    Wiser about the animals’ hardships and better at avoiding them
    According to the marine biologist, the data-based approach represented by the statistical method also means that researchers can now develop better, more gentle ways of tagging.
    “Based on this study, we already know that the amount of time we spend putting the equipment on is an important factor for how much the animals are affected afterwards. Therefore, we can set up some time limits — where we stop and set the whale free if it takes more than X number of minutes allowed,” says Outi Tervo.
    A shift away from individual estimates to a mathematical standard could also mean better assessments from the veterinary oversight that tag-using research projects are required to go through.
    “The method will make it so that ethical approval from a veterinary inspection is more data-based and precise. So, there is no doubt that this research is a step forward for animal welfare,” says the marine biologist.
    * Extra info: An important instrument for a future with less ice and more people
    The natural Arctic habitat of narwhals and bowhead whales is changing due to climate change. Annual ice shrinkage and increasing human activity is taking place in areas that whales once had all to themselves. The researchers’ method can become an important instrument and contribute to a greater understanding of the consequences.
    “It allows us to study how whales are impacted by various human activities. They can be external sources of noise that we can situate in time and location, such as a blast or a ship passing by. Or sounds and activities that we emit ourselves. Lars’ algorithm lets us get a clear picture of how it all affects the animals,” says Outi Tervo.
    Increased activity will lead to more ocean noise, which is of concern to marine biologists with regards to how it will affect large marine animals like narwhal, which are incredibly sensitive to sound. Co-author and supervisor Professor Susanne Ditlevsen believes that the studies and new method will become more important in the years ahead.
    “Climate change is leading to increased anthropogenic activity in Arctic whale habitats. Melting ice means that areas which were once impassable can now be reached by humans. We would like to assess whether it scares and disturbs the animals, but it is not clear how. The new methods can be used to assess at what distance from the animal habitat should various activities take place,” says Susanne Ditlevsen.
    Facts: Statistical method with two mathematical calculations and one intersection.
    The statistical method can generally be understood as calculations with two types of tagging data — acceleration and depth, and a way of adding it up that finds the optimal intersection.
    1. Acceleration tells about the energy level and whale movements (“jerks”). The indicators for natural behaviour are divided according to whale activity, so that, for example, a high energy level is recorded as natural in connection with hunting, but not in connection with rest.
    2. The whale’s diving profile is measured in depth and time spent on a dive. Temporal impacts over a 40-hour period show a pattern of different types of dives — e.g., U-dives, where the whale stay at depth for some time, or V-dives, where the whale resurfaces quickly. The pattern is compared with normal values measured after the 40 hours.
    3. The cut-off point for when the whale is back in normal behaviour is found by counting the individual measurements as “voting for or against” normal behaviour. As such, the researchers find the optimal place to divide the research data into natural and influenced behaviour.
    About the study
    The study is part of a larger research collaboration between the Greenland Institute of Natural Resources and the University of Copenhagen’s Department of Mathematics, that focuses on the Arctic’s large marine mammals.
    The researchers include Lars Reiter Nielsen and Susanne Ditlevsen from the University of Copenhagen, Outi M. Tervo and Mads Peter Heide-Jørgensen from the Greenland Institute of Natural Resources and Susanna B. Blackwell from Greeneridge Sciences, Inc., Santa Barbara, USA More

  • in

    Researchers calculate economic value of temporary carbon reduction with ‘Social Value of Offsets’ formula

    A new study identifies how to calculate the economic value of temporarily reducing carbon emissions through carbon offsetting.
    The Social Value of Offsets (SVO) is an economic framework that will help policymakers calculate how much carbon should be stored in temporary offsets to make it equivalent to a permanent CO2 emission.
    Using the SVO metric the researchers estimate that an offset sequestering one ton of carbon for 50 years is equivalent to between 0.3 to 0.5 tons permanently locked away, taking into account a range of factors for different risks, permanence and climate scenarios.
    Offsets are a key part of Paris-compliant net zero strategies, but many offsetting projects fail and there is never a guarantee on how long an offset will sequester carbon for — making it difficult to measure the economic damage avoided.
    The study, published in Nature, sets out the risks and uncertainties of offsetting, which occur due to the unregulated nature of the global offsets market.
    Risk factors to projects in tropical forests, for example, can include the lack of strong institutions on the ground to monitor, enforce and account for emissions sequestered, as well as the possibility of fires and disease.

    There are also risks in how emissions reductions are reported as well that of ‘non-additionality’ — when emissions reductions would have happened irrespective of the offsetting.
    Other frameworks count the physical units of carbon but SVO is unique in that it is an economic framework where the value of temporary emissions reductions is measured as the value of the damages avoided to the economy during the length of the offsetting project.
    The researchers say this will potentially make it easier to compare offsetting schemes, allowing anyone offsetting their carbon emissions to be able to weigh up the risks involved and decide how much carbon they would need to offset in temporary schemes to make up for a permanent carbon emission.
    Professor Ben Groom, Dragon Capital Chair in Environmental Economics at the University of Exeter Business School, said: “Our analysis shows that a carbon emission today which is offset by a temporary project can be thought of as a postponed emission with the same warming effect when the project ends, but with less warming during the project.
    “The Social Value of Offsets (SVO) stems from the value of delaying emissions and damages, and this depends on how impermanent, risky or additional they are. Valuing offsets using the SVO then provides a means of comparing offsets with different qualities in terms of the economic damages avoided.”
    Professor Groom explains why delaying emissions is important, both in an economic and physical sense. “With a project that stores carbon and releases it 50 years later, the net carbon reduction is always going to be zero, so some may say it’s as if it never happened.”

    “But what that ignores is the flow of damages that you’ve avoided in the meantime, which could be important, because certain responses to climate change, like the melting of the ice caps, are responsive, depending on how long temperatures have been at a particular level.
    “Delaying emissions is also important because economic processes could be happening in the background that make carbon removal cheaper in the future so offsetting could act as a temporary solution allowing the action point to be delayed until a time when it is cheaper to act.
    “The question we’re answering with SVO is how valuable this temporary period in which you avoid damages is.”
    The IPCC has previously noted that meeting the objectives of the Paris Agreement will require some offsetting, though some organisations suggest that offsetting should be largely avoided due to the unregulated, impermanent and risky nature of the offset market.
    However, this study illustrates that in principle delaying emissions, even when offsetting projects are temporary and risky, is valuable in economic terms.
    The economists believe the SVO metric can play an important role in appraising net-zero climate policy and harmonising the offset market, and has policy applications beyond the valuation of offsets.
    These include calculating the benefits-to-cost ratio of an offset or any temporary carbon storage solution allowing for comparison to alternative technologies for mitigating climate change.
    The SVO formula can also be applied to Life-Cycle Analysis of biofuels as well as used to calculate the price of carbon debt, using the rule of thumb that a company that emits a ton of carbon today and commits to a permanent removal in 50 years’ time will pay 33% of the carbon price today to cover the damages of temporary atmospheric storage.
    The Social Value of Offsets, by Professor Ben Groom, Dragon Capital Chair in Environmental Economics at the University of Exeter Business School and Professor Frank Venmans from the Grantham Research Institute on Climate Change and the Environment at LSE, is published in Nature. More

  • in

    Testing real driverless cars in a virtual environment

    Researchers at The Ohio State University have developed new software to aid in the development, evaluation and demonstration of safer autonomous, or driverless, vehicles.
    Called the Vehicle-in-Virtual-Environment (VVE) method, it allows the testing of driverless cars in a perfectly safe environment, said Bilin Aksun-Guvenc, co-author of the study and a professor of mechanical and aerospace engineering at Ohio State.
    Imagine a driverless car is placed in the middle of an empty parking lot. Although it is driving, it isn’t reacting to the real world, but to input from the software, which tells the car what the road looks like, and what cars, pedestrians and hazards it is meeting along the way.
    “With our software, we’re able to make the vehicle think that it’s driving on actual roads while actually operating on a large open, safe test area,” said Aksun-Guvenc. “This ability saves time, money, and there is no risk of fatal traffic accidents.”
    The study, published recently in the journal Sensors, found that by immersing self-driving machines in a virtual environment, the technique can help the car learn to avoid possible car collisions, increase pedestrian safety, and react to rare or extreme traffic events.
    Although autonomous driving technologies have become a much more common sight on the road in the last few years, due to the sheer number of accidents these systems have caused, the way these technologies are tested deserves closer scrutiny, Aksun-Guvenc said.

    “Our future depends on being able to trust any and all road vehicles with our safety, so all of our research concepts pertain to working towards that goal,” said Aksun-Guvenc, who is also co-director of Ohio State’s Automated Driving Lab, a research group originally formed in 2014 to advance autonomous vehicle technologies.
    Current approaches for demonstrating autonomous vehicle functions involve testing software and technology first in simulations and then on public roads. Yet this method essentially turns other road users into involuntary participants in these driving experiments, said Aksun-Guvenc, and such risks can make the entire development process costly, inefficient, and potentially unsafe for both drivers and pedestrians alike.
    To overcome the limitations of these faulty assessments, researchers in this study replaced the output of high-resolution sensors in a real vehicle with simulated data to connect its controls to a highly realistic 3D environment, much like giving the machine a VR headset or virtual reality glasses. After feeding the data to the autonomous driving system’s computers and syncing the car’s real motions with the simulations’, researchers were able to show that it behaves as if the virtual environment were its true surroundings in real time.
    But what makes their software especially powerful, said Levent Guvenc, co-author of the study and also co-director of the Automated Driving Lab, is the strength of how flexible their virtual environment can be. “When actual senses are replaced by virtual senses, the model can be easily changed to fit any kind of scenario,” said Guvenc.
    Because the VVE method can be calibrated to maintain the properties of the real world while modeling rare events in the virtual environment, it could easily simulate extreme traffic scenarios, like someone jumping in front of a vehicle, to mundane ones like pedestrians waiting at a crosswalk, he said.

    Additionally, with the help of a communication app for vehicle-to-pedestrian connectivity, the software can use Bluetooth to communicate between a pedestrian with a mobile phone and a phone in the test vehicle. The researchers had a pedestrian actually dart quickly across a simulated road a safe distance from the test vehicle. But the Bluetooth signal told the car that the person was darting right in front of it.
    “The beauty of the method is that road users can share the same environment at the same time without being in the same location at all,” said Guvenc. And although generating these super-realistic environments can take time, he said the technological challenge of syncing different environments to use in real-time simulations is one challenge their team has solved.
    The team has also filed a patent for the technology. In the future, Guvenc said he’d also like to see it be integrated into traffic guidelines made by groups such as The National Highway Traffic Safety Administration.
    “We could see this technology becoming a staple in the industry in the next five or 10 years,” said Guvenc. “That’s why we’re focusing on building more applications for it.”
    Other Ohio State co-authors were Xincheng Cao, Haochong Chen and Sukru Yaren Gelbal. More