More stories

  • in

    Making self-driving cars human-friendly

    Automated vehicles could be made more pedestrian-friendly thanks to new research which could help them predict when people will cross the road.
    University of Leeds-led scientists investigating how to better understand human behaviour in traffic say that neuroscientific theories of how the brain makes decisions can be used in automated vehicle technology to improve safety and make them more human-friendly.
    The researchers set out to determine whether a decision-making model called drift diffusion could predict when pedestrians would cross a road in front of approaching cars, and whether it could be used in scenarios where the car gives way to the pedestrian, either with or without explicit signals. This prediction capability will allow the autonomous vehicle to communicate more effectively with pedestrians, in terms of its movements in traffic and any external signals such as flashing lights, to maximise traffic flow and decrease uncertainty.
    Drift diffusion models assume that people reach decisions after accumulation of sensory evidence up to a threshold at which the decision is made.
    Professor Gustav Markkula, from the University of Leeds’ Institute for Transport Studies and the senior author of the study, said: “When making the decision to cross, pedestrians seem to be adding up lots of different sources of evidence, not only relating to the vehicle’s distance and speed, but also using communicative cues from the vehicle in terms of deceleration and headlight flashes.
    “When a vehicle is giving way, pedestrians will often feel quite uncertain about whether the car is actually yielding, and will often end up waiting until the car has almost come to a full stop before starting to cross. Our model clearly shows this state of uncertainty borne out, meaning it can be used to help design how automated vehicles behave around pedestrians in order to limit uncertainty, which in turn can improve both traffic safety and traffic flow. More

  • in

    Calculating the path of cancer

    Biologists at Cold Spring Harbor Laboratory (CSHL) are using a mathematical approach developed in CSHL Assistant Professor David McCandlish’s lab to find solutions to a diverse set of biological problems. Originally created as a way to understand interactions between different mutations in proteins, the tool is now being used by McCandlish and his collaborators to learn about the complexities of gene expression and the chromosomal mutations associated with cancer. McCandlish says:
    “This is one of the things that’s really fascinating about mathematical research, is sometimes you can see connections between topics, which on the surface they seem so different, but at a mathematical level, they might be using some of the same technical ideas.”
    All of these questions involve mapping the likelihood of different variations on a biological theme: which combinations of mutations are most likely to arise in a particular protein, for example, or which chromosome mutations are most often found together in the same cancer cell. McCandlish explains that these are problems of density estimation — a statistical tool that predicts how often an event happens. Density estimation can be relatively straightforward, such as charting different heights within a group of people. But when dealing with complex biological sequences, such as the hundreds, or thousands of amino acids that are strung together to build a protein, predicting the probability of each potential sequence becomes astonishingly complex.
    McCandlish explains the fundamental problem his team is using math to address: “Sometimes if you make, say one mutation to a protein sequence, it doesn’t do anything. The protein works fine. And if you make a second mutation, it still works fine, but then if you put the two of them together, now you’ve got a broken protein. We’ve been trying to come up with methods to model not just interactions between pairs of mutations, but between three or four or any number of mutations.”
    The methods they have developed can be used to interpret data from experiments that measure how hundreds of thousands of different combinations of mutations impact the function of a protein.
    This study, reported in the Proceedings of the National Academy of Sciences, began with conversations with two other CSHL colleagues: CSHL Fellow Jason Sheltzer and Associate Professor Justin Kinney. They worked with McCandlish to apply his methods to gene expression and the evolution of cancer mutations. Software released by McCandlish’s team will enable other researchers to use these same approaches in their own work. He says he hopes it will be applied to a variety of biological problems.
    Story Source:
    Materials provided by Cold Spring Harbor Laboratory. Original written by Jennifer Michalowski. Note: Content may be edited for style and length. More

  • in

    Precious metals from electronic waste in seconds

    In what should be a win-win-win for the environment, a process developed at Rice University to extract valuable metals from electronic waste would also use up to 500 times less energy than current lab methods and produce a byproduct clean enough for agricultural land.
    The flash Joule heating method introduced last year to produce graphene from carbon sources like waste food and plastic has been adapted to recover rhodium, palladium, gold and silver for reuse.
    A report in Nature Communications by the Rice lab of chemist James Tour also shows highly toxic heavy metals including chromium, arsenic, cadmium, mercury and lead are removed from the flashed materials, leaving a byproduct with minimal metal content.
    Instantly heating the waste to 3,400 Kelvin (5,660 degrees Fahrenheit) with a jolt of electricity vaporizes the precious metals, and the gases are vented away for separation, storage or disposal. Tour said that with more than 40 million tons of e-waste produced globally every year, there is plenty of potential for “urban mining.”
    “Here, the largest growing source of waste becomes a treasure,” Tour said. “This will curtail the need to go all over the world to mine from ores in remote and dangerous places, stripping the Earth’s surface and using gobs of water resources. The treasure is in our dumpsters.”
    He noted an increasingly rapid turnover of personal devices like cell phones has driven the worldwide rise of electronic waste, with only about 20% of landfill waste currently being recycled. More

  • in

    Induced flaws in quantum materials could enhance superconducting properties

    In a surprising discovery, an international team of researchers, led by scientists in the University of Minnesota Center for Quantum Materials, found that deformations in quantum materials that cause imperfections in the crystal structure can actually improve the material’s superconducting and electrical properties.
    The groundbreaking findings could provide new insight for developing the next generation of quantum-based computing and electronic devices.
    The research just appeared in Nature Materials, a peer-reviewed scientific journal published by Nature Publishing Group.
    “Quantum materials have unusual magnetic and electrical properties that, if understood and controlled, could revolutionize virtually every aspect of society and enable highly energy-efficient electrical systems and faster, more accurate electronic devices,” said study co-author Martin Greven, a Distinguished McKnight Professor in the University of Minnesota’s School of Physics and Astronomy and the Director of the Center for Quantum Materials. “The ability to tune and modify the properties of quantum materials is pivotal to advances in both fundamental research and modern technology.”
    Elastic deformation of materials occurs when the material is subjected to stress but returns to its original shape once the stress is removed. In contrast, plastic deformation is the non-reversible change of a material’s shape in response to an applied stress — or, more simply, the act of squeezing or stretching it until it loses its shape. Plastic deformation has been used by blacksmiths and engineers for thousands of years. An example of a material with a large plastic deformation range is wet chewing gum, which can be stretched to dozens of times its original length.
    While elastic deformation has been extensively used to study and manipulate quantum materials, the effects of plastic deformation have not yet been explored. In fact, conventional wisdom would lead scientists to believe that “squeezing” or “stretching” quantum materials may remove their most intriguing properties. More

  • in

    How apples get their shapes

    Apples are among the oldest and most recognizable fruits in the world. But have you ever really considered an apple’s shape? Apples are relatively spherical except for that characteristic dimple at the top where the stem grows.
    How do apples grow that distinctive shape?
    Now, a team of mathematicians and physicists have used observations, lab experiments, theory and computation to understand the growth and form of the cusp of an apple.
    The paper is published in Nature Physics.
    “Biological shapes are often organized by the presence of structures that serve as focal points,” said L Mahadevan, the Lola England de Valpine Professor of Applied Mathematics, of Organismic and Evolutionary Biology, and of Physics at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and senior author of the study. “These focal points can sometimes take the form of singularities where deformations are localized. A ubiquitous example is seen in the cusp of an apple, the inward dimple where the stalk meets the fruit.”
    Mahadevan had already developed a simple theory to explain the form and growth of apples but the project began to bear fruit when the researchers were able to connect observations of real apples at different growth stages and gel experiments to mimic the growth along with theory and computations. More

  • in

    Foundational step shows quantum computers can be better than the sum of their parts

    Pobody’s nerfect — not even the indifferent, calculating bits that are the foundation of computers. But JQI Fellow Christopher Monroe’s group, together with colleagues from Duke University, have made progress toward ensuring we can trust the results of quantum computers even when they are built from pieces that sometimes fail. They have shown in an experiment, for the first time, that an assembly of quantum computing pieces can be better than the worst parts used to make it. In a paper published in the journal Nature on Oct. 4, 2021, the team shared how they took this landmark step toward reliable, practical quantum computers.
    In their experiment, the researchers combined several qubits — the quantum version of bits — so that they functioned together as a single unit called a logical qubit. They created the logical qubit based on a quantum error correction code so that, unlike for the individual physical qubits, errors can be easily detected and corrected, and they made it to be fault-tolerant — capable of containing errors to minimize their negative effects.
    “Qubits composed of identical atomic ions are natively very clean by themselves,” says Monroe, who is also a Fellow of the Joint Center for Quantum Information and Computer Science and a College Park Professor in the Department of Physics at the University of Maryland. “However, at some point, when many qubits and operations are required, errors must be reduced further, and it is simpler to add more qubits and encode information differently. The beauty of error correction codes for atomic ions is they can be very efficient and can be flexibly switched on through software controls.”
    This is the first time that a logical qubit has been shown to be more reliable than the most error-prone step required to make it. The team was able to successfully put the logical qubit into its starting state and measure it 99.4% of the time, despite relying on six quantum operations that are individually expected to work only about 98.9% of the time.
    That might not sound like a big difference, but it’s a crucial step in the quest to build much larger quantum computers. If the six quantum operations were assembly line workers, each focused on one task, the assembly line would only produce the correct initial state 93.6% of the time (98.9% multiplied by itself six times) — roughly ten times worse than the error measured in the experiment. That improvement is because in the experiment the imperfect pieces work together to minimize the chance of quantum errors compounding and ruining the result, similar to watchful workers catching each other’s mistakes.
    The results were achieved using Monroe’s ion-trap system at UMD, which uses up to 32 individual charged atoms — ions — that are cooled with lasers and suspended over electrodes on a chip. They then use each ion as a qubit by manipulating it with lasers. More

  • in

    Scientists are one step closer to error-correcting quantum computers

    Mistakes happen — especially in quantum computers. The fragile quantum bits, or qubits, that make up the machines are notoriously error-prone, but now scientists have shown that they can fix the flubs.

    Computers that harness the rules of quantum mechanics show promise for making calculations far out of reach for standard computers (SN: 6/29/17). But without a mechanism for fixing the computers’ mistakes, the answers that a quantum computer spits out could be gobbledygook (SN: 6/22/20).

    Combining the power of multiple qubits into one can solve the error woes, researchers report October 4 in Nature. Scientists used nine qubits to make a single, improved qubit called a logical qubit, which, unlike the individual qubits from which it was made, can be probed to check for mistakes.

    “This is a key demonstration on the path to build a large-scale quantum computer,” says quantum physicist Winfried Hensinger of the University of Sussex in Brighton, England, who was not involved in the new study.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Still, that path remains a long one, Hensinger says. To do complex calculations, scientists will have to dramatically scale up the number of qubits in the machines. But now that scientists have shown that they can keep errors under control, he says, “there’s nothing fundamentally stopping us to build a useful quantum computer.”

    In a logical qubit, information is stored redundantly. That allows researchers to check and fix mistakes in the data. “If a piece of it goes missing, you can reconstruct it from the other pieces, like Voldemort,” says quantum physicist David Schuster of the University of Chicago, who was not involved with the new research. (The Harry Potter villain kept his soul safe by concealing it in multiple objects called Horcruxes.)

    In the new study, four additional, auxiliary qubits interfaced with the logical qubit, in order to identify errors in its data. Future quantum computers could make calculations using logical qubits in place of the original, faulty qubits, repeatedly checking and fixing any errors that crop up.

    To make their logical qubit, the researchers used a technique called a Bacon-Shor code, applying it to qubits made of ytterbium ions hovering above an ion-trapping chip inside a vacuum, which are manipulated with lasers. The researchers also designed sequences of operations so that errors don’t multiply uncontrollably, what’s known as “fault tolerance.”

    Thanks to those efforts, the new logical qubit had a lower error rate than that of the most flawed components that made it up, says quantum physicist Christopher Monroe of the University of Maryland in College Park and Duke University.

    However, the team didn’t quite complete the full process envisioned for error correction. While the computer detected the errors that arose, the researchers didn’t correct the mistakes and continue on with computation. Instead, they fixed errors after the computer was finished. In a full-fledged example, scientists would detect and correct errors multiple times on the fly.

    Demonstrating quantum error correction is a necessity for building useful quantum computers. “It’s like achieving criticality with [nuclear] fission,” Schuster says. Once that nuclear science barrier was passed in 1942, it led to technologies like nuclear power and atomic bombs (SN: 11/29/17).

    As quantum computers gradually draw closer to practical usefulness, companies are investing in the devices. Technology companies such as IBM, Google and Intel host major quantum computing endeavors. On October 1, a quantum computing company cofounded by Monroe, called IonQ, went public; Monroe spoke to Science News while on a road trip to ring the opening bell at the New York Stock Exchange.

    The new result suggests that full-fledged quantum error correction is almost here, says coauthor Kenneth Brown, a quantum engineer also at Duke University. “It really shows that we can get all the pieces together and do all the steps.” More

  • in

    Unprecedented view of a single catalyst nanoparticle at work

    A DESY-led research team has been using high-intensity X-rays to observe a single catalyst nanoparticle at work. The experiment has revealed for the first time how the chemical composition of the surface of an individual nanoparticle changes under reaction conditions, making it more active. The team led by DESY’s Andreas Stierle is presenting its findings in the journal Science Advances. This study marks an important step towards a better understanding of real, industrial catalytic materials.
    Catalysts are materials that promote chemical reactions without being consumed themselves. Today, catalysts are used in numerous industrial processes, from fertiliser production to manufacturing plastics. Because of this, catalysts are of huge economic importance. A very well-known example is the catalytic converter installed in the exhaust systems of cars. These contain precious metals such as platinum, rhodium and palladium, which allow highly toxic carbon monoxide (CO) to be converted into carbon dioxide (CO2) and reduce the amount of harmful nitrogen oxides (NOx).
    “In spite of their widespread use and great importance, we are still ignorant of many important details of just how the various catalysts work,” explains Stierle, head of the DESY NanoLab. “That’s why we have long wanted to study real catalysts while in operation.” This is not easy, because in order to make the active surface as large as possible, catalysts are typically used in the form of tiny nanoparticles, and the changes that affect their activity occur on their surface.
    Surface strain relates to chemical composition
    In the framework of the EU project Nanoscience Foundries and Fine Analysis (NFFA), the team from DESY NanoLab has developed a technique for labelling individual nanoparticles and thereby identifying them in a sample. “For the study, we grew nanoparticles of a platinum-rhodium alloy on a substrate in the lab and labelled one specific particle,” says co-author Thomas Keller from DESY NanoLab and in charge of the project at DESY. “The diameter of the labelled particle is around 100 nanometres, and it is similar to the particles used in a car’s catalytic converter.” A nanometre is a millionth of a millimetre.
    Using X-rays from the European Synchrotron Radiation Facility ESRF in Grenoble, France, the team was not only able to create a detailed image of the nanoparticle; it also measured the mechanical strain within its surface. “The surface strain is related to the surface composition, in particular the ratio of platinum to rhodium atoms,” explains co-author Philipp Pleßow from the Karlsruhe Institute of Technology (KIT), whose group computed strain as a function of surface composition. By comparing the observed and computed facet-dependent strain, conclusions can be drawn concerning the chemical composition at the particle surface. The different surfaces of a nanoparticle are called facets, just like the facets of a cut gemstone.
    When the nanoparticle is grown, its surface consists mainly of platinum atoms, as this configuration is energetically favoured. However, the scientists studied the shape of the particle and its surface strain under different conditions, including the operating conditions of an automotive catalytic converter. To do this, they heated the particle to around 430 degrees Celsius and allowed carbon monoxide and oxygen molecules to pass over it. “Under these reaction conditions, the rhodium inside the particle becomes mobile and migrates to the surface because it interacts more strongly with oxygen than the platinum,” explains Pleßow. This is also predicted by theory.
    “As a result, the surface strain and the shape of the particle change,” reports co-author Ivan Vartaniants, from DESY, whose team converted the X-ray diffraction data into three-dimensional spatial images. “A facet-dependent rhodium enrichment takes place, whereby additional corners and edges are formed.” The chemical composition of the surface, and the shape and size of the particles have a significant effect on their function and efficiency. However, scientists are only just beginning to understand exactly how these are connected and how to control the structure and composition of the nanoparticles. The X-rays allow researchers to detect changes of as little as 0.1 in a thousand in the strain, which in this experiment corresponds to a precision of about 0.0003 nanometres (0.3 picometres).
    Crucial step towards analysing industrial catalyst maerials
    “We can now, for the first time, observe the details of the structural changes in such catalyst nanoparticles while in operation,” says Stierle, Lead Scientist at DESY and professor for nanoscience at the University of Hamburg. “This is a major step forward and is helping us to understand an entire class of reactions that make use of alloy nanoparticles.” Scientists at KIT and DESY now want to explore this systematically at the new Collaborative Research Centre 1441, funded by the German Research Foundation (DFG) and entitled “Tracking the Active Sites in Heterogeneous Catalysis for Emission Control (TrackAct).”
    “Our investigation is an important step towards analysing industrial catalytic materials,” Stierle points out. Until now, scientists have had to grow model systems in the laboratory in order to conduct such investigations. “In this study, we have gone to the limit of what can be done. With DESY’s planned X-ray microscope PETRA IV, we will be able to look at ten times smaller individual particles in real catalysts, and under reaction conditions.” More