More stories

  • in

    Chemical reactions can scramble quantum information as well as black holes

    If you were to throw a message in a bottle into a black hole, all of the information in it, down to the quantum level, would become completely scrambled. Because in black holes this scrambling happens as quickly and thoroughly as quantum mechanics allows. They are generally considered nature’s ultimate information scramblers.
    New research from Rice University theorist Peter Wolynes and collaborators at the University of Illinois Urbana-Champaign, however, has shown that molecules can be as formidable at scrambling quantum information as black holes. Combining mathematical tools from black hole physics and chemical physics, they have shown that quantum information scrambling takes place in chemical reactions and can nearly reach the same quantum mechanical limit as it does in black holes. The work is published online in the Proceedings of the National Academy of Sciences.
    “This study addresses a long-standing problem in chemical physics, which has to do with the question of how fast quantum information gets scrambled in molecules,” Wolynes said. “When people think about a reaction where two molecules come together, they think the atoms only perform a single motion where a bond is made or a bond is broken.
    “But from the quantum mechanical point of view, even a very small molecule is a very complicated system. Much like the orbits in the solar system, a molecule has a huge number of possible styles of motion — things we call quantum states. When a chemical reaction takes place, quantum information about the quantum states of the reactants becomes scrambled, and we want to know how information scrambling affects the reaction rate.”
    To better understand how quantum information is scrambled in chemical reactions, the scientists borrowed a mathematical tool typically used in black hole physics known as out-of-time-order correlators, or OTOCs.
    “OTOCs were actually invented in a very different context about 55 years ago, when they were used to look at how electrons in superconductors are affected by disturbances from an impurity,” Wolynes said. “They’re a very specialized object that is used in the theory of superconductivity. They were next used by physicists in the 1990s studying black holes and string theory.”
    OTOCs measure how much tweaking one part of a quantum system at some instant in time will affect the motions of the other parts — providing insight into how quickly and effectively information can spread throughout the molecule. They are the quantum analog of Lyapunov exponents, which measure unpredictability in classical chaotic systems.

    “How quickly an OTOC increases with time tells you how quickly information is being scrambled in the quantum system, meaning how many more random looking states are getting accessed,” said Martin Gruebele, a chemist at Illinois Urbana-Champaign and co-author on the study who is a part of the joint Rice-Illinois Center for Adapting Flaws as Features funded by the National Science Foundation. “Chemists are very conflicted about scrambling in chemical reactions, because scrambling is necessary to get to the reaction goal, but it also messes up your control over the reaction.
    “Understanding under what circumstances molecules scramble information and under what circumstances they don’t potentially gives us a handle on actually being able to control the reactions better. Knowing OTOCs basically allows us to set limits on when this information is really disappearing out of our control and conversely when we could still harness it to have controlled outcomes.”
    In classical mechanics, a particle must have enough energy to overcome an energy barrier for a reaction to occur. However, in quantum mechanics, there’s the possibility that particles can “tunnel” through this barrier even if they don’t possess sufficient energy. The calculation of OTOCs showed that chemical reactions with a low activation energy at low temperatures where tunneling dominates can scramble information at nearly the quantum limit, like a black hole.
    Nancy Makri, also a chemist at Illinois Urbana-Champaign, used path integral methods she has developed to study what happens when the simple chemical reaction model is embedded in a larger system, which could be a large molecule’s own vibrations or a solvent, and tends to suppress chaotic motion.
    “In a separate study, we found that large environments tend to make things more regular and suppress the effects that we’re talking about,” Makri said. “So we calculated the OTOC for a tunneling system interacting with a large environment, and what we saw was that the scrambling was quenched — a big change in the behavior.”
    One area of practical application for the research findings is to place limits on how tunneling systems can be used to build qubits for quantum computers. One needs to minimize information scrambling between interacting tunneling systems to improve the reliability of quantum computers. The research could also be relevant for light-driven reactions and advanced materials design.
    “There’s potential for extending these ideas to processes where you wouldn’t just be tunneling in one particular reaction, but where you’d have multiple tunneling steps, because that’s what’s involved in, for example, electron conduction in a lot of the new soft quantum materials like perovskites that are being used to make solar cells and things like that,” Gruebele said.
    Wolynes is Rice’s D.R. Bullard-Welch Foundation Professor of Science, a professor of chemistry, f biochemistry and cell biology, physics and astronomy and materials science and nanoengineering and co-director of its Center for Theoretical Biological Physics, which is funded by the National Science Foundation. Co-authors Gruebele is the James R. Eiszner Endowed Chair in Chemistry; Makri is the Edward William and Jane Marr Gutgsell Professor and professor of chemistry and physics; Chenghao Zhang was a graduate student in physics at Illinois Urbana-Champaign and is now a postdoc at Pacific Northwest National Lab; and Sohang Kundu recently received his Ph.D. in chemistry from the University of Illinois and is currently a postdoc at Columbia University.
    The research was supported by the National Science Foundation (1548562, 2019745, 1955302) and the Bullard-Welch Chair at Rice (C-0016). More

  • in

    Progress in quantum physics: Researchers tame superconductors

    Superconductors are materials that can conduct electricity without electrical resistance — making them the ideal base material for electronic components in MRI machines, magnetic levitation trains and even particle accelerators. However, conventional superconductors are easily disturbed by magnetism. An international group of researchers has now succeeded in building a hybrid device consisting of a stable proximitized-superconductor enhanced by magnetism and whose function can be specifically controlled.
    They combined the superconductor with a special semiconductor material known as a topological insulator. “Topological insulators are materials that conduct electricity on their surface but not inside. This is due to their unique topological structure, i.e. the special arrangement of the electrons,” explains Professor Charles Gould, a physicist at the Institute for Topological Insulators at the University of Würzburg (JMU). “The exciting thing is that we can equip topological insulators with magnetic atoms so that they can be controlled by a magnet.”
    The superconductors and topological insulators were coupled to form a so-called Josephson junction, a connection between two superconductors separated by a thin layer of non-superconducting material. “This allowed us to combine the properties of superconductivity and semiconductors,” says Gould. “So we combine the advantages of a superconductor with the controllability of the topological insulator. Using an external magnetic field, we can now precisely control the superconducting properties. This is a true breakthrough in quantum physics!”
    Superconductivity Meets Magnetism
    The special combination creates an exotic state in which superconductivity and magnetism are combined — normally these are opposite phenomena that rarely coexist. This is known as the proximity-induced Fulde-Ferrell-Larkin-Ovchinnikov (p-FFLO) state. The new “superconductor with a control function” could be important for practical applications, such as the development of quantum computers. Unlike conventional computers, quantum computers are based not on bits but on quantum bits (qubits), which can assume not just two but several states simultaneously.
    “The problem is that quantum bits are currently very unstable because they are extremely sensitive to external influences, such as electric or magnetic fields,” says physicist Gould. “Our discovery could help stabilise quantum bits so that they can be used in quantum computers in the future.” More

  • in

    New privacy-preserving robotic cameras obscure images beyond human recognition

    From robotic vacuum cleaners and smart fridges to baby monitors and delivery drones, the smart devices being increasingly welcomed into our homes and workplaces use vision to take in their surroundings, taking videos and images of our lives in the process.
    In a bid to restore privacy, researchers at the Australian Centre for Robotics at the University of Sydney and the Centre for Robotics (QCR) at Queensland University of Technology have created a new approach to designing cameras that process and scramble visual information before it is digitised so that it becomes obscured to the point of anonymity.
    Known as sighted systems, devices like smart vacuum cleaners form part of the “internet-of-things” — smart systems that connect to the internet. They can be at risk of being hacked by bad actors or lost through human error, their images and videos at risk of being stolen by third parties, sometimes with malicious intent.
    Acting as a “fingerprint,” the distorted images can still be used by robots to complete their tasks but do not provide a comprehensive visual representation that compromises privacy.
    “Smart devices are changing the way we work and live our lives, but they shouldn’t compromise our privacy and become surveillance tools,” said Adam Taras, who completed the research as part of his Honours thesis.
    “When we think of ‘vision’ we think of it like a photograph, whereas many of these devices don’t require the same type of visual access to a scene as humans do. They have a very narrow scope in terms of what they need to measure to complete a task, using other visual signals, such as colour and pattern recognition,” he said.
    The researchers have been able to segment the processing that normally happens inside a computer within the optics and analogue electronics of the camera, which exists beyond the reach of attackers.

    “This is the key distinguishing point from prior work which obfuscated the images inside the camera’s computer — leaving the images open to attack,” said Dr Don Dansereau, Taras’ supervisor at the Australian Centre for Robotics. “We go one level beyond to the electronics themselves, enabling a greater level of protection.”
    The researchers tried to hack their approach but were unable to reconstruct the images in any recognisable format. They have opened this task to the research community at large, challenging others to hack their method.
    “If these images were to be accessed by a third party, they would not be able to make much of them, and privacy would be preserved,” said Taras.
    Dr Dansereau said privacy was increasingly becoming a concern as more devices today come with built-in cameras, and with the possible increase in new technologies in the near future like parcel drones, which travel into residential areas to make deliveries.
    “You wouldn’t want images taken inside your home by your robot vacuum cleaner leaked on the dark web, nor would you want a delivery drone to map out your backyard. It is too risky to allow services linked to the web to capture and hold onto this information,” said Dr Dansereau.
    The approach could also be used to make devices that work in places where privacy and security are a concern, such as warehouses, hospitals, factories, schools and airports.
    The researchers hope to next build physical camera prototypes to demonstrate the approach in practice.
    “Current robotic vision technology tends to ignore the legitimate privacy concerns of end-users. This is a short-sighted strategy that slows down or even prevents the adoption of robotics in many applications of societal and economic importance. Our new sensor design takes privacy very seriously, and I hope to see it taken up by industry and used in many applications,” said Professor Niko Suenderhauf, Deputy Director of the QCR, who advised on the project.
    Professor Peter Corke, Distinguished Professor Emeritus and Adjunct Professor at the QCR who also advised on the project said: “Cameras are the robot equivalent of a person’s eyes, invaluable for understanding the world, knowing what is what and where it is. What we don’t want is the pictures from those cameras to leave the robot’s body, to inadvertently reveal private or intimate details about people or things in the robot’s environment.” More

  • in

    The largest 3-D map of the universe reveals hints of dark energy’s secrets

    A massive survey of the cosmos is revealing new details of one of the most mysterious facets of the universe, dark energy. Intriguingly, when combined with other observations, the data hint that dark energy, commonly thought to maintain a constant density over time, might evolve along with the cosmos.

    The result is “an adrenaline shot to the cosmology community,” says physicist Daniel Scolnic of Duke University, who was not involved with the new study.

    Dark energy, an invisible enigma that causes the universe’s expansion to speed up over time, is poorly understood, despite making up the bulk of the universe’s contents. To explore that puzzle, the Dark Energy Spectroscopic Instrument, DESI, has produced the largest 3-D map of the universe to date, researchers report April 4 in 10 papers posted on the DESI website, and in talks at a meeting of the American Physical Society held in Sacramento, Calif. By analyzing patterns in the distributions of galaxies and other objects on that map, scientists can determine the history of how the universe expanded over time. More

  • in

    Large language models respond differently based on user’s motivation

    A new study recently published in the Journal of the American Medical Informatics Association (JAMIA) reveals how large language models (LLMs) respond to different motivational states. In their evaluation of three LLM-based generative conversational agents (GAs) — ChatGPT, Google Bard, and Llama 2, PhD student Michelle Bak and Assistant Professor Jessie Chin of the School of Information Sciences at the University of Illinois Urbana-Champaign found that while GAs are able to identify users’ motivation states and provide relevant information when individuals have established goals, they are less likely to provide guidance when the users are hesitant or ambivalent about changing their behavior.
    Bak provides the example of an individual with diabetes who is resistant to changing their sedentary lifestyle.
    “If they were advised by a doctor that exercising would be necessary to manage their diabetes, it would be important to provide information through GAs that helps them increase an awareness about healthy behaviors, become emotionally engaged with the changes, and realize how their unhealthy habits might affect people around them. This kind of information can help them take the next steps toward making positive changes,” said Bak.
    Current GAs lack specific information about these processes, which puts the individual at a health disadvantage. Conversely, for individuals who are committed to changing their physical activity levels (e.g., have joined personal fitness training to manage chronic depression), GAs are able to provide relevant information and support.
    “This major gap of LLMs in responding to certain states of motivation suggests future directions of LLMs research for health promotion,” said Chin.
    Bak’s research goal is to develop a digital health solution based on using natural language processing and psychological theories to promote preventive health behaviors. She earned her bachelor’s degree in sociology from the University of California Los Angeles.
    Chin’s research aims to translate social and behavioral sciences theories to design technologies and interactive experiences to promote health communication and behavior across the lifespan. She leads the Adaptive Cognition and Interaction Design (ACTION) Lab at the University of Illinois. Chin holds a BS in psychology from National Taiwan University, an MS in human factors, and a PhD in educational psychology with a focus on cognitive science in teaching and learning from the University of Illinois. More

  • in

    ‘Smart swarms’ of tiny robots inspired by natural herd mentality

    In natural ecosystems, the herd mentality plays a major role — from schools of fish, to beehives to ant colonies. This collective behavior allows the whole to exceed the sum of its parts and better respond to threats and challenges.
    This behavior inspired researchers from The University of Texas at Austin, and for more than a year they’ve been working on creating “smart swarms” of microscopic robots. The researchers engineered social interactions among these tiny machines so that they can act as one coordinated group, performing tasks better than they would if they were moving as individuals or at random.
    “All these groups, flocks of birds, schools of fish and others, each member of the group has this natural inclination to work in concert with its neighbor, and together they are smarter, stronger and more efficient than they would be on their own,” said Yuebing Zheng, associate professor in the Walker Department of Mechanical Engineering and Texas Materials Institute. “We wanted to learn more about the mechanisms that make this happen and see if we can reproduce it.”
    Zheng and his team first showcased these innovations in a paper published in Advanced Materials last year. But they’ve taken things a step further in a new paper published recently in Science Advances.
    In the new paper, Zheng and his team have given these swarms a new trait called adaptive time delay. This concept allows each microrobot within the swarm to adapt its motion to changes in local surroundings. By doing this, the swarm showed a significant increase in responsivity without decreasing its robustness — the ability to quickly respond to any environment change while maintaining the integrity of the swarm.
    This finding builds on a novel optical feedback system — the ability to direct these microrobots in a collective way using controllable light patterns. This system was first unveiled in the researchers’ 2023 paper — recently chosen as an “editor’s choice” by Advanced Materials – and it facilitated the development of adaptive time delay for microrobots.
    The adaptive time delay strategy offers potential for scalability and integration into larger machinery. This approach could significantly enhance the operational efficiency of autonomous drone fleets. Similarly, it could enable conveys of trucks and cars to autonomously navigate extensive highway journeys in unison, with improved responsiveness and increased robustness. The same way schools of fish can communicate and follow each other, so will these machines. As a result, there’s no need for any kind of central control, which takes more data and energy to operate.
    “Nanorobots, on an individual basis, are vulnerable to complex environments; they struggle to navigate effectively in challenging conditions such as bloodstreams or polluted waters,” said Zhihan Chen, a Ph.D. student in Zheng’s lab and co-author on the new paper. “This collective motion can help them better navigate a complicated environment and reach the target efficiently and avoid obstacles or threats.”
    Having proven this swarm mentality in the lab setting, the next step is to introduce more obstacles. These experiments were conducted in a static liquid solution. Up next, they’ll try to repeat the behavior in flowing liquid. And then they’ll move to replicate it inside an organism.
    Once fully developed, these smart swarms could serve as advanced drug delivery forces, able to navigate the human body and elude its defenses to bring medicine to its target. Or, they could operate like iRobot robotic vacuums, but for contaminated water, collectively cleaning every bit of an area together. More

  • in

    Computer scientists show the way: AI models need not be SO power hungry

    The development of AI models is an overlooked climate culprit. Computer scientists at the University of Copenhagen have created a recipe book for designing AI models that use much less energy without compromising performance. They argue that a model’s energy consumption and carbon footprint should be a fixed criterion when designing and training AI models.
    The fact that colossal amounts of energy are needed to Google away, talk to Siri, ask ChatGPT to get something done, or use AI in any sense, has gradually become common knowledge. One study estimates that by 2027, AI servers will consume as much energy as Argentina or Sweden. Indeed, a single ChatGPT prompt is estimated to consume, on average, as much energy as forty mobile phone charges. But the research community and the industry have yet to make the development of AI models that are energy efficient and thus more climate friendly the focus, computer science researchers at the University of Copenhagen point out.
    “Today, developers are narrowly focused on building AI models that are effective in terms of the accuracy of their results. It’s like saying that a car is effective because it gets you to your destination quickly, without considering the amount of fuel it uses. As a result, AI models are often inefficient in terms of energy consumption,” says Assistant Professor Raghavendra Selvan from the Department of Computer Science, whose research looks in to possibilities for reducing AI’s carbon footprint.
    But the new study, of which he and computer science student Pedram Bakhtiarifard are two of the authors, demonstrates that it is easy to curb a great deal of CO2e without compromising the precision of an AI model. Doing so demands keeping climate costs in mind from the design and training phases of AI models.
    “If you put together a model that is energy efficient from the get-go, you reduce the carbon footprint in each phase of the model’s ‘life cycle’. This applies both to the model’s training, which is a particularly energy-intensive process that often takes weeks or months, as well as to its application,” says Selvan.
    Recipe book for the AI industry
    In their study, the researchers calculated how much energy it takes to train more than 400,000 convolutional neural network type AI models — this was done without actually training all these models. Among other things, convolutional neural networks are used to analyse medical imagery, for language translation and for object and face recognition — a function you might know from the camera app on your smartphone.

    Based on the calculations, the researchers present a benchmark collection of AI models that use less energy to solve a given task, but which perform at approximately the same level. The study shows that by opting for other types of models or by adjusting models, 70-80% energy savings can be achieved during the training and deployment phase, with only a 1% or less decrease in performance. And according to the researchers, this is a conservative estimate.
    “Consider our results as a recipe book for the AI professionals. The recipes don’t just describe the performance of different algorithms, but how energy efficient they are. And that by swapping one ingredient with another in the design of a model, one can often achieve the same result. So now, the practitioners can choose the model they want based on both performance and energy consumption, and without needing to train each model first,” says Pedram Bakhtiarifard, who continues:
    “Oftentimes, many models are trained before finding the one that is suspected of being the most suitable for solving a particular task. This makes the development of AI extremely energy-intensive. Therefore, it would be more climate-friendly to choose the right model from the outset, while choosing one that does not consume too much power during the training phase.”
    The researchers stress that in some fields, like self-driving cars or certain areas of medicine, model precision can be critical for safety. Here, it is important not to compromise on performance. However, this shouldn’t be a deterrence to striving for high energy efficiency in other domains.
    “AI has amazing potential. But if we are to ensure sustainable and responsible AI development, we need a more holistic approach that not only has model performance in mind, but also climate impact. Here, we show that it is possible to find a better trade-off. When AI models are developed for different tasks, energy efficiency ought to be a fixed criterion — just as it is standard in many other industries,” concludes Raghavendra Selvan.
    The “recipe book” put together in this work is available as an open-source dataset for other researchers to experiment with. The information about all these 423,000 architectures is published on Github which AI practitioners can access using simple Python scripts.
    The UCPH researchers estimated how much energy it takes to train 429,000 of the AI subtype models known as convolutional neural networks in this dataset. Among other things, these are used for object detection, language translation and medical image analysis.
    It is estimated that the training alone of the 429,000 neural networks the study looked at would require 263,000 kWh. This equals the amount of energy that an average Danish citizen consumes over 46 years. And it would take one computer about 100 years to do the training. The authors in this work did not actually train these models themselves but estimated these using another AI model, and thus saving 99% of the energy it would have taken.
    Training AI models consumes a lot of energy, and thereby emits a lot of CO2e. This is due to the intensive computations performed while training a model, typically run on powerful computers. This is especially true for large models, like the language model behind ChatGPT. AI tasks are often processed in data centers, which demand significant amounts of power to keep computers running and cool. The energy source for these centers, which may rely on fossil fuels, influences their carbon footprint. More

  • in

    Drawing inspiration from plants: A metal-air paper battery for wearable devices

    Drawing inspiration from the way plants breathe, a group of researchers at Tohoku University has created a paper-based magnesium-air battery that can be used in GPS sensors or pulse oximeter sensors. Taking advantage of paper’s recyclability and lightweight nature, the engineered battery holds promise for a more environmentally friendly source of energy.
    For over two millennia, paper has been a staple of human civilization. But these days, the usage of paper is not limited to writing. It is also playing a pivotal role in ushering in a greener future.
    Lightweight and thin paper-based devices help reduce dependence on metal or plastic materials, whilst at the same time being easier to dispose of. From paper-based diagnostic devices that deliver economical and rapid detection of infectious diseases to batteries and energy devices that offer an environmentally friendly alternative for power generation, scientists are finding ingenious ways to put this versatile material to use.
    Now, a team of researchers at Tohoku University has reported on a high-performance magnesium-air (Mg-air) battery that is paper-based and activated by water.
    “We drew inspiration for this device from the respiration mechanism of plants,” points out Hiroshi Yabu, corresponding authors of the study. “Photosynthesis is analogous to the charge and discharge process in batteries. Just as plants harness solar energy to synthesize sugar from water in the ground and carbon dioxide from the air, our battery utilizes magnesium as a substrate to generate power from oxygen and water.”
    To fabricate the battery, Yabu and his colleagues bonded magnesium foil onto paper and added the cathode catalyst and gas diffusion layer directly to the other side of the paper. The paper battery achieved an open circuit voltage of 1.8 volts, a 1.0 volt current density of 100 mA/cm², and a maximum output of 103 milliwatts/cm².
    “Not only did the battery demonstrate impressive performance results, it operates without using toxic materials — instead using carbon cathodes and a pigment electrocatalyst that have passed stringent assessments,” adds Yabu.
    The researchers put the battery to the test in a pulse oximeter sensor and a gps sensor, illustrating its versatility for wearable devices. More