More stories

  • in

    Using AI to create better, more potent medicines

    While it can take years for the pharmaceutical industry to create medicines capable of treating or curing human disease, a new study suggests that using generative artificial intelligence could vastly accelerate the drug-development process.
    Today, most drug discovery is carried out by human chemists who rely on their knowledge and experience to select and synthesize the right molecules needed to become the safe and efficient medicines we depend on. To identify the synthesis paths, scientists often employ a technique called retrosynthesis — a method for creating potential drugs by working backward from the wanted molecules and searching for chemical reactions to make them.
    Yet because sifting through millions of potential chemical reactions can be an extremely challenging and time-consuming endeavor, researchers at The Ohio State University have created an AI framework called G2Retro to automatically generate reactions for any given molecule. The new study showed that compared to current manual-planning methods, the framework was able to cover an enormous range of possible chemical reactions as well as accurately and quickly discern which reactions might work best to create a given drug molecule.
    “Using AI for things critical to saving human lives, such as medicine, is what we really want to focus on,” said Xia Ning, lead author of the study and an associate professor of computer science and engineering at Ohio State. “Our aim was to use AI to accelerate the drug design process, and we found that it not only saves researchers time and money but provides drug candidates that may have much better properties than any molecules that exist in nature.”
    This study builds on previous research of Ning’s where her team developed a method named Modof that was able to generate molecule structures that exhibited desired properties better than any existing molecules. “Now the question becomes how to make such generated molecules, and that is where this new study shines,” said Ning, also an associate professor of biomedical informatics in the College of Medicine.
    The study was published today in the journal Communications Chemistry.
    Ning’s team trained G2Retro on a dataset that contains 40,000 chemical reactions collected between 1976 and 2016. The framework “learns” from graph-based representations of given molecules, and uses deep neural networks to generate possible reactant structures that could be used to synthesize them. Its generative power is so impressive that, according to Ning, once given a molecule, G2Retro could come up with hundreds of new reaction predictions in only a few minutes.
    “Our generative AI method G2Retro is able to supply multiple different synthesis routes and options, as well as a way to rank different options for each molecule,” said Ning. “This is not going to replace current lab-based experiments, but it will offer more and better drug options so experiments can be prioritized and focused much faster.”
    To further test the AI’s effectiveness, Ning’s team conducted a case study to see if G2Retro could accurately predict four newly released drugs already in circulation: Mitapivat, a medication used to treat hemolytic anemia; Tapinarof, which is used to treat various skin diseases; Mavacamten, a drug to treat systemic heart failure; and Oteseconazole, used to treat fungal infections in females. G2Retro was able to correctly generate exactly the same patented synthesis routes for these medicines, and provided alternative synthesis routes that are also feasible and synthetically useful, Ning said.
    Having such a dynamic and effective device at scientists’ disposal could enable the industry to manufacture stronger drugs at a quicker pace — but despite the edge AI might give scientists inside the lab, Ning emphasizes the medicines G2Retro or any generative AI creates still need to be validated — a process that involves the created molecules being tested in animal models and later in human trials.
    “We are very excited about generative AI for medicine, and we are dedicated to using AI responsibly to improve human health,” said Ning.
    This research was supported by Ohio State’s President’s Research Excellence Program and the National Science Foundation. Other Ohio State co-authors were Ziqi Chen, Oluwatosin Ayinde, James Fuchs and Huan Sun. More

  • in

    Fastest industry standard optical fiber

    An optical fibre about the thickness of a human hair can now carry the equivalent of more than 10 million fast home internet connections running at full capacity.
    A team of Japanese, Australian, Dutch, and Italian researchers has set a new speed record for an industry standard optical fibre, achieving 1.7 Petabits over a 67km length of fibre. The fibre, which contains 19 cores that can each carry a signal, meets the global standards for fibre size, ensuring that it can be adopted without massive infrastructure change. And it uses less digital processing, greatly reducing the power required per bit transmitted.
    Macquarie University researchers supported the invention by developing a 3D laser-printed glass chip that allows low loss access to the 19 streams of light carried by the fibre and ensures compatibility with existing transmission equipment.
    The fibre was developed by the Japanese National Institute of Information and Communications Technology (NICT, Japan) and Sumitomo Electric Industries, Ltd. (SEI, Japan) and the work was performed in collaboration with the Eindhoven University of Technology, University of L’Aquila, and Macquarie University.
    All the world’s internet traffic is carried through optical fibres which are each 125 microns thick (comparable to the thickness of a human hair). These industry standard fibres link continents, data centres, mobile phone towers, satellite ground stations and our homes and businesses.
    Back in 1988, the first subsea fibre-optic cable across the Atlantic had a capacity of 20 Megabits or 40,000 telephone calls, in two pairs of fibres. Known as TAT 8, it came just in time to support the development of the World Wide Web. But it was soon at capacity.

    The latest generation of subsea cables such as the Grace Hopper cable, which went into service in 2022, carries 22 Terabits in each of 16 fibre pairs. That’s a million times more capacity than TAT 8, but it’s still not enough to meet the demand for streaming TV, video conferencing and all our other global communication.
    “Decades of optics research around the world has allowed the industry to push more and more data through single fibres,” says Dr Simon Gross from Macquarie University’s School of Engineering. “They’ve used different colours, different polarisations, light coherence and many other tricks to manipulate light.”
    Most current fibres have a single core that carries multiple light signals. But this current technology is practically limited to only a few Terabits per second due to interference between the signals.
    “We could increase capacity by using thicker fibres. But thicker fibres would be less flexible, more fragile, less suitable for long-haul cables, and would require massive reengineering of optical fibre infrastructure,” says Dr Gross.
    “We could just add more fibres. But each fibre adds equipment overhead and cost and we’d need a lot more fibres.”
    To meet the exponentially growing demand for movement of data, telecommunication companies need technologies that offer greater data flow for reduced cost.

    The new fibre contains 19 cores that can each carry a signal.
    “Here at Macquarie University, we’ve created a compact glass chip with a wave guide pattern etched into it by a 3D laser printing technology. It allows feeding of signals into the 19 individual cores of the fibre simultaneously with uniform low losses. Other approaches are lossy and limited in the number of cores,” says Dr Gross.
    “It’s been exciting to work with the Japanese leaders in optical fibre technology. I hope we’ll see this technology in subsea cables within five to 10 years.”
    Another researcher involved in the experiment, Professor Michael Withford from Macquarie University’s School of Mathematical and Physical Sciences, believes this breakthrough in optical fibre technology has far-reaching implications.
    “The optical chip builds on decades of research into optics at Macquarie University,” says Professor Withford. “The underlying patented technology has many applications including finding planets orbiting distant stars, disease detection, even identifying damage in sewage pipes.” More

  • in

    Symmetry breaking by ultrashort light pulses opens new quantum pathways for coherent phonons

    Atoms in a crystal form a regular lattice, in which they can move over small distances from their equilibrium positions. Such phonon excitations are represented by quantum states. A superposition of phonon states defines a so-called phonon wavepacket, which is connected with collective coherent oscillations of the atoms in the crystal. Coherent phonons can be generated by excitation of the crystal with a femtosecond light pulse and their motions in space and time be followed by scattering an ultrashort x-ray pulse from the excited material. The pattern of scattered x-rays gives direct insight in the momentary position of and distances between the atoms. A sequence of such patterns provides a ‘movie’ of the atomic motions.
    The physical properties of coherent phonons are determined by the symmetry of the crystal, which represents a periodic arrangement of identical unit cells. Weak optical excitation does not change the symmetry properties of the crystal. In this case, coherent phonons with identical atomic motions in all unit cells are excited . In contrast, strong optical excitation can break the symmetry of the crystal and make atoms in adjacent unit cells oscillate differently. While this mechanism holds potential for accessing other phonons, it has not been explored so far.
    In the journal Physical Review B, researchers from the Max-Born-Institute in Berlin in collaboration with researchers from the University of Duisburg-Essen have demonstrated a novel concept for exciting and probing coherent phonons in crystals of a transiently broken symmetry. The key of this concept lies in reducing the symmetry of a crystal by appropriate optical excitation, as has been shown with the prototypical crystalline semimetal bismuth (Bi).
    Ultrafast mid-infrared excitation of electrons in Bi modifies the spatial charge distribution and, thus, reduces the crystal symmetry transiently. In the reduced symmetry, new quantum pathways for the excitation of coherent phonons open up. The symmetry reduction causes a doubling of the unit-cell size from the red framework with two Bi atoms to the blue framework with four Bi atoms. In addition to the unidirectional atomic motion, the unit cell with 4 Bi atoms allows for coherent phonon wave packets with bidirectional atomic motions.
    Probing the transient crystal structure directly by femtosecond x-ray diffraction reveals oscillations of diffracted intensity, which persist on a picosecond time scale. The oscillations arise from coherent wave packet motions along phonon coordinates in the crystal of reduced symmetry. Their frequency of 2.6 THz is different from that of phonon oscillations at low excitation level. Interestingly, this behavior occurs only above a threshold of the optical pump fluence and reflects the highly nonlinear, so-called non-perturbative character of the optical excitation process.
    In summary, optically induced symmetry breaking allows for modifying the excitation spectrum of a crystal on ultrashort time scales. These results may pave the way for steering material properties transiently and, thus, implementing new functions in optoacoustics and optical switching. More

  • in

    Self-driving cars lack social intelligence in traffic

    Should I go or give way? It is one of the most basic questions in traffic, whether merging in on a motorway or at the door of the metro. The decision is one that humans typically make quickly and intuitively, because doing so relies on social interactions trained from the time we begin to walk.
    Self-driving cars on the other hand, which are already on the road in several parts of the world, still struggle when navigating these social interactions in traffic. This has been demonstrated in new research conducted at the University of Copenhagen’s Department of Computer Science. Researchers analyzed an array of videos uploaded by YouTube users of self-driving cars in various traffic situations. The results show that self-driving cars have a particularly tough time understanding when to ‘yield’ — when to give way and when to drive on.
    “The ability to navigate in traffic is based on much more than traffic rules. Social interactions, including body language, play a major role when we signal each other in traffic. This is where the programming of self-driving cars still falls short. That is why it is difficult for them to consistently understand when to stop and when someone is stopping for them, which can be both annoying and dangerous,” says Professor Barry Brown, who has studied the evolution of self-driving car road behavior for the past five years.
    Sorry, it’s a self-driving car!
    Companies like Waymo and Cruise have launched taxi services with self-driving cars in parts of the United States. Tesla has rolled out their FSD model (full self-driving) to about 100,000 volunteer drivers in the US and Canada. And the media is brimming with stories about how good self-driving cars perform. But according to Professor Brown and his team, their actual road performance is a well-kept trade secret that very few have insight into. Therefore, the researchers performed in-depth analyses using 18 hours of YouTube footage filmed by enthusiasts testing cars from the back seat.
    One of their video examples shows a family of four standing by the curb of a residential street in the United States. There is no pedestrian crossing, but the family would like to cross the road. As the driverless car approaches, it slows, causing the two adults in the family to wave their hands as a sign for the car to drive on. Instead, the car stops right next to them for 11 seconds. Then, as the family begins walking across the road, the car starts moving again, causing them to jump back onto the sidewalk, whereupon the person in the back seat rolls down the window and yells, “Sorry, self-driving car!.”
    “The situation is similar to the main problem we found in our analysis and demonstrates the inability of self-driving cars to understand social interactions in traffic. The driverless vehicle stops so as to not hit pedestrians, but ends up driving into them anyway because it doesn’t understand the signals. Besides creating confusion and wasted time in traffic, it can also be downright dangerous,” says Professor Brown.

    A drive in foggy Frisco
    In tech centric San Francisco, the performance of self-driving cars can be judged up close. Here, driverless cars have been unleashed in several parts of the city as buses and taxis, navigating the hilly streets among people and other natural phenomena. And according to the researcher, this has created plenty of resistance among the city’s residents:
    “Self-driving cars are causing traffic jams and problems in San Francisco because they react inappropriately to other road users. Recently, the city’s media wrote of a chaotic traffic event caused by self-driving cars due to fog. Fog caused the self-driving cars to overreact, stop and block traffic, even though fog is extremely common in the city,” says Professor Brown.
    Robotic cars have been in the works for 10 years and the industry behind them has spent over DKK 40 billion to push their development. Yet the outcome has been cars that still drive with many mistakes, blocking other drivers and disrupting the smooth flow of traffic.
    Why do you think it’s so difficult to program self-driving cars to understand social interactions in traffic?
    “I think that part of the answer is that we take the social element for granted. We don’t think about it when we get into a car and drive — we just do it automatically. But when it comes to designing systems, you need to describe everything we take for granted and incorporate it into the design. The car industry could learn from having a more sociological approach. Understanding social interactions that are part of traffic should be used to design self-driving cars’ interactions with other road users, similar to how research has helped improve the usability of mobile phones and technology more broadly.”
    About the study: The researchers analyzed 18 hours of video footage of self-driving cars from 70 different YouTube videos. Using different video analysis techniques the researchers studied the video sequences in depth, rather than making a broader superficial analysis. The study is called: “The Halting Problem: Video analysis of self-driving cars in traffic” has just been presented at the 2023 CHI Conference on Human Factors in Computing Systems, where it won the conference’s best paper award. The study was conducted by Barry Brown of the University of Copenhagen and Stockholm University, Mathias Broth of Linköping University, and Erik Vinkhuyzen of Kings College, London. More

  • in

    New tool may help spot ‘invisible’ brain damage in college athletes

    An artificial intelligence computer program that processes magnetic resonance imaging (MRI) can accurately identify changes in brain structure that result from repeated head injury, a new study in student athletes shows. These variations have not been captured by other traditional medical images such as computerized tomography (CT) scans. The new technology, researchers say, may help design new diagnostic tools to better understand subtle brain injuries that accumulate over time.
    Experts have long known about potential risks of concussion among young athletes, particularly for those who play high-contact sports such as football, hockey, and soccer. Evidence is now mounting that repeated head impacts, even if they at first appear mild, may add up over many years and lead to cognitive loss. While advanced MRI identifies microscopic changes in brain structure that result from head trauma, researchers say the scans produce vast amounts of data that is difficult to navigate.
    Led by researchers in the Department of Radiology at NYU Grossman School of Medicine, the new study showed for the first time that the new tool, using an AI technique called machine learning, could accurately distinguish between the brains of male athletes who played contact sports like football versus noncontact sports like track and field. The results linked repeated head impacts with tiny, structural changes in the brains of contact-sport athletes who had not been diagnosed with a concussion.
    “Our findings uncover meaningful differences between the brains of athletes who play contact sports compared to those who compete in noncontact sports,” said study senior author and neuroradiologist Yvonne Lui, MD. “Since we expect these groups to have similar brain structure, these results suggest that there may be a risk in choosing one sport over another,” adds Lui, a professor and vice chair for research in the Department of Radiology at NYU Langone Health.
    Lui adds that beyond spotting potential damage, the machine-learning technique used in their investigation may also help experts to better understand the underlying mechanisms behind brain injury.
    The new study, which published online May 22 in The Neuroradiology Journal, involved hundreds of brain images from 36 contact-sport college athletes (mostly football players) and 45 noncontact-sport college athletes (mostly runners and baseball players). The work was meant to clearly link changes detected by the AI tool in the brain scans of football players to head impacts. It builds on a previous study that had identified brain-structure differences in football players, comparing those with and without concussions to athletes who competed in noncontact sports.

    For the investigation, the researchers analyzed MRI scans from 81 male athletes taken between 2016 through 2018, none of whom had a known diagnosis of concussion within that time period. Contact-sport athletes played football, lacrosse, and soccer, while noncontact-sport athletes participated in baseball, basketball, track and field, and cross-country.
    As part of their analysis, the research team designed statistical techniques that gave their computer program the ability to “learn” how to predict exposure to repeated head impacts using mathematical models. These were based on data examples fed into them, with the program getting “smarter” as the amount of training data grew.
    The study team trained the program to identify unusual features in brain tissue and distinguish between athletes with and without repeated exposure to head injuries based on these factors. They also ranked how useful each feature was for detecting damage to help uncover which of the many MRI metrics might contribute most to diagnoses.
    Two metrics most accurately flagged structural changes that resulted from head injury, say the authors. The first, mean diffusivity, measures how easily water can move through brain tissue and is often used to spot strokes on MRI scans. The second, mean kurtosis, examines the complexity of brain-tissue structure and can indicate changes in the parts of the brain involved in learning, memory, and emotions.
    “Our results highlight the power of artificial intelligence to help us see things that we could not see before, particularly ‘invisible injuries’ that do not show up on conventional MRI scans,” said study lead author Junbo Chen, MS, a doctoral candidate at NYU Tandon School of Engineering. “This method may provide an important diagnostic tool not only for concussion, but also for detecting the damage that stems from subtler and more frequent head impacts.”
    Chen adds that the study team next plans to explore the use of their machine-learning technique for examining head injury in female athletes.
    Funding for the study was provided by National Institute of Health grants P41EB017183 and C63000NYUPG118117. Further funding was provided by Department of Defense grant W81XWH2010699.
    In addition to Lui and Chen, other NYU researchers involved in the study were Sohae Chung, PhD; Tianhao Li, MS; Els Fieremans, PhD; Dmitry Novikov, PhD; and Yao Wang, PhD. More

  • in

    Source-shifting metastructures composed of only one resin for location camouflaging

    The field of transformation optics has flourished over the past decade, allowing scientists to design metamaterial-based structures that shape and guide the flow of light. One of the most dazzling inventions potentially unlocked by transformation optics is the invisibility cloak — a theoretical fabric that bends incoming light away from the wearer, rendering them invisible. Interestingly, such illusions are not restricted to the manipulations of light alone.
    Many of the techniques used in transformation optics have been applied to sound waves, giving rise to the parallel field of transformation acoustics. In fact, researchers have already made substantial progress by developing the “acoustic cloak,” the analog of the invisibility cloak for sounds. While research on acoustic illusion has focused on the concept of masking the presence of an object, not much progress has been made on the problem of location camouflaging.
    The concept of an acoustic source-shifter utilizes a structure that makes the location of the sound source appear different from its actual location. Such devices capable of “acoustic location camouflaging” could find applications in advanced holography and virtual reality. Unfortunately, the nature of location camouflaging has been scarcely studied, and the development of accessible materials and surfaces that would provide a decent performance has proven challenging.
    Against this backdrop, Professor Garuda Fujii, affiliated with the Institute of Engineering and Energy Landscape Architectonics Brain Bank (ELab2) at Shinshu University, Japan, has now made progress in developing high-performance source-shifters. In a recent study published in the Journal of Sound and Vibration online on May 5, 2023, Prof. Fujii presented an innovative approach to designing source-shifter structures out of acrylonitrile butadiene styrene (ABS), an elastic polymer commonly used in 3D printing.
    Prof. Fujii’s approach is centered around a core concept: inverse design based on topology optimization. The numerical approach builds on the reproduction of pressure fields (sound) emitted by a virtual source, i.e., the source that nearby listeners would mistakenly perceive as real. Next, the pressure fields emitted by the actual source are manipulated to camouflage the location and make it sound as if coming from a different location in space. This can be achieved with the optimum design of a metastructure that, by the virtue of its geometry and elastic properties, minimizes the difference between the pressure fields emitted from the actual and virtual sources.
    Utilizing this approach, Prof. Fujii implemented an iterative algorithm to numerically determine the optimal design of ABS resin source-shifters according to various design criteria. His models and simulations had to account for the acoustic-elastic interactions between fluids (air) and solid elastic structures, as well as the actual limitations of modern manufacturing technology.
    The simulation results revealed that the optimized structures could reduce the difference between the emitted pressure fields of the masked source and those of a bare source at the virtual location to as low as 0.6%. “The optimal structure configurations obtained via topology optimization exhibited good performances at camouflaging the actual source location despite the simple composition of ABS that did not comprise complex acoustic metamaterials”, remarks Prof. Fujii.
    To shed more light on the underlying camouflaging mechanisms, Prof. Fujii analyzed the importance of the distance between the virtual and actual sources. He found that a greater distance did not necessarily degrade the source-shifter’s performance. He also investigated the effect of changing the frequency of the emitted sound on the performance as the source-shifters had been optimized for only one target frequency. Finally, he explored whether a source-shifter could be topologically optimized to operate at multiple sound frequencies.
    While his approach requires further fine-tuning, the findings of this study will surely help advance illusion acoustics. He concludes, “The proposed optimization method for designing high-performance source-shifters will help in the development of acoustic location camouflage and the advancement of holography technology.” More

  • in

    Quantum computers braided ‘anyons,’ long-sought quasiparticles with memory

    Anyons, anyone?

    Scientists have created strange new particle-like objects called non-abelian anyons. These long-sought quasiparticles can be “braided,” meaning that they can be moved around one another and retain a memory of that swapping, similar to how a braided ponytail keeps a record of the order in which strands cross over each other.

    Two independent teams — one led by researchers at Google, the other by researchers at the quantum computing company Quantinuum — have reported creating and braiding versions of these anyons using quantum computers. The Google and Quantinuum results, respectively reported May 11 in Nature and May 9 at arXiv.org, could help scientists construct quantum computers that are resistant to the errors that currently bedevil the machines.

    .email-conversion {
    border: 1px solid #ffcccb;
    color: white;
    margin-top: 50px;
    background-image: url(“/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg”);
    padding: 20px;
    clear: both;
    }

    .zephr-registration-form{max-width:440px;margin:20px auto;padding:20px;background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form *{box-sizing:border-box}.zephr-registration-form-text > *{color:var(–zephr-color-text-main)}.zephr-registration-form-relative-container{position:relative}.zephr-registration-form-flex-container{display:flex}.zephr-registration-form-input.svelte-blfh8x{display:block;width:100%;height:calc(var(–zephr-input-height) * 1px);padding-left:8px;font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input.svelte-blfh8x::placeholder{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-checkbox.svelte-blfh8x{width:auto;height:auto;margin:8px 5px 0 0;float:left}.zephr-registration-form-input-radio.svelte-blfh8x{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x{width:50px;padding:0;border-radius:50%}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch{border:none;border-radius:50%;padding:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch-wrapper{border:none;border-radius:50%;padding:0}.zephr-registration-form-input.disabled.svelte-blfh8x,.zephr-registration-form-input.disabled.svelte-blfh8x:hover{border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);background-color:var(–zephr-color-background-tinted)}.zephr-registration-form-input.error.svelte-blfh8x{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-label.svelte-1ok5fdj.svelte-1ok5fdj{margin-top:10px;display:block;line-height:30px;font-size:12px;color:var(–zephr-color-text-tinted);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj{display:block}.zephr-registration-form-button.svelte-17g75t9{height:calc(var(–zephr-button-height) * 1px);line-height:0;padding:0 20px;text-decoration:none;text-transform:capitalize;text-align:center;border-radius:calc(var(–zephr-button-borderRadius) * 1px);font-size:calc(var(–zephr-button-fontSize) * 1px);font-weight:normal;cursor:pointer;border-style:solid;border-width:calc(var(–zephr-button-borderWidth) * 1px);border-color:var(–zephr-color-action-tinted);transition:backdrop-filter 0.2s, background-color 0.2s;margin-top:20px;display:block;width:100%;background-color:var(–zephr-color-action-main);color:#fff;position:relative;overflow:hidden;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-button.svelte-17g75t9:hover{background-color:var(–zephr-color-action-tinted);border-color:var(–zephr-color-action-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled:hover{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-text.svelte-i1fi5{font-size:19px;text-align:center;margin:20px auto;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-divider-container.svelte-mk4m8o{display:flex;align-items:center;justify-content:center;margin:40px 0}.zephr-registration-form-divider-line.svelte-mk4m8o{height:1px;width:50%;margin:0 5px;background-color:var(–zephr-color-text-tinted);;}.zephr-registration-form-divider-text.svelte-mk4m8o{margin:0 12px;color:var(–zephr-color-text-main);font-size:14px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);white-space:nowrap}.zephr-registration-form-response-message.svelte-179421u{text-align:center;padding:10px 30px;border-radius:5px;font-size:15px;margin-top:10px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message-title.svelte-179421u{font-weight:bold;margin-bottom:10px}.zephr-registration-form-response-message-success.svelte-179421u{background-color:#baecbb;border:1px solid #00bc05}.zephr-registration-form-response-message-error.svelte-179421u{background-color:#fcdbec;border:1px solid #d90c00}.zephr-recaptcha-tcs.svelte-1wyy3bx{margin:20px 0 0 0;font-size:15px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-inline.svelte-1wyy3bx{margin:20px 0 0 0}.zephr-registration-form-social-sign-in.svelte-gp4ky7{align-items:center}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7{height:55px;padding:0 15px;color:#000;background-color:#fff;box-shadow:0px 0px 5px rgba(0, 0, 0, 0.3);border-radius:10px;font-size:17px;display:flex;align-items:center;cursor:pointer;margin-top:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7:hover{background-color:#fafafa}.zephr-registration-form-social-sign-in-icon.svelte-gp4ky7{display:flex;justify-content:center;margin-right:30px;width:25px}.zephr-form-link-message.svelte-rt4jae{margin:10px 0 10px 20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-progress-bar.svelte-8qyhcl{width:100%;border:0;border-radius:20px;margin-top:10px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-bar{background-color:var(–zephr-color-background-tinted);border:0;border-radius:20px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-value{background-color:var(–zephr-color-text-tinted);border:0;border-radius:20px}.zephr-registration-progress-bar-step.svelte-8qyhcl{margin:auto;color:var(–zephr-color-text-tinted);font-size:12px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-progress-bar-step.svelte-8qyhcl:first-child{margin-left:0}.zephr-registration-progress-bar-step.svelte-8qyhcl:last-child{margin-right:0}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-error-text.svelte-19a73pq{color:var(–zephr-color-warning-main);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-select.svelte-19a73pq{display:block;appearance:auto;width:100%;height:calc(var(–zephr-input-height) * 1px);font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-color-text-main);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;padding:10px}.zephr-registration-form-input-select.disabled.svelte-19a73pq{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.unselected.svelte-19a73pq{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.error.svelte-19a73pq{border-color:var(–zephr-color-warning-main)}.zephr-registration-form-input-textarea.svelte-19a73pq{background-color:#fff;border:1px solid #ddd;color:#222;font-size:14px;font-weight:300;padding:16px;width:100%}.zephr-registration-form-input-slider-output.svelte-19a73pq{margin:13px 0 0 10px}.spin.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 2s 0s infinite linear}.pulse.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 1s infinite steps(8)}@keyframes svelte-1cj2gr0-spin{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}.zephr-registration-form-input-radio.svelte-1qn5n0t{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-radio-label.svelte-1qn5n0t{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-radio-dot.svelte-1qn5n0t{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid #ebebeb;border-radius:50%;margin-right:12px}.checked.svelte-1qn5n0t{border-color:#009fe3}.checked.svelte-1qn5n0t:after{content:””;position:absolute;width:17px;height:17px;background:#009fe3;background:linear-gradient(#009fe3, #006cb5);border-radius:50%;top:2px;left:2px}.disabled.checked.svelte-1qn5n0t:after{background:var(–zephr-color-background-tinted)}.error.svelte-1qn5n0t{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-checkbox.svelte-1gzpw2y{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-checkbox-label.svelte-1gzpw2y{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-checkmark.svelte-1gzpw2y{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid var(–zephr-color-text-main);border-radius:6px;margin-right:12px;cursor:pointer}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y{border-color:#009fe3}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y:after{content:””;position:absolute;width:6px;height:13px;border:solid #009fe3;border-width:0 2px 2px 0;transform:rotate(45deg);top:3px;left:8px;box-sizing:border-box}.zephr-registration-form-checkmark.disabled.svelte-1gzpw2y{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-checkmark.disabled.checked.svelte-1gzpw2y:after{border:solid var(–zephr-color-background-tinted);border-width:0 2px 2px 0}.zephr-registration-form-checkmark.error.svelte-1gzpw2y{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-google-icon.svelte-1jnblvg{width:20px}.zephr-form-link.svelte-64wplc{margin:10px 0;color:#6ba5e9;text-decoration:underline;cursor:pointer;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-form-link-disabled.svelte-64wplc{color:var(–zephr-color-text-main);cursor:none;text-decoration:none}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}
    .zephr-registration-form {
    max-width: 100%;
    background-image: url(/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg);
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    margin: 0px auto;
    margin-bottom: 4rem;
    padding: 20px;
    }

    .zephr-registration-form-text h6 {
    font-size: 0.8rem;
    }

    .zephr-registration-form h4 {
    font-size: 3rem;
    }

    .zephr-registration-form h4 {
    font-size: 1.5rem;
    }

    .zephr-registration-form-button.svelte-17g75t9:hover {
    background-color: #fc6a65;
    border-color: #fc6a65;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9:disabled {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9 {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-text > * {
    color: #FFFFFF;
    font-weight: bold
    font: 25px;
    }
    .zephr-registration-form-progress-bar.svelte-8qyhcl {
    width: 100%;
    border: 0;
    border-radius: 20px;
    margin-top: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-title.svelte-179421u {
    font-weight: bold;
    margin-bottom: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-success.svelte-179421u {
    background-color: #8db869;
    border: 1px solid #8db869;
    color: white;
    margin-top: -0.2rem;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(1){
    font-size: 18px;
    text-align: center;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(5){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(7){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(9){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj {
    display: none;
    color: white;
    }
    .zephr-registration-form-input.disabled.svelte-blfh8x, .zephr-registration-form-input.disabled.svelte-blfh8x:hover {
    border: calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);
    background-color: white;
    }
    .zephr-registration-form-checkbox-label.svelte-1gzpw2y {
    display: flex;
    align-items: center;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    font-size: 20px;
    margin-bottom: -20px;
    }

    Non-abelian anyons defy common intuition about what happens to objects that swap locations. Picture the street game with cups and balls, where a performer swaps identical cups back and forth. If you weren’t watching closely, you’d never know if two cups had been moved around one another and back to their original positions. In the quantum world, that’s not always the case.

    “It’s predicted that there is this crazy particle where, if you swap them around each other while you have your eyes closed, you can actually tell after the fact,” says physicist Trond Andersen of Google Quantum AI in Santa Barbara, Calif. “This goes against our common sense, and it seems crazy.”

    Particles in our regular 3-D world can’t do this magic trick. But when particles are confined to just two dimensions, the rules change. While scientists don’t have a 2-D universe in which to explore particles, they can manipulate materials or quantum computers to exhibit behavior like that of particles that live in two dimensions, creating objects known as quasiparticles.

    All fundamental subatomic particles fall into two classes, based on how identical particles of each type behave when swapped. They are either fermions, a class that includes electrons and other particles that make up matter, or bosons, which include particles of light known as photons.

    But in two dimensions, there’s another option: anyons. For bosons or fermions, swapping identical particles back and forth or moving them around one another can’t have a directly measurable effect. For anyons, it can.

    In the 1990s, scientists realized that a specific version of an anyon, called a non-abelian anyon, could be used to build quantum computers that might safeguard fragile quantum information, which is easily knocked out of whack by minute disturbances.

    .subscribe-cta {
    color: black;
    margin-top: 0px;
    background-image: url(“”);
    background-size: cover;
    padding: 20px;
    border: 1px solid #ffcccb;
    border-top: 5px solid #e04821;
    clear: both;
    }

    Subscribe to Science News

    Get great science journalism, from the most trusted source, delivered to your doorstep.

    “For fundamental reasons these anyons have been very exciting, and for practical reasons people hope they might be useful,” says theoretical physicist Maissam Barkeshli of the University of Maryland in College Park, who was not involved with either study.

    Google’s team created the anyons using a superconducting quantum computer, where the quantum bits, or qubits, are made of material that conducts electricity without resistance. Quantinuum’s study, which has yet to be peer-reviewed, is based on a quantum computer whose qubits are composed of trapped, electrically charged atoms of ytterbium and barium. In both cases, scientists manipulated the qubits to create the anyons and move them around, demonstrating a measurable change after the anyons were braided.

    Scientists have previously created and braided a less exotic type of anyon, called an abelian anyon, within a 2-D layer of a solid material (SN: 7/9/20). And many physicists are similarly questing after a solid material that might host the non-abelian type.

    But the new studies create non-abelian states within qubits inside a quantum computer, which is fundamentally different, Barkeshli says. “You’re kind of synthetically creating the state for a fleeting moment.” That means it doesn’t have all the properties that anyons within a solid material would have, he says.

    In both cases, much more work must be done before the anyons could create powerful, error-resistant quantum computers. Google’s study, in particular, produces an anyon that’s akin to a fish out of water. It’s a non-abelian within a more commonplace abelian framework. That means those anyons may not be as powerful for quantum computing, Barkeshli says.

    It’s not all about practical usefulness. Demonstrating that non-abelian anyons really exist is fundamentally important, says Quantinuum’s Henrik Dreyer, a physicist in Munich. It “confirms that the rules of quantum mechanics apply in the way that we thought they would apply.” More

  • in

    Robot centipedes go for a walk

    Researchers from the Department of Mechanical Science and Bioengineering at Osaka University have invented a new kind of walking robot that takes advantage of dynamic instability to navigate. By changing the flexibility of the couplings, the robot can be made to turn without the need for complex computational control systems. This work may assist the creation of rescue robots that are able to traverse uneven terrain.
    Most animals on Earth have evolved a robust locomotion system using legs that provides them with a high degree of mobility over a wide range of environments. Somewhat disappointingly, engineers who have attempted to replicate this approach have often found that legged robots are surprisingly fragile. The breakdown of even one leg due to the repeated stress can severely limit the ability of these robots to function. In addition, controlling a large number of joints so the robot can transverse complex environments requires a lot of computer power. Improvements in this design would be extremely useful for building autonomous or semi-autonomous robots that could act as exploration or rescue vehicles and enter dangerous areas.
    Now, investigators from Osaka University have developed a biomimetic “myriapod” robot that takes advantage of a natural instability that can convert straight walking into curved motion. In a study published recently in Soft Robotics, researchers from Osaka University describe their robot, which consists of six segments (with two legs connected to each segment) and flexible joints. Using an adjustable screw, the flexibility of the couplings can be modified with motors during the walking motion. The researchers showed that increasing the flexibility of the joints led to a situation called a “pitchfork bifurcation,” in which straight walking becomes unstable. Instead, the robot transitions to walking in a curved pattern, either to the right or to the left. Normally, engineers would try to avoid creating instabilities. However, making controlled use of them can enable efficient maneuverability. “We were inspired by the ability of certain extremely agile insects that allows them to control the dynamic instability in their own motion to induce quick movement changes,” says Shinya Aoi, an author of the study. Because this approach does not directly steer the movement of the body axis, but rather controls the flexibility, it can greatly reduce both the computational complexity as well as the energy requirements.
    The team tested the robot’s ability to reach specific locations and found that it could navigate by taking curved paths toward targets. “We can foresee applications in a wide variety of scenarios, such as search and rescue, working in hazardous environments or exploration on other planets,” says Mau Adachi, another study author. Future versions may include additional segments and control mechanisms. More