More stories

  • in

    Adding or subtracting single quanta of sound

    Researchers perform experiments that can add or subtract a single quantum of sound — with surprising results when applied to noisy sound fields.
    Quantum mechanics tells us that physical objects can have both wave and particle properties. For instance, a single particle — or quantum — of light is known as a photon, and, in a similar fashion, a single quantum of sound is known as a phonon, which can be thought of as the smallest unit of sound energy.
    A team of researchers spanning Imperial College London, University of Oxford, the Niels Bohr Institute, University of Bath, and the Australian National University have performed an experiment that can add or subtract a single phonon to a high-frequency sound field using interactions with laser light.
    The team’s findings aid the development of future quantum technologies, such as hardware components in a future ‘quantum internet’, and help pave the way for tests of quantum mechanics on a more macroscopic scale. The details of their research are published today in the journal Physical Review Letters.
    To add or subtract a single quantum of sound, the team experimentally implement a technique proposed in 2013 that exploits correlations between photons and phonons created inside a resonator. More specifically, laser light is injected into a crystalline microresonator that supports both the light and the high-frequency sound waves.
    The two types of waves then couple to one another via an electromagnetic interaction that creates light at a new frequency. Then, to subtract a single phonon, the team detect a single photon that has been up-shifted in frequency. “Detecting a single photon gives us an event-ready signal that we have subtracted a single phonon,” says lead author of the project Georg Enzian.
    When the experiment is performed at a finite temperature, the sound field has random fluctuations from thermal noise. Thus, at any one time, the exact number of sound quanta present is unknown but on average there will be n phonons initially.
    What happens now when you add or subtract a single phonon? At first thought, you may expect this would simply change the average to n + 1 or n — 1, respectively, however the actual outcome defies this intuition. Indeed, quite counterintuitively, when you subtract a single phonon, the average number of phonons actually goes up to 2n.
    This surprising result where the mean number of quanta doubles has been observed for all-optical photon-subtraction experiments and is observed for the first time outside of optics here. “One way to think of the experiment is to imagine a claw machine that you often see in video arcades, except that you can’t see how many toys there are inside the machine. Before you agree to play, you’ve been told that on average there are n toys inside but the exact number changes randomly each time you play. Then, immediately after a successful grab with the claw, the average number of toys actually goes up to 2n,” describes Michael Vanner, Principal Investigator of the Quantum Measurement Lab at Imperial College London.
    It’s important to note that this result certainly does not violate energy conservation and comes about due to the statistics of thermal phonons.
    The team’s results, combined with their recent experiment that reported strong coupling between light and sound in a microresonator, open a new path for quantum science and technology with sound waves.

    Story Source:
    Materials provided by Imperial College London. Note: Content may be edited for style and length. More

  • in

    Sport may fast-track numeracy skills for Indigenous children

    Greater sports participation among Aboriginal and Torres Strait Islander children is linked with better academic performance, according to new research from the University of South Australia.
    Conducted in partnership with the University of Sydney and the University of Technology Sydney, the world-first study found that Aboriginal and Torres Strait Islander children who played organised sports every year over four years, had numeracy skills which were advanced by seven months, compared to children who did less sport.
    The study used data from four successive waves of Australia’s Longitudinal Study of Indigenous Children, following 303 students (with a baseline age of five to six years old) to assess cumulative sports participation against academic performance in standardised NAPLAN and PAT outcomes.
    Sports participation has been linked with better cognitive function and memory in many child populations, but this is the first study to confirm the beneficial association between ongoing involvement in sport and academic performance among Aboriginal and Torres Strait Islander children.
    Lead researcher, UniSA’s Dr Dot Dumuid, says the study highlights the importance of sports as a strategy to help close the gap* for Australia’s first nations peoples.
    “Playing sport has always had strong cultural importance to Aboriginal and Torres Strait Islanders, so understanding how sports can boost numeracy among Indigenous children is a valuable step towards improving health and reducing disadvantage,” Dr Dumuid says.

    advertisement

    “When children play sport, they’re learning the social structures of a team, how to work within rules, how to focus their attention, and key strategies for success.
    “Interestingly, when children play sport, they’re not only activating parts of the brain that are involved in learning, but they’re also inadvertently practising mathematical computations such as ‘how much time is left in the game?’ and ‘how many points do we need to win?’, and it’s this that may well be contributing to improved numeracy.”
    Aboriginal and Torres Strait Islanders comprise a relatively large proportion of athletes in Australia’s leading sports teams. While only representing about three percent of the population, they make up nine percent of AFL players, and 22 per cent of State of Origin players.
    Encouraging sports in Aboriginal and Torres Strait Islander communities could have many other benefits for health and wellbeing, says co-researcher and Professor of Indigenous Health Education at UTS, John Evans.
    “Playing sport creates a sense of belonging, and builds self-esteem, coherence and purpose,” Professor Evans says.
    “This is especially important for people living in rural and remote areas where opportunities for social interaction and structured activities can be limited.
    “If we can find ways to encourage greater participation among Aboriginal and Torres Strain Islander communities, while removing key barriers — such as financial costs and lack of transport — we could promote healthier living, more cohesive societies while also and boosting academic performance among Indigenous children.”

    Story Source:
    Materials provided by University of South Australia. Note: Content may be edited for style and length. More

  • in

    New blueprint for more stable quantum computers

    Researchers at the Paul Scherrer Institute PSI have put forward a detailed plan of how faster and better defined quantum bits — qubits — can be created. The central elements are magnetic atoms from the class of so-called rare-earth metals, which would be selectively implanted into the crystal lattice of a material. Each of these atoms represents one qubit. The researchers have demonstrated how these qubits can be activated, entangled, used as memory bits, and read out. They have now published their design concept and supporting calculations in the journal PRX Quantum.
    On the way to quantum computers, an initial requirement is to create so-called quantum bits or “qubits”: memory bits that can, unlike classical bits, take on not only the binary values of zero and one, but also any arbitrary combination of these states. “With this, an entirely new kind of computation and data processing becomes possible, which for specific applications means an enormous acceleration of computing power,” explains PSI researcher Manuel Grimm, first author of a new paper on the topic of qubits.
    The authors describe how logical bits and basic computer operations on them can be realised in a magnetic solid: qubits would reside on individual atoms from the class of rare-earth elements, built into the crystal lattice of a host material. On the basis of quantum physics, the authors calculate that the nuclear spin of the rare-earth atoms would be suitable for use as an information carrier, that is, a qubit. They further propose that targeted laser pulses could momentarily transfer the information to the atom’s electrons and thus activate the qubits, whereby their information becomes visible to surrounding atoms. Two such activated qubits communicate with each other and thus can be “entangled.” Entanglement is a special property of quantum systems of multiple particles or qubits that is essential for quantum computers: The result of measuring one qubit directly depends on the measurement results of other qubits, and vice versa.
    Faster means less error-prone
    The researchers demonstrate how these qubits can be used to produce logic gates, most notably the “controlled NOT gate” (CNOT gate). Logic gates are the basic building blocks that also classical computers use to perform calculations. If sufficiently many such CNOT gates as well as single-qubit gates are combined, every conceivable computational operation becomes possible. They thus form the basis for quantum computers.
    This paper is not the first to propose quantum-based logic gates. “Our method of activating and entangling the qubits, however, has a decisive advantage over previous comparable proposals: It is at least ten times faster,” says Grimm. The advantage, though, is not only the speed with which a quantum computer based on this concept could calculate; above all, it addresses the system’s susceptibility to errors. “Qubits are not very stable. If the entanglement processes are too slow, there is a greater probability that some of the qubits will lose their information in the meantime,” Grimm explains. Ultimately, what the PSI researchers have discovered is a way of making this type of quantum computer not only at least ten times as fast as comparable systems, but also less error-prone by the same factor.

    Story Source:
    Materials provided by Paul Scherrer Institute. Original written by Laura Hennemann. Note: Content may be edited for style and length. More

  • in

    AI trained to read electric vehicle charging station reviews to find infrastructure gaps

    Although electric vehicles that reduce greenhouse gas emissions attract many drivers, the lack of confidence in charging services deters others. Building a reliable network of charging stations is difficult in part because it’s challenging to aggregate data from independent station operators. But now, researchers reporting January 22 in the journal Patterns have developed an AI that can analyze user reviews of these stations, allowing it to accurately identify places where there are insufficient or out-of-service stations.
    “We’re spending billions of both public and private dollars on electric vehicle infrastructure,” says Omar Asensio (@AsensioResearch), principal investigator and assistant professor in the School of Public Policy at the Georgia Institute of Technology. “But we really don’t have a good understanding of how well these investments are serving the public and public interest.”
    Electric vehicle drivers have started to solve the problem of uncertain charging infrastructure by forming communities on charge station locator apps, leaving reviews. The researchers sought to analyze these reviews to better understand the problems facing users.
    With the aid of their AI, Asensio and colleagues were able to predict whether a specific station was functional on a particular day. They also found that micropolitan areas, where the population is between 10,000 and 50,000 people, may be underserved, with more frequent reports of station availability issues. These communities are mostly located in states in the West and Midwest, such as Oregon, Utah, South Dakota, and Nebraska, along with Hawaii.
    “When users are engaging and sharing information about charging experiences, they are often engaging in prosocial or pro-environmental behavior, which gives us rich behavioral information for machine learning,” says Asensio. But compared to analyzing data tables, texts can be challenging for computers to process. “A review could be as short as three words. It could also be as long as 25 or 30 words with misspellings and multiple topics,” says co-author Sameer Dharur of Georgia Institute of Technology. Users sometimes even throw smiley faces or emojis into the texts.
    To address the problem, Asensio and his team tailored their algorithm to electric vehicle transportation lingo. They trained it with reviews from 12,720 US charging stations to classify reviews into eight different categories: functionality, availability, cost, location, dealership, user interaction, service time, and range anxiety. The AI achieved a 91% accuracy and high learning efficiency in parsing the reviews in minutes. “That’s a milestone in the transition for us to deploy these AI tools because it’s no longer ‘can the AI do as good as human?'” says Asensio. “In some cases, the AI exceeded the performance of human experts.”
    As opposed to previous charging infrastructure performance evaluation studies that rely on costly and infrequent self-reported surveys, AI can reduce research costs while providing real-time standardized data. The electric vehicle charging market is expected to grow to $27.6 billion by 2027. The new method can give insight into consumers’ behavior, enabling rapid policy analysis and making infrastructure management easier for the government and companies. For instance, the team’s findings suggest that it may be more effective to subsidize infrastructure development as opposed to the sale of an electric car.
    While the technology still faces some limitations — like the need to reduce requirements for computer processing power — before rolling out large-scale implementation to the electric vehicle charging market, Asensio and his team hope that as the science progresses, their research can open doors to more in-depth studies about social equity on top of meeting consumer needs.
    “This is a wake-up call for us because, given the massive investment in electric vehicle infrastructure, we’re doing it in a way that is not necessarily attentive to the social equity and distributional issues of access to this enabling infrastructure,” says Asensio. “That is a topic of discussion that’s not going away and we’re only beginning to understand.”

    Story Source:
    Materials provided by Cell Press. Note: Content may be edited for style and length. More

  • in

    Defects may help scientists understand the exotic physics of topology

    Real-world materials are usually messier than the idealized scenarios found in textbooks. Imperfections can add complications and even limit a material’s usefulness. To get around this, scientists routinely strive to remove defects and dirt entirely, pushing materials closer to perfection. Now, researchers at the University of Illinois at Urbana-Champaign have turned this problem around and shown that for some materials defects could act as a probe for interesting physics, rather than a nuisance.
    The team, led by professors Gaurav Bahl and Taylor Hughes, studied artificial materials, or metamaterials, which they engineered to include defects. The team used these customizable circuits as a proxy for studying exotic topological crystals, which are often imperfect, difficult to synthesize, and notoriously tricky to probe directly. In a new study, published in the January 20th issue of Nature, the researchers showed that defects and structural deformations can provide insights into a real material’s hidden topological features.
    “Most studies in this field have focused on materials with perfect internal structure. Our team wanted to see what happens when we account for imperfections. We were surprised to discover that we could actually use defects to our advantage,” said Bahl, an associate professor in the Department of Mechanical Science and Engineering. With that unexpected assist, the team has created a practical and systematic approach for exploring the topology of unconventional materials.
    Topology is a way of mathematically classifying objects according to their overall shape, rather than every small detail of their structure. One common illustration of this is a coffee mug and a bagel, which have the same topology because both objects have only one hole that you can wrap your fingers through.
    Materials can also have topological features related to the classification of their atomic structure and energy levels. These features lead to unusual, yet possibly useful, electron behaviors. But verifying and harnessing topological effects can be tricky, especially if a material is new or unknown. In recent years, scientists have used metamaterials to study topology with a level of control that is nearly impossible to achieve with real materials.
    “Our group developed a toolkit for being able to probe and confirm topology without having any preconceived notions about a material.” says Hughes, who is a professor in the Department of Physics. “This has given us a new window into understanding the topology of materials, and how we should measure it and confirm it experimentally.”
    In an earlier study published in Science, the team established a novel technique for identifying insulators with topological features. Their findings were based on translating experimental measurements made on metamaterials into the language of electronic charge. In this new work, the team went a step further — they used an imperfection in the material’s structure to trap a feature that is equivalent to fractional charges in real materials.

    advertisement

    A single electron by itself cannot carry half a charge or some other fractional amount. But, fragmented charges can show up within crystals, where many electrons dance together in a ballroom of atoms. This choreography of interactions induces odd electronic behaviors that are otherwise disallowed. Fractional charges have not been measured in either naturally occurring or custom-grown crystals, but this team showed that analogous quantities can be measured in a metamaterial.
    The team assembled arrays of centimeter-scale microwave resonators onto a chip. “Each of these resonators plays the role of an atom in a crystal and, similar to an atom’s energy levels, has a specific frequency where it easily absorbs energy — in this case the frequency is similar that of a conventional microwave oven.” said lead author Kitt Peterson, a former graduate student in Bahl’s group.
    The resonators are arranged into squares, repeating across the metamaterial. The team included defects by disrupting this square pattern — either by removing one resonator to make a triangle or adding one to create a pentagon. Since all the resonators are connected together, these singular disclination defects ripple out, warping the overall shape of the material and its topology.
    The team injected microwaves into each resonator of the array and recorded the amount of absorption. Then, they mathematically translated their measurements to predict how electrons act in an equivalent material. From this, they concluded that fractional charges would be trapped on disclination defects in such a crystal. With further analysis, the team also demonstrated that trapped fractional charge signals the presence of certain kinds of topology.
    “In these crystals, fractional charge turns out to be the most fundamental observable signature of interesting underlying topological features” said Tianhe Li, a theoretical physics graduate student in Hughes’ research group and a co-author on the study.
    Observing fractional charges directly remains a challenge, but metamaterials offer an alternative way to test theories and learn about manipulating topological forms of matter. According to the researchers, reliable probes for topology are also critical for developing future applications for topological quantum materials.
    The connection between the topology of a material and its imperfect geometry is also broadly interesting for theoretical physics. “Engineering a perfect material does not necessarily reveal much about real materials,” says Hughes. “Thus, studying the connection between defects, like the ones in this study, and topological matter may increase our understanding of realistic materials, with all of their inherent complexities.” More

  • in

    Record-breaking laser link could help us test whether Einstein was right

    Scientists from the International Centre for Radio Astronomy Research (ICRAR) and The University of Western Australia (UWA) have set a world record for the most stable transmission of a laser signal through the atmosphere.
    In a study published today in the journal Nature Communications, Australian researchers teamed up with researchers from the French National Centre for Space Studies (CNES) and the French metrology lab Systèmes de Référence Temps-Espace (SYRTE) at Paris Observatory.
    The team set the world record for the most stable laser transmission by combining the Aussies’ ‘phase stabilisation’ technology with advanced self-guiding optical terminals.
    Together, these technologies allowed laser signals to be sent from one point to another without interference from the atmosphere.
    Lead author Benjamin Dix-Matthews, a PhD student at ICRAR and UWA, said the technique effectively eliminates atmospheric turbulence.
    “We can correct for atmospheric turbulence in 3D, that is, left-right, up-down and, critically, along the line of flight,” he said.

    advertisement

    “It’s as if the moving atmosphere has been removed and doesn’t exist.
    “It allows us to send highly-stable laser signals through the atmosphere while retaining the quality of the original signal.”
    The result is the world’s most precise method for comparing the flow of time between two separate locations using a laser system transmitted through the atmosphere.
    ICRAR-UWA senior researcher Dr Sascha Schediwy said the research has exciting applications.
    “If you have one of these optical terminals on the ground and another on a satellite in space, then you can start to explore fundamental physics,” he said.

    advertisement

    “Everything from testing Einstein’s theory of general relativity more precisely than ever before, to discovering if fundamental physical constants change over time.”
    The technology’s precise measurements also have practical uses in earth science and geophysics.
    “For instance, this technology could improve satellite-based studies of how the water table changes over time, or to look for ore deposits underground,” Dr Schediwy said.
    There are further potential benefits for optical communications, an emerging field that uses light to carry information.
    Optical communications can securely transmit data between satellites and Earth with much higher data rates than current radio communications.
    “Our technology could help us increase the data rate from satellites to ground by orders of magnitude,” Dr Schediwy said.
    “The next generation of big data-gathering satellites would be able to get critical information to the ground faster.”
    The phase stabilisation technology behind the record-breaking link was originally developed to synchronise incoming signals for the Square Kilometre Array telescope.
    The multi-billion-dollar telescope is set to be built in Western Australia and South Africa from 2021. More

  • in

    Bringing atoms to a standstill: Miniaturizing laser cooling

    It’s cool to be small. Scientists at the National Institute of Standards and Technology (NIST) have miniaturized the optical components required to cool atoms down to a few thousandths of a degree above absolute zero, the first step in employing them on microchips to drive a new generation of super-accurate atomic clocks, enable navigation without GPS, and simulate quantum systems.
    Cooling atoms is equivalent to slowing them down, which makes them a lot easier to study. At room temperature, atoms whiz through the air at nearly the speed of sound, some 343 meters per second. The rapid, randomly moving atoms have only fleeting interactions with other particles, and their motion can make it difficult to measure transitions between atomic energy levels. When atoms slow to a crawl — about 0.1 meters per second — researchers can measure the particles’ energy transitions and other quantum properties accurately enough to use as reference standards in a myriad of navigation and other devices.
    For more than two decades, scientists have cooled atoms by bombarding them with laser light, a feat for which NIST physicist Bill Phillips shared the 1997 Nobel Prize in physics. Although laser light would ordinarily energize atoms, causing them to move faster, if the frequency and other properties of the light are chosen carefully, the opposite happens. Upon striking the atoms, the laser photons reduce the atoms’ momentum until they are moving slowly enough to be trapped by a magnetic field.
    But to prepare the laser light so that it has the properties to cool atoms typically requires an optical assembly as big as a dining-room table. That’s a problem because it limits the use of these ultracold atoms outside the laboratory, where they could become a key element of highly accurate navigation sensors, magnetometers and quantum simulations.
    Now NIST researcher William McGehee and his colleagues have devised a compact optical platform, only about 15 centimeters (5.9 inches) long, that cools and traps gaseous atoms in a 1-centimeter-wide region. Although other miniature cooling systems have been built, this is the first one that relies solely on flat, or planar, optics, which are easy to mass produce.
    “This is important as it demonstrates a pathway for making real devices and not just small versions of laboratory experiments,” said McGehee. The new optical system, while still about 10 times too big to fit on a microchip, is a key step toward employing ultracold atoms in a host of compact, chip-based navigation and quantum devices outside a laboratory setting. Researchers from the Joint Quantum Institute, a collaboration between NIST and the University of Maryland in College Park, along with scientists from the University of Maryland’s Institute for Research in Electronics and Applied Physics, also contributed to the study.

    advertisement

    The apparatus, described online in the New Journal of Physics, consists of three optical elements. First, light is launched from an optical integrated circuit using a device called an extreme mode converter. The converter enlarges the narrow laser beam, initially about 500 nanometers (nm) in diameter (about five thousandths the thickness of a human hair), to 280 times that width. The enlarged beam then strikes a carefully engineered, ultrathin film known as a “metasurface” that’s studded with tiny pillars, about 600 nm in length and 100 nm wide.
    The nanopillars act to further widen the laser beam by another factor of 100. The dramatic widening is necessary for the beam to efficiently interact with and cool a large collection of atoms. Moreover, by accomplishing that feat within a small region of space, the metasurface miniaturizes the cooling process.
    The metasurface reshapes the light in two other important ways, simultaneously altering the intensity and polarization (direction of vibration) of the light waves. Ordinarily, the intensity follows a bell-shaped curve, in which the light is brightest at the center of the beam, with a gradual falloff on either side. The NIST researchers designed the nanopillars so that the tiny structures modify the intensity, creating a beam that has a uniform brightness across its entire width. The uniform brightness allows more efficient use of the available light. Polarization of the light is also critical for laser cooling.
    The expanding, reshaped beam then strikes a diffraction grating that splits the single beam into three pairs of equal and oppositely directed beams. Combined with an applied magnetic field, the four beams, pushing on the atoms in opposing directions, serve to trap the cooled atoms.
    Each component of the optical system — the converter, the metasurface and the grating — had been developed at NIST but was in operation at separate laboratories on the two NIST campuses, in Gaithersburg, Maryland and Boulder, Colorado. McGehee and his team brought the disparate components together to build the new system.
    “That’s the fun part of this story,” he said. “I knew all the NIST scientists who had independently worked on these different components, and I realized the elements could be put together to create a miniaturized laser cooling system.”
    Although the optical system will have to be 10 times smaller to laser-cool atoms on a chip, the experiment “is proof of principle that it can be done,” McGehee added.
    “Ultimately, making the light preparation smaller and less complicated will enable laser-cooling based technologies to exist outside of laboratories,” he said. More

  • in

    Designing customized 'brains' for robots

    Contemporary robots can move quickly. “The motors are fast, and they’re powerful,” says Sabrina Neuman.
    Yet in complex situations, like interactions with people, robots often don’t move quickly. “The hang up is what’s going on in the robot’s head,” she adds.
    Perceiving stimuli and calculating a response takes a “boatload of computation,” which limits reaction time, says Neuman, who recently graduated with a PhD from the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Neuman has found a way to fight this mismatch between a robot’s “mind” and body. The method, called robomorphic computing, uses a robot’s physical layout and intended applications to generate a customized computer chip that minimizes the robot’s response time.
    The advance could fuel a variety of robotics applications, including, potentially, frontline medical care of contagious patients. “It would be fantastic if we could have robots that could help reduce risk for patients and hospital workers,” says Neuman.
    Neuman will present the research at this April’s International Conference on Architectural Support for Programming Languages and Operating Systems. MIT co-authors include graduate student Thomas Bourgeat and Srini Devadas, the Edwin Sibley Webster Professor of Electrical Engineering and Neuman’s PhD advisor. Other co-authors include Brian Plancher, Thierry Tambe, and Vijay Janapa Reddi, all of Harvard University. Neuman is now a postdoctoral NSF Computing Innovation Fellow at Harvard’s School of Engineering and Applied Sciences.
    There are three main steps in a robot’s operation, according to Neuman. The first is perception, which includes gathering data using sensors or cameras. The second is mapping and localization: “Based on what they’ve seen, they have to construct a map of the world around them and then localize themselves within that map,” says Neuman. The third step is motion planning and control — in other words, plotting a course of action.

    advertisement

    These steps can take time and an awful lot of computing power. “For robots to be deployed into the field and safely operate in dynamic environments around humans, they need to be able to think and react very quickly,” says Plancher. “Current algorithms cannot be run on current CPU hardware fast enough.”
    Neuman adds that researchers have been investigating better algorithms, but she thinks software improvements alone aren’t the answer. “What’s relatively new is the idea that you might also explore better hardware.” That means moving beyond a standard-issue CPU processing chip that comprises a robot’s brain — with the help of hardware acceleration.
    Hardware acceleration refers to the use of a specialized hardware unit to perform certain computing tasks more efficiently. A commonly used hardware accelerator is the graphics processing unit (GPU), a chip specialized for parallel processing. These devices are handy for graphics because their parallel structure allows them to simultaneously process thousands of pixels. “A GPU is not the best at everything, but it’s the best at what it’s built for,” says Neuman. “You get higher performance for a particular application.” Most robots are designed with an intended set of applications and could therefore benefit from hardware acceleration. That’s why Neuman’s team developed robomorphic computing.
    The system creates a customized hardware design to best serve a particular robot’s computing needs. The user inputs the parameters of a robot, like its limb layout and how its various joints can move. Neuman’s system translates these physical properties into mathematical matrices. These matrices are “sparse,” meaning they contain many zero values that roughly correspond to movements that are impossible given a robot’s particular anatomy. (Similarly, your arm’s movements are limited because it can only bend at certain joints — it’s not an infinitely pliable spaghetti noodle.)
    The system then designs a hardware architecture specialized to run calculations only on the non-zero values in the matrices. The resulting chip design is therefore tailored to maximize efficiency for the robot’s computing needs. And that customization paid off in testing.

    advertisement

    Hardware architecture designed using this method for a particular application outperformed off-the-shelf CPU and GPU units. While Neuman’s team didn’t fabricate a specialized chip from scratch, they programmed a customizable field-programmable gate array (FPGA) chip according to their system’s suggestions. Despite operating at a slower clock rate, that chip performed eight times faster than the CPU and 86 times faster than the GPU.
    “I was thrilled with those results,” says Neuman. “Even though we were hamstrung by the lower clock speed, we made up for it by just being more efficient.”
    Plancher sees widespread potential for robomorphic computing. “Ideally we can eventually fabricate a custom motion-planning chip for every robot, allowing them to quickly compute safe and efficient motions,” he says. “I wouldn’t be surprised if 20 years from now every robot had a handful of custom computer chips powering it, and this could be one of them.” Neuman adds that robomorphic computing might allow robots to relieve humans of risk in a range of settings, such as caring for covid-19 patients or manipulating heavy objects.
    Neuman next plans to automate the entire system of robomorphic computing. Users will simply drag and drop their robot’s parameters, and “out the other end comes the hardware description. I think that’s the thing that’ll push it over the edge and make it really useful.”
    This research was funded by the National Science Foundation, the Computing Research Agency, the CIFellows Project, and the Defense Advanced Research Projects Agency. More