More stories

  • in

    Targeted prevention helps stop homelessness before it starts

    Homelessness has become an increasingly worrisome crisis in our nation over the past several years, but a new study from the University of Notre Dame shows that efforts to prevent homelessness work.
    The issue has reached such proportions in California, for example, that mayors of several major cities have declared a state of emergency on homelessness. In response, leaders in California have invested billions in homelessness programs, including some that target prevention.
    Prevention efforts, however, have led to questions — even from organizations committed to addressing homelessness — as to whether such programs are effective, due to the difficulty of targeting assistance to those with the greatest risk of becoming homeless. To test the impact of providing financial assistance to those susceptible to losing their housing, researchers at Notre Dame conducted a randomized controlled trial to evaluate the effect of emergency financial assistance (EFA) on families receiving support through the Santa Clara County Homelessness Prevention System, which is co-led by Destination: Home, a nonprofit organization dedicated to ending homelessness in Silicon Valley.
    David Phillips, a research professor in the Wilson Sheehan Lab for Economic Opportunities (LEO) within Notre Dame’s economics department, and James Sullivan, a professor of economics and co-founder of LEO, found that people offered EFA were 81 percent less likely to become homeless within six months of enrollment and 73 percent less likely within 12 months, as reported in their study recently published by The Review of Economics and Statistics.
    The study evaluated individuals and families at imminent risk of being evicted or becoming homeless who were allocated EFA between July 2019 and December 2020, with the average household receiving nearly $2,000. Recipients were chosen from among a larger group of people eligible for the program based on their vulnerability to homelessness and on a randomized system set up by LEO and Destination: Home. This temporary financial assistance helped pay rent, utilities or other housing-related expenses on their behalf.
    A common approach to fighting homelessness is to provide shelter to those who are already homeless, but the researchers argued that once a family or individual becomes homeless, they face even more difficulties — such as finding permanent housing, basic necessities and health care. They are also more likely to become involved in the criminal justice system and experience frequent hospital visits. LEO’s study found that a preventive approach focusing directly on helping those who are on the brink of homelessness can also be effective.
    “Our estimates suggest that the benefits to homelessness prevention exceed the costs,” the researchers said. They estimated that communities get $2.47 back in benefits per net dollar spent on emergency financial assistance.
    “Policymakers at all levels are struggling to make really hard decisions about how to allocate scarce resources to address this pervasive problem,” Sullivan said. “But this study shows that you can actually target the intervention to those at risk, which moves the needle on homelessness enough to justify making the investment.”
    Phillips added that while homelessness prevention programs are not a panacea to other problems often associated with the most visible forms of homelessness — such as health and substance abuse issues — it is still an effective way to help people.
    “Every person who ends up homeless is a little different from the next, and the reasons they’re there are different, but it’s the kind of help they need at the moment they need it, before everything falls apart,” Phillips said.
    One of LEO’s main tenets is to take a rigorous approach to fighting poverty by helping service providers apply scientific evaluation methods to better understand and share effective poverty interventions. Said Sullivan, “A big part of LEO’s mission is to create evidence that helps improve the lives of those most vulnerable. Because we have far greater needs than we have resources to address them, we have a real incentive to allocate those resources to the programs that are most effective. This evidence helps shape the decisions of those on the front lines fighting homelessness and poverty.”
    Jennifer Loving, chief executive officer of Destination: Home, said the LEO study has implications both locally and nationally. “This could inspire other jurisdictions to stand up their own homelessness prevention systems, using this research as a model or starting point for how to do that on their own — as well as justification to policymakers for funding,” Loving said. More

  • in

    Surgical and engineering innovations enable unprecedented control over every finger of a bionic hand

    Prosthetic limbs are the most common solution to replace a lost extremity. However, they are hard to control and often unreliable with only a couple of movements available. Remnant muscles in the residual limb are the preferred source of control for bionic hands. This is because patients can contract muscles at will, and the electrical activity generated by the contractions can be used to tell the prosthetic hand what to do, for instance, open or close. A major problem at higher amputation levels, such as above the elbow, is that not many muscles remain to command the many robotic joints needed to truly restore the function of an arm and hand.
    A multidisciplinary team of surgeons and engineers has circumvented this problem by reconfiguring the residual limb and integrating sensors and a skeletal implant to connect with a prosthesis electrically and mechanically. By dissecting the peripheral nerves and redistributing them to new muscle targets used as biological amplifiers, the bionic prosthesis can now access much more information so the user can command many robotic joints at will (video: https://youtu.be/h1N-vKku0hg).
    The research was led by Professor Max Ortiz Catalan, Founding Director of the Center for Bionics and Pain Research (CBPR) in Sweden, Head of Neural Prosthetics Research at the Bionics Institute in Australia, and Professor of Bionics at Chalmers University of Technology in Sweden.
    “In this article, we show that rewiring nerves to different muscle targets in a distributed and concurrent manner is not only possible but also conducive to improved prosthetic control. A key feature of our work is that we have the possibility to clinically implement more refine surgical procedures and embed sensors in the neuromuscular constructs at the time of the surgery, which we then connect to the electronic system of the prosthesis via an osseointegrated interface. A.I. algorithms take care of the rest.”
    Prosthetic limbs are commonly attached to the body by a socket that compresses the residual limb causing discomfort and is mechanically unstable. An alternative to socket attachment is to use a titanium implant placed within the residual bone which becomes strongly anchored — this is known as osseointegration. Such skeletal attachment allows for comfortable and more efficient mechanical connection of the prosthesis to the body.
    “It is rewarding to see that our cutting-edge surgical and engineering innovation can provide such a high level of functionality for an individual with an arm amputation. This achievement is based on over 30 years of gradual development of the concept, in which I am proud to have contributed” comments Dr. Rickard Brånemark, research affiliate at MIT, associate professor at Gothenburg University, CEO of Integrum, a leading expert on osseointegration for limb prostheses, who conducted the implantation of the interface.
    The surgery took place at the Sahlgrenska University Hospital, Sweden, where CBPR is located. The neuromuscular reconstruction procedure was conducted by Dr. Paolo Sassu, who also led the first hand transplantation performed in Scandinavia.
    “The incredible journey we have undertaken together with the bionic engineers at CBPR has allowed us to combine new microsurgical techniques with sophisticated implanted electrodes that provide single-finger control of a prosthetic arm as well as sensory feedback. Patients who have suffered from an arm amputation might now see a brighter future,” says Dr. Sassu, who is presently working at the Istituto Ortopedico Rizzoli in Italy.
    The Science Translational Medicine article illustrates how the transferred nerves progressively connected to their new hosting muscles. Once the innervation process had advanced enough, the researchers connected them to the prosthesis so the patient could control every finger of a prosthetic hand as if it would be his own (video: https://youtu.be/FdDdZQg58kc). The researchers also demonstrated how the system respond in activities of the daily life (video: https://youtu.be/yC24WRoGIe8) and are currently in the process of further improving the controllability of the bionic hand. More

  • in

    Robot team on lunar exploration tour

    On the Moon, there are raw materials that humanity could one day mine and use. Various space agencies, such as the European Space Agency (ESA), are already planning missions to better explore Earth’s satellite and find minerals. This calls for appropriate exploration vehicles. Swiss researchers led by ETH Zurich are now pursuing the idea of sending not just one solitary rover on an exploration tour, but rather an entire team of vehicles and flying devices that complement each other.
    The researchers equipped three ANYmal — a type of legged robot developed at ETH — with a range of measuring and analysis instruments that would potentially make them suitable exploration devices in the future. They tested these robots on various terrains in Switzerland and at the European Space Resources Innovation Centre (ESRIC) in Luxembourg, where, a few months ago, the Swiss team won a European competition for lunar exploration robots together with colleagues from Germany. The competition involved finding and identifying minerals on a test site modelled after the surface of the Moon. In the latest issue of the journal Science Robotics, the scientists describe how they go about exploring an unknown terrain using a team of robots.
    Insurance against failure
    “Using multiple robots has two advantages,” explains Philip Arm, a doctoral student in the group led by ETH Professor Marco Hutter. “The individual robots can take on specialised tasks and perform them simultaneously. Moreover, thanks to its redundancy, a robot team is able to compensate for a teammate’s failure.” Redundancy in this case means that important measuring equipment is installed on several robots. In other words, redundancy and specialisation are opposing goals. “Getting the benefits of both is a matter of finding the right balance,” Arm says.
    The researchers at ETH Zurich and the Universities of Basel, Bern and Zurich solved this problem by equipping two of the legged robots as specialists. One robot was programmed to be particularly good at mapping the terrain and classifying the geology. It used a laser scanner and several cameras — some of them capable of spectral analysis — to gather initial clues about the mineral composition of the rock. The other specialist robot was taught to precisely identify rocks using a Raman spectrometer and a microscopy camera.
    The third robot was a generalist: it was able to both map the terrain and identify rocks, which meant that it had a broader range of tasks than the specialists. However, its equipment meant that it could perform these tasks with less precision. “This makes it possible to complete the mission should any one of the robots malfunction,” Arm says.
    Combination is key
    At the ESRIC and ESA Space Resources Challenge, the jury was particularly impressed that the researchers had built redundancy into their exploration system to make it resilient to potential failures. As a prize, the Swiss scientists and their colleagues from the FZI Research Center for Information Technology in Karlsruhe, were awarded a one-year research contract to further develop this technology. In addition to legged robots, this work will also involve robots with wheels, building on the FZI researchers’ experience with such robots.
    “Legged robots like our ANYmal cope well in rocky and steep terrain, for example when it comes to climbing down into a crater,” explains Hendrik Kolvenbach, a senior scientist in Professor Hutter’s group. Robots with wheels are at a disadvantage in these kinds of conditions, but they can move faster on less challenging terrain. For a future mission, it would therefore make sense to combine robots that differ in terms of their mode of locomotion. Flying robots could also be added to the team.
    The researchers also plan to make the robots more autonomous. Presently, all data from the robots flows into a control centre, where an operator assigns tasks to the individual robots. In the future, semi-autonomous robots could directly assign certain tasks to each other, with control and intervention options for the operator.
    Video: https://youtu.be/bqwbQzVrzkQ More

  • in

    Generative AI ‘fools’ scientists with artificial data, bringing automated data analysis closer

    The same AI technology used to mimic human art can now synthesize artificial scientific data, advancing efforts toward fully automated data analysis.
    Researchers at the University of Illinois Urbana-Champaign have developed an AI that generates artificial data from microscopy experiments commonly used to characterize atomic-level material structures. Drawing from the technology underlying art generators, the AI allows the researchers to incorporate background noise and experimental imperfections into the generated data, allowing material features to be detected much faster and more efficiently than before.
    “Generative AIs take information and generate new things that haven’t existed before in the world, and now we’ve leveraged that for the goal of automated data analysis,” said Pinshane Huang, a U. of I. professor of materials science and engineering and a project co-lead. “What is used to make paintings of llamas in the style of Monet on the internet can now make scientific data so good it fools me and my colleagues.”
    Other forms of AI and machine learning are routinely used in materials science to assist with data analysis, but they require frequent, time-consuming human intervention. Making these analysis routines more efficient requires a large set of labeled data to show the program what to look for. Moreover, the data set needs to account for a wide range of background noise and experimental imperfections to be effective, effects that are difficult to model.
    Since collecting and labeling such a vast data set using a real microscope is infeasible, Huang worked with U. of I. physics professor Bryan Clark to develop a generative AI that could create a large set of artificial training data from a comparatively small set of real, labeled data. To achieve this, the researchers used a cycle generative adversarial network, or CycleGAN.
    “You can think of a CycleGAN as a competition between two entities,” Clark said. “There’s a ‘generator’ whose job is to imitate a provided data set, and there’s a ‘discriminator’ whose job is to spot the differences between the generator and the real data. They take turns trying to foil each other, improving themselves based on what the other was able to do. Ultimately, the generator can produce artificial data that is virtually indistinguishable from the real data.”
    By providing the CycleGAN with a small sample of real microscopy images, the AI learned to generate images that were used to train the analysis routine. It is now capable of recognizing a wide range of structural features despite the background noise and systematic imperfections.
    “The remarkable part of this is that we never had to tell the AI what things like background noise and imperfections like aberration in the microscope are,” Clark said. “That means even if there’s something that we hadn’t thought about, the CycleGAN can learn it and run with it.”
    Huang’s research group has incorporated the CycleGAN into their experiments to detect defects in two-dimensional semiconductors, a class of materials that is promising for applications in electronics and optics but is difficult to characterize without the aid of AI. However, she observed that the method has a much broader reach.
    “The dream is to one day have a ‘self-driving’ microscope, and the biggest barrier was understanding how to process the data,” she said. “Our work fills in this gap. We show how you can teach a microscope how to find interesting things without having to know what you’re looking for.” More

  • in

    Physicists work to prevent information loss in quantum computing

    Nothing exists in a vacuum, but physicists often wish this weren’t the case. If the systems that scientists study could be completely isolated from the outside world, things would be a lot easier.
    Take quantum computing. It’s a field that’s already drawing billions of dollars in support from tech investors and industry heavyweights including IBM, Google and Microsoft. But if the tiniest vibrations creep in from the outside world, they can cause a quantum system to lose information.
    For instance, even light can cause information leaks if it has enough energy to jiggle the atoms within a quantum processor chip.
    “Everyone is really excited about building quantum computers to answer really hard and important questions,” said Joe Kitzman, a doctoral student at Michigan State University. “But vibrational excitations can really mess up a quantum processor.”
    But, with new research published in the journal Nature Communications, Kitzman and his colleagues are showing that these vibrations need not be a hindrance. In fact, they could benefit quantum technology.
    “If we can understand how the vibrations couple with our system, we can use that as a resource and a tool for creating and stabilizing some types of quantum states,” Kitzman said.

    What that means is that researchers can use these results to help mitigate information lost by quantum bits, or qubits (pronounced “q bits”).
    Conventional computers rely on a clear-cut binary logic. Bits encode information by taking on one of two distinct possible states, often denoted as zero or one. Qubits, however, are more flexible and can exist in states that are simultaneously both zero and one.
    Although that may sound like cheating, it’s well within the rules of quantum mechanics. Still, this feature should give quantum computers valuable advantages over conventional computers for certain problems in a variety of areas, including science, finance and cybersecurity.
    Beyond its implications for quantum technology, the MSU-led team’s report also helps set the stage for future experiments to better explore quantum systems in general.
    “Ideally, you want to separate your system from the environment, but the environment is always there,” said Johannes Pollanen, the Jerry Cowen Endowed Chair of Physics in the MSU Department of Physics and Astronomy. “It’s almost like junk you don’t want to deal with, but you can learn all kinds of cool stuff about the quantum world when you do.”
    Pollanen also leads the Laboratory for Hybrid Quantum Systems, of which Kitzman is a member, in the College of Natural Science. For the experiments led by Pollanen and Kitzman, the team built a system consisting of a superconducting qubit and what are known as surface acoustic wave resonators.

    These qubits are one of the most popular varieties among companies developing quantum computers. Mechanical resonators are used in many modern communications devices, including cellphones and garage door openers, and now, groups like Pollanen’s are putting them to work in emerging quantum technology.
    The team’s resonators allowed the researchers to tune the vibrations experienced by qubits and understand how the mechanical interaction between the two influenced the fidelity of quantum information.
    “We’re creating a paradigm system to understand how this information is scrambled,” said Pollanen. “We have control over the environment, in this case, the mechanical vibrations in the resonator, as well as the qubit.”
    “If you can understand how these environmental losses affect the system, you can use that to your advantage,” Kitzman said. “The first step in solving a problem is understanding it.”
    MSU is one of only a few places equipped and staffed to perform experiments on these coupled qubit-mechanical resonator devices, Pollanen said, and the researchers are excited to use their system for further exploration. The team also included scientists from the Massachusetts Institute of Technology and the Washington University in St. Louis. More

  • in

    A foundation that fits just right gives superconducting nickelates a boost

    Researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University say they’ve found a way to make thin films of an exciting new nickel oxide superconductor that are free of extended defects.
    Not only does this improve the material’s ability to conduct electricity with no loss, they said, but it also allows them to discover its true nature and properties, both in and out of the superconducting state, for the first time.
    Their first look at a superconducting nickel oxide, or nickelate, that does not have defects revealed that it is more like the cuprates – which hold the world’s high-temperature record for unconventional superconductivity at normal pressures — than previously thought. For instance, when the nickelate is tweaked to optimize its superconductivity and then heated above its superconducting temperature, its resistance to the flow of electric current increases in a linear fashion, just as in cuprates.
    Those striking similarities, they said, may mean these two very different materials achieve superconductivity in much the same way.
    It’s the latest step in a 35-year quest to develop superconductors that can operate at close to room temperature, which would revolutionize electronics, transportation, power transmission and other technologies by allowing them to operate without energy-wasting electrical resistance.
    The research team, led by Harold Hwang, director of the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC, described their work today in the journal Nature.

    “Nickelate films are really unstable, and until now our efforts to stabilize them on top of other materials have produced defects that are like speed bumps for electrons,” said Kyuho Lee, a SIMES postdoctoral researcher who contributed to the discovery of superconductivity in nickelates four years ago and has been working on them ever since.
    “These quality issues have led to many debates and open questions about nickelate properties, with research groups reporting widely varying results,” Lee said. “So eliminating the defects is a significant breakthrough. It means we can finally address the underlying physics behind these materials and behind unconventional superconductivity in general.”
    Jenga chemistry and a just-right fit
    The defects, which are a bit like misaligned zipper teeth, arise from the same innovative process that allowed Hwang’s team to create and stabilize a nickelate film in the first place.
    They started by making a common material known as perovskite. They “doped” it to change its electrical conductivity, then exposed it to a chemical that deftly removed layers of oxygen atoms from its molecular structure, much like removing a stick from a tower of Jenga blocks. With the oxygen layers gone, the film settled into a new structure — known as an infinite-layer nickelate -that can host superconductivity.

    The atomic latticework of this new structure occupied a slightly bigger surface area than the original. With this in mind, they had built the film on a foundation, or substrate, that would be a good fit for the finished, spread-out product, Lee said.
    But it didn’t match the atomic lattice of the starting material, which developed defects as it tried to fit comfortably onto the substrate — and those imperfections carried through to the finished nickelate.
    Hwang said it’s as if two friends of different sizes had to share a coat. If the coat fit the smaller friend perfectly, the larger one would have a hard time zipping it up. If it fit the larger friend perfectly, it would hang like a tent on the smaller one and let the cold in. An in-between size might not be the best fit for either of them, but it’s close enough to keep them both warm and happy.
    That’s the solution Lee and his colleagues pursued.
    In a series of meticulous experiments, they used a substrate that was like the in-between coat. The atomic structure of its surface was a close enough fit for both the starting and ending materials that the finished nickelate came out defect-free. Lee said the team is already starting to see some interesting physics in the nickelate now that the system is much cleaner.
    “What this means,” Hwang said, “is that we are getting closer and closer to measuring the intrinsic properties of these materials. And by sharing the details of how to make defect-free nickelates, we hope to benefit the field as a whole.”
    Researchers from Cornell University contributed to this work, which was funded by the DOE Office of Science and the Gordon and Betty Moore Foundation’s Emergent Phenomena in Quantum Systems Initiative. More

  • in

    Supercomputer used to simulate winds that cause clear air turbulence

    A research group from Nagoya University has accurately simulated air turbulence occurring on clear days around Tokyo using Japan’s fastest supercomputer. They then compared their findings with flight data to create a more accurate predictive model. The research was reported in the journal Geophysical Research Letters.
    Although air turbulence is usually associated with bad weather, an airplane cabin can shake violently even on a sunny and cloudless day. Known as clear air turbulence (CAT), these turbulent air movements can occur in the absence of any visible clouds or other atmospheric disturbances. Although the exact mechanisms that cause CAT are not fully understood, it is believed to be primarily driven by wind shear and atmospheric instability.
    CAT poses a high risk to aviation safety. The sudden turbulence on an otherwise calm day can lead to passenger and crew member injuries, aircraft damage, and disruptions to flight operations. Pilots rely on reports from other aircraft, weather radar, and atmospheric models to anticipate and avoid areas of potential turbulence. However, since CAT shows no visible indicators, such as clouds or storms, it is particularly challenging to detect and forecast.
    As winds swirl and circulate creating sudden changes in airflow, eddies are created that can shake an aircraft. Therefore, to better understand CAT, scientists model it using large-eddy simulation (LES), a computational fluid dynamics technique used to simulate these turbulent flows. However, despite its importance to research on air turbulence, one of the greatest challenges of LES is the computational cost. Simulating the complex interactions involved in LES requires high levels of computing power.
    To elaborately simulate the process of turbulence generation using high-resolution LES, the research group from Nagoya University turned to an exascale computer called the Fugaku supercomputer. It is a high-performance computing system, currently ranked as the world’s second fastest supercomputer.
    Using Fugaku’s immense computational power, Dr. Ryoichi Yoshimura of Nagoya University in collaboration with Dr. Junshi Ito and others at Tohoku University, performed an ultra-high-resolution simulation of the CAT above Tokyo’s Haneda airport in winter caused by low pressure and a nearby mountain range.
    They found that the wind speed disturbance was caused by the collapse of the Kelvin-Helmholtz instability wave, a specific type of instability that occurs the interface between two layers of air with different velocities. As one layer has higher velocity than the other, it creates a wave-like effect as it pulls at the lower velocity layer. As the atmospheric waves grow from the west and collapse in the east, this phenomenon creates several fine vortices, creating turbulence.
    After making their computations, the group needed to confirm whether their simulated vortices were consistent with real-world data. “Around Tokyo, there is a lot of observational data available to validate our results,” said Yoshimura. “There are many airplanes flying over the airports, which results in many reports of turbulence and the intensity of shaking. Atmospheric observations by a balloon near Tokyo were also used. The shaking data recorded at that time was used to show that the calculations were valid.”
    “The results of this research should lead to a deeper understanding of the principle and mechanism of turbulence generation by high-resolution simulation and allow us to investigate the effects of turbulence on airplanes in more detail,” said Yoshimura. “Since significant turbulence has been shown to occur in the limited 3D region, routing without flying in the region is possible by adjusting flight levels if the presence of active turbulence is known in advance. LES would provide a smart way of flying by providing more accurate turbulence forecasts and real-time prediction.” More

  • in

    Pump powers soft robots, makes cocktails

    The hottest drink of the summer may be the SEAS-colada. Here’s what you need to make it: gin, pineapple juice, coconut milk and a dielectric elastomer actuator-based soft peristaltic pump. Unfortunately, the last component can only be found in the lab of Robert Wood, the Harry Lewis and Marlyn McGrath Professor of Engineering and Applied Sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences.
    At least, for now.
    Wood and his team designed the pump to solve a major challenge in soft robotics — how to replace traditionally bulky and rigid power components with soft alternatives.
    Over the past several years, Wood’s Microrobotics Lab at SEAS has been developing soft analogues of traditionally rigid robotic components, including valves and sensors. In fluid-driven robotic systems, pumps control the pressure or flow of the liquid that powers the robot’s movement. Most pumps available today for soft robotics are either too large and rigid to fit onboard, not powerful enough for actuation or only work with specific fluids.
    Wood’s team developed a compact, soft pump with adjustable pressure flow versatile enough to pump a variety of fluids with varying viscosity, including gin, juice, and coconut milk, and powerful enough to power soft haptic devices and a soft robotic finger.
    The pump’s size, power and versatility opens up a range of possibilities for soft robots in a variety of applications, including food handling, manufacturing, and biomedical therapeutics.

    The research was published recently in Science Robotics.
    Peristaltic pumps are widely used in industry. These simple machines use motors to compress a flexible tube, creating a pressure differential that forces liquid through the tube. These types of pumps are especially useful in biomedical applications because the fluid doesn’t touch any component of the pump itself.
    “Peristaltic pumps can deliver liquids with a wide range of viscosities, particle-liquid suspensions, or fluids such as blood, which are challenging for other types of pumps,” said first author Siyi Xu, a former graduate student at SEAS and current postdoctoral fellow in Wood’s lab.
    Building off previous research, Xu and the team designed electrically powered dielectric elastomer actuators (DEAs) to act as the pump’s motor and rollers. These soft actuators have ultra-high power density, are lightweight, and can run for hundreds of thousands of cycles.
    The team designed an array of DEAs that coordinate with each other, compressing a millimeter-sized channel in a programmed sequence to produce pressure waves.

    The result is a centimeter-sized pump small enough to fit on board a small soft robot and powerful enough to actuate movement, with controllable pressure, flow rate, and flow direction.
    “We also demonstrated that we could actively tune the output from continuous flow to droplets by varying the input voltages and the outlet resistance, in our case the diameter of the blunt needle,” said Xu. “This capability may allow the pump to be useful not only for robotics but also for microfluidic applications.”
    “The majority of soft robots contain rigid components somewhere along their drivetrain,” said Wood. “This topic started as an effort to swap out one of those key pieces, the pump, with a soft alternative. But along the way we realized that compact soft pumps may have far greater utility, for example in biomedical settings for drug delivery or implantable therapeutic devices.”
    The research was co-authored by Cara M. Nunez and Mohammad Souri. It was supported by the National Science Foundation under grant CMMI-1830291.
    Video: https://youtu.be/knC9HJ6K-sU More