More stories

  • in

    New small laser device can help detect signs of life on other planets

    As space missions delve deeper into the outer solar system, the need for more compact, resource-conserving and accurate analytical tools has become increasingly critical — especially as the hunt for extraterrestrial life and habitable planets or moons continues.
    A University of Maryland-led team developed a new instrument specifically tailored to the needs of NASA space missions. Their mini laser-sourced analyzer is significantly smaller and more resource efficient than its predecessors — all without compromising the quality of its ability to analyze planetary material samples and potential biological activity onsite. The team’s paper on this new device was published in the journal Nature Astronomy on January 16, 2023.
    Weighing only about 17 pounds, the instrument is a physically scaled-down combination of two important tools for detecting signs of life and identifying compositions of materials: a pulsed ultraviolet laser that removes small amounts of material from a planetary sample and an OrbitrapTM analyzer that delivers high-resolution data about the chemistry of the examined materials.
    “The Orbitrap was originally built for commercial use,” explained Ricardo Arevalo, lead author of the paper and an associate professor of geology at UMD. “You can find them in the labs of pharmaceutical, medical and proteomic industries. The one in my own lab is just under 400 pounds, so they’re quite large, and it took us eight years to make a prototype that could be used efficiently in space — significantly smaller and less resource-intensive, but still capable of cutting-edge science.”
    The team’s new gadget shrinks down the original Orbitrap while pairing it with laser desorption mass spectrometry (LDMS) — techniques that have yet to be applied in an extraterrestrial planetary environment. The new device boasts the same benefits as its larger predecessors but is streamlined for space exploration and onsite planetary material analysis, according to Arevalo.
    Thanks to its diminutive mass and minimal power requirements, the mini Orbitrap LDMS instrument can be easily stowed away and maintained on space mission payloads. The instrument’s analyses of a planetary surface or substance are also far less intrusive and thus much less likely to contaminate or damage a sample than many current methods that attempt to identify unknown compounds.

    “The good thing about a laser source is that anything that can be ionized can be analyzed. If we shoot our laser beam at an ice sample, we should be able to characterize the composition of the ice and see biosignatures in it,” Arevalo said. “This tool has such a high mass resolution and accuracy that any molecular or chemical structures in a sample become much more identifiable.”
    The laser component of the mini LDMS Orbitrap also allows researchers access to larger, more complex compounds that are more likely to be associated with biology. Smaller organic compounds like amino acids, for example, are more ambiguous signatures of life forms.
    “Amino acids can be produced abiotically, meaning that they’re not necessarily proof of life. Meteorites, many of which are chock full of amino acids, can crash onto a planet’s surface and deliver abiotic organics to the surface,” Arevalo said. “We know now that larger and more complex molecules, like proteins, are more likely to have been created by or associated with living systems. The laser lets us study larger and more complex organics that can reflect higher fidelity biosignatures than smaller, simpler compounds.”
    For Arevalo and his team, the mini LDMS Orbitrap will offer much-needed insight and flexibility for future ventures into the outer solar system, such as missions focused on life detection objectives (e.g., Enceladus Orbilander) and exploration of the lunar surface (e.g., the NASA Artemis Program). They hope to send their device into space and deploy it on a planetary target of interest within the next few years.
    “I view this prototype as a pathfinder for other future LDMS and Orbitrap-based instruments,” Arevalo said. “Our mini Orbitrap LDMS instrument has the potential to significantly enhance the way we currently study the geochemistry or astrobiology of a planetary surface.”
    Other UMD-affiliated researchers on the team include geology graduate students Lori Willhite and Ziqin “Grace” Ni, geology postdoctoral associates Anais Bardyn and Soumya Ray, and astronomy visiting associate research engineer Adrian Southard.
    This study was supported by NASA (Award Nos. 80NSSC19K0610, 80NSSC19K0768, 80GSFC21M0002), NASA Goddard Space Flight Center Internal Research Development (IRAD), and the University of Maryland Faculty Incentive Program. More

  • in

    Blocking radio waves and electromagnetic interference with the flip of a switch

    Researchers in Drexel University’s College of Engineering have developed a thin film device, fabricated by spray coating, that can block electromagnetic radiation with the flip of a switch. The breakthrough, enabled by versatile two-dimensional materials called MXenes, could adjust the performance of electronic devices, strengthen wireless connections and secure mobile communications against intrusion.
    The team, led by Yury Gogotsi, PhD, Distinguished University and Bach professor in Drexel’s College of Engineering, previously demonstrated that the two-dimensional layered MXene materials, discovered just over a decade ago, when combined with an electrolyte solution, can be turned into a potent active shield against electromagnetic waves. This latest MXene discovery, reported in Nature Nanotechnology, shows how this shielding can be tuned when a small voltage — less than that produced by an alkaline battery — is applied.
    “Dynamic control of electromagnetic wave jamming has been a significant technological challenge for protecting electronic devices working at gigahertz frequencies and a variety of other communications technologies,” Gogotsi said. “As the number of wireless devices being used in industrial and private sectors has increased by orders of magnitude over the past decade, the urgency of this challenge has grown accordingly. This is why our discovery — which would dynamically mitigate the effect of electromagnetic interference on these devices — could have a broad impact.”
    MXene is a unique material in that it is highly conductive — making it perfectly suited for reflecting microwave radiation that could cause static, feedback or diminish the performance of communications devices — but its internal chemical structure can also be temporarily altered to allow these electromagnetic waves to pass through.
    This means that a thin coating on a device or electrical components prevents them from both emitting electromagnetic waves, as well as being penetrated by those emitted by other electronics. Eliminating the possibility of interference from both internal and external sources can ensure the performance of the device, but some waves must be allowed to exit and enter when it is being used for communication.
    “Without being able to control the ebb and flow of electromagnetic waves within and around a device, it’s a bit like a leaky faucet — you’re not really turning off the water and that constant dripping is no good,” Gogotsi said. “Our shielding ensures the plumbing is tight — so-to-speak — no electromagnetic radiation is leaking out or getting in until we want to use the device.”
    The key to eliciting bidirectional tunability of MXene’s shielding property is using the flow and expulsion of ions to alternately expand and compress the space between material’s layers, like an accordion, as well as to change the surface chemistry of MXenes.

    With a small voltage applied to the film, ions enter — or intercalate — between the MXene layers altering the charge of their surface and inducing electrostatic attraction, which serves to change the layer spacing, the conductivity and shielding efficiency of the material. When the ions are deintercalated, as the current is switched off, the MXene layers return to their original state.
    The team tested 10 different MXene-electrolyte combinations, applying each via paint sprayer in a layer about 30 to 100 times thinner than a human hair. The materials consistently demonstrated the dynamic tunability of shielding efficiency in blocking microwave radiation, which is impossible for traditional metals like copper and steel. And the device sustained the performance through more than 500 charge-discharge cycles.
    “These results indicate that the MXene films can convert from electromagnetic interference shielding to quasi-electromagnetic wave transmission by electrochemical oxidation of MXenes,” Gogotsi and his co-authors wrote. “The MXene film can potentially serve as a dynamic EMI shielding switch.”
    For security applications, Gogotsi suggests that the MXene shielding could hide devices from detection by radar or other tracing systems. The team also tested the potential of a one-way shielding switch. This would allow a device to remain undetectable and protected from unauthorized access until it is deployed for use.
    “A one-way switch could open the protection and allow a signal to be sent or communication to be opened in an emergency or at the required moment,” Gogotsi said. “This means it could protect communications equipment from being influenced or tampered with until it is in use. For example, it could encase the device during transportation or storage and then activate only when it is ready to be used.”
    The next step for Gogotsi’s team is to explore additional MXene-electrolyte combinations and mechanisms to fine-tune the shielding to achieve a stronger modulation of electromagnetic wave transmission and dynamic adjustment to block radiation at a variety of bandwidths. More

  • in

    COVID calculations spur solution to old problem in computer science

    During the corona epidemic many of us became amateur mathematicians. How quickly would the number of hospitalized patients rise, and when would herd immunity be achieved? Professional mathematicians were challenged as well, and a researcher at University of Copenhagen became inspired to solve a 30-year-old problem in computer science. The breakthrough has just been published in th Journal of the ACM (Association for Computing Machinery).
    “Like many others, I was out to calculate how the epidemic would develop. I wanted to investigate certain ideas from theoretical computer science in this context. However, I realized that the lack of solution to the old problem was a showstopper,” says Joachim Kock, Associate Professor at the Department of Mathematics, University of Copenhagen.
    His solution to the problem can be of use in epidemiology and computer science, and potentially in other fields as well. A common feature for these fields is the presence of systems where the various components exhibit mutual influence. For instance, when a healthy person meets a person infected with COVID, the result can be two people infected.
    Smart method invented by German teenager
    To understand the breakthrough, one needs to know that such complex systems can be described mathematically through so-called Petri nets. The method was invented in 1939 by German Carl Adam Petri (by the way at the age of only 13) for chemistry applications. Just like a healthy person meeting a person infected with COVID can trigger a change, the same may happen when two chemical substances mix and react.
    In a Petri net the various components are drawn as circles while events such as a chemical reaction or an infection are drawn as squares. Next, circles and squares are connected by arrows which show the interdependencies in the system.

    A simple version of a Petri net for COVID infection. The starting point is a non-infected person. “S” denotes “susceptible.” Contact with an infected person (“I”) is an event which leads to two persons being infected. Later another event will happen, removing a person from the group of infected. Here, “R” denotes “recovered” which in this context could be either cured or dead. Either outcome would remove the person from the infected group.
    Computer scientists regarded the problem as unsolvable
    In chemistry, Petri nets are applied for calculating how the concentrations of various chemical substances in a mixture will evolve. This manner of thinking has influenced the use of Petri nets in other fields such as epidemiology: we are starting out with a high “concentration” of un-infected people, whereafter the “concentration” of infected starts to rise. In computer science, the use of Petri nets is somewhat different: the focus is on individuals rather than concentrations, and the development happens in steps rather than continuously.
    What Joachim Kock had in mind was to apply the more individual-oriented Petri nets from computer science for COVID calculations. This was when he encountered the old problem:
    “Basically, the processes in a Petri net can be described through two separate approaches. The first approach regards a process as a series of events, while the second approach sees the net as a graphical expression of the interdependencies between components and events,” says Joachim Kock, adding:
    “The serial approach is well suited for performing calculations. However, it has a downside since it describes causalities less accurately than the graphical approach. Further, the serial approach tends to fall short when dealing with events that take place simultaneously.”

    “The problem was that nobody had been able to unify the two approaches. The computer scientists had more or less resigned, regarding the problem as unsolvable. This was because no-one had realized that you need to go all the way back and revise the very definition of a Petri net,” says Joachim Kock.
    Small modification with large impact
    The Danish mathematician realized that a minor modification to the definition of a Petri net would enable a solution to the problem:
    “By allowing parallel arrows rather than just counting them and writing a number, additional information is made available. Things work out and the two approaches can be unified.”
    The exact mathematical reason why this additional information matters is complex, but can be illustrated by an analogy:
    “Assigning numbers to objects has helped humanity greatly. For instance, it is highly practical that I can arrange the right number of chairs in advance for a dinner party instead of having to experiment with different combinations of chairs and guests after they have arrived. However, the number of chairs and guests does not reveal who will be sitting where. Some information is lost when we consider numbers instead of the real objects.”
    Similarly, information is lost when the individual arrows of the Petri net are replaced by a number.
    “It takes a bit more effort to treat the parallel arrows individually, but one is amply rewarded as it becomes possible to combine the two approaches so that the advantages of both can be obtained simultaneously.”
    The circle to COVID has been closed
    The solution helps our mathematical understanding of how to describe complex systems with many interdependencies, but will not have much practical effect on the daily work of computer scientists using Petri nets, according to Joachim Kock:
    “This is because the necessary modifications are mostly back-compatible and can be applied without need for revision of the entire Petri net theory.”
    “Somewhat surprisingly, some epidemiologists have started using the revised Petri nets. So, one might say the circle has been closed!”
    Joachim Kock does see a further point to the story:
    “I wasn’t out to find a solution to the old problem in computer science at all. I just wanted to do COVID calculations. This was a bit like looking for your pen but realizing that you must find your glasses first. So, I would like to take the opportunity to advocate the importance of research which does not have a predefined goal. Sometimes research driven by curiosity will lead to breakthroughs.” More

  • in

    Clinical trial results indicate low rate of adverse events associated with implanted brain computer interface

    For people with paralysis caused by neurologic injury or disease — such as ALS (also known as Lou Gehrig’s disease), stroke, or spinal cord injury — brain-computer interfaces (BCIs) have the potential to restore communication, mobility, and independence by transmitting information directly from the brain to a computer or other assistive technology.
    Although implanted brain sensors, the core component of many brain-computer interfaces, have been used in neuroscientific studies with animals for decades and have been approved for short term use ( More

  • in

    AI discovers new nanostructures

    Scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have successfully demonstrated that autonomous methods can discover new materials. The artificial intelligence (AI)-driven technique led to the discovery of three new nanostructures, including a first-of-its-kind nanoscale “ladder.” The research was published today in Science Advances.
    The newly discovered structures were formed by a process called self-assembly, in which a material’s molecules organize themselves into unique patterns. Scientists at Brookhaven’s Center for Functional Nanomaterials (CFN) are experts at directing the self-assembly process, creating templates for materials to form desirable arrangements for applications in microelectronics, catalysis, and more. Their discovery of the nanoscale ladder and other new structures further widens the scope of self-assembly’s applications.
    “Self-assembly can be used as a technique for nanopatterning, which is a driver for advances in microelectronics and computer hardware,” said CFN scientist and co-author Gregory Doerk. “These technologies are always pushing for higher resolution using smaller nanopatterns. You can get really small and tightly controlled features from self-assembling materials, but they do not necessarily obey the kind of rules that we lay out for circuits, for example. By directing self-assembly using a template, we can form patterns that are more useful.”
    Staff scientists at CFN, which is a DOE Office of Science User Facility, aim to build a library of self-assembled nanopattern types to broaden their applications. In previous studies, they demonstrated that new types of patterns are made possible by blending two self-assembling materials together.
    “The fact that we can now create a ladder structure, which no one has ever dreamed of before, is amazing,” said CFN group leader and co-author Kevin Yager. “Traditional self-assembly can only form relatively simple structures like cylinders, sheets, and spheres. But by blending two materials together and using just the right chemical grating, we’ve found that entirely new structures are possible.”
    Blending self-assembling materials together has enabled CFN scientists to uncover unique structures, but it has also created new challenges. With many more parameters to control in the self-assembly process, finding the right combination of parameters to create new and useful structures is a battle against time. To accelerate their research, CFN scientists leveraged a new AI capability: autonomous experimentation.

    In collaboration with the Center for Advanced Mathematics for Energy Research Applications (CAMERA) at DOE’s Lawrence Berkeley National Laboratory, Brookhaven scientists at CFN and the National Synchrotron Light Source II (NSLS-II), another DOE Office of Science User Facility at Brookhaven Lab, have been developing an AI framework that can autonomously define and perform all the steps of an experiment. CAMERA’s gpCAM algorithm drives the framework’s autonomous decision-making. The latest research is the team’s first successful demonstration of the algorithm’s ability to discover new materials.
    “gpCAM is a flexible algorithm and software for autonomous experimentation,” said Berkeley Lab scientist and co-author Marcus Noack. “It was used particularly ingeniously in this study to autonomously explore different features of the model.”
    “With help from our colleagues at Berkeley Lab, we had this software and methodology ready to go, and now we’ve successfully used it to discover new materials,” Yager said. “We’ve now learned enough about autonomous science that we can take a materials problem and convert it into an autonomous problem pretty easily.”
    To accelerate materials discovery using their new algorithm, the team first developed a complex sample with a spectrum of properties for analysis. Researchers fabricated the sample using the CFN nanofabrication facility and carried out the self-assembly in the CFN material synthesis facility.
    “An old school way of doing material science is to synthesize a sample, measure it, learn from it, and then go back and make a different sample and keep iterating that process,” Yager said. “Instead, we made a sample that has a gradient of every parameter we’re interested in. That single sample is thus a vast collection of many distinct material structures.”
    Then, the team brought the sample to NSLS-II, which generates ultrabright x-rays for studying the structure of materials. CFN operates three experimental stations in partnership with NSLS-II, one of which was used in this study, the Soft Matter Interfaces (SMI) beamline.

    “One of the SMI beamline’s strengths is its ability to focus the x-ray beam on the sample down to microns,” said NSLS-II scientist and co-author Masa Fukuto. “By analyzing how these microbeam x-rays get scattered by the material, we learn about the material’s local structure at the illuminated spot. Measurements at many different spots can then reveal how the local structure varies across the gradient sample. In this work, we let the AI algorithm pick, on the fly, which spot to measure next to maximize the value of each measurement.”
    As the sample was measured at the SMI beamline, the algorithm, without human intervention, created of model of the material’s numerous and diverse set of structures. The model updated itself with each subsequent x-ray measurement, making every measurement more insightful and accurate.
    In a matter of hours, the algorithm had identified three key areas in the complex sample for the CFN researchers to study more closely. They used the CFN electron microscopy facility to image those key areas in exquisite detail, uncovering the rails and rungs of a nanoscale ladder, among other novel features.
    From start to finish, the experiment ran about six hours. The researchers estimate they would have needed about a month to make this discovery using traditional methods.
    “Autonomous methods can tremendously accelerate discovery,” Yager said. “It’s essentially ‘tightening’ the usual discovery loop of science, so that we cycle between hypotheses and measurements more quickly. Beyond just speed, however, autonomous methods increase the scope of what we can study, meaning we can tackle more challenging science problems.”
    “Moving forward, we want to investigate the complex interplay among multiple parameters. We conducted simulations using the CFN computer cluster that verified our experimental results, but they also suggested how other parameters, such as film thickness, can also play an important role,” Doerk said.
    The team is actively applying their autonomous research method to even more challenging material discovery problems in self-assembly, as well as other classes of materials. Autonomous discovery methods are adaptable and can be applied to nearly any research problem.
    “We are now deploying these methods to the broad community of users who come to CFN and NSLS-II to conduct experiments,” Yager said. “Anyone can work with us to accelerate the exploration of their materials research. We foresee this empowering a host of new discoveries in the coming years, including in national priority areas like clean energy and microelectronics.”
    This research was supported by the DOE Office of Science. More

  • in

    AI improves detail, estimate of urban air pollution

    Using artificial intelligence, Cornell University engineers have simplified and reinforced models that accurately calculate the fine particulate matter (PM2.5) — the soot, dust and exhaust emitted by trucks and cars that get into human lungs — contained in urban air pollution.
    Now, city planners and government health officials can obtain a more precise accounting about the well-being of urban dwellers and the air they breathe, from new research published December 2022 in the journal Transportation Research Part D.
    “Infrastructure determines our living environment, our exposure,” said senior author Oliver Gao, the Howard Simpson Professor of Civil and Environmental Engineering in the College of Engineering at Cornell University. “Air pollution impact due to transportation — put out as exhaust from the cars and trucks that drive on our streets — is very complicated. Our infrastructure, transportation and energy policies are going to impact air pollution and hence public health.”
    Previous methods to gauge air pollution were cumbersome and reliant on extraordinary amounts of data points. “Older models to calculate particulate matter were computationally and mechanically consuming and complex,” said Gao, a faculty fellow at the Cornell Atkinson Center for Sustainability. “But if you develop an easily accessible data model, with the help of artificial intelligence filling in some of the blanks, you can have an accurate model at a local scale.”
    Lead author Salil Desai and visiting scientist Mohammad Tayarani, together with Gao, published “Developing Machine Learning Models for Hyperlocal Traffic Related Particulate Matter Concentration Mapping,” to offer a leaner, less data-intensive method for making accurate models.
    Ambient air pollution is a leading cause of premature death around the world. Globally, more than 4.2 million annual fatalities — in the form of cardiovascular disease, ischemic heart disease, stroke and lung cancer — were attributed to air pollution in 2015, according to a Lancet study cited in the Cornell research.
    In this work, the group developed four machine learning models for traffic-related particulate matter concentrations in data gathered in New York City’s five boroughs, which have a combined population of 8.2 million people and a daily-vehicle miles traveled of 55 million miles.
    The equations use few inputs such as traffic data, topology and meteorology in an AI algorithm to learn simulations for a wide range of traffic-related, air-pollution concentration scenarios.
    Their best performing model was the Convolutional Long Short-term Memory, or ConvLSTM, which trained the algorithm to predict many spatially correlated observations.
    “Our data-driven approach — mainly based on vehicle emission data — requires considerably fewer modeling steps,” Desai said. Instead of focusing on stationary locations, the method provides a high-resolution estimation of the city street pollution surface. Higher resolution can help transportation and epidemiology studies assess health, environmental justice and air quality impacts.
    Funding for this research came from the U.S. Department of Transportation’s University Transportation Centers Program and Cornell Atkinson. More

  • in

    A precision arm for miniature robots

    We are all familiar with robots equipped with moving arms. They stand in factory halls, perform mechanical work and can be programmed. A single robot can be used to carry out a variety of tasks.
    Until today, miniature systems that transport miniscule amounts of liquid through fine capillaries have had little association with such robots. Developed by researchers as an aid for laboratory analysis, such systems are known as microfluidics or lab-on-a-chip and generally make use of external pumps to move the liquid through the chips. To date, such systems have been difficult to automate, and the chips have had to be custom-designed and manufactured for each specific application.
    Ultrasound needle oscillations
    Scientists led by ETH Professor Daniel Ahmed are now combining conventional robotics and microfluidics. They have developed a device that uses ultrasound and can be attached to a robotic arm. It is suitable for performing a wide range of tasks in microrobotic and microfluidic applications and can also be used to automate such applications. The scientists have reported on this development in Nature Communications.
    The device comprises a thin, pointed glass needle and a piezoelectric transducer that causes the needle to oscillate. Similar transducers are used in loudspeakers, ultrasound imaging and professional dental cleaning equipment. The ETH researchers can vary the oscillation frequency of their glass needle. By dipping the needle into a liquid they create a three-dimensional pattern composed of multiple vortices. Since this pattern depends on the oscillation frequency, it can be controlled accordingly.
    The researchers were able to use this to demonstrate several applications. First, they were able to mix tiny droplets of highly viscous liquids. “The more viscous liquids are, the more difficult it is to mix them,” Professor Ahmed explains. “However, our method succeeds in doing this because it allows us to not only create a single vortex, but to also efficiently mix the liquids using a complex three-dimensional pattern composed of multiple strong vortices.”
    Second, the scientists were able to pump fluids through a mini-channel system by creating a specific pattern of vortices and placing the oscillating glass needle close to the channel wall.
    Third, they succeeded in using their robot-assisted acoustic device to trap fine particles present in the fluid. This works because a particle’s size determines its reaction to the sound waves. Relatively large particles move towards the oscillating glass needle, where they accumulate. The researchers demonstrated how this method can capture not only inanimate particles but also fish embryos. They believe it should also be capable of capturing biological cells in the fluid. “In the past, manipulating microscopic particles in three dimensions was always challenging. Our microrobotic arm makes it easy,” Ahmed says.
    “Until now, advancements in large, conventional robotics and microfluidic applications have been made separately,” Ahmed says. “Our work helps to bring the two approaches together.” As a result, future microfluidic systems could be designed similarly to today’s robotic systems. An appropriately programmed single device would be able to handle a variety of tasks. “Mixing and pumping liquids and trapping particles — we can do it all with one device,” Ahmed says. This means tomorrow’s microfluidic chips will no longer have to be custom-developed for each specific application. The researchers would next like to combine several glass needles to create even more complex vortex patterns in liquids.
    In addition to laboratory analysis, Ahmed can envisage other applications for microrobotic arms, such as sorting tiny objects. The arms could conceivably also be used in biotechnology as a way of introducing DNA into individual cells. It should ultimately be possible to employ them in additive manufacturing and 3D printing. More

  • in

    Feathered robotic wing paves way for flapping drones

    Birds fly more efficiently by folding their wings during the upstroke, according to a recent study led by Lund University in Sweden. The results could mean that wing-folding is the next step in increasing the propulsive and aerodynamic efficiency of flapping drones.
    Even the precursors to birds — extinct bird-like dinosaurs — benefited from folding their wings during the upstroke, as they developed active flight. Among flying animals alive today, birds are the largest and most efficient. This makes them particularly interesting as inspiration for the development of drones. However, determining which flapping strategy is best requires aerodynamic studies of various ways of flapping the wings. Therefore, a Swedish-Swiss research team has constructed a robotic wing that can achieve just that — flapping like a bird, and beyond.
    “We have built a robot wing that can flap more like a bird than previous robots, but also flap in way that birds cannot do. By measuring the performance of the wing in our wind tunnel, we have studied how different ways of achieving the wing upstroke affect force and energy in flight,” says Christoffer Johansson, biology researcher at Lund University.
    Previous studies have shown that birds flap their wings more horizontally when flying slowly. The new study shows that the birds probably do it, even though it requires more energy, because it is easier to create a sufficiently large forces to stay aloft and propel themselves. This is something drones can emulate to increase the range of speeds they can fly at.
    “The new robotic wing can be used to answer questions about bird flight that would be impossible simply by observing flying birds. Research into the flight ability of living birds is limited to the flapping movement that the bird actually uses,” explains Christoffer Johansson.
    The research explains why birds flap the way they do, by finding out which movement patterns create the most force and are the most efficient. The results can also be used in other research areas, such as better understanding how the migration of birds is affected by climate change and access to food. There are also many potential uses for drones where these insights can be put to good use. One area might be using drones to deliver goods.
    “Flapping drones could be used for deliveries, but they would need to be efficient enough and able to lift the extra weight this entails. How the wings move is of great importance for performance, so this is where our research could come in handy,” concludes Christoffer Johansson. More