More stories

  • in

    Liquid metal sensors and AI could help prosthetic hands to 'feel'

    Each fingertip has more than 3,000 touch receptors, which largely respond to pressure. Humans rely heavily on sensation in their fingertips when manipulating an object. The lack of this sensation presents a unique challenge for individuals with upper limb amputations. While there are several high-tech, dexterous prosthetics available today — they all lack the sensation of “touch.” The absence of this sensory feedback results in objects inadvertently being dropped or crushed by a prosthetic hand.
    To enable a more natural feeling prosthetic hand interface, researchers from Florida Atlantic University’s College of Engineering and Computer Science and collaborators are the first to incorporate stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand. Encapsulated within silicone-based elastomers, this technology provides key advantages over traditional sensors, including high conductivity, compliance, flexibility and stretchability. This hierarchical multi-finger tactile sensation integration could provide a higher level of intelligence for artificial hands.
    For the study, published in the journal Sensors, researchers used individual fingertips on the prosthesis to distinguish between different speeds of a sliding motion along different textured surfaces. The four different textures had one variable parameter: the distance between the ridges. To detect the textures and speeds, researchers trained four machine learning algorithms. For each of the ten surfaces, 20 trials were collected to test the ability of the machine learning algorithms to distinguish between the ten different complex surfaces comprised of randomly generated permutations of four different textures.
    Results showed that the integration of tactile information from liquid metal sensors on four prosthetic hand fingertips simultaneously distinguished between complex, multi-textured surfaces — demonstrating a new form of hierarchical intelligence. The machine learning algorithms were able to distinguish between all the speeds with each finger with high accuracy. This new technology could improve the control of prosthetic hands and provide haptic feedback, more commonly known as the experience of touch, for amputees to reconnect a previously severed sense of touch.
    “Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors,” said Erik Engeberg, Ph.D., senior author, an associate professor in the Department of Ocean and Mechanical Engineering and a member of the FAU Stiles-Nicholson Brain Institute and the FAU Institute for Sensing and Embedded Network Systems Engineering (I-SENSE), who conducted the study with first author and Ph.D. student Moaed A. Abd. “The tactile information from all the individual fingertips in our study provided the foundation for a higher hand-level of perception enabling the distinction between ten complex, multi-textured surfaces that would not have been possible using purely local information from an individual fingertip. We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand.”
    Researchers compared four different machine learning algorithms for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the liquid metal sensors were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 percent accuracy to distinguish between ten different multi-textured surfaces using four liquid metal sensors from four fingers simultaneously.
    “The loss of an upper limb can be a daunting challenge for an individual who is trying to seamlessly engage in regular activities,” said Stella Batalama, Ph.D., dean, College of Engineering and Computer Science. “Although advances in prosthetic limbs have been beneficial and allow amputees to better perform their daily duties, they do not provide them with sensory information such as touch. They also don’t enable them to control the prosthetic limb naturally with their minds. With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can ‘feel’ and respond to its environment.”
    Story Source:
    Materials provided by Florida Atlantic University. Original written by Gisele Galoustian. Note: Content may be edited for style and length. More

  • in

    This device harvests power from your sweaty fingertips while you sleep

    Feeling extra sweaty from a summer heat wave? Don’t worry — not all your perspiration has to go to waste. In a paper publishing July 13 in the journal Joule, researchers have developed a new device that harvests energy from the sweat on — of all places — your fingertips. To date, the device is the most efficient on-body energy harvester ever invented, producing 300 millijoules (mJ) of energy per square centimeter without any mechanical energy input during a 10-hour sleep and an additional 30 mJ of energy with a single press of a finger. The authors say the device represents a significant step forward for self-sustainable wearable electronics.
    “Normally, you want maximum return on investment in energy. You don’t want to expend a lot of energy through exercise to get only a little energy back,” says senior author Joseph Wang (@JWangnano), a nanoengineering professor at the University of California San Diego. “But here, we wanted to create a device adapted to daily activity that requires almost no energy investment — you can completely forget about the device and go to sleep or do desk work like typing, yet still continue to generate energy. You can call it ‘power from doing nothing.'”
    Previous sweat-based energy devices required intense exercise, such as a great deal of running or biking, before the user sweated enough to activate power generation. But the large amount of energy consumed during exercise can easily cancel out the energy produced, often resulting in energy return on investment of less than 1%.
    In contrast, this device falls into what the authors call the “holy grail” category of energy harvesters. Instead of relying on external, irregular sources like sunlight or movement, all it needs is finger contact to collect more than 300 mJ of energy during sleep — which the authors say is enough to power some small wearable electronics. Since no movement is needed, the ratio between harvested energy and invested energy is essentially infinite.
    It may seem odd to choose fingertips as the source of this sweat over, say, the underarms, but in fact, fingertips have the highest concentration of sweat glands compared to anywhere else on the body.
    “Generating more sweat at the fingers probably evolved to help us better grip things,” says first co-author Lu Yin (@YinLu_CLT), a nanoengineering PhD student working in Wang’s lab. “Sweat rates on the finger can reach as high as a few microliters per square centimeter per minute. This is significant compared to other locations on the body, where sweat rates are maybe two or three orders of magnitude smaller.”
    The device the researchers developed in this study is a type of energy harvester called a biofuel cell (BFC) and is powered by lactate, a dissolved compound in sweat. From the outside, it looks like a simple piece of foam connected to a circuit with electrodes, all of which is attached to the pad of a finger. The foam is made out of carbon nanotube material, and the device also contains a hydrogel that helps maximize sweat absorption.
    “The size of the device is about 1 centimeter squared. Its material is flexible as well, so you don’t need to worry about it being too rigid or feeling weird. You can comfortably wear it for an extended period of time,” says Yin.
    Within the device, a series of electrochemical reactions occur. The cells are equipped with a bioenzyme on the anode that oxidizes, or removes electrons from, the lactate; the cathode is deposited with a small amount of platinum to catalyze a reduction reaction that takes the electron to turn oxygen into water. Once this happens, electrons flow from the lactate through the circuit, creating a current of electricity. This process occurs spontaneously: as long as there is lactate, no additional energy is needed to kickstart the process.
    Separate from but complementary to the BFC, piezoelectric generators — which convert mechanical energy into electricity — are also attached to the device to harvest up to 20% additional energy. Relying on the natural pinching motion of fingers or everyday motions like typing, these generators helped produce additional energy from barely any work: a single press of a finger once per hour required only 0.5 mJ of energy but produced over 30 mJ of energy, a 6,000% return in investment.
    The researchers were able to use the device to power effective vitamin C- and sodium-sensing systems, and they are optimistic about improving the device to have even greater abilities in the future, which might make it suitable for health and wellness applications such as glucose meters for people with diabetes. “We want to make this device more tightly integrated in wearable forms, like gloves. We’re also exploring the possibility of enabling wireless connection to mobile devices for extended continuous sensing,” Yin says.
    “There’s a lot of exciting potential,” says Wang. “We have ten fingers to play with.”
    Story Source:
    Materials provided by Cell Press. Note: Content may be edited for style and length. More

  • in

    Supercomputer predicts cell-membrane permeability of cyclic peptides

    Scientists at Tokyo Institute of Technology have developed a computational method based on large-scale molecular dynamics simulations to predict the cell-membrane permeability of cyclic peptides using a supercomputer. Their protocol has exhibited promising accuracy and may become a useful tool for the design and discovery of cyclic peptide drugs, which could help us reach new therapeutic targets inside cells beyond the capabilities of conventional small-molecule drugs or antibody-based drugs.
    Cyclic peptide drugs have attracted the attention of major pharmaceutical companies around the world as promising alternatives to conventional small molecule-based drugs. Through proper design, cyclic peptides can be tailored to reach specific targets inside cells, such as protein-protein interactions, which are beyond the scope of small molecules. Unfortunately, it has proven notoriously difficult to design cyclic peptides with high cell-membrane permeability — that is, cyclic peptides that can easily diffuse through the lipid bilayer that delimits the inside and outside of a cell.
    In an effort to resolve this bottleneck, scientists at the Middle Molecule IT-based Drug Discovery Laboratory (MIDL) have been working on a computational method for predicting cell-membrane permeability. Established in September 2017, MIDL is one of the “Research Initiatives” at Tokyo Institute of Technology (Tokyo Tech) that goes beyond the boundaries of departments. Under the support of the Program for Building Regional Innovation Ecosystems of the Ministry of Education, Culture, Sports, Science and Technology (MEXT), MIDL has been working with the city of Kawasaki to industrialize a framework for discovering middle molecule-based drugs — cyclic peptide drugs and nucleic acid drugs larger than conventional small-molecule drugs but smaller than antibody-based drugs — by combining computational drug design and chemical synthesis technology.
    In a recent study published in the Journal of Chemical Information and Modeling, Professor Yutaka Akiyama and colleagues from MIDL and Tokyo Tech have developed a protocol for predicting the cell-membrane permeability of cyclic peptides using molecular dynamics simulations. Such simulations constitute a widely accepted computational approach for predicting and reproducing the dynamics of atoms and molecules by sequentially solving Newton’s laws of motion at short time intervals. However, even a single simulation for predicting the permeability of a cyclic peptide with only eight amino acids takes a tremendous amount of time and resources. “Our study marks the first time comprehensive simulations were performed for as many as 156 different cyclic peptides,” highlights Prof. Akiyama, “The simulation of each cyclic peptide using the protocol we developed took about 70 hours per peptide using 28 GPUs on the TSUBAME 3.0 supercomputer at Tokyo Tech.”
    The researchers verified the predicted permeability values with experimentally derived ones and confirmed an acceptable correlation coefficient of R = 0.63 under the best conditions, showcasing the potential of their protocol. Moreover, after a detailed analysis of the peptide conformation and energy values obtained from the trajectory data, Prof. Akiyama’s team found that the strength of the electrostatic interactions between the atoms constituting the cyclic peptide and the surrounding media, namely lipid membrane and water molecules, are strongly related to the membrane permeability value. The simulations also revealed the way in which peptides permeate through the membrane by changing their orientation and conformation according to their surroundings. “Our results shed some light on the mechanisms of cell-membrane permeability and provide a guideline for designing molecules that can get inside cells more efficiently. This will greatly contribute to the development of next-generation peptide drugs,” remarks Prof. Masatake Sugita, the first author of the study.
    The researchers are already working on a more advanced simulation protocol that will enable more accurate predictions. They are also trying to incorporate artificial intelligence into the picture by adopting deep learning techniques, which could increase both accuracy and speed. Considering that cyclic peptides could unlock many therapeutic targets for diseases that are difficult to treat, let us hope that scientists at MIDL and Tokyo Tech succeed in their endeavors!
    Story Source:
    Materials provided by Tokyo Institute of Technology. Note: Content may be edited for style and length. More

  • in

    Electrons in quantum liquid gain energy from laser pulses

    The absorption of energy from laser light by free electrons in a liquid has been demonstrated for the first time. Until now, this process was observed only in the gas phase. The findings, led by Graz University of Technology, open new doors for ultra-fast electron microscopy.
    The investigation and development of materials crucially depends on the ability to observe smallest objects at fastest time scales. The necessary spatial resolution for investigations in the (sub-)atomic range can be achieved with electron microscopy. For the most rapid processes, however, proceeding within a few femtoseconds (quadrillionths of a second), the time resolution of conventional electron microscopes is insufficient. To improve the time duration of electron pulses, electrons would have to be selected within a shorter time window — in analogy to a camera shutter, which controls the exposure time in photography.
    In principle, this temporal selection is possible with extremely short laser pulses through a process called laser-assisted electron scattering (LAES). In this process, electrons can absorb energy from the light field during collisions with atoms of the sample under investigation. “Structural information is provided by all electrons, but those that have a higher energy level can be assigned to the time window in which the light pulse was present. With this method, it is possible to select a short time window from the long electron pulse and thus improve the time resolution,” explains Markus Koch, professor at the Institute of Experimental Physics at Graz University of Technology. So far, however, LAES processes have only been observed in the gas phase, despite their investigation for about 50 years.
    Markus Koch and his team, in collaboration with researchers from Photonics Institute at Vienna University of Technology and the Institute of Chemistry at Tokyo Metropolitan University, have now demonstrated for the first time that laser-assisted electron scattering can also be observed in condensed matter, specifically in superfluid helium.
    Superfluid helium leading to success
    The TU Graz researchers performed the experiment in a superfluid helium droplet of few nanometer diameter (3-30 nm), into which they loaded single atoms (indium or xenon) or molecules (acetone) that served as an electron source — a field of expertise at the institute. “The free electrons can move almost without friction within the droplet and absorb more energy in the light field than they lose in collisions with the helium atoms,” says Leonhard Treiber, the PhD student in charge of the experiment. The resulting acceleration allows for the observation of much faster electrons.
    The experiments could be interpreted in cooperation with Markus Kitzler-Zeiler, an expert for strong-field processes at TU Wien, and the LAES process was confirmed through simulations by Reika Kanya from Tokyo Metropolitan University. The results were published in Nature Communications.
    In the future, the LAES process will be studied within thin films of various materials, also produced inside helium droplets, in order to determine important parameters such as the optimal film thickness or the favourable intensity of the laser pulses for application in an electron microscope.
    Story Source:
    Materials provided by Graz University of Technology. Original written by Christoph Pelzl. Note: Content may be edited for style and length. More

  • in

    Simulating microswimmers in nematic fluids

    Artificial microswimmers have received much attention in recent years. By mimicking microbes which convert their surrounding energy into swimming motions, these particles could soon be exploited for many important applications. Yet before this can happen, researchers must develop methods to better control the trajectories of individual microswimmers in complex environments. In a new study published inEPJ E, Shubhadeep Mandal at the Indian Institute of Technology Guwahati (India), and Marco Mazza at the Max Planck Institute for Dynamics and Self-Organisation in Göttingen (Germany) and Loughborough University (UK), show how this control could be achieved using exotic materials named ‘nematic liquid crystals’ (LCs) — whose viscosity and elasticity can vary depending on the direction of an applied force.

    advertisement More

  • in

    Mathematical model predicts the movement of microplastics in the ocean

    A new model tracking the vertical movement of algae-covered microplastic particles offers hope in the fight against plastic waste in our oceans.
    Research led by Newcastle University’s Dr Hannah Kreczak is the first to identify the processes that underpin the trajectories of microplastics below the ocean surface. Publishing their findings in the journal Limnology and Oceanography the authors analysed how biofouling — the accumulation of algae on the surface of microplastics, impacts the vertical movement of buoyant particles.
    The researchers found that particle properties are the biggest factor in determining the period and characteristics of the repetitive vertical movement below the surface, while the algal population dynamics determine the maximum depth reached. Their findings also show that the smallest particles are extremely sensitive to algal cell attachment and growth, suggesting they are always submerged at depths surrounding the base of the euphotic zone, the layer closer to the surface that receives enough light to support photosynthesis, or could become trapped in large algal colonies.
    In general, the results suggest that a higher concentration of biofouled microplastic is expected to be found subsurface, close to the euphotic zone depth rather than at the ocean’s surface.
    Microplastics (fragments with a diameter smaller than 5mm) make up 90% of the plastic debris found at the ocean surface and the amount of plastic entering our ocean is significantly larger than the estimates of floating plastic on the surface of the ocean. However, it is not exactly known what happens to these particles once they enter the ocean, and 99% of microplastics within our ocean are considered missing.
    This new model has the potential to understand the distribution of fouled plastics in the ocean and therefore the ecological impact, particularly in areas of high concentration.
    Dr Hannah Kreczak, EPSRC Doctoral Prize Fellow at Newcastle University’s School of Mathematics, Statistics and Physics, said: “Mathematical modelling has been extremely beneficial in identifying hot-spots for marine plastic pollution on the ocean surface. I hope this research can be a constructive step in understanding the impact plastic pollution has below the surface and aid in the effort towards a more sustainable ocean.”
    Co-Author Dr Andrew Baggaley, Lecturer in Applied Mathematics at the School of Mathematics, Statistics and Physics, added: “This is an exciting first step in our project to develop a comprehensive modelling framework to understand the transport of microplastic particles and their distribution in the oceans.”
    Future research by the team will focus on the fluid motion in the ocean mixed layer, to allow for even more complete assessment of microplastic vertical distributions in the ocean.
    Story Source:
    Materials provided by Newcastle University. Note: Content may be edited for style and length. More

  • in

    Reducing data-transfer error in radiation therapy

    Just as helicopter traffic reporters use their “bird’s eye view” to route drivers around roadblocks safely, radiation oncologists treating a variety of cancers can use new guidelines developed by a West Virginia University researcher to reduce mistakes in data transfer and more safely treat their patients.
    Ramon Alfredo Siochi — the director of medical physics at WVU — led a task group to help ensure the accuracy of data that dictates a cancer patient’s radiation therapy. The measures he and his colleagues recommended in their new report safeguard against medical errors in a treatment that more than half of all cancer patients receive.
    “The most common mistake that happens in radiation oncology is the transfer of information from one system to another,” Siochi, the associate chair for the School of Medicine’s Department of Radiation Oncology, said. “This report gives you a good, bird’s-eye view of the way data is moving around in your department.”
    “How frequently do these accidents occur? I think one estimate I saw was that three out of every 100 patients might have an error, but it doesn’t necessarily harm them. Now, I don’t know what the incidence rate is of errors that are quote-unquote ‘near misses’ — when an error happens before it hits the patient — but I would imagine it is much higher.
    Siochi recently chaired the Task Group of Quality Assurance on External Beam Treatment Data Transfer, a division of the American Association of Physicists in Medicine.
    The group was formed in response to news coverage of radiation overdoses caused by faulty data transfer.
    “In 2010, it was reported in the New York Times that a patient [in a New York City hospital] was overdosed with radiation because the data somehow didn’t transfer properly from one system to another,” Siochi said. “Long story short, the patient received a lethal dose of radiation to his head that went on for three days undetected. Now, that falls into the general class of many things happening that were not standard practice. But it could have been avoided.”
    Radiation therapy is used to treat a variety of cancers, including cancers of the lung, pancreas, prostate, breast, brain and bladder. Depending on a cancer’s type or stage, radiation may cure it, shrink it or stop it from coming back.
    But as the complexity of radiation therapy has grown — making it possible to target cancers that would once have been too difficult to treat — so too has the amount of data that goes into treatment machines. With more data comes more opportunity for errors.
    When Siochi started practicing radiation oncology physics — in the 1990s — this data evoked a tree-lined residential street more than the six-lane highway it brings to mind today.
    “It was very analog,” he said. “We’re talking maybe 20 parameters that you would need to check on a plan, and you would put it all on a paper chart. But I once did a calculation — to do an order of magnitude — and now we’re talking about 100,000 parameters. It’s just impossible for a human to check.”
    The group’s report — which earned the approval of AAPM and the Science Council — makes that volume of parameters less overwhelming. It explains how data is transferred among various systems used in radiation therapy, and it suggests ways that medical physicists can test the data’s integrity throughout the process, contributing to safer treatments.
    Story Source:
    Materials provided by West Virginia University. Note: Content may be edited for style and length. More

  • in

    Hurricanes may not be becoming more frequent, but they’re still more dangerous

    Climate change is helping Atlantic hurricanes pack more of a punch, making them rainier, intensifying them faster and helping the storms linger longer even after landfall. But a new statistical analysis of historical records and satellite data suggests that there aren’t actually more Atlantic hurricanes now than there were roughly 150 years ago, researchers report July 13 in Nature Communications.

    The record-breaking number of Atlantic hurricanes in 2020, a whopping 30 named storms, led to intense speculation over whether and how climate change was involved (SN: 12/21/20). It’s a question that scientists continue to grapple with, says Gabriel Vecchi, a climate scientist at Princeton University. “What is the impact of global warming — past impact and also our future impact — on the number and intensity of hurricanes and tropical storms?”

    Satellite records over the last 30 years allow us to say “with little ambiguity how many hurricanes, and how many major hurricanes [Category 3 and above] there were each year,” Vecchi says. Those data clearly show that the number, intensity and speed of intensification of hurricanes has increased over that time span.

    But “there are a lot of things that have happened over the last 30 years” that can influence that trend, he adds. “Global warming is one of them.” Decreasing aerosol pollution is another (SN: 11/21/19). The amount of soot and sulfate particles and dust over the Atlantic Ocean was much higher in the mid-20th century than now; by blocking and scattering sunlight, those particles temporarily cooled the planet enough to counteract greenhouse gas warming. That cooling is also thought to have helped temporarily suppress hurricane activity in the Atlantic.  

    To get a longer-term perspective on trends in Atlantic storms, Vecchi and colleagues examined a dataset of hurricane observations from the U.S. National Oceanic and Atmospheric Administration that stretches from 1851 to 2019. It includes old-school observations by unlucky souls who directly observed the tempests as well as remote sensing data from the modern satellite era.

    How to directly compare those different types of observations to get an accurate trend was a challenge. Satellites, for example, can see every storm, but earlier observations will count only the storms that people directly experienced. So the researchers took a probabilistic approach to fill in likely gaps in the older record, assuming, for example, that modern storm tracks are representative of pre-satellite storm tracks to account for storms that would have stayed out at sea and unseen. The team found no clear increase in the number of storms in the Atlantic over that 168-year time frame. One possible reason for this, the researchers say, is a rebound from the aerosol pollution–induced lull in storms that may be obscuring some of the greenhouse gas signal in the data.  

    More surprisingly — even to Vecchi, he says — the data also seem to show no significant increase in hurricane intensity over that time. That’s despite “scientific consistency between theories and models indicating that the typical intensity of hurricanes is more likely to increase as the planet warms,” Vecchi says. But this conclusion is heavily caveated — and the study also doesn’t provide evidence against the hypothesis that global warming “has acted and will act to intensify hurricane activity,” he adds.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Climate scientists were already familiar with the possibility that storm frequency might not have increased much in the last 150 or so years — or over much longer timescales. The link between number of storms and warming has long been uncertain, as the changing climate also produces complex shifts in atmospheric patterns that could take the hurricane trend in either direction. The Intergovernmental Panel on Climate Change noted in a 2012 report that there is “low confidence” that tropical cyclone activity has increased in the long term.

    Geologic evidence of Atlantic storm frequency, which can go back over 1,000 years, also suggests that hurricane frequency does tend to wax and wane every few decades, says Elizabeth Wallace, a paleotempestologist at Rice University in Houston (SN: 10/22/17).

    Wallace hunts for hurricane records in deep underwater caverns called blue holes: As a storm passes over an island beach or the barely submerged shallows, winds and waves pick up sand that then can get dumped into these caverns, forming telltale sediment deposits. Her data, she says, also suggest that “the past 150 years hasn’t been exceptional [in storm frequency], compared to the past.”

    But, Wallace notes, these deposits don’t reveal anything about whether climate change is producing more intense hurricanes. And modern observational data on changes in hurricane intensity is muddled by its own uncertainties, particularly the fact that the satellite record just isn’t that long. Still, “I liked that the study says it doesn’t necessarily provide evidence against the hypothesis” that higher sea-surface temperatures would increase hurricane intensity by adding more energy to the storm, she says.

    Kerry Emanuel, an atmospheric scientist at MIT, says the idea that storm numbers haven’t increased isn’t surprising, given the longstanding uncertainty over how global warming might alter that. But “one reservation I have about the new paper is the implication that no significant trends in Atlantic hurricane metrics [going back to 1851] implies no effect of global warming on these storms,” he says. Looking for such a long-term trend isn’t actually that meaningful, he says, as scientists wouldn’t expect to see any global warming-related hurricane trends become apparent until about the 1970s anyway, as warming has ramped up.

    Regardless of whether there are more of these storms, there’s no question that modern hurricanes have become more deadly in many ways, Vecchi says. There’s evidence that global warming has already been increasing the amount of rain from some storms, such as Hurricane Harvey in 2017, which led to widespread, devastating flooding (SN: 9/28/18). And, Vecchi says, “sea level will rise over the coming century … so [increasing] storm surge is one big hazard from hurricanes.” More