More stories

  • in

    Simple robots, smart algorithms

    Anyone with children knows that while controlling one child can be hard, controlling many at once can be nearly impossible. Getting swarms of robots to work collectively can be equally challenging, unless researchers carefully choreograph their interactions — like planes in formation — using increasingly sophisticated components and algorithms. But what can be reliably accomplished when the robots on hand are simple, inconsistent, and lack sophisticated programming for coordinated behavior?
    A team of researchers led by Dana Randall, ADVANCE Professor of Computing and Daniel Goldman, Dunn Family Professor of Physics, both at Georgia Institute of Technology, sought to show that even the simplest of robots can still accomplish tasks well beyond the capabilities of one, or even a few, of them. The goal of accomplishing these tasks with what the team dubbed “dumb robots” (essentially mobile granular particles) exceeded their expectations, and the researchers report being able to remove all sensors, communication, memory and computation — and instead accomplishing a set of tasks through leveraging the robots’ physical characteristics, a trait that the team terms “task embodiment.”
    The team’s BOBbots, or “behaving, organizing, buzzing bots” that were named for granular physics pioneer Bob Behringer, are “about as dumb as they get,” explains Randall. “Their cylindrical chassis have vibrating brushes underneath and loose magnets on their periphery, causing them to spend more time at locations with more neighbors.” The experimental platform was supplemented by precise computer simulations led by Georgia Tech physics student Shengkai Li, as a way to study aspects of the system inconvenient to study in the lab.
    Despite the simplicity of the BOBbots, the researchers discovered that, as the robots move and bump into each other, “compact aggregates form that are capable of collectively clearing debris that is too heavy for one alone to move,” according to Goldman. “While most people build increasingly complex and expensive robots to guarantee coordination, we wanted to see what complex tasks could be accomplished with very simple robots.”
    Their work, as reported April 23, 2021 in the journal Science Advances, was inspired by a theoretical model of particles moving around on a chessboard. A theoretical abstraction known as a self-organizing particle system was developed to rigorously study a mathematical model of the BOBbots. Using ideas from probability theory, statistical physics and stochastic algorithms, the researchers were able to prove that the theoretical model undergoes a phase change as the magnetic interactions increase — abruptly changing from dispersed to aggregating in large, compact clusters, similar to phase changes we see in common everyday systems, like water and ice.
    “The rigorous analysis not only showed us how to build the BOBbots, but also revealed an inherent robustness of our algorithm that allowed some of the robots to be faulty or unpredictable,” notes Randall, who also serves as a professor of computer science and adjunct professor of mathematics at Georgia Tech.
    Story Source:
    Materials provided by Georgia Institute of Technology. Note: Content may be edited for style and length. More

  • in

    Toward new solar cells with active learning

    Scientists from the Theory Department of the Fritz-Haber Institute in Berlin and Technical University of Munich use machine learning to discover suitable molecular materials. To deal with the myriad of possibilities for candidate molecules, the machine decides for itself which data it needs.
    How can I prepare myself for something I do not yet know? Scientists from the Fritz Haber Institute in Berlin and from the Technical University of Munich have addressed this almost philosophical question in the context of machine learning. Learning is no more than drawing on prior experience. In order to deal with a new situation, one needs to have dealt with roughly similar situations before. In machine learning, this correspondingly means that a learning algorithm needs to have been exposed to roughly similar data. But what can we do if there is a nearly infinite amount of possibilities so that it is simply impossible to generate data that covers all situations?
    This problem comes up a lot when dealing with an endless number of possible candidate molecules. Organic semiconductors enable important future technologies such as portable solar cells or rollable displays. For such applications, improved organic molecules — which make up these materials — need to be discovered. Tasks of this nature are increasingly using methods of machine learning, while training on data from computer simulations or experiments. The number of potentially possible small organic molecules is, however, estimated to be on the order of 1033. This overwhelming number of possibilities makes it practically impossible to generate enough data to reflect such a large material diversity. In addition, many of those molecules are not even suitable for organic semiconductors. One is essentially looking for the proverbial needle in a haystack.
    In their work published recently in Nature Communications the team around Prof. Karsten Reuter, Director of the Theory Department at the Fritz-Haber-Institute, addressed this problem using so-called active learning. Instead of learning from existing data, the machine learning algorithm iteratively decides for itself which data it actually needs to learn about the problem. The scientists first carry out simulations on a few smaller molecules, and obtain data related to the molecules’ electrical conductivity — a measure of their usefulness when looking at possible solar cell materials. Based on this data, the algorithm decides if small modifications to these molecules could already lead to useful properties or whether it is uncertain due to a lack of similar data. In both cases, it automatically requests new simulations, improves itself through the newly generated data, considers new molecules, and goes on to repeat this procedure. In their work, the scientists show how new and promising molecules can efficiently be identified this way, while the algorithm continues its exploration into the vast molecular space, even now, at this very moment. Every week new molecules are being proposed that could usher in the next generation of solar cells and the algorithm just keeps getting better and better.
    Story Source:
    Materials provided by Fritz Haber Institute of the Max Planck Society. Note: Content may be edited for style and length. More

  • in

    Ankle exoskeleton enables faster walking

    Being unable to walk quickly can be frustrating and problematic, but it is a common issue, especially as people age. Noting the pervasiveness of slower-than-desired walking, engineers at Stanford University have tested how well a prototype exoskeleton system they have developed — which attaches around the shin and into a running shoe — increased the self-selected walking speed of people in an experimental setting.
    The exoskeleton is externally powered by motors and controlled by an algorithm. When the researchers optimized it for speed, participants walked, on average, 42 percent faster than when they were wearing normal shoes and no exoskeleton. The results of this study were published April 20 in IEEE Transactions on Neural Systems and Rehabilitation Engineering.
    “We were hoping that we could increase walking speed with exoskeleton assistance, but we were really surprised to find such a large improvement,” said Steve Collins, associate professor of mechanical engineering at Stanford and senior author of the paper. “Forty percent is huge.”
    For this initial set of experiments, the participants were young, healthy adults. Given their impressive results, the researchers plan to run future tests with older adults and to look at other ways the exoskeleton design can be improved. They also hope to eventually create an exoskeleton that can work outside the lab, though that goal is still a ways off.
    “My research mission is to understand the science of biomechanics and motor control behind human locomotion and apply that to enhance the physical performance of humans in daily life,” said Seungmoon Song, a postdoctoral fellow in mechanical engineering and lead author of the paper. “I think exoskeletons are very promising tools that could achieve that enhancement in physical quality of life.”
    Walking in the loop
    The ankle exoskeleton system tested in this research is an experimental emulator that serves as a testbed for trying out different designs. It has a frame that fastens around the upper shin and into an integrated running shoe that the participant wears. It is attached to large motors that sit beside the walking surface and pull a tether that runs up the length of the back of the exoskeleton. Controlled by an algorithm, the tether tugs the wearer’s heel upward, helping them point their toe down as they push off the ground. More

  • in

    Quantum steering for more precise measurements

    Quantum systems consisting of several particles can be used to measure magnetic or electric fields more precisely. A young physicist at the University of Basel has now proposed a new scheme for such measurements that uses a particular kind of correlation between quantum particles.
    In quantum information, the fictitious agents Alice and Bob are often used to illustrate complex communication tasks. In one such process, Alice can use entangled quantum particles such as photons to transmit or “teleport” a quantum state — unknown even to herself — to Bob, something that is not feasible using traditional communications.
    However, it has been unclear whether the team Alice-Bob can use similar quantum states for other things besides communication. A young physicist at the University of Basel has now shown how particular types of quantum states can be used to perform measurements with higher precision than quantum physics would ordinarily allow. The results have been published in the scientific journal Nature Communications.
    Quantum steering at a distance
    Together with researchers in Great Britain and France, Dr. Matteo Fadel, who works at the Physics Department of the University of Basel, has thought about how high-precision measurement tasks can be tackled with the help of so-called quantum steering.
    Quantum steering describes the fact that in certain quantum states of systems consisting of two particles, a measurement on the first particle allows one to make more precise predictions about possible measurement results on the second particle than quantum mechanics would allow if only the measurement on the second particle had been made. It is just as if the measurement on the first particle had “steered” the state of the second one. More

  • in

    Machine learning model generates realistic seismic waveforms

    A new machine-learning model that generates realistic seismic waveforms will reduce manual labor and improve earthquake detection, according to a study published recently in JGR Solid Earth.
    “To verify the e?cacy of our generative model, we applied it to seismic ?eld data collected in Oklahoma,” said Youzuo Lin, a computational scientist in Los Alamos National Laboratory’s Geophysics group and principal investigator of the project. “Through a sequence of qualitative and quantitative tests and benchmarks, we saw that our model can generate high-quality synthetic waveforms and improve machine learning-based earthquake detection algorithms.”
    Quickly and accurately detecting earthquakes can be a challenging task. Visual detection done by people has long been considered the gold standard, but requires intensive manual labor that scales poorly to large data sets. In recent years, automatic detection methods based on machine learning have improved the accuracy and efficiency of data collection; however, the accuracy of those methods relies on access to a large amount of high?quality, labeled training data, often tens of thousands of records or more.
    To resolve this data dilemma, the research team developed SeismoGen based on a generative adversarial network (GAN), which is a type of deep generative model that can generate high?quality synthetic samples in multiple domains. In other words, deep generative models train machines to do things and create new data that could pass as real.
    Once trained, the SeismoGen model is capable of producing realistic seismic waveforms of multiple labels. When applied to real Earth seismic datasets in Oklahoma, the team saw that data augmentation from SeismoGen?generated synthetic waveforms could be used to improve earthquake detection algorithms in instances when only small amounts of labeled training data are available.
    Story Source:
    Materials provided by DOE/Los Alamos National Laboratory. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence model predicts which key of the immune system opens the locks of coronavirus

    With an artificial intelligence (AI) method developed by researchers at Aalto University and University of Helsinki, researchers can now link immune cells to their targets and, for example, uncouple which white blood cells recognize SARS-CoV-2. The developed tool has broad applications in understanding the function of the immune system in infections, autoimmune disorders, and cancer.
    The human immune defense is based on the ability of white blood cells to accurately identify disease-causing pathogens and to initiate a defense reaction against them. The immune defense is able to recall the pathogens it has encountered previously, on which, for example, the effectiveness of vaccines is based. Thus, the immune defense the most accurate patient record system that carries a history of all pathogens an individual has faced. This information however has previously been difficult to obtain from patient samples.
    The learning immune system can be roughly divided into two parts, of which B cells are responsible for producing antibodies against pathogens, while T cells are responsible for destroying their targets. The measurement of antibodies by traditional laboratory methods is relatively simple, which is why antibodies already have several uses in healthcare.
    “Although it is known that the role of T cells in the defense response against for example viruses and cancer is essential, identifying the targets of T cells has been difficult despite extensive research,” says Satu Mustjoki, Professor of Translational Hematology.
    AI helps to identify new key-lock pairs
    T cells identify their targets in a key and a lock principle, where the key is the T cell receptor on the surface of the T cell and the key is the protein presented on the surface of an infected cell. An individual is estimated to carry more different T cell keys than there are stars in the Milky Way, making the mapping of T cell targets with laboratory techniques cumbersome. More

  • in

    Scientists glimpse signs of a puzzling state of matter in a superconductor

    Unconventional superconductors contain a number of exotic phases of matter that are thought to play a role, for better or worse, in their ability to conduct electricity with 100% efficiency at much higher temperatures than scientists had thought possible — although still far short of the temperatures that would allow their wide deployment in perfectly efficient power lines, maglev trains and so on.
    Now scientists at the Department of Energy’s SLAC National Accelerator Laboratory have glimpsed the signature of one of those phases, known as pair-density waves or PDW, and confirmed that it’s intertwined with another phase known as charge density wave (CDW) stripes — wavelike patterns of higher and lower electron density in the material.
    Observing and understanding PDW and its correlations with other phases may be essential for understanding how superconductivity emerges in these materials, allowing electrons to pair up and travel with no resistance, said Jun-Sik Lee, a SLAC staff scientist who led the research at the lab’s Stanford Synchrotron Radiation Lightsource (SSRL).
    Even indirect evidence of the PDW phase intertwined with charge stripes, he said, is an important step on the long road toward understanding the mechanism behind unconventional superconductivity, which has eluded scientists over more than 30 years of research.
    Lee added that the method his team used to make this observation, which involved dramatically increasing the sensitivity of a standard X-ray technique known as resonant soft X-ray scattering (RSXS) so it could see the extremely faint signals given off by these phenomena, has potential for directly sighting both the PDW signature and its correlations with other phases in future experiments. That’s what they plan to work on next.
    The scientists described their findings today in Physical Review Letters. More

  • in

    Capturing the sense of touch could upgrade prosthetics and our digital lives

    On most mornings, Jeremy D. Brown eats an avocado. But first, he gives it a little squeeze. A ripe avocado will yield to that pressure, but not too much. Brown also gauges the fruit’s weight in his hand and feels the waxy skin, with its bumps and ridges.

    “I can’t imagine not having the sense of touch to be able to do something as simple as judging the ripeness of that avocado,” says Brown, a mechanical engineer who studies haptic feedback — how information is gained or transmitted through touch — at Johns Hopkins University.

    Many of us have thought about touch more than usual during the COVID-19 pandemic. Hugs and high fives rarely happen outside of the immediate household these days. A surge in online shopping has meant fewer chances to touch things before buying. And many people have skipped travel, such as visits to the beach where they might sift sand through their fingers. A lot goes into each of those actions.

    “Anytime we touch anything, our perceptual experience is the product of the activity of thousands of nerve fibers and millions of neurons in the brain,” says neuroscientist Sliman Bensmaia of the University of Chicago. The body’s natural sense of touch is remarkably complex. Nerve receptors detect cues about pressure, shape, motion, texture, temperature and more. Those cues cause patterns of neural activity, which the central nervous system interprets so we can tell if something is smooth or rough, wet or dry, moving or still.

    Scientists at the University of Chicago attached strips of different materials to a rotating drum to measure vibrations produced in the skin as a variety of textures move across a person’s fingertips.
    Matt Wood/Univ. of Chicago

    Neuroscience is at the heart of research on touch. Yet mechanical engineers like Brown and others, along with experts in math and materials science, are studying touch with an eye toward translating the science into helpful applications. Researchers hope their work will lead to new and improved technologies that mimic tactile sensations.

    As scientists and engineers learn more about how our nervous system responds to touch stimuli, they’re also studying how our skin interacts with different materials. And they’ll need ways for people to send and receive simulated touch sensations. All these efforts present challenges, but progress is happening. In the near term, people who have lost limbs might recover some sense of touch through their artificial limbs. Longer term, haptics research might add touch to online shopping, enable new forms of remote medicine and expand the world of virtual reality.

    “Anytime you’re interacting with an object, your skin deforms,” or squishes a bit.Sliman Bensmaia

    Good vibrations

    Virtual reality programs already give users a sense of what it’s like to wander through the International Space Station or trek around a natural gas well. For touch to be part of such experiences, researchers will need to reproduce the signals that trigger haptic sensations.

    Our bodies are covered in nerve endings that respond to touch, and our hands are really loaded up, especially our fingertips. Some receptors tell where parts of us are in relation to the rest of the body. Others sense pain and temperature. One goal for haptics researchers is to mimic sensations resulting from force and movement, such as pressure, sliding or rubbing.

    “Anytime you’re interacting with an object, your skin deforms,” or squishes a bit, Bensmaia explains. Press on the raised dots of a braille letter, and the dots will poke your skin. A soapy glass slipping through your fingers produces a shearing force — and possibly a crash. Rub fabric between your fingers, and the action produces vibrations.

    Four main categories of touch receptors respond to those and other mechanical stimuli. There’s some overlap among the types. And a single contact with an object can affect multiple types of receptors, Bensmaia notes.

    One type, called Pacinian corpuscles, sits deep in the skin. They are especially good at detecting vibrations created when we interact with different textures. When stimulated, the receptors produce sequences of signals that travel to the brain over a period of time. Our brains interpret the signals as a particular texture. Bensmaia compares it to the way we hear a series of notes and recognize a tune.

    “Corduroy will produce one set of vibrations. Organza will produce another set,” Bensmaia says. Each texture produces “a different set of vibrations in your skin that we can measure.” Such measurements are a first step toward trying to reproduce the feel of different textures.

    Additionally, any stimulus meant to mimic a texture sensation must be strong enough to trigger responses in the nervous system’s touch receptors. That’s where work by researchers at the University of Birmingham in England comes in. The vibrations from contact with various textures create different kinds of wave energy. Rolling-type waves called Rayleigh waves go deep enough to reach the Pacinian receptors, the team reported last October in Science Advances. Much larger versions of the same types of waves cause much of the damage from earthquakes.

    Not all touches are forceful enough to trigger a response from the Pacinian receptors. To gain more insight into which interactions will stimulate those receptors, the team looked at studies that have collected data on touches to the limbs, head or neck of dogs, dolphins, rhinos, elephants and other mammals. A pattern emerged. The group calls it a “universal scaling law” of touch for mammals.

    For the most part, a touch at the surface will trigger a response in a Pacinian receptor deep in the skin if the ratio is 5-to-2 between the length of the Rayleigh waves resulting from the touch and the depth of the receptor. At that ratio or higher, a person and most other mammals will feel the sensation, says mathematician James Andrews, lead author of the study.

    Also, the amount of skin displacement needed to cause wavelengths long enough to trigger a sensation by the Pacinian receptors will be the same across most mammal species, the group found. Different species will need more or less force to cause that displacement, however, which may depend on skin composition or other factors. Rodents did not fit the 5–2 ratio, perhaps because their paws and limbs are so small compared with the wavelengths created when they touch things, Andrews notes.

    Beyond that, the work sheds light on “what types of information you’d need to realistically capture the haptic experience — the touch experience — and send that digitally anywhere,” Andrews says. People could then feel sensations with a device or perhaps with ultrasonic waves. Someday the research might help provide a wide range of virtual reality experiences, including virtual hugs.

    Online tactile shopping

    Mechanical engineer Cynthia Hipwell of Texas A&M University in College Station moved into a new house before the pandemic. She looked at some couches online but couldn’t bring herself to buy one from a website. “I didn’t want to choose couch fabric without feeling it,” Hipwell says.

    “Ideally, in the long run, if you’re shopping on Amazon, you could feel fabric,” she says. Web pages’ computer codes would make certain areas on a screen mimic different textures, perhaps with shifts in electrical charge, vibration signals, ultrasound or other methods. Touching the screen would clue you in to whether a sweater is soft or scratchy, or if a couch’s fabric feels bumpy or smooth. Before that can happen, researchers need to understand conditions that affect our perception of how a computer screen feels.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Surface features at the nanometer scale (billionths of a meter) can affect how we perceive the texture of a piece of glass, Hipwell says. Likewise, we may not consciously feel any wetness as humidity in the air mixes with our skin’s oil and sweat. But tiny changes in that moisture can alter the friction our fingers encounter as they move on a screen, she says. And that friction can influence how we perceive the screen’s texture.

    Shifts in electric charge also can change the attraction between a finger and a touch screen. That attraction is called electroadhesion, and it affects our tactile experience as we touch a screen. Hipwell’s group recently developed a computer model that accounts for the effects of electroadhesion, moisture and the deformation of skin pressing against glass. The team reported on the work in March 2020 in IEEE Transactions on Haptics.

    Hipwell hopes the model can help product designers develop haptic touch screens that go beyond online shopping. A car’s computerized dashboard might have sections that change texture for each menu, she suggests. A driver could change temperature or radio settings by touch while keeping eyes on the road.

    “Ideally, in the long run, if you’re shopping on Amazon, you could feel fabric.”Cynthia Hipwell

    Wireless touch patches

    Telemedicine visits rose dramatically during the early days of the COVID-19 pandemic. But video doesn’t let doctors feel for swollen glands or press an abdomen to check for lumps. Remote medicine with a sense of touch might help during pandemics like this one — and long after for people in remote areas with few doctors.

    People in those places might eventually have remote sensing equipment in their own homes or at a pharmacy or workplace. If that becomes feasible, a robot, glove or other equipment with sensors could touch parts of a patient’s body. The information would be relayed to a device somewhere else. A doctor at that other location could then experience the sensations of touching the patient.

    Researchers are already working on materials that can translate digital information about touch into sensations people — in this case, doctors — can feel. The same materials could communicate information for virtual reality applications. One possibility is a skin patch developed by physical chemist John Rogers of Northwestern University in Evanston, Ill., and others.

    One layer of the flexible patch sticks to a person’s skin. Other layers include a stretchable circuit board and tiny actuators that create vibrations as current flows around them. Wireless signals tell the actuators to turn on or off. Energy to run the patch also comes in wirelessly. The team described the patch in Nature in 2019.

    Retired U.S. Army Sgt. Garrett Anderson shakes hands with researcher Aadeel Akhtar, CEO of Psyonic, a prosthesis developer. A wireless skin patch on Anderson’s upper arm gives him sensory feedback when grasping an object.Northwestern Univ.

    Inside the patch are circular actuators that vibrate in response to signals. The prototype device might give the sensation of touch pressure in artificial limbs, in virtual reality and telemedicine.

    Since then, Rogers’ group has reduced the patch’s thickness and weight. The patch now also provides more detailed information to a wearer. “We have scaled the systems into a modular form to allow custom sizes [and] shapes in a kind of plug-and-play scheme,” Rogers notes. So far, up to six separate patches can work at the same time on different parts of the body.

    The group also wants to make its technology work with electronics that many consumers have, such as smartphones. Toward that end, Rogers and colleagues have developed a pressure-sensitive touch screen interface for sending information to the device. The interface lets someone provide haptic sensations by moving their fingers on a smartphone or touch screen–based computer screen. A person wearing the patch then feels stroking, tapping or other touch sensations.

    Pressure points

    Additionally, Rogers’ team has developed a way to use the patch system to pick up signals from pressure on a prosthetic arm’s fingertips. Those signals can then be relayed to a patch worn by the person with the artificial limb. Other researchers also are testing ways to add tactile feedback to prostheses. European researchers reported in 2019 that adding feedback for pressure and motion helped people with an artificial leg walk with more confidence (SN: 10/12/19, p. 8). The device reduced phantom limb pain as well.

    Brown, the mechanical engineer at Johns Hopkins, hopes to help people control the force of their artificial limbs. Nondisabled people adjust their hands’ force instinctively, he notes. He often takes his young daughter’s hand when they’re in a parking lot. If she starts to pull away, he gently squeezes. But he might easily hurt her if he couldn’t sense the stiffness of her flesh and bones.

    Two types of prosthetic limbs can let people who lost an arm do certain movements again. Hands on “body-controlled” limbs open or close when the user moves other muscle groups. The movement works a cable on a harness that connects to the hand. Force on those other muscles tells the person if the hand is open or closed. Myoelectric prosthetic limbs, in contrast, are directly controlled by the muscles on the residual limb. Those muscle-controlled electronic limbs generally don’t give any feedback about touch. Compared with the body-controlled options, however, they allow a greater range of motion and can offer other advantages.

    In one study, Brown’s group tested two ways to add feedback about the force that a muscle-controlled electronic limb exerts on an object. One method used an exoskeleton that applied force around a person’s elbow. The other technique used a device strapped near the wrist. The stiffer an object is, the stronger the vibrations on someone’s wrist. Volunteers without limb loss tried using each setup to judge the stiffness of blocks.

    In a study of two different haptic feedback methods, one system applied force near the elbow. N. Thomas et al/J. NeuroEng. Rehab. 2019

    The other system tested in the study provided vibrations near the wrist. N. Thomas et al/J. NeuroEng. Rehab. 2019

    Both methods worked better than no feedback. And compared with each other, the two types of feedback “worked equally well,” Brown says. “We think that is because, in the end, what the human user is doing is creating a map.” Basically, people match up how much force corresponds to the intensity of each type of feedback. The work suggests ways to improve muscle-controlled electronic limbs, Brown and colleagues reported in 2019 in the Journal of NeuroEngineering and Rehabilitation.

    Still, people’s brains may not be able to match up all types of feedback for touch sensations. Bensmaia’s group at the University of Chicago has worked with colleagues in Sweden who built tactile sensors into bionic hands: Signals from a sensor on the thumb went to an electrode implanted around the ulnar nerve on people’s arms. Three people who had lost a hand tested the bionic hands and felt a touch when the thumb was prodded, but the touch felt as if it came from somewhere else on the hand.

    Doctors can choose which nerve an electrode will stimulate. But they don’t know in advance which bundle of fibers it will affect within the nerve, Bensmaia explains. And different bundles receive and supply sensations to different parts of the hand. Even after the people had used the prosthesis for more than a year, the mismatch didn’t improve. The brain didn’t adapt to correct the sensation. The team shared its findings last December in Cell Reports.

    Despite that, in previous studies, those same people using the bionic hands had better precision and more control over their force when grasping objects, compared with those using versions without direct stimulation of the nerve. People getting the direct nerve stimulation also reported feeling as if the hand was more a part of them.

    As with the bionic hands, advances in haptic technology probably won’t start out working perfectly. Indeed, virtual hugs and other simulated touch experiences may never be as good as the real thing. Yet haptics may help us get a feel for the future, with new ways to explore our world and stay in touch with those we love. More