More stories

  • in

    Capturing the sense of touch could upgrade prosthetics and our digital lives

    On most mornings, Jeremy D. Brown eats an avocado. But first, he gives it a little squeeze. A ripe avocado will yield to that pressure, but not too much. Brown also gauges the fruit’s weight in his hand and feels the waxy skin, with its bumps and ridges.

    “I can’t imagine not having the sense of touch to be able to do something as simple as judging the ripeness of that avocado,” says Brown, a mechanical engineer who studies haptic feedback — how information is gained or transmitted through touch — at Johns Hopkins University.

    Many of us have thought about touch more than usual during the COVID-19 pandemic. Hugs and high fives rarely happen outside of the immediate household these days. A surge in online shopping has meant fewer chances to touch things before buying. And many people have skipped travel, such as visits to the beach where they might sift sand through their fingers. A lot goes into each of those actions.

    “Anytime we touch anything, our perceptual experience is the product of the activity of thousands of nerve fibers and millions of neurons in the brain,” says neuroscientist Sliman Bensmaia of the University of Chicago. The body’s natural sense of touch is remarkably complex. Nerve receptors detect cues about pressure, shape, motion, texture, temperature and more. Those cues cause patterns of neural activity, which the central nervous system interprets so we can tell if something is smooth or rough, wet or dry, moving or still.

    Scientists at the University of Chicago attached strips of different materials to a rotating drum to measure vibrations produced in the skin as a variety of textures move across a person’s fingertips.
    Matt Wood/Univ. of Chicago

    Neuroscience is at the heart of research on touch. Yet mechanical engineers like Brown and others, along with experts in math and materials science, are studying touch with an eye toward translating the science into helpful applications. Researchers hope their work will lead to new and improved technologies that mimic tactile sensations.

    As scientists and engineers learn more about how our nervous system responds to touch stimuli, they’re also studying how our skin interacts with different materials. And they’ll need ways for people to send and receive simulated touch sensations. All these efforts present challenges, but progress is happening. In the near term, people who have lost limbs might recover some sense of touch through their artificial limbs. Longer term, haptics research might add touch to online shopping, enable new forms of remote medicine and expand the world of virtual reality.

    “Anytime you’re interacting with an object, your skin deforms,” or squishes a bit.Sliman Bensmaia

    Good vibrations

    Virtual reality programs already give users a sense of what it’s like to wander through the International Space Station or trek around a natural gas well. For touch to be part of such experiences, researchers will need to reproduce the signals that trigger haptic sensations.

    Our bodies are covered in nerve endings that respond to touch, and our hands are really loaded up, especially our fingertips. Some receptors tell where parts of us are in relation to the rest of the body. Others sense pain and temperature. One goal for haptics researchers is to mimic sensations resulting from force and movement, such as pressure, sliding or rubbing.

    “Anytime you’re interacting with an object, your skin deforms,” or squishes a bit, Bensmaia explains. Press on the raised dots of a braille letter, and the dots will poke your skin. A soapy glass slipping through your fingers produces a shearing force — and possibly a crash. Rub fabric between your fingers, and the action produces vibrations.

    Four main categories of touch receptors respond to those and other mechanical stimuli. There’s some overlap among the types. And a single contact with an object can affect multiple types of receptors, Bensmaia notes.

    One type, called Pacinian corpuscles, sits deep in the skin. They are especially good at detecting vibrations created when we interact with different textures. When stimulated, the receptors produce sequences of signals that travel to the brain over a period of time. Our brains interpret the signals as a particular texture. Bensmaia compares it to the way we hear a series of notes and recognize a tune.

    “Corduroy will produce one set of vibrations. Organza will produce another set,” Bensmaia says. Each texture produces “a different set of vibrations in your skin that we can measure.” Such measurements are a first step toward trying to reproduce the feel of different textures.

    Additionally, any stimulus meant to mimic a texture sensation must be strong enough to trigger responses in the nervous system’s touch receptors. That’s where work by researchers at the University of Birmingham in England comes in. The vibrations from contact with various textures create different kinds of wave energy. Rolling-type waves called Rayleigh waves go deep enough to reach the Pacinian receptors, the team reported last October in Science Advances. Much larger versions of the same types of waves cause much of the damage from earthquakes.

    Not all touches are forceful enough to trigger a response from the Pacinian receptors. To gain more insight into which interactions will stimulate those receptors, the team looked at studies that have collected data on touches to the limbs, head or neck of dogs, dolphins, rhinos, elephants and other mammals. A pattern emerged. The group calls it a “universal scaling law” of touch for mammals.

    For the most part, a touch at the surface will trigger a response in a Pacinian receptor deep in the skin if the ratio is 5-to-2 between the length of the Rayleigh waves resulting from the touch and the depth of the receptor. At that ratio or higher, a person and most other mammals will feel the sensation, says mathematician James Andrews, lead author of the study.

    Also, the amount of skin displacement needed to cause wavelengths long enough to trigger a sensation by the Pacinian receptors will be the same across most mammal species, the group found. Different species will need more or less force to cause that displacement, however, which may depend on skin composition or other factors. Rodents did not fit the 5–2 ratio, perhaps because their paws and limbs are so small compared with the wavelengths created when they touch things, Andrews notes.

    Beyond that, the work sheds light on “what types of information you’d need to realistically capture the haptic experience — the touch experience — and send that digitally anywhere,” Andrews says. People could then feel sensations with a device or perhaps with ultrasonic waves. Someday the research might help provide a wide range of virtual reality experiences, including virtual hugs.

    Online tactile shopping

    Mechanical engineer Cynthia Hipwell of Texas A&M University in College Station moved into a new house before the pandemic. She looked at some couches online but couldn’t bring herself to buy one from a website. “I didn’t want to choose couch fabric without feeling it,” Hipwell says.

    “Ideally, in the long run, if you’re shopping on Amazon, you could feel fabric,” she says. Web pages’ computer codes would make certain areas on a screen mimic different textures, perhaps with shifts in electrical charge, vibration signals, ultrasound or other methods. Touching the screen would clue you in to whether a sweater is soft or scratchy, or if a couch’s fabric feels bumpy or smooth. Before that can happen, researchers need to understand conditions that affect our perception of how a computer screen feels.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Surface features at the nanometer scale (billionths of a meter) can affect how we perceive the texture of a piece of glass, Hipwell says. Likewise, we may not consciously feel any wetness as humidity in the air mixes with our skin’s oil and sweat. But tiny changes in that moisture can alter the friction our fingers encounter as they move on a screen, she says. And that friction can influence how we perceive the screen’s texture.

    Shifts in electric charge also can change the attraction between a finger and a touch screen. That attraction is called electroadhesion, and it affects our tactile experience as we touch a screen. Hipwell’s group recently developed a computer model that accounts for the effects of electroadhesion, moisture and the deformation of skin pressing against glass. The team reported on the work in March 2020 in IEEE Transactions on Haptics.

    Hipwell hopes the model can help product designers develop haptic touch screens that go beyond online shopping. A car’s computerized dashboard might have sections that change texture for each menu, she suggests. A driver could change temperature or radio settings by touch while keeping eyes on the road.

    “Ideally, in the long run, if you’re shopping on Amazon, you could feel fabric.”Cynthia Hipwell

    Wireless touch patches

    Telemedicine visits rose dramatically during the early days of the COVID-19 pandemic. But video doesn’t let doctors feel for swollen glands or press an abdomen to check for lumps. Remote medicine with a sense of touch might help during pandemics like this one — and long after for people in remote areas with few doctors.

    People in those places might eventually have remote sensing equipment in their own homes or at a pharmacy or workplace. If that becomes feasible, a robot, glove or other equipment with sensors could touch parts of a patient’s body. The information would be relayed to a device somewhere else. A doctor at that other location could then experience the sensations of touching the patient.

    Researchers are already working on materials that can translate digital information about touch into sensations people — in this case, doctors — can feel. The same materials could communicate information for virtual reality applications. One possibility is a skin patch developed by physical chemist John Rogers of Northwestern University in Evanston, Ill., and others.

    One layer of the flexible patch sticks to a person’s skin. Other layers include a stretchable circuit board and tiny actuators that create vibrations as current flows around them. Wireless signals tell the actuators to turn on or off. Energy to run the patch also comes in wirelessly. The team described the patch in Nature in 2019.

    Retired U.S. Army Sgt. Garrett Anderson shakes hands with researcher Aadeel Akhtar, CEO of Psyonic, a prosthesis developer. A wireless skin patch on Anderson’s upper arm gives him sensory feedback when grasping an object.Northwestern Univ.

    Inside the patch are circular actuators that vibrate in response to signals. The prototype device might give the sensation of touch pressure in artificial limbs, in virtual reality and telemedicine.

    Since then, Rogers’ group has reduced the patch’s thickness and weight. The patch now also provides more detailed information to a wearer. “We have scaled the systems into a modular form to allow custom sizes [and] shapes in a kind of plug-and-play scheme,” Rogers notes. So far, up to six separate patches can work at the same time on different parts of the body.

    The group also wants to make its technology work with electronics that many consumers have, such as smartphones. Toward that end, Rogers and colleagues have developed a pressure-sensitive touch screen interface for sending information to the device. The interface lets someone provide haptic sensations by moving their fingers on a smartphone or touch screen–based computer screen. A person wearing the patch then feels stroking, tapping or other touch sensations.

    Pressure points

    Additionally, Rogers’ team has developed a way to use the patch system to pick up signals from pressure on a prosthetic arm’s fingertips. Those signals can then be relayed to a patch worn by the person with the artificial limb. Other researchers also are testing ways to add tactile feedback to prostheses. European researchers reported in 2019 that adding feedback for pressure and motion helped people with an artificial leg walk with more confidence (SN: 10/12/19, p. 8). The device reduced phantom limb pain as well.

    Brown, the mechanical engineer at Johns Hopkins, hopes to help people control the force of their artificial limbs. Nondisabled people adjust their hands’ force instinctively, he notes. He often takes his young daughter’s hand when they’re in a parking lot. If she starts to pull away, he gently squeezes. But he might easily hurt her if he couldn’t sense the stiffness of her flesh and bones.

    Two types of prosthetic limbs can let people who lost an arm do certain movements again. Hands on “body-controlled” limbs open or close when the user moves other muscle groups. The movement works a cable on a harness that connects to the hand. Force on those other muscles tells the person if the hand is open or closed. Myoelectric prosthetic limbs, in contrast, are directly controlled by the muscles on the residual limb. Those muscle-controlled electronic limbs generally don’t give any feedback about touch. Compared with the body-controlled options, however, they allow a greater range of motion and can offer other advantages.

    In one study, Brown’s group tested two ways to add feedback about the force that a muscle-controlled electronic limb exerts on an object. One method used an exoskeleton that applied force around a person’s elbow. The other technique used a device strapped near the wrist. The stiffer an object is, the stronger the vibrations on someone’s wrist. Volunteers without limb loss tried using each setup to judge the stiffness of blocks.

    In a study of two different haptic feedback methods, one system applied force near the elbow. N. Thomas et al/J. NeuroEng. Rehab. 2019

    The other system tested in the study provided vibrations near the wrist. N. Thomas et al/J. NeuroEng. Rehab. 2019

    Both methods worked better than no feedback. And compared with each other, the two types of feedback “worked equally well,” Brown says. “We think that is because, in the end, what the human user is doing is creating a map.” Basically, people match up how much force corresponds to the intensity of each type of feedback. The work suggests ways to improve muscle-controlled electronic limbs, Brown and colleagues reported in 2019 in the Journal of NeuroEngineering and Rehabilitation.

    Still, people’s brains may not be able to match up all types of feedback for touch sensations. Bensmaia’s group at the University of Chicago has worked with colleagues in Sweden who built tactile sensors into bionic hands: Signals from a sensor on the thumb went to an electrode implanted around the ulnar nerve on people’s arms. Three people who had lost a hand tested the bionic hands and felt a touch when the thumb was prodded, but the touch felt as if it came from somewhere else on the hand.

    Doctors can choose which nerve an electrode will stimulate. But they don’t know in advance which bundle of fibers it will affect within the nerve, Bensmaia explains. And different bundles receive and supply sensations to different parts of the hand. Even after the people had used the prosthesis for more than a year, the mismatch didn’t improve. The brain didn’t adapt to correct the sensation. The team shared its findings last December in Cell Reports.

    Despite that, in previous studies, those same people using the bionic hands had better precision and more control over their force when grasping objects, compared with those using versions without direct stimulation of the nerve. People getting the direct nerve stimulation also reported feeling as if the hand was more a part of them.

    As with the bionic hands, advances in haptic technology probably won’t start out working perfectly. Indeed, virtual hugs and other simulated touch experiences may never be as good as the real thing. Yet haptics may help us get a feel for the future, with new ways to explore our world and stay in touch with those we love. More

  • in

    Mechanical engineers develop new high-performance artificial muscle technology

    In the field of robotics, researchers are continually looking for the fastest, strongest, most efficient and lowest-cost ways to actuate, or enable, robots to make the movements needed to carry out their intended functions.
    The quest for new and better actuation technologies and ‘soft’ robotics is often based on principles of biomimetics, in which machine components are designed to mimic the movement of human muscles — and ideally, to outperform them. Despite the performance of actuators like electric motors and hydraulic pistons, their rigid form limits how they can be deployed. As robots transition to more biological forms and as people ask for more biomimetic prostheses, actuators need to evolve.
    Associate professor (and alum) Michael Shafer and professor Heidi Feigenbaum of Northern Arizona University’s Department of Mechanical Engineering, along with graduate student researcher Diego Higueras-Ruiz, published a paper in Science Robotics presenting a new, high-performance artificial muscle technology they developed in NAU’s Dynamic Active Systems Laboratory. The paper, titled “Cavatappi artificial muscles from drawing, twisting, and coiling polymer tubes,” details how the new technology enables more human-like motion due to its flexibility and adaptability, but outperforms human skeletal muscle in several metrics.
    “We call these new linear actuators cavatappi artificial muscles based on their resemblance to the Italian pasta,” Shafer said.
    Because of their coiled, or helical, structure, the actuators can generate more power, making them an ideal technology for bioengineering and robotics applications. In the team’s initial work, they demonstrated that cavatappi artificial muscles exhibit specific work and power metrics ten and five times higher than human skeletal muscles, respectively, and as they continue development, they expect to produce even higher levels of performance.
    “The cavatappi artificial muscles are based on twisted polymer actuators (TPAs), which were pretty revolutionary when they first came out because they were powerful, lightweight and cheap. But they were very inefficient and slow to actuate because you had to heat and cool them. Additionally, their efficiency is only about two percent,” Shafer said. “For the cavatappi, we get around this by using pressurized fluid to actuate, so we think these devices are far more likely to be adopted. These devices respond about as fast as we can pump the fluid. The big advantage is their efficiency. We have demonstrated contractile efficiency of up to about 45 percent, which is a very high number in the field of soft actuation.”
    The engineers think this technology could be used in soft robotics applications, conventional robotic actuators (for example, for walking robots), or even potentially in assistive technologies like exoskeletons or prostheses.
    “We expect that future work will include the use of cavatappi artificial muscles in many applications due to their simplicity, low-cost, lightweight, flexibility, efficiency and strain energy recovery properties, among other benefits,” Shafer said.
    Technology is available for licensing, partnering opportunities.
    Working with the NAU Innovations team, the inventors have taken steps to protect their intellectual property. The technology has entered the protection and early commercialization stage and is available for licensing and partnering opportunities. For more information, please contact NAU Innovations.
    Shafer joined NAU in 2013. His other research interests are related to energy harvesting, wildlife telemetry systems and unmanned aerial systems. Feigenbaum joined NAU in 2007, and her other research interest include ratcheting in metals and smart materials. The graduate student on this project, Diego Higueras-Ruiz, received his MS in Mechanical Engineering from NAU in 2018 and will be completing his PhD in Bioengineering in Fall 2021. This work has been supported through a grant from NAU’s Research and Development Preliminary Studies program. More

  • in

    AI algorithms can influence people's voting and dating decisions in experiments

    In a new series of experiments, artificial intelligence (A.I.) algorithms were able to influence people’s preferences for fictitious political candidates or potential romantic partners, depending on whether recommendations were explicit or covert. Ujué Agudo and Helena Matute of Universidad de Deusto in Bilbao, Spain, present these findings in the open-access journal PLOS ONE on April 21, 2021.
    From Facebook to Google search results, many people encounter A.I. algorithms every day. Private companies are conducting extensive research on the data of their users, generating insights into human behavior that are not publicly available. Academic social science research lags behind private research, and public knowledge on how A.I. algorithms might shape people’s decisions is lacking.
    To shed new light, Agudo and Matute conducted a series of experiments that tested the influence of A.I. algorithms in different contexts. They recruited participants to interact with algorithms that presented photos of fictitious political candidates or online dating candidates, and asked the participants to indicate whom they would vote for or message. The algorithms promoted some candidates over others, either explicitly (e.g., “90% compatibility”) or covertly, such as by showing their photos more often than others’.
    Overall, the experiments showed that the algorithms had a significant influence on participants’ decisions of whom to vote for or message. For political decisions, explicit manipulation significantly influenced decisions, while covert manipulation was not effective. The opposite effect was seen for dating decisions.
    The researchers speculate these results might reflect people’s preference for human explicit advice when it comes to subjective matters such as dating, while people might prefer algorithmic advice on rational political decisions.
    In light of their findings, the authors express support for initiatives that seek to boost the trustworthiness of A.I., such as the European Commission’s Ethics Guidelines for Trustworthy AI and DARPA’s explainable AI (XAI) program. Still, they caution that more publicly available research is needed to understand human vulnerability to algorithms.
    Meanwhile, the researchers call for efforts to educate the public on the risks of blind trust in recommendations from algorithms. They also highlight the need for discussions around ownership of the data that drives these algorithms.
    The authors add: “If a fictitious and simplistic algorithm like ours can achieve such a level of persuasion without establishing actually customized profiles of the participants (and using the same photographs in all cases), a more sophisticated algorithm such as those with which people interact in their daily lives should certainly be able to exert a much stronger influence.”
    Story Source:
    Materials provided by PLOS. Note: Content may be edited for style and length. More

  • in

    Pepper the robot talks to itself to improve its interactions with people

    Ever wondered why your virtual home assistant doesn’t understand your questions? Or why your navigation app took you on the side street instead of the highway? In a study published April 21st in the journal iScience, Italian researchers designed a robot that “thinks out loud” so that users can hear its thought process and better understand the robot’s motivations and decisions.
    “If you were able to hear what the robots are thinking, then the robot might be more trustworthy,” says co-author Antonio Chella, describing first author Arianna Pipitone’s idea that launched the study at the University of Palermo. “The robots will be easier to understand for laypeople, and you don’t need to be a technician or engineer. In a sense, we can communicate and collaborate with the robot better.”
    Inner speech is common in people and can be used to gain clarity, seek moral guidance, and evaluate situations in order to make better decisions. To explore how inner speech might impact a robot’s actions, the researchers built a robot called Pepper that speaks to itself. They then asked people to set the dinner table with Pepper according to etiquette rules to study how Pepper’s self-dialogue skills influence human-robot interactions.
    The scientists found that, with the help of inner speech, Pepper is better at solving dilemmas. In one experiment, the user asked Pepper to place the napkin at the wrong spot, contradicting the etiquette rule. Pepper started asking itself a series of self-directed questions and concluded that the user might be confused. To be sure, Pepper confirmed the user’s request, which led to further inner speech.
    “Ehm, this situation upsets me. I would never break the rules, but I can’t upset him, so I’m doing what he wants,” Pepper said to itself, placing the napkin at the requested spot. Through Pepper’s inner voice, the user can trace its thoughts to learn that Pepper was facing a dilemma and solved it by prioritizing the human’s request. The researchers suggest that the transparency could help establish human-robot trust.
    Comparing Pepper’s performance with and without inner speech, Pipitone and Chella discovered that the robot had a higher task-completion rate when engaging in self-dialogue. Thanks to inner speech, Pepper outperformed the international standard functional and moral requirements for collaborative robots — guidelines that machines, from humanoid AI to mechanic arms at the manufacturing line, follow.
    “People were very surprised by the robot’s ability,” says Pipitone. “The approach makes the robot different from typical machines because it has the ability to reason, to think. Inner speech enables alternative solutions for the robots and humans to collaborate and get out of stalemate situations.”
    Although hearing the inner voice of robots enriches the human-robot interaction, some people might find it inefficient because the robot spends more time completing tasks when it talks to itself. The robot’s inner speech is also limited to the knowledge that researchers gave it. Still, Pipitone and Chella say their work provides a framework to further explore how self-dialogue can help robots focus, plan, and learn.
    “In some sense, we are creating a generational robot that likes to chat,” says Chella. The authors say that, from navigation apps and the camera on your phone to medical robots in the operation rooms, machines and computers alike can benefit from this chatty feature. “Inner speech could be useful in all the cases where we trust the computer or a robot for the evaluation of a situation,” Chella says.
    Story Source:
    Materials provided by Cell Press. Note: Content may be edited for style and length. More

  • in

    Fast radio bursts could help solve the mystery of the universe’s expansion

    Astronomers have been arguing about the rate of the universe’s expansion for nearly a century. A new independent method to measure that rate could help cast the deciding vote.

    For the first time, astronomers calculated the Hubble constant — the rate at which the universe is expanding — from observations of cosmic flashes called fast radio bursts, or FRBs. While the results are preliminary and the uncertainties are large, the technique could mature into a powerful tool for nailing down the elusive Hubble constant, researchers report April 12 at arXiv.org.

    Ultimately, if the uncertainties in the new method can be reduced, it could help settle the longstanding debate that holds our understanding of the universe’s physics in the balance (SN: 7/30/19).

    “I see great promises in this measurement in the future, especially with the growing number of detected repeated FRBs,” says Stanford University astronomer Simon Birrer, who was not involved with the new work.

    Astronomers typically measure the Hubble constant in two ways. One uses the cosmic microwave background, the light released shortly after the Big Bang, in the distant universe. The other uses supernovas and other stars in the nearby universe. These approaches currently disagree by a few percent. The new value from FRBs comes in at an expansion rate of about 62.3 kilometers per second for every megaparsec (about 3.3 million light-years). While lower than the other methods, it’s tentatively closer to the value from the cosmic microwave background, or CMB.

    “Our data agrees a little bit more with the CMB side of things compared to the supernova side, but the error bar is really big, so you can’t really say anything,” says Steffen Hagstotz, an astronomer at Stockholm University. Nonetheless, he says, “I think fast radio bursts have the potential to be as accurate as the other methods.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    No one knows exactly what causes FRBs, though eruptions from highly magnetic neutron stars are one possible explanation (SN: 6/4/20). During the few milliseconds when FRBs blast out radio waves, their extreme brightness makes them visible across large cosmic distances, giving astronomers a way to probe the space between galaxies (SN: 5/27/20).

    As an FRB signal travels through the dust and gas separating galaxies, it becomes scattered in a predictable way that causes some frequencies to arrive slightly later than others. The farther away the FRB, the more dispersed the signal. Comparing this delay with distance estimates to nine known FRBs, Hagstotz and colleagues measured the Hubble constant.

    The largest error in the new method comes from not knowing precisely how the FRB signal disperses as it exits its home galaxy before entering intergalactic space, where the gas and dust content is better understood. With a few hundred FRBs, the team estimates that it could reduce the uncertainties and match the accuracy of other methods such as supernovas.

    “It’s a first measurement, so not too surprising that the current results are not as constraining as other more matured probes,” says Birrer.

    New FRB data might be coming soon. Many new radio observatories are coming online and larger surveys, such as ones proposed for the Square Kilometer Array, could discover tens to thousands of FRBs every night. Hagstotz expects there will sufficient FRBs with distance estimates in the next year or two to accurately determine the Hubble constant. Such FRB data could also help astronomers understand what’s causing the bright outbursts.

    “I am very excited about the new possibilities that we will have soon,” Hagstotz says. “It’s really just beginning.” More

  • in

    A new technique could make some plastic trash compostable at home

    A pinch of polymer-munching enzymes could make biodegradable plastic packaging and forks truly compostable.

    With moderate heat, enzyme-laced films of the plastic disintegrated in standard compost or plain tap water within days to weeks, Ting Xu and her colleagues report April 21 in Nature.

    “Biodegradability does not equal compostability,” says Xu, a polymer scientist at the University of California, Berkeley and Lawrence Berkeley National Laboratory. She often finds bits of biodegradable plastic in the compost she picks up for her parents’ garden. Most biodegradable plastics go to landfills, where the conditions aren’t right for them to break down, so they degrade no faster than normal plastics.

    Embedding polymer-chomping enzymes in biodegradable plastic should accelerate decomposition. But that process often inadvertently forms potentially harmful microplastics, which are showing up in ecosystems across the globe (SN: 11/20/20). The enzymes clump together and randomly snip plastics’ molecular chains, leading to an incomplete breakdown. “It’s worse than if you don’t degrade them in the first place,” Xu says.

    Her team added individual enzymes into two biodegradable plastics, including polylactic acid, commonly used in food packaging. They inserted the enzymes along with another ingredient, a degradable additive Xu previously developed, which ensured the enzymes didn’t clump together and didn’t fall apart. The solitary enzymes grabbed the ends of the plastics’ molecular chains and ate as though they were slurping spaghetti, severing every chain link and preventing microplastic formation.

    Filaments of a new plastic material degrade completely (right) when submerged in tap water for several days.Adam Lau/Berkeley Engineering

    Adding enzymes usually makes plastic expensive and compromises its properties. However, Xu’s enzymes make up as little as 0.02 percent of the plastic’s weight, and her plastics are as strong and flexible as one typically used in grocery bags.

    The technology doesn’t work on all plastics because their molecular structures vary, a limitation Xu’s team is working to overcome. She’s filed a patent application for the technology, and a coauthor founded a startup to commercialize it. “We want this to be in every grocery store,” she says. More

  • in

    Augmented reality in retail and its impact on sales

    Augmented reality (AR) is a technology that superimposes virtual objects onto a live view of physical environments, helping users visualize how these objects fit into their physical world. Researchers from City University of Hong Kong and Singapore Management University published a new paper in the Journal of Marketing that identifies four broad uses of AR in retail settings and examines the impact of AR on retail sales.
    The study, forthcoming in the Journal of Marketing, is titled “Augmented Reality in Retail and Its Impact on Sales” and is authored by Yong-Chin Tan, Sandeep Chandukala, and Srinivas Reddy. The researchers discuss the following uses of AR in retail settings:* To entertain customers. AR transforms static objects into interactive, animated three-dimensional objects, helping marketers create fresh experiences that captivate and entertain customers. Marketers can use AR-enabled experiences to drive traffic to their physical locations. For example, Walmart collaborated with DC Comics and Marvel to place special thematic displays with exclusive superhero-themed AR experiences in its stores. In addition to creating novel and engaging experiences for customers, the displays also encouraged customers to explore different areas in the stores. * To educate customers. Due to its interactive and immersive format, AR is also an effective medium to deliver content and information to customers. To help customers better appreciate their new car models, Toyota and Hyundai have utilized AR to demonstrate key features and innovative technologies in a vivid and visually appealing manner. AR can also be used to provide in-store wayfinding and product support. Walgreens and Lowe’s have developed in-store navigation apps that overlay directional signals onto a live view of the path in front of users to guide them to product locations and notify them if there are special promotions along the way. * To facilitate product evaluation. By retaining the physical environment as a backdrop for virtual elements, AR also helps users visualize how products would appear in their actual consumption contexts to assess product fit more accurately prior to purchase. For example, Ikea’s Place app uses AR to overlay true-to-scale, three-dimensional models of furniture onto a live view of customers’ rooms. Customers can easily determine if the products fit in a space without taking any measurements. Uniqlo and Topshop have also deployed the same technology in their physical stores, offering customers greater convenience by reducing the need to change in and out of different outfits. An added advantage of AR is its ability to accommodate a wide assortment of products. This capability is particularly useful for made-to-order or bulky products. BMW and Audi have used AR to provide customers with true-to-scale, three-dimensional visual representations of car models based on customized features such as paint color, wheel design, and interior aesthetics. * To enhance the post-purchase consumption experience. Lastly, AR can be used to enhance and redefine the way products are experienced or consumed after they have been purchased. For example, Lego recently launched several specially designed brick sets that combine physical and virtual gameplay. Through the companion AR app, animated Lego characters spring to life and interact with the physical Lego sets, creating a whole new playing experience. In a bid to address skepticism about the quality of its food ingredients, McDonald’s has also used AR to let customers discover the origins of ingredients in the food they purchased via story-telling and three-dimensional animations. The research also focuses on the promising application of AR to facilitate product evaluation prior to purchase and examine how it impacts sales in online retail. For example: * The availability and usage of AR has a positive impact on sales. The overall impact appears to be small, but certain products are more likely to benefit from the technology than others. * The impact of AR is stronger for products and brands that are less popular. Thus, retailers carrying wide product assortments can use AR to stimulate demand for niche products at the long tail of the sales distribution. AR may also help to level the playing field for less-popular brands. With the launch of AR-enabled display ads on advertising platforms such as Facebook and YouTube, less-established brands could consider investing in this new ad format because they stand to benefit most from this technology. * The impact of AR is also greater for products that are more expensive, indicating that AR could increase overall revenues for retailers. Retailers selling premium products may also leverage AR to improve decision comfort and reduce customers’ hesitation in the purchase process. * Customers who are new to the online channel or product category are more likely to purchase after using AR, suggesting that AR has the potential to promote online channel adoption and category expansion. As prior research has shown that multichannel customers are more profitable, omni-channel retailers can use AR to encourage their offline customers to adopt the online channel.Taken together, these findings provide converging evidence that AR is most effective when product-related uncertainty is high. Managers can thus use AR to reduce customer uncertainty and improve sales.
    Story Source:
    Materials provided by American Marketing Association. Original written by Matt Weingarden. Note: Content may be edited for style and length. More

  • in

    New conductive polymer ink opens for next-generation printed electronics

    Researchers at Linköping University, Sweden, have developed a stable high-conductivity polymer ink. The advance paves the way for innovative printed electronics with high energy efficiency. The results have been published in Nature Communications.
    Electrically conducting polymers have made possible the development of flexible and lightweight electronic components such as organic biosensors, solar cells, light-emitting diodes, transistors, and batteries.
    The electrical properties of the conducting polymers can be tuned using a method known as “doping.” In this method, various dopant molecules are added to the polymer to change its properties. Depending on the dopant, the doped polymer can conduct electricity by the motion of either negatively charged electrons (an “n-type” conductor), or positively charged holes (a “p-type” conductor). Today, the most commonly used conducting polymer is the p-type conductor PEDOT:PSS. PEDOT:PSS has several compelling features such as high electrical conductivity, excellent ambient stability, and most importantly, commercial availability as an aqueous dispersion. However, many electronic devices require a combination of p-types and n-types to function. At the moment, there is no n-type equivalent to PEDOT:PSS.
    Researchers at Linköping University, together with colleagues in the US and South Korea, have now developed a conductive n-type polymer ink, stable in air and at high temperatures. This new polymer formulation is known as BBL:PEI.
    “This is a major advance that makes the next generation of printed electronic devices possible. The lack of a suitable n-type polymer has been like walking on one leg when designing functional electronic devices. We can now provide the second leg,” says Simone Fabiano, senior lecturer in the Department of Science and Technology at Linköping University.
    Chi-Yuan Yang is a postdoc at Linköping University and one of the principal authors of the article published in Nature Communications. He adds:
    “Everything possible with PEDOT:PSS is also possible with our new polymer. The combination of PEDOT:PSS and BBL:PEI opens new possibilities for the development of stable and efficient electronic circuits,” says Chi-Yuan Yang.
    The new n-type material comes in the form of ink with ethanol as the solvent. The ink can be deposited by simply spraying the solution onto a surface, making organic electronic devices easier and cheaper to manufacture. Also, the ink is more eco-friendly than many other n-type organic conductors currently under development, which instead contain harmful solvents. Simone Fabiano believes that the technology is ready for routine use.
    “Large-scale production is already feasible, and we are thrilled to have come so far in a relatively short time. We expect BBL:PEI to have the same impact as PEDOT:PSS. At the same time, much remains to be done to adapt the ink to various technologies, and we need to learn more about the material,” says Simone Fabiano.
    Story Source:
    Materials provided by Linköping University. Note: Content may be edited for style and length. More