More stories

  • in

    New technique improves AI ability to map 3D space with 2D cameras

    Researchers have developed a technique that allows artificial intelligence (AI) programs to better map three-dimensional spaces using two-dimensional images captured by multiple cameras. Because the technique works effectively with limited computational resources, it holds promise for improving the navigation of autonomous vehicles.
    “Most autonomous vehicles use powerful AI programs called vision transformers to take 2D images from multiple cameras and create a representation of the 3D space around the vehicle,” says Tianfu Wu, corresponding author of a paper on the work and an associate professor of electrical and computer engineering at North Carolina State University. “However, while each of these AI programs takes a different approach, there is still substantial room for improvement.
    “Our technique, called Multi-View Attentive Contextualization (MvACon), is a plug-and-play supplement that can be used in conjunction with these existing vision transformer AIs to improve their ability to map 3D spaces,” Wu says. “The vision transformers aren’t getting any additional data from their cameras, they’re just able to make better use of the data.”
    MvACon effectively works by modifying an approach called Patch-to-Cluster attention (PaCa), which Wu and his collaborators released last year. PaCa allows transformer AIs to more efficiently and effectively identify objects in an image.
    “The key advance here is applying what we demonstrated with PaCa to the challenge of mapping 3D space using multiple cameras,” Wu says.
    To test the performance of MvACon, the researchers used it in conjunction with three leading vision transformers — BEVFormer, the BEVFormer DFA3D variant, and PETR. In each case, the vision transformers were collecting 2D images from six different cameras. In all three instances, MvACon significantly improved the performance of each vision transformer.
    “Performance was particularly improved when it came to locating objects, as well as the speed and orientation of those objects,” says Wu. “And the increase in computational demand of adding MvACon to the vision transformers was almost negligible.
    “Our next steps include testing MvACon against additional benchmark datasets, as well as testing it against actual video input from autonomous vehicles. If MvACon continues to outperform the existing vision transformers, we’re optimistic that it will be adopted for widespread use.”
    The paper, “Multi-View Attentive Contextualization for Multi-View 3D Object Detection,” will be presented June 20 at the IEEE/CVF Conference on Computer Vision and Pattern Recognition, being held in Seattle, Wash. First author of the paper is Xianpeng Liu, a recent Ph.D. graduate of NC State. The paper was co-authored by Ce Zheng and Chen Chen of the University of Central Florida; Ming Qian and Nan Xue of the Ant Group; and Zhebin Zhang and Chen Li of the OPPO U.S. Research Center.
    The work was done with support from the National Science Foundation, under grants 1909644, 2024688 and 2013451; the U.S. Army Research Office, under grants W911NF1810295 and W911NF2210010; and a research gift fund from Innopeak Technology, Inc. More

  • in

    Quantum data assimilation: A quantum leap in weather prediction

    Data assimilation is a mathematical discipline that integrates observed data and numerical models to improve the interpretation and prediction of dynamical systems. It is a crucial component of earth sciences, particularly in numerical weather prediction (NWP). Data assimilation techniques have been widely investigated in NWP in the last two decades to refine the initial conditions of weather models by combining model forecasts and observational data. Most NWP centers around the world employ variational and ensemble-variational data assimilation methods, which iteratively reduce cost functions via gradient-based optimization. However, these methods require significant computational resources.
    Recently, quantum computing has emerged as a new avenue of computational technology, offering a promising solution for overcoming the computational challenges of classical computers. Quantum computers can take advantage of quantum effects such as tunneling, superposition, and entanglement to significantly reduce computational demands. Quantum annealing machines, in particular, are powerful for solving optimization problems.
    In a recent study, Professor Shunji Kotsuki from the Institute for Advanced Academic Research/Center for Environmental Remote Sensing/Research Institute of Disaster Medicine, Chiba University, along with his colleagues Fumitoshi Kawasaki from the Graduate School of Science and Engineering and Masanao Ohashi from the Center for Environmental Remote Sensing, developed a novel data assimilation technique designed for quantum annealing machines. “Our study introduces a novel quantum annealing approach to accelerate data assimilation, which is the main computational bottleneck for numerical weather predictions. With this algorithm, we successfully solved data assimilation on quantum annealers for the first time,” explains Prof. Kotsuki. Their study has been published in the journal Nonlinear Processes in Geophysics on June 07, 2024.
    In the study, the researchers focused on the four-dimensional variational data assimilation (4DVAR) method, one of the most widely used data assimilation methods in NWP systems. However, since 4DVAR is designed for classical computers, it cannot be directly used on quantum hardware. Prof. Kotsuki clarifies, “Unlike the conventional 4DVAR, which requires a cost function and its gradient, quantum annealers require only the cost function. However, the cost function must be represented by binary variables (0 or 1). Therefore, we reformulated the 4DVAR cost function, a quadratic unconstrained optimization (QUO) problem, into a quadratic unconstrained binary optimization (QUBO) problem, which quantum annealers can solve.”
    The researchers applied this QUBO approach to a series of 4DVAR experiments using a 40-variable Lorentz-96 model, which is a dynamical system commonly used to test data assimilation. They conducted the experiments using the D-Wave Advantage physical quantum annealer, or Phy-QA, and the Fixstars Amplify’s simulated quantum annealer, or Sim-QA. Moreover, they tested the conventionally utilized quasi-Newton-based iterative approaches, using the Broyden-Fletcher-Goldfarb-Shanno formula, in solving linear and nonlinear QUO problems and compared their performance to that of quantum annealers.
    The results revealed that quantum annealers produced analysis with comparable accuracy to conventional quasi-Newton-based approaches but in a fraction of the time they took. The D-Wave’s Phy-QA required less than 0.05 seconds for computation, much faster than conventional approaches. However, it also exhibited slightly larger root mean square errors, which the researchers attributed to the inherent stochastic quantum effects. To address this, they found that reading out multiple solutions from the quantum annealer improved stability and accuracy. They also noted that the scaling factor for quantum data assimilation, which is important for regulating the analysis accuracy, was different for the D-Wave Phy-QA and the Sim-QA, owing to the stochastic quantum effects associated with the former annealer.
    These findings signify the role of quantum computers in reducing the computational cost of data assimilation. “Our approach could revolutionize future NWP systems, enabling a deeper understanding and improved predictions with much less computational time. In addition, it has the potential to advance the practical applications of quantum annealers in solving complex optimization problems in earth science,” remarks Prof. Kotsuki.
    Overall, the proposed innovative method holds great promise for inspiring future applications of quantum computers in advancing data assimilation, potentially leading to more accurate weather predictions. More

  • in

    Swimming microrobots deliver cancer-fighting drugs to metastatic lung tumors in mice

    Engineers at the University of California San Diego have developed microscopic robots, known as microrobots, capable of swimming through the lungs to deliver cancer-fighting medication directly to metastatic tumors. This approach has shown promise in mice, where it inhibited the growth and spread of tumors that had metastasized to the lungs, thereby boosting survival rates compared to control treatments.
    The findings are detailed in a paper published on June 12 in Science Advances.
    The microrobots are an ingenious combination of biology and nanotechnology. They are a joint effort between the labs of Joseph Wang and Liangfang Zhang, both professors in the Aiiso Yufeng Li Family Department of Chemical and Nano Engineering at the UC San Diego Jacobs School of Engineering.
    To create the microrobots, researchers chemically attached drug-filled nanoparticles to the surface of green algae cells. The algae, which provide the microrobots with their movement, enable the nanoparticles to efficiently swim around in the lungs and deliver their therapeutic payload to tumors.
    The nanoparticles are made of tiny biodegradable polymer spheres, which are loaded with the chemotherapeutic drug doxorubicin and coated with red blood cell membranes. This coating serves a critical function: it protects the nanoparticles from the immune system, allowing them to stay in the lungs long enough to exert their anti-tumor effects. “It acts as a camouflage,” said study co-first author Zhengxing Li, who is a nanoengineering Ph.D. student in both Wang and Zhang’s research groups. “This coating makes the nanoparticle look like a red blood cell from the body, so it will not trigger an immune response.”
    This formulation of nanoparticle-carrying algae is safe, the researchers noted. The materials used to make the nanoparticles are biocompatible while the green algae employed, Chlamydomonas reinhardtii, are recognized as safe for use by the U.S. Food and Drug Administration.
    This study builds on prior work by Wang and Zhang’s teams using similar microrobots to treat deadly pneumonia in mice. “Those were the first microrobots to be safely tested in the lungs of live animals,” said Wang.

    In previous work, the microrobots fought the spread of pneumonia-causing bacteria using a different drug and cell membrane combination for the nanoparticles. By tweaking these components, the team has now tailored the microrobots to fight the spread of cancer cells in the lungs. “We demonstrate that this is a platform technology that can actively and efficiently deliver therapeutics throughout the entire lung tissue to combat different types of deadly diseases in the lungs,” said Zhang.
    In the current study, mice with melanoma that had metastasized to the lungs were treated with the microrobots, which were administered to the lungs through a small tube inserted into the windpipe. Treated mice experienced a median survival time of 37 days, an improvement over the 27-day median survival time observed in untreated mice, as well as mice that received either the drug alone or drug-filled nanoparticles without algae.
    “The active swimming motion of the microrobots significantly improved distribution of the drug to the deep lung tissue, while prolonging retention time,” said Li. “This enhanced distribution and prolonged retention time allowed us to reduce the required drug dosage, potentially reducing side effects while maintaining high survival efficacy.”
    Moving forward, the team is working on advancing this microrobot treatment to trials in larger animals, with the ultimate goal of human clinical trials.
    Paper: “Biohybrid microrobots locally and actively deliver drug-loaded nanoparticles to inhibit the progression of lung metastasis.” Co-authors of the study include Fangyu Zhang*, Zhongyuan Guo*, Zhengxing Li*, Hao Luan, Yiyan Yu, Audrey T. Zhu, Shichao Ding, Weiwei Gao and Ronnie H. Fang.
    *These authors contributed equally to this work.
    This work was supported by the Defense Threat Reduction Agency Joint Science and Technology Office for Chemical and Biological Defense (HDTRA1-21-1-0010) and the National Institutes of Health (R21AI175904). More

  • in

    Incorporating ‘touch’ into social media interactions can increase feelings of support and approval

    Including “tactile emoticons” into social media communications can enhance communication, according to a study published June 12, 2024 in the open-access journal PLOS ONE by Alkistis Saramandi and Yee Ki Au from University College London, United Kingdom, and colleagues.
    Digital communications rely exclusively on visual and auditory cues (text, emoticons, videos, and music) to convey tone and emotion. Currently lacking from these platforms is touch, which can convey feelings of love and support, impact emotions, and influence behaviors. Technology companies are developing devices to incorporate touch into digital interactions, such as interactive kiss and hug transmission devices. These social touch devices can elicit feelings of closeness and positive emotions, but the effect of touch in social media communication is not well studied.
    In this study, researchers incorporated tactile emoticons into social media interactions. Using a mock social media platform, participants were given posts to send that expressed either positive or negative emotions. They then received feedback via visual emoticons (e.g., a heart or thumbs up), tactile emoticons (a stroke on the forearm by either another person or a robotic device), or both.
    Participants felt greater feelings of support and approval when they received the tactile emoticons compared to the visual-only feedback. This suggests that social touch, even by a stranger, can convey meaning without any other accompanying communication. Feedback consisting of both visual and tactile emoticons was preferred over either type of emoticon alone. The researchers noted that touch could offer additional context to visual emoticons, which can be misinterpreted. They also found that the type of touch matters. Touch delivered at a speed that activates the C-Tactile nervous system, which produces positive emotions associated with touching or embracing, was experienced more favorably than other types of touch.
    According to the authors, this is the first study to explore the role of touch in communicating emotions via social media. They hope that the results can inform the development of devices to deliver touch during digital communications.
    The authors add: “Touch has long been essential to human bonding and survival, yet in our digitized world we are more isolated and touch-deprived than ever. What if we could use ‘digitalized touch’ to bring us closer in communicating emotions in today’s world?” More

  • in

    To heal skin, scientists invent living bioelectronics

    For much of his childhood, Simiao Niu was troubled by psoriasis, a chronic, often painful skin condition, mostly on his arms.
    Sometimes, the prescribed ointment worked and treated the inflamed, red areas produced by the disease. But he was never sure if he was using enough, whether the skin irritation was healing and when he should stop treatment.
    Years later, as an engineer on the team at Apple Inc. that devised the electronics in Apple watches monitoring heart rhythm, Niu had a revelation: Could a similar wearable electronic device be developed to treat skin ailments such as psoriasis and provide patients with continuous feedback?
    Now, Niu, an assistant professor of biomedical engineering in the School of Engineering at Rutgers-New Brunswick, has played a crucial role in the development of the kind of device that he dreamed of: a unique prototype of what he and his research collaborators are calling a “living bioelectronic” designed to treat psoriasis.
    Describing the advance in Science magazine, Niu and collaborators, including scientists at the University of Chicago led by Bozhi Tian and Columbia University, reported developing a patch — a combination of advanced electronics, living cells and hydrogel — that is showing efficacy in experiments in mice.
    The patch represents not only a potential treatment for psoriasis, but a new technology platform to deliver treatments for medical needs as diverse as wounds and, potentially, various skin cancers, Niu said.
    “We were looking for a new type of device that combines sensing and treatment for managing skin inflammation diseases like psoriasis,” Niu said. “We found that by combining living bacteria, flexible electronics and adhesive skin interface materials, we were able to create a new type of device.”
    The circular patch is about 1 inch in diameter and wafer thin. The patch contains electronic chips, bacterial cells and a gel made from starch and gelatin. Tests in mice conducted by the research team showed that the device could continuously monitor and improve psoriasis-like symptoms without irritating skin.

    The device, Niu said, is a leap forward from conventional bioelectronics, which are generally composed of electronic components that are encased in a soft synthetic layer that reduces irritation when in contact with the body. The patches placed on a patient’s chest for an electrocardiogram are examples of conventional devices.
    Niu’s invention could be seen as a “living drug,” he said, in that it incorporates living cells as part of its therapy. S. epidermidis, which lives on human skin and has been shown to reduce inflammation, is incorporated into the device’s gel casing. A thin, flexible printed circuit forms the skeleton of the device.
    When the device is placed on skin, the bacteria secrete compounds that reduce inflammation, while sensors in the flexible circuits monitor the skin for signals indicating healing, such as skin impedance, temperature and humidity. The data collected by the circuits is beamed wirelessly to a computer or a cell phone, a process that would allow patients to monitor their healing process.
    During his years at Apple, before he joined Rutgers faculty in 2022, Niu and other engineers were forwarded hundreds of thank-you notes that had been sent to the chief executive’s office. The customers wrote to credit their Apple watches with saving their lives, Niu said. The watches’ built-in heart rate monitors pointed to a condition — an arrhythmia known as atrial fibrillation — the customers said they didn’t know they had. Atrial fibrillation can lead to strokes if left untreated.
    “When you produce the kinds of things that positively affect people’s lives, you feel very proud,” Niu said. “That is something that inspires me a lot and motivates me to do my current research.”
    Clinical trials to test the device on human patients must come next, Niu said, as the first step toward commercialization. Once there is evidence of positive results with minimum side effects, the inventors would apply for FDA approval in order to bring the device to market, Niu said.
    Other authors of the study from Rutgers included Fuying Dong and Chi Han, two graduate students at the Department of Biomedical Engineering in the School of Engineering. More

  • in

    Robot radiotherapy could improve treatments for eye disease

    Researchers from King’s, with doctors at King’s College Hospital NHS Foundation Trust, have successfully used a new robot system to improve treatment for debilitating eye disease.
    The custom-built robot was used to treat wet neovascular age-related macular degeneration (AMD), administering a one-off, minimally invasive dose of radiation, followed by patients’ routine treatment with injections into their eye.
    In the landmark trial, published today in The Lancet, it was found that patients then needed fewer injections to effectively control the disease, potentially saving around 1.8 million injections per year around the world.
    Wet AMD is a debilitating eye disease, where abnormal new blood vessels grow into the macula, the light sensing-layer of cells inside the back of the eyeball. The vessels then start to leak blood and fluid, typically causing a rapid, permanent and severe loss of sight.
    Globally, around 196 million people have AMD and the Royal College of Ophthalmologists estimates that the disease affects more than 700,000 people in the UK. The number of people with AMD is expected to increase 60% by 2035, due to the country’s ageing population.
    Wet AMD is currently treated with regular injections into the eye. Initially, treatment substantially improves a patient’s vision. But, because the injections don’t cure the disease, fluid will eventually start to build up again in the macula, and patients will require long-term, repeated injections. Most people require an injection around every 1-3 months, and eye injections, costing between £500 and £800 per injection, have become one of the most common NHS procedures.
    The new treatment can be targeted far better than existing methods, aiming three beams of highly focused radiation into the diseased eye. Scientists found that patients having robotic radiotherapy required fewer injections to control their disease compared to standard treatment.

    The study found that the robotically controlled device saves the NHS £565 for each patient treated over the first two years, as it results in fewer injections.
    The study lead and first author on the paper, Professor Timothy Jackson, King’s College London and Consultant Ophthalmic Surgeon at King’s College Hospital said: “Research has previously tried to find a better way to target radiotherapy to the macula, such as by repurposing devices used to treat brain tumours. But so far nothing has been sufficiently precise to target macular disease that may be less than 1 mm across.
    “With this purpose-built robotic system, we can be incredibly precise, using overlapping beams of radiation to treat a very small lesion in the back of the eye.
    “Patients generally accept that they need to have eye injections to help preserve their vision, but frequent hospital attendance and repeated eye injections isn’t something they enjoy. By better stabilising the disease and reducing its activity, the new treatment could reduce the number of injections people need by about a quarter. Hopefully, this discovery will reduce the burden of treatment that patients have to endure.”
    Dr Helen Dakin, University Research Lecturer at the University of Oxford said: “We found that the savings from giving fewer injections are larger than the cost of robot-controlled radiotherapy. This new treatment can therefore save the NHS money that can be used to treat other patients, while controlling patients’ AMD just as well as standard care.”
    The research was jointly funded by the National Institute for Health and Care Research (NIHR) and the Medical Research Council (MRC) and recruited 411 participants across 30 NHS hospitals. A Lancet-commissioned commentary that accompanied the article described it as a “landmark trial.”
    This study was led by researchers from King’s College London and doctors at King’s College Hospital NHS Foundation Trust, in collaboration with the University of Oxford, the University of Bristol and Queen’s University in Belfast. More

  • in

    Quantum dots and metasurfaces: Deep connections in the nano world

    In relationships, sharing closer spaces naturally deepens the connection as bonds form and strengthen through increasing shared memories. This principle applies not only to human interactions but also to engineering. Recently, an intriguing study was published demonstrating the use of quantum dots to create metasurfaces, enabling two objects to exist in the same space.
    Professor Junsuk Rho from the Department of Mechanical Engineering, the Department of Chemical Engineering, and the Department of Electrical Engineering, PhD candidates Minsu Jeong, Byoungsu Ko, and Jaekyung Kim from the Department of Mechanical Engineering, and Chunghwan Jung, a PhD candidate, from the Department of Chemical Engineering at Pohang University of Science and Technology (POSTECH) employed Nanoimprint Lithography (NIL) to fabricate metasurfaces embedded with quantum dots, enhancing their luminescence efficiency. Their research was recently published in the online edition of Nano Letters.
    NIL, a process for creating optical metasurfaces, utilizes patterned stamps to quickly transfer intricate patterns at the nanometer (nm) scale. This method offers cost advantages over electron beam lithography and other processes and has the advantage of enabling the creation of metasurfaces using materials that are not available in conventional processes.
    Metasurfaces have recently been the focus of extensive research for their ability to control the polarization and emission direction of light from quantum dots. Quantum dots, which are nanoscale semiconductor particles, are highly efficient light emitters capable of emitting light at precise wavelengths. This makes them widely used in applications such as QLEDs and quantum computing. However, conventional processes cannot embed quantum dots within metasurfaces. As a result, research has often involved fabricating metasurfaces and quantum dots separately and then combining them, which imposes limitations on controlling the luminescence of the quantum dots.
    In this study, the researchers integrated quantum dots with titanium dioxide (TiO2), a material used in the NIL process, to create a metasurface. Unlike conventional methods, which involve separately fabricating the metasurface and quantum dots before combining them, this approach embeds the quantum dots directly within the metasurface during its creation.
    The resulting metasurface enhances the proportion of photons emitted from the quantum dots that couple with the resonance mode of the metasurface. This advancement allows for more effective control over the specific direction of light emitted from the quantum dots compared to previous methods.
    Experiments demonstrated that the more photons emitted from the quantum dots that were coupled to the resonant modes of the metasurface, the higher the luminescence efficiency. The team’s metasurface achieved up to 25 times greater luminescence efficiency compared to a simple coating of quantum dots.
    Professor Junsuk Rho of POSTECH who led the research stated, “The use of luminescence-controlled metasurfaces will enable sharper, brighter displays and more precise, sensitive biosensing.” He added, “Further research will allow us to control luminescence more effectively, leading to advances in areas such as nano-optical sensors, optoelectronic devices, and quantum dot displays.”
    The research was conducted with support from POSCO N.EX.T IMPACT, the Samsung Future Technology Incubation Program, and the Mid-Career Researcher Program of the Ministry of Science and ICT and the National Research Foundation of Korea. More

  • in

    Towards a new era in flexible piezoelectric sensors for both humans and robots

    Flexible piezoelectric sensors are essential to monitor the motions of both humans and humanoid robots. However, existing designs are either are costly or have limited sensitivity. In a recent study, researchers from Japan tackled these issues by developing a novel piezoelectric composite material made from electrospun polyvinylidene fluoride nanofibers combined with dopamine. Sensors made from this material showed significant performance and stability improvements at a low cost, promising advancements in medicine, healthcare, and robotics.
    The world is accelerating rapidly towards the intelligent era — a stage in history marked by increased automation and interconnectivity by leveraging technologies such as artificial intelligence and robotics. As a sometimes-overlooked foundational requirement in this transformation, sensors represent an essential interface between humans, machines, and their environment.
    However, now that robots are becoming more agile and wearable electronics are no longer confined to science fiction, traditional silicon-based sensors won’t make the cut in many applications. Thus, flexible sensors, which provide better comfort and higher versatility, have become a very active area of study. Piezoelectric sensors are particularly important in this regard, as they can convert mechanical stress and stretching into an electrical signal. Despite numerous promising approaches, there remains a lack of environmentally sustainable methods for mass-producing flexible, high-performance piezoelectric sensors at a low cost.
    Against this backdrop, a research team from Shinshu University, Japan, decided to step up to the challenge and improve flexible piezoelectric sensor design using a well-established manufacturing technique: electrospinning. Their latest study, which was led by Distinguished Professor Ick Soo Kim in association with Junpeng Xiong, Ling Wang, Mayakrishnan Gopiraman, and Jian Shi, was published on 2 May, 2024, in the journal Nature Communications.
    The proposed flexible sensor design involves the stepwise electrospinning of a composite 2D nanofiber membrane. First, polyvinylidene fluoride (PVDF) nanofibers with diameters in the order of 200 nm are spun, forming a strong uniform network that acts as the base for the piezoelectric sensor. Then, ultrafine PVDF nanofibers with diameters smaller than 35 nm are spun onto the preexisting base. These fibers become automatically interweaved between the gaps of the base network, creating a particular 2D topology.
    After characterization via experiments, simulations, and theoretical analyses, the researchers found that the resulting composite PVDF network had enhanced beta crystal orientation. By enhancing this polar phase, which is responsible for the piezoelectric effect observed in PVDF materials, the piezoelectric performance of the sensors was significantly improved. To increase the stability of the material further, the researchers introduced dopamine (DA) during the electrospinning process, which created a protective core-shell structure.
    “Sensor fabricated from using PVDF/DA composite membranes exhibited superb performance, including a wide response range of 1.5-40 N, high sensitivity of 7.29 V/N to weak forces in the range of 0-4 N, and excellent operational durability,” remarks Kim. These exceptional qualities were demonstrated practically using wearable sensors to measure a wide variety of human movements and actions. More specifically, the proposed sensors, when worn by a human, could produce an easily distinguishable voltage response to natural motions and physiological signals. This included finger tapping, knee and elbow bending, foot stamping, and even speaking and wrist pulses.
    Given the potential low-cost mass production of these piezoelectric sensors, combined with their use of environmentally friendly organic materials instead of harmful inorganics, this study could have important technological implications not only for health monitoring and diagnostics, but also robotics. “Despite the current challenges, humanoid robots are poised to play an increasingly integral role in the very near future. For instance, the well-known Tesla robot ‘Optimus’ can already mimic human motions and walk like a human,” muses Kim, “Considering high-tech sensors are currently being used to monitor robot motions, our proposed nanofiber-based superior piezoelectric sensors hold much potential not only for monitoring human movements, but also in the field of humanoid robotics.”
    To make the adoption of these sensors easier, the research team will be focusing on improving the material’s electrical output properties so that flexible electronic components can be driven without the need for an external power source. Hopefully, further progress in this area will accelerate our stride towards the intelligent era, leading to more comfortable and sustainable lives. More