More stories

  • in

    Incorporating ‘touch’ into social media interactions can increase feelings of support and approval

    Including “tactile emoticons” into social media communications can enhance communication, according to a study published June 12, 2024 in the open-access journal PLOS ONE by Alkistis Saramandi and Yee Ki Au from University College London, United Kingdom, and colleagues.
    Digital communications rely exclusively on visual and auditory cues (text, emoticons, videos, and music) to convey tone and emotion. Currently lacking from these platforms is touch, which can convey feelings of love and support, impact emotions, and influence behaviors. Technology companies are developing devices to incorporate touch into digital interactions, such as interactive kiss and hug transmission devices. These social touch devices can elicit feelings of closeness and positive emotions, but the effect of touch in social media communication is not well studied.
    In this study, researchers incorporated tactile emoticons into social media interactions. Using a mock social media platform, participants were given posts to send that expressed either positive or negative emotions. They then received feedback via visual emoticons (e.g., a heart or thumbs up), tactile emoticons (a stroke on the forearm by either another person or a robotic device), or both.
    Participants felt greater feelings of support and approval when they received the tactile emoticons compared to the visual-only feedback. This suggests that social touch, even by a stranger, can convey meaning without any other accompanying communication. Feedback consisting of both visual and tactile emoticons was preferred over either type of emoticon alone. The researchers noted that touch could offer additional context to visual emoticons, which can be misinterpreted. They also found that the type of touch matters. Touch delivered at a speed that activates the C-Tactile nervous system, which produces positive emotions associated with touching or embracing, was experienced more favorably than other types of touch.
    According to the authors, this is the first study to explore the role of touch in communicating emotions via social media. They hope that the results can inform the development of devices to deliver touch during digital communications.
    The authors add: “Touch has long been essential to human bonding and survival, yet in our digitized world we are more isolated and touch-deprived than ever. What if we could use ‘digitalized touch’ to bring us closer in communicating emotions in today’s world?” More

  • in

    To heal skin, scientists invent living bioelectronics

    For much of his childhood, Simiao Niu was troubled by psoriasis, a chronic, often painful skin condition, mostly on his arms.
    Sometimes, the prescribed ointment worked and treated the inflamed, red areas produced by the disease. But he was never sure if he was using enough, whether the skin irritation was healing and when he should stop treatment.
    Years later, as an engineer on the team at Apple Inc. that devised the electronics in Apple watches monitoring heart rhythm, Niu had a revelation: Could a similar wearable electronic device be developed to treat skin ailments such as psoriasis and provide patients with continuous feedback?
    Now, Niu, an assistant professor of biomedical engineering in the School of Engineering at Rutgers-New Brunswick, has played a crucial role in the development of the kind of device that he dreamed of: a unique prototype of what he and his research collaborators are calling a “living bioelectronic” designed to treat psoriasis.
    Describing the advance in Science magazine, Niu and collaborators, including scientists at the University of Chicago led by Bozhi Tian and Columbia University, reported developing a patch — a combination of advanced electronics, living cells and hydrogel — that is showing efficacy in experiments in mice.
    The patch represents not only a potential treatment for psoriasis, but a new technology platform to deliver treatments for medical needs as diverse as wounds and, potentially, various skin cancers, Niu said.
    “We were looking for a new type of device that combines sensing and treatment for managing skin inflammation diseases like psoriasis,” Niu said. “We found that by combining living bacteria, flexible electronics and adhesive skin interface materials, we were able to create a new type of device.”
    The circular patch is about 1 inch in diameter and wafer thin. The patch contains electronic chips, bacterial cells and a gel made from starch and gelatin. Tests in mice conducted by the research team showed that the device could continuously monitor and improve psoriasis-like symptoms without irritating skin.

    The device, Niu said, is a leap forward from conventional bioelectronics, which are generally composed of electronic components that are encased in a soft synthetic layer that reduces irritation when in contact with the body. The patches placed on a patient’s chest for an electrocardiogram are examples of conventional devices.
    Niu’s invention could be seen as a “living drug,” he said, in that it incorporates living cells as part of its therapy. S. epidermidis, which lives on human skin and has been shown to reduce inflammation, is incorporated into the device’s gel casing. A thin, flexible printed circuit forms the skeleton of the device.
    When the device is placed on skin, the bacteria secrete compounds that reduce inflammation, while sensors in the flexible circuits monitor the skin for signals indicating healing, such as skin impedance, temperature and humidity. The data collected by the circuits is beamed wirelessly to a computer or a cell phone, a process that would allow patients to monitor their healing process.
    During his years at Apple, before he joined Rutgers faculty in 2022, Niu and other engineers were forwarded hundreds of thank-you notes that had been sent to the chief executive’s office. The customers wrote to credit their Apple watches with saving their lives, Niu said. The watches’ built-in heart rate monitors pointed to a condition — an arrhythmia known as atrial fibrillation — the customers said they didn’t know they had. Atrial fibrillation can lead to strokes if left untreated.
    “When you produce the kinds of things that positively affect people’s lives, you feel very proud,” Niu said. “That is something that inspires me a lot and motivates me to do my current research.”
    Clinical trials to test the device on human patients must come next, Niu said, as the first step toward commercialization. Once there is evidence of positive results with minimum side effects, the inventors would apply for FDA approval in order to bring the device to market, Niu said.
    Other authors of the study from Rutgers included Fuying Dong and Chi Han, two graduate students at the Department of Biomedical Engineering in the School of Engineering. More

  • in

    Robot radiotherapy could improve treatments for eye disease

    Researchers from King’s, with doctors at King’s College Hospital NHS Foundation Trust, have successfully used a new robot system to improve treatment for debilitating eye disease.
    The custom-built robot was used to treat wet neovascular age-related macular degeneration (AMD), administering a one-off, minimally invasive dose of radiation, followed by patients’ routine treatment with injections into their eye.
    In the landmark trial, published today in The Lancet, it was found that patients then needed fewer injections to effectively control the disease, potentially saving around 1.8 million injections per year around the world.
    Wet AMD is a debilitating eye disease, where abnormal new blood vessels grow into the macula, the light sensing-layer of cells inside the back of the eyeball. The vessels then start to leak blood and fluid, typically causing a rapid, permanent and severe loss of sight.
    Globally, around 196 million people have AMD and the Royal College of Ophthalmologists estimates that the disease affects more than 700,000 people in the UK. The number of people with AMD is expected to increase 60% by 2035, due to the country’s ageing population.
    Wet AMD is currently treated with regular injections into the eye. Initially, treatment substantially improves a patient’s vision. But, because the injections don’t cure the disease, fluid will eventually start to build up again in the macula, and patients will require long-term, repeated injections. Most people require an injection around every 1-3 months, and eye injections, costing between £500 and £800 per injection, have become one of the most common NHS procedures.
    The new treatment can be targeted far better than existing methods, aiming three beams of highly focused radiation into the diseased eye. Scientists found that patients having robotic radiotherapy required fewer injections to control their disease compared to standard treatment.

    The study found that the robotically controlled device saves the NHS £565 for each patient treated over the first two years, as it results in fewer injections.
    The study lead and first author on the paper, Professor Timothy Jackson, King’s College London and Consultant Ophthalmic Surgeon at King’s College Hospital said: “Research has previously tried to find a better way to target radiotherapy to the macula, such as by repurposing devices used to treat brain tumours. But so far nothing has been sufficiently precise to target macular disease that may be less than 1 mm across.
    “With this purpose-built robotic system, we can be incredibly precise, using overlapping beams of radiation to treat a very small lesion in the back of the eye.
    “Patients generally accept that they need to have eye injections to help preserve their vision, but frequent hospital attendance and repeated eye injections isn’t something they enjoy. By better stabilising the disease and reducing its activity, the new treatment could reduce the number of injections people need by about a quarter. Hopefully, this discovery will reduce the burden of treatment that patients have to endure.”
    Dr Helen Dakin, University Research Lecturer at the University of Oxford said: “We found that the savings from giving fewer injections are larger than the cost of robot-controlled radiotherapy. This new treatment can therefore save the NHS money that can be used to treat other patients, while controlling patients’ AMD just as well as standard care.”
    The research was jointly funded by the National Institute for Health and Care Research (NIHR) and the Medical Research Council (MRC) and recruited 411 participants across 30 NHS hospitals. A Lancet-commissioned commentary that accompanied the article described it as a “landmark trial.”
    This study was led by researchers from King’s College London and doctors at King’s College Hospital NHS Foundation Trust, in collaboration with the University of Oxford, the University of Bristol and Queen’s University in Belfast. More

  • in

    Quantum dots and metasurfaces: Deep connections in the nano world

    In relationships, sharing closer spaces naturally deepens the connection as bonds form and strengthen through increasing shared memories. This principle applies not only to human interactions but also to engineering. Recently, an intriguing study was published demonstrating the use of quantum dots to create metasurfaces, enabling two objects to exist in the same space.
    Professor Junsuk Rho from the Department of Mechanical Engineering, the Department of Chemical Engineering, and the Department of Electrical Engineering, PhD candidates Minsu Jeong, Byoungsu Ko, and Jaekyung Kim from the Department of Mechanical Engineering, and Chunghwan Jung, a PhD candidate, from the Department of Chemical Engineering at Pohang University of Science and Technology (POSTECH) employed Nanoimprint Lithography (NIL) to fabricate metasurfaces embedded with quantum dots, enhancing their luminescence efficiency. Their research was recently published in the online edition of Nano Letters.
    NIL, a process for creating optical metasurfaces, utilizes patterned stamps to quickly transfer intricate patterns at the nanometer (nm) scale. This method offers cost advantages over electron beam lithography and other processes and has the advantage of enabling the creation of metasurfaces using materials that are not available in conventional processes.
    Metasurfaces have recently been the focus of extensive research for their ability to control the polarization and emission direction of light from quantum dots. Quantum dots, which are nanoscale semiconductor particles, are highly efficient light emitters capable of emitting light at precise wavelengths. This makes them widely used in applications such as QLEDs and quantum computing. However, conventional processes cannot embed quantum dots within metasurfaces. As a result, research has often involved fabricating metasurfaces and quantum dots separately and then combining them, which imposes limitations on controlling the luminescence of the quantum dots.
    In this study, the researchers integrated quantum dots with titanium dioxide (TiO2), a material used in the NIL process, to create a metasurface. Unlike conventional methods, which involve separately fabricating the metasurface and quantum dots before combining them, this approach embeds the quantum dots directly within the metasurface during its creation.
    The resulting metasurface enhances the proportion of photons emitted from the quantum dots that couple with the resonance mode of the metasurface. This advancement allows for more effective control over the specific direction of light emitted from the quantum dots compared to previous methods.
    Experiments demonstrated that the more photons emitted from the quantum dots that were coupled to the resonant modes of the metasurface, the higher the luminescence efficiency. The team’s metasurface achieved up to 25 times greater luminescence efficiency compared to a simple coating of quantum dots.
    Professor Junsuk Rho of POSTECH who led the research stated, “The use of luminescence-controlled metasurfaces will enable sharper, brighter displays and more precise, sensitive biosensing.” He added, “Further research will allow us to control luminescence more effectively, leading to advances in areas such as nano-optical sensors, optoelectronic devices, and quantum dot displays.”
    The research was conducted with support from POSCO N.EX.T IMPACT, the Samsung Future Technology Incubation Program, and the Mid-Career Researcher Program of the Ministry of Science and ICT and the National Research Foundation of Korea. More

  • in

    Towards a new era in flexible piezoelectric sensors for both humans and robots

    Flexible piezoelectric sensors are essential to monitor the motions of both humans and humanoid robots. However, existing designs are either are costly or have limited sensitivity. In a recent study, researchers from Japan tackled these issues by developing a novel piezoelectric composite material made from electrospun polyvinylidene fluoride nanofibers combined with dopamine. Sensors made from this material showed significant performance and stability improvements at a low cost, promising advancements in medicine, healthcare, and robotics.
    The world is accelerating rapidly towards the intelligent era — a stage in history marked by increased automation and interconnectivity by leveraging technologies such as artificial intelligence and robotics. As a sometimes-overlooked foundational requirement in this transformation, sensors represent an essential interface between humans, machines, and their environment.
    However, now that robots are becoming more agile and wearable electronics are no longer confined to science fiction, traditional silicon-based sensors won’t make the cut in many applications. Thus, flexible sensors, which provide better comfort and higher versatility, have become a very active area of study. Piezoelectric sensors are particularly important in this regard, as they can convert mechanical stress and stretching into an electrical signal. Despite numerous promising approaches, there remains a lack of environmentally sustainable methods for mass-producing flexible, high-performance piezoelectric sensors at a low cost.
    Against this backdrop, a research team from Shinshu University, Japan, decided to step up to the challenge and improve flexible piezoelectric sensor design using a well-established manufacturing technique: electrospinning. Their latest study, which was led by Distinguished Professor Ick Soo Kim in association with Junpeng Xiong, Ling Wang, Mayakrishnan Gopiraman, and Jian Shi, was published on 2 May, 2024, in the journal Nature Communications.
    The proposed flexible sensor design involves the stepwise electrospinning of a composite 2D nanofiber membrane. First, polyvinylidene fluoride (PVDF) nanofibers with diameters in the order of 200 nm are spun, forming a strong uniform network that acts as the base for the piezoelectric sensor. Then, ultrafine PVDF nanofibers with diameters smaller than 35 nm are spun onto the preexisting base. These fibers become automatically interweaved between the gaps of the base network, creating a particular 2D topology.
    After characterization via experiments, simulations, and theoretical analyses, the researchers found that the resulting composite PVDF network had enhanced beta crystal orientation. By enhancing this polar phase, which is responsible for the piezoelectric effect observed in PVDF materials, the piezoelectric performance of the sensors was significantly improved. To increase the stability of the material further, the researchers introduced dopamine (DA) during the electrospinning process, which created a protective core-shell structure.
    “Sensor fabricated from using PVDF/DA composite membranes exhibited superb performance, including a wide response range of 1.5-40 N, high sensitivity of 7.29 V/N to weak forces in the range of 0-4 N, and excellent operational durability,” remarks Kim. These exceptional qualities were demonstrated practically using wearable sensors to measure a wide variety of human movements and actions. More specifically, the proposed sensors, when worn by a human, could produce an easily distinguishable voltage response to natural motions and physiological signals. This included finger tapping, knee and elbow bending, foot stamping, and even speaking and wrist pulses.
    Given the potential low-cost mass production of these piezoelectric sensors, combined with their use of environmentally friendly organic materials instead of harmful inorganics, this study could have important technological implications not only for health monitoring and diagnostics, but also robotics. “Despite the current challenges, humanoid robots are poised to play an increasingly integral role in the very near future. For instance, the well-known Tesla robot ‘Optimus’ can already mimic human motions and walk like a human,” muses Kim, “Considering high-tech sensors are currently being used to monitor robot motions, our proposed nanofiber-based superior piezoelectric sensors hold much potential not only for monitoring human movements, but also in the field of humanoid robotics.”
    To make the adoption of these sensors easier, the research team will be focusing on improving the material’s electrical output properties so that flexible electronic components can be driven without the need for an external power source. Hopefully, further progress in this area will accelerate our stride towards the intelligent era, leading to more comfortable and sustainable lives. More

  • in

    AI better detects prostate cancer on MRI than radiologists

    AI detects prostate cancer more often than radiologists. Additionally, AI triggers false alarms half as often. This is shown by an international study coordinated by Radboud university medical center and published in The Lancet Oncology. This is the first large-scale study where an international team transparently evaluates and compares AI with radiologist assessments and clinical outcomes.
    Radiologists face an increasing workload as men with a higher risk of prostate cancer now routinely receive a prostate MRI. Diagnosing prostate cancer with MRI requires significant expertise, and there is a shortage of experienced radiologists. AI can assist with these challenges.
    AI expert Henkjan Huisman and radiologist Maarten de Rooij, project leaders of the PI-CAI study, organized a major competition between AI teams and radiologists with an international team. Along with other centers in the Netherlands and Norway, they provided over 10,000 MRI scans. They transparently determined for each patient whether prostate cancer was present. They allowed various groups worldwide to develop AI for analyzing these images. The top five submissions were combined into a super-algorithm for analyzing MRI scans for prostate cancer. Finally, AI assessments were compared to those of a group of radiologists on four hundred prostate MRI scans.
    Accurate Diagnosis
    The PI-CAI community brought together over two hundred AI teams and 62 radiologists from twenty countries. They compared the findings of AI and radiologists not only with each other but also with a gold standard, as they monitored the outcomes of the men from whom the scans originated. On average, the men were followed for five years.
    This first international study on AI in prostate diagnostics shows that AI detects nearly seven percent more significant prostate cancers than the group of radiologists. Additionally, AI identifies suspicious areas, later found not to be cancer, fifty percent less often. This means the number of biopsies could be halved with the use of AI. If these results are replicated in follow-up studies, it could greatly assist radiologists and patients in the future. It could reduce radiologists’ workload, provide more accurate diagnoses, and minimize unnecessary prostate biopsies. The developed AI still needs to be validated and is currently not yet available for patients in clinical settings.
    Quality System
    Huisman observes that society has little trust in AI. ‘This is because manufacturers sometimes build AI that isn’t good enough’, he explains. He is working on two things. The first is a public and transparent test to fairly evaluate AI. The second is a quality management system, similar to what exists in the aviation industry. ‘If planes almost collide, a safety committee will look at how to improve the system so that it doesn’t happen in the future. I want the same for AI. I want to research and develop a system that learns from every mistake so that AI is monitored and can continue to improve. That way, we can build trust in AI for healthcare. Optimal, governed AI can help make healthcare better and more efficient.’ More

  • in

    Breakthrough in next-generation memory technology!

    A research team led by Professor Jang-Sik Lee from the Department of Materials Science and Engineering and the Department of Semiconductor Engineering at Pohang University of Science and Technology (POSTECH) has significantly enhanced the data storage capacity of ferroelectric memory devices. By utilizing hafnia-based ferroelectric materials and an innovative device structure, their findings, published on June 7 in the international journal Science Advances, mark a substantial advancement in memory technology.
    With the exponential growth in data production and processing due to advancements in electronics and artificial intelligence (AI), the importance of data storage technologies has surged. NAND flash memory, one of the most prevalent technologies for mass data storage, can store more data in the same area by stacking cells in a three-dimensional structure rather than a planar one. However, this approach relies on charge traps to store data, which results in higher operating voltages and slower speeds.
    Recently, hafnia-based ferroelectric memory has emerged as a promising next-generation memory technology. Hafnia (Hafnium oxide) enables ferroelectric memories to operate at low voltages and high speeds. However, a significant challenge has been the limited memory window for multilevel data storage.
    Professor Jang-Sik Lee’s team at POSTECH has addressed this issue by introducing new materials and a novel device structure. They enhanced the performance of hafnia-based memory devices by doping the ferroelectric materials with aluminum, creating high-performance ferroelectric thin films. Additionally, they replaced the conventional metal-ferroelectric-semiconductor (MFS) structure, where the metal and ferroelectric materials that make up the device are simply arranged, with an innovative metal-ferroelectric-metal-ferroelectric-semiconductor (MFMFS) structure.
    The team successfully controlled the voltage across each layer by adjusting the capacitance of the ferroelectric layers, which involved fine-tuning factors such as the thickness and area ratio of the metal-to-metal and metal-to-channel ferroelectric layers. This efficient use of applied voltage to switch ferroelectric material improved the device’s performance and reduced energy consumption.
    Conventional hafnia-based ferroelectric devices typically have a memory window of around 2 volts (V). In contrast, the research team’s device achieved a memory window exceeding 10 V, enabling Quad-Level Cell (QLC) technology, which stores 16 levels of data (4 bits) per unit transistor. It also demonstrated high stability after more than one million cycles and operated at voltages of 10 V or less, significantly lower than the 18 V required for NAND flash memory. Furthermore, the team’s memory device exhibited stable characteristics in terms of data retention.
    NAND flash memory programs its memory states using Incremental Step Pulse Programming (ISPP), which leads to long programming times and complex circuitry. In contrast, the team’s device achieves rapid programming through one-shot programming by controlling ferroelectric polarization switching.
    Professor Jang-Sik Lee of POSTECH commented, “We have laid the technological foundation for overcoming the limitations of existing memory devices and provided a new research direction for hafnia-based ferroelectric memory.” He added, “Through follow-up research, we aim to develop low-power, high-speed, and high-density memory devices, contributing to solving power issues in data centers and artificial intelligence applications.”
    The research was conducted with the support from the Project for Next-generation Intelligent Semiconductor Technology Development of the Ministry of Science and ICT (National Research Foundation of Korea) and Samsung Electronics. More

  • in

    An AI-powered wearable system tracks the 3D movement of smart pills in the gut

    Scientists at the University of Southern California have developed an artificial intelligence (AI)-powered system to track tiny devices that monitor markers of disease in the gut. Devices using the novel system may help at-risk individuals monitor their gastrointestinal (GI) tract health at home, without the need for invasive tests in hospital settings. This work appears June 12 in the journal Cell Reports Physical Science.
    “Ingestibles are like Fitbits for the gut,” says author Yasser Khan, assistant professor of electrical and computer engineering at the University of Southern California. “But tracking them once swallowed has been a significant challenge.”
    Gas that is formed in the intestines when bacteria break down food can offer insights into a person’s health. Currently, to measure GI tract gases, physicians either use direct methods such as flatus collection and intestinal tube collection, or indirect methods such as breath testing and stool analysis. Ingestible capsules — devices that a user swallows — offer a promising alternative, but no such technologies have been developed for precise gas sensing.
    To tackle this problem, Khan and colleagues developed a system that includes a wearable coil, which the user can conceal under a t-shirt or other clothing. The coil creates a magnetic field, which interacts with sensors embedded in an ingestible pill after it has been swallowed. AI analyzes the signals the pill receives, pinpointing where the device is located in the gut within less than a few millimeters. In addition, the system monitors 3D real-time concentrations of ammonia, a proxy for a bacteria linked with ulcers and gastric cancer, via the device’s optical gas-sensing membranes.
    While previous attempts to track ingestibles as they journey through the gut have relied on bulky desktop coils, the wearable coil can be used anywhere, says Khan. The technology may also have other applications beyond measuring GI tract gases, such as identifying inflammation in the gut caused by Crohn’s disease and delivering drugs to precisely these regions.
    The researchers tested the system’s performance in a variety of mediums that mimic the GI tract, including a simulated cow intestine and liquids designed to replicate stomach and intestinal fluids.
    “During these tests, the device demonstrated its ability to pinpoint its location and measure levels of oxygen and ammonia gases,” says Khan. “Any ingestible device can utilize the technology we’ve developed.”
    However, there are still improvements to be made to the device, says Khan, such as designing it to be smaller and to use less power. Next, as they continue to hone the device, Khan and colleagues plan to test it in pigs in order to study its safety and effectiveness in an organism with human-like biology.
    “Successful outcomes from these trials will bring the device nearer to readiness for human clinical trials,” says Khan. “We are optimistic about the practicality of the system and believe it will soon be applicable for use in humans.” More