More stories

  • in

    New graphite-based sensor technology for wearable medical devices

    Researchers at AMBER, the SFI Centre for Advanced Materials and BioEngineering Research, and from Trinity’s School of Physics, have developed next-generation, graphene-based sensing technology using their innovative G-Putty material.
    The team’s printed sensors are 50 times more sensitive than the industry standard and outperform other comparable nano-enabled sensors in an important metric seen as a game-changer in the industry: flexibility.
    Maximising sensitivity and flexibility without reducing performance makes the teams’ technology an ideal candidate for the emerging areas of wearable electronics and medical diagnostic devices.
    The team — led by Professor Jonathan Coleman from Trinity’s School of Physics, one of the world’s leading nanoscientists — demonstrated that they can produce a low-cost, printed, graphene nanocomposite strain sensor.
    They developed a method to formulate G-Putty based inks that can be printed as a thin-film onto elastic substrates, including band-aids, and attached easily to the skin.
    The team developed a method to formulate G-Putty based inks that can be printed as a thin-film onto elastic substrates, including band-aids, and attached easily to the skin. More

  • in

    Little to no increase in association between adolescents' mental health problems and digital tech

    With the explosion in digital entertainment options over the past several decades and the more recent restrictions on outdoor and in-person social activities, parents may worry that excessive engagement with digital technology could have long-term effects on their children’s mental health.
    A new study published in the journal Clinical Psychological Science, however, found little evidence for an increased association between adolescents’ technology engagement and mental health problems over the past 30 years. The data did not consistently support the suggestion that the technologies we worry about most (e.g., smartphones) are becoming more harmful.
    The new study, which included 430,000 U.K. and U.S. adolescents, investigated the links between social media use and depression, emotional problems, and conduct problems. It also examined the associations between television viewing and suicidality, depression, emotional problems, and conduct problems. Finally, the study explored the association between digital device use and suicidality.
    Of the eight associations examined in this research, only three showed some change over time. Social media use and television viewing became less strongly associated with depression. In contrast, social media’s association with emotional problems did increase, although only slightly. The study found no consistent changes in technology engagement’s associations with conduct problems or suicidality.
    “If we want to understand the relationship between tech and well-being today, we need to first go back and look at historic data — as far back as when parents were concerned too much TV would give their kids square eyes — in order to bring the contemporary concerns we have about newer technologies into focus,” said Matti Vuorre, a postdoctoral researcher at the Oxford Internet Institute and lead author on the paper.
    The study also highlighted key factors preventing scientists from conclusively determining how technology use relates to mental health.
    “As more data accumulates on adolescents’ use of emerging technologies, our knowledge of them and their effects on mental health will become more precise,” said Andy Przybylski, director of research at Oxford Internet Institute and senior author on the study. “So, it’s too soon to draw firm conclusions about the increasing, or declining, associations between social media and adolescent mental health, and it is certainly way too soon to be making policy or regulation on this basis.
    “We need more transparent and credible collaborations between scientists and technology companies to unlock the answers. The data exists within the tech industry; scientists just need to be able to access it for neutral and independent investigation,” Przybylski said.
    Story Source:
    Materials provided by Association for Psychological Science. Note: Content may be edited for style and length. More

  • in

    New synapse-like phototransistor

    Researchers at the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) developed a breakthrough in energy-efficient phototransistors. Such devices could eventually help computers process visual information more like the human brain and be used as sensors in things like self-driving vehicles.
    The structures rely on a new type of semiconductor — metal-halide perovskites — which have proven to be highly efficient at converting sunlight into electrical energy and shown tremendous promise in a range of other technologies.
    “In general, these perovskite semiconductors are a really unique functional system with potential benefits for a number of different technologies,” said Jeffrey Blackburn, a senior scientist at NREL and co-author of a new paper outlining the research. “NREL became interested in this material system for photovoltaics, but they have many properties that could be applied to whole different areas of science.”
    In this case, the researchers combined perovskite nanocrystals with a network of single-walled carbon nanotubes to create a material combination they thought might have interesting properties for photovoltaics or detectors. When they shined a laser at it, they found a surprising electrical response.
    “What normally would happen is that, after absorbing the light, an electrical current would briefly flow for a short period of time,” said Joseph Luther, a senior scientist and co-author. “But in this case, the current continued to flow and did not stop for several minutes even when the light was switched off.”
    Such behavior is referred to as “persistent photoconductivity” and is a form of “optical memory,” where the light energy hitting a device can be stored in “memory” as an electrical current. The phenomenon can also mimic synapses in the brain that are used to store memories. Often, however, persistent photoconductivity requires low temperatures and/or high operating voltages, and the current spike would only last for small fractions of a second. In this new discovery, the persistent photoconductivity produces an electrical current at room temperature and flows current for more than an hour after the light is switched off. In addition, only low voltages and low light intensities were found to be needed, highlighting the low energy needed to store memory. More

  • in

    Algorithms improve how we protect our data

    Daegu Gyeongbuk Institute of Science and Technology (DGIST) scientists in Korea have developed algorithms that more efficiently measure how difficult it would be for an attacker to guess secret keys for cryptographic systems. The approach they used was described in the journal IEEE Transactions on Information Forensics and Security and could reduce the computational complexity needed to validate encryption security.
    “Random numbers are essential for generating cryptographic information,” explains DGIST computer scientist Yongjune Kim, who co-authored the study with Cyril Guyot and Young-Sik Kim. “This randomness is crucial for the security of cryptographic systems.”
    Cryptography is used in cybersecurity for protecting information. Scientists often use a metric, called ‘min-entropy’, to estimate and validate how good a source is at generating the random numbers used to encrypt data. Data with low entropy is easier to decipher, whereas data with high entropy is much more difficult to decode. But it is difficult to accurately estimate the min-entropy for some types of sources, leading to underestimations.
    Kim and his colleagues developed an offline algorithm that estimates min-entropy based on a whole data set, and an online estimator that only needs limited data samples. The accuracy of the online estimator improves as the amount of data samples increases. Also, the online estimator does not need to store entire datasets, so it can be used in applications with stringent memory, storage and hardware constraints, like Internet-of-things devices.
    “Our evaluations showed that our algorithms can estimate min-entropy 500 times faster than the current standard algorithm while maintaining estimation accuracy,” says Kim.
    Kim and his colleagues are working on improving the accuracy of this and other algorithms for estimating entropy in cryptography. They are also investigating how to improve privacy in machine learning applications.
    Story Source:
    Materials provided by DGIST (Daegu Gyeongbuk Institute of Science and Technology). Note: Content may be edited for style and length. More

  • in

    Complex shapes of photons to boost future quantum technologies

    As the digital revolution has now become mainstream, quantum computing and quantum communication are rising in the consciousness of the field. The enhanced measurement technologies enabled by quantum phenomena, and the possibility of scientific progress using new methods, are of particular interest to researchers around the world.
    Recently two researchers at Tampere University, Assistant Professor Robert Fickler and Doctoral Researcher Markus Hiekkamäki, demonstrated that two-photon interference can be controlled in a near-perfect way using the spatial shape of the photon. Their findings were recently published in the journal Physical Review Letters.
    “Our report shows how a complex light-shaping method can be used to make two quanta of light interfere with each other in a novel and easily tuneable way,” explains Markus Hiekkamäki.
    Single photons (units of light) can have highly complex shapes that are known to be beneficial for quantum technologies such as quantum cryptography, super-sensitive measurements, or quantum-enhanced computational tasks. To make use of these so-called structured photons, it is crucial to make them interfere with other photons.
    “One crucial task in essentially all quantum technological applications is improving the ability to manipulate quantum states in a more complex and reliable way. In photonic quantum technologies, this task involves changing the properties of a single photon as well as interfering multiple photons with each other;” says Robert Fickler, who leads the Experimental Quantum Optics group at the university.
    Linear optics bring promising solutions to quantum communications
    The demonstrated development is especially interesting from the point of view of high-dimensional quantum information science, where more than a single bit of quantum information is used per carrier. These more complex quantum states not only allow the encoding of more information onto a single photon but are also known to be more noise-resistant in various settings.
    The method presented by the research duo holds promise for building new types of linear optical networks. This paves the way for novel schemes of photonic quantum-enhanced computing.
    “Our experimental demonstration of bunching two photons into multiple complex spatial shapes is a crucial next step for applying structured photons to various quantum metrological and informational tasks,” continues Markus Hiekkamäki.
    The researchers now aim at utilizing the method for developing new quantum-enhanced sensing techniques, while exploring more complex spatial structures of photons and developing new approaches for computational systems using quantum states.
    “We hope that these results inspire more research into the fundamental limits of photon shaping. Our findings might also trigger the development of new quantum technologies, e.g. improved noise-tolerant quantum communication or innovative quantum computation schemes, that benefit from such high-dimensional photonic quantum states,” adds Robert Fickler.
    Story Source:
    Materials provided by Tampere University. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence to monitor water quality more effectively

    Artificial intelligence that enhances remote monitoring of water bodies — highlighting quality shifts due to climate change or pollution — has been developed by researchers at the University of Stirling.
    A new algorithm — known as the ‘meta-learning’ method — analyses data directly from satellite sensors, making it easier for coastal zone, environmental and industry managers to monitor issues such as harmful algal blooms (HABs) and possible toxicity in shellfish and finfish.
    Environmental protection agencies and industry bodies currently monitor the ‘trophic state’ of water — its biological productivity — as an indicator of ecosystem health. Large clusters of microscopic algae, or phytoplankton, is called eutrophication and can turn into HABs, an indicator of pollution and which pose risk to human and animal health.
    HABs are estimated to cost the Scottish shellfish industry £1.4 million per year, and a single HAB event in Norway killed eight million salmon in 2019, with a direct value of over £74 million.
    Lead author Mortimer Werther, a PhD Researcher in Biological and Environmental Sciences at Stirling’s Faculty of Natural Sciences, said: “Currently, satellite-mounted sensors, such as the Ocean and Land Instrument (OLCI), measure phytoplankton concentrations using an optical pigment called chlorophyll-a. However, retrieving chlorophyll-a across the diverse nature of global waters is methodologically challenging.
    “We have developed a method that bypasses the chlorophyll-a retrieval and enables us to estimate water health status directly from the signal measured at the remote sensor.”
    Eutrophication and hypereutrophication is often caused by excessive nutrient input, for example from agricultural practices, waste discharge, or food and energy production. In impacted waters, HABs are common, and cyanobacteria may produce cyanotoxins which affect human and animal health. In many locations, these blooms are of concern to the finfish and shellfish aquaculture industries.
    Mr Werther said: “To understand the impact of climate change on freshwater aquatic environments such as lakes, many of which serve as drinking water resources, it is essential that we monitor and assess key environmental indicators, such as trophic status, on a global scale with high spatial and temporal frequency.
    “This research, funded by the European Union’s Horizon 2020 programme, is the first demonstration that trophic status of complex inland and nearshore waters can be learnt directly by machine learning algorithms from OLCI reflectance measurements. Our algorithm can produce estimates for all trophic states on imagery acquired by OLCI over global water bodies.
    “Our method outperforms a comparable state-of-the-art approach by 5-12% on average across the entire spectrum of trophic states, as it also eliminates the need to choose the right algorithm for water observation. It estimates trophic status with over 90% accuracy for highly affected eutrophic and hypereutrophic waters.”
    The collaborative study was carried out with five external partners from research and industry: Dr. Stefan G.H. Simis from Plymouth Marine Laboratory; Harald Krawczyk from the German Aerospace Center; Dr. Daniel Odermatt from the Swiss Federal Institute of Aquatic Science and Technology; Kerstin Stelzer from Brockmann Consult and Oberon Berlage from Appjection (Amsterdam).
    Story Source:
    Materials provided by University of Stirling. Note: Content may be edited for style and length. More

  • in

    Speeding new treatments

    A year into the COVID-19 pandemic, mass vaccinations have begun to raise the tantalizing prospect of herd immunity that eventually curtails or halts the spread of SARS-CoV-2. But what if herd immunity is never fully achieved — or if the mutating virus gives rise to hyper-virulent variants that diminish the benefits of vaccination?
    Those questions underscore the need for effective treatments for people who continue to fall ill with the coronavirus. While a few existing drugs show some benefit, there’s a pressing need to find new therapeutics.
    Led by The University of New Mexico’s Tudor Oprea, MD, PhD, scientists have created a unique tool to help drug researchers quickly identify molecules capable of disarming the virus before it invades human cells or disabling it in the early stages of the infection.
    In a paper published this week in Nature Machine Intelligence, the researchers introduced REDIAL-2020, an open source online suite of computational models that will help scientists rapidly screen small molecules for their potential COVID-fighting properties.
    “To some extent this replaces (laboratory) experiments, says Oprea, chief of the Translational Informatics Division in the UNM School of Medicine. “It narrows the field of what people need to focus on. That’s why we placed it online for everyone to use.”
    Oprea’s team at UNM and another group at the University of Texas at El Paso led by Suman Sirimulla, PhD, started work on the REDIAL-2020 tool last spring after scientists at the National Center for Advancing Translational Sciences (NCATS) released data from their own COVID drug repurposing studies. More

  • in

    When will your elevator arrive? Two physicists do the math

    The human world is, increasingly, an urban one — and that means elevators. Hong Kong, the hometown of physicist Zhijie Feng (Boston University),* adds new elevators at the rate of roughly 1500 every year…making vertical transport an alluring topic for quantitative research.
    “Just in the main building of my undergraduate university, Hong Kong University of Science and Technology,” Feng reflects, “there are 37 elevators, all numbered so we can use them to indicate the location of hundreds of classrooms. There is always a line outside each elevator lobby, and if they are shut down, we have to hike for 30 minutes.”
    Feng and Santa Fe Institute Professor Sidney Redner saw this as an opportunity to explore the factors that determine elevator transport capabilities. In their new paper in the Journal of Statistical Mechanics, they begin by making a deliberately simple “toy” model.
    “Engineers have already developed computational models for simulating elevators as realistically as possible,” says Feng. “Instead, we wanted insight into basic mechanisms, using just enough parameters to describe what we see in a way we can fully understand.”
    Their minimum-variable simulation makes six key assumptions: unoccupied buildings, first-come-first-served transport, identical elevators traveling to uniformly distributed destination floors, 2.5 seconds to enter or exit elevators, and one second to travel from one floor to the next.
    For a 100-story building with one idealized infinite-capacity elevator, Feng and Redner find that waiting times typically fall between five and seven minutes. With elevators that can carry 20 people each, and buildings that hold 100 workers per floor, this cycle requires 500 trips over 2 hours — or 21 elevators — to get everyone to work on time.
    “If the elevators are uncorrelated,” the authors write, wait time “should equal the single elevator cycle time divided by the number of elevators, which is roughly 15 seconds.” However, this efficient spacing of elevators doesn’t last: as passenger demand increases, elevators start to move in lockstep, creating traffic jams in the lobby below until multiple elevators arrive back on the ground floor at the same time.
    These nonlinear dynamics stymie any easy answer to the question of how long a person has to wait. But to Feng and Redner this is just the entry-level to a bigger inquiry. “I hope our work could be a ‘pocket version’ model to extend from,” Feng remarks. She credits Redner’s textbook, which she read in her early undergraduate days, for inspiring her love of breaking down complex problems into simple models.
    Some of the further questions they identify include, “If a building tapers with height, is there a taper angle that minimizes waiting time but optimizes office space?”; and, “What if some elevators only service certain floors, and others service different floors?”
    Food for thought next time you’re waiting in the lobby…
    Story Source:
    Materials provided by Santa Fe Institute. Note: Content may be edited for style and length. More