More stories

  • in

    New study shows mathematical models helped reduce the spread of COVID-19

    Colorado researchers have published new findings in Emerging Infectious Diseases that take a first look at the use of SARS-CoV-2 mathematical modeling to inform early statewide policies enacted to reduce the spread of the Coronavirus pandemic in Colorado. Among other findings, the authors estimate that 97 percent of potential hospitalizations across the state in the early months of the pandemic were avoided as a result of social distancing and other transmission-reducing activities such as mask wearing and social isolation of symptomatic individuals.
    The modeling team was led by faculty and researchers in the Colorado School of Public Health and involved experts from the University of Colorado Anschutz Medical Campus, University of Colorado Denver, University of Colorado Boulder, and Colorado State University.
    “One of the defining characteristics of the COVID-19 pandemic was the need for rapid response in the face of imperfect and incomplete information,” said the authors. “Mathematical models of infectious disease transmission can be used in real-time to estimate parameters, such as the effective reproductive number (Re) and the efficacy of current and future intervention measures, and to provide time-sensitive data to policymakers.”
    The new paper describes the development of such a model, in close collaboration with the Colorado Department of Health and Environment and the Colorado Governor’s office to gage the impact of early policies to decrease social contacts and, later, the impact of gradual relaxation of Stay-at-Home orders. The authors note that preparing for hospital intensive care unit (ICU) loads or capacity limits was a critical decision-making issue.
    The Colorado COVID-19 Modeling team developed a susceptible-exposed-infected-recovered (SEIR) model calibrated to Colorado COVID-19 case and hospitalization data to estimate changes in the contact rate and the Re after emergence of SARS-CoV-2 and the implementation of statewide COVID-19 control policies in Colorado. The modeling team supplemented model estimates with an analysis of mobility by using mobile device location data. Estimates were generated in near real time, at multiple time-points, with a rapidly evolving understanding of SARS-CoV-2. At each time point, the authors generated projections of the possible course of the outbreak under an array of intervention scenarios. Findings were regularly provided to key Colorado decision-makers.
    “Real-time estimation of contact reduction enabled us to respond to urgent requests to actively inform rapidly changing public health policy amidst a pandemic. In early stages, the urgent need was to flatten the curve,” note the authors. “Once infections began to decrease, there was interest in the degree of increased social contact that could be tolerated as the economy reopened without leading to overwhelmed hospitals.”
    “Although our analysis is specific to Colorado, our experience highlights the need for locally calibrated transmission models to inform public health preparedness and policymaking, along with ongoing analyses of the impact of policies to slow the spread of SARS-CoV-2,” said Andrea Buchwald, PhD, lead author from the Colorado School of Public Health at CU Anschutz. “We present this material not as a final estimate of the impact of social distancing policies, but to illustrate how models can be constructed and adapted in real-time to inform critical policy questions.”
    Story Source:
    Materials provided by University of Colorado Anschutz Medical Campus. Original written by Tonya Ewers. Note: Content may be edited for style and length. More

  • in

    AI predicts diabetes risk by measuring fat around the heart

    A team led by researchers from Queen Mary University of London has developed a new artificial intelligence (AI) tool that is able to automatically measure the amount of fat around the heart from MRI scan images.
    Using the new tool, the team was able to show that a larger amount of fat around the heart is associated with significantly greater odds of diabetes, independent of a person’s age, sex, and body mass index.
    The research is published in the journal Frontiers in Cardiovascular Medicine and is the result of funding from the CAP-AI programme, which is led by Barts Life Sciences, a research and innovation partnership between Queen Mary University of London and Barts Health NHS Trust.
    The distribution of fat in the body can influence a person’s risk of developing various diseases. The commonly used measure of body mass index (BMI) mostly reflects fat accumulation under the skin, rather than around the internal organs. In particular, there are suggestions that fat accumulation around the heart may be a predictor of heart disease, and has been linked to a range of conditions, including atrial fibrillation, diabetes, and coronary artery disease.
    Lead researcher Dr Zahra Raisi-Estabragh from Queen Mary University of London said: “Unfortunately, manual measurement of the amount of fat around the heart is challenging and time-consuming. For this reason, to date, no-one has been able to investigate this thoroughly in studies of large groups of people.
    “To address this problem, we’ve invented an AI tool that can be applied to standard heart MRI scans to obtain a measure of the fat around the heart automatically and quickly, in under three seconds. This tool can be used by future researchers to discover more about the links between the fat around the heart and disease risk, but also potentially in the future, as part of a patient’s standard care in hospital.”
    The research team tested the AI algorithm’s ability to interpret images from heart MRI scans of more than 45,000 people, including participants in the UK Biobank, a database of health information from over half a million participants from across the UK. The team found that the AI tool was accurately able to determine the amount of fat around the heart in those images, and it was also able to calculate a patient’s risk of diabetes.
    Dr Andrew Bard from Queen Mary University of London, who led the technical development, added: “The AI tool also includes an in-built method for calculating uncertainty of its own results, so you could say it has an impressive ability to mark its own homework.”
    Professor Steffen Petersen from Queen Mary University of London, who supervised the project, said: “This novel tool has high utility for future research and, if clinical utility is demonstrated, may be applied in clinical practice to improve patient care. This work highlights the value of cross-disciplinary collaborations in medical research, particularly within cardiovascular imaging.”
    Story Source:
    Materials provided by Queen Mary University of London. Note: Content may be edited for style and length. More

  • in

    Scientists closing in on map of the mammalian immune system

    Using artificial intelligence, UT Southwestern scientists have identified thousands of genetic mutations likely to affect the immune system in mice. The work is part of one Nobel laureate’s quest to find virtually all such variations in mammals.
    “This study identifies 101 novel gene candidates with greater than 95% chance of being required for immunity,” says Bruce Beutler, M.D., director of the Center for the Genetics of Host Defense (CGHD) and corresponding author of the study published this week in the Proceedings of the National Academy of Sciences. “Many of these candidates we have already verified by re-creating the mutations or knocking out the genes.” Lead author Darui Xu, a computational biologist at CGHD, wrote the software.
    “We’ve developed software called Candidate Explorer (CE) that uses a machine-learning algorithm to identify chemically induced mutations likely to cause traits related to immunity. The software determines the probability that any mutation we’ve induced will be verified as causative after further testing,” Beutler says. His discovery of an important family of receptors that allow mammals to quickly sense infection and trigger an inflammatory response led to the 2011 Nobel Prize in Physiology or Medicine.
    “The purpose of CE is to help researchers predict whether a mutation associated with a phenotype (trait or function) is a truly causative mutation. CE has already helped us to identify hundreds of genes with novel functions in immunity. This will improve our understanding of the immune system so that we can find new ways to keep it robust, and also know the reason it sometimes falters,” says Beutler, Regental Professor, and professor of immunology and internal medicine at UT Southwestern.
    “CE provides a score that tells us the likelihood that a particular mutation-phenotype association will be verified for cause and effect if we re-create the mutation or knock out the gene,” he says.
    CE examines 67 features of the primary genetic mapping data to arrive at an estimate of the likelihood of causation. For some mutations, causation is very clear; for others, less so. Over time, the program “learns” from experiments in which researchers re-create the mutation in a fresh pedigree and verify or exclude the hypothesis of causation. All mutations are made available to the scientific community through a public repository, and the data supporting causation are viewable within the Candidate Explorer program on the CGHD website, Mutagenetix (https://mutagenetix.utsouthwestern.edu/). More

  • in

    A new look at color displays

    Researchers at Linköping University have developed a method that may lead to new types of displays based on structural colours. The discovery opens the way to cheap and energy-efficient colour displays and electronic labels. The study has been published in the scientific journal Advanced Materials.
    We usually think of colours as created by pigments, which absorb light at certain wavelengths such that we perceive colour from other wavelengths that are scattered and reach our eyes. That’s why leaves, for example, are green and tomatoes red. But colours can be created in other ways, and some materials appear coloured due to their structure. Structural colours can arise when light is internally reflected inside the material on a scale of nanometres. This is usually referred to as interference effects. An example found in nature are peacock feathers, which are fundamentally brown but acquire their characteristic blue-green sheen from small structural features.
    Researchers at Linköping University have developed a new and simple method to create structural colours for use with reflective colour displays. The new method may enable manufacturing of thin and lightweight displays with high energy-efficiency for a broad range of applications.
    Reflective colour displays differ from the colour displays we see in everyday life on devices such as mobile phones and computers. The latter consist of small light-emitting diodes of red, green and blue positioned close to each other such that they together create white light. The colour of each light-emitting diode depends on the molecules from which it is built, or in other words, its pigment. However, it is relatively expensive to manufacture light-emitting diodes, and the global use of emissive displays consumes a lot of energy. Another type of display, reflective displays, is therefore being explored for purposes such as tablet computers used as e-readers, and electronic labels. Reflective displays form images by controlling how incident light from the surroundings is reflected, which means that they do not need their own source of illumination. However, most reflective displays are intrinsically monochrome, and attempts to create colour versions have been rather complicated and have sometimes given poor results.
    Shangzhi Chen is a newly promoted doctor at the Laboratory of Organic Electronics at Linköping University and principal author of an article that describes a new type of dynamic structural colour image, published in the scientific journal Advanced Materials.
    “We have developed a simple method to produce structural colour images with electrically conducting plastics, or conducting polymers. The polymer is applied at nanoscale thicknesses onto a mirror by a technique known as vapour phase polymerisation, after the substrate has been illuminated with UV light. The stronger the UV illumination, the thicker the polymer film, and this allows us to control the structural colours that appear at different locations on the substrate,” says Shangzhi Chen.
    The method can produce all colours in the visible spectrum. Furthermore, the colours can be subsequently adjusted using electrochemical variation of the redox state of the polymer. This function has been popular for monochrome reflective displays, and the new study shows that the same materials can provide dynamic images in colour using optical interference effects combined with spatial control of nanoscale thicknesses. Magnus Jonsson, associate professor at the Laboratory of Organic Electronics at Linköping University, believes that the method has great potential, for example, for applications such as electronic labels in colour. Further research may also allow more advanced displays to be manufactured.
    “We receive increasing amounts of information via digital displays, and if we can contribute to more people gaining access to information through cheap and energy-efficient displays, that would be a major benefit. But much research remains to be done, and new projects are already under way,” says Magnus Jonsson.
    Story Source:
    Materials provided by Linköping University. Original written by Anders Ryttarson Törneholm. Note: Content may be edited for style and length. More

  • in

    Synthetic biology circuits can respond within seconds

    Synthetic biology offers a way to engineer cells to perform novel functions, such as glowing with fluorescent light when they detect a certain chemical. Usually, this is done by altering cells so they express genes that can be triggered by a certain input.
    However, there is often a long lag time between an event such as detecting a molecule and the resulting output, because of the time required for cells to transcribe and translate the necessary genes. MIT synthetic biologists have now developed an alternative approach to designing such circuits, which relies exclusively on fast, reversible protein-protein interactions. This means that there’s no waiting for genes to be transcribed or translated into proteins, so circuits can be turned on much faster — within seconds.
    “We now have a methodology for designing protein interactions that occur at a very fast timescale, which no one has been able to develop systematically. We’re getting to the point of being able to engineer any function at timescales of a few seconds or less,” says Deepak Mishra, a research associate in MIT’s Department of Biological Engineering and the lead author of the new study.
    This kind of circuit could be useful for creating environmental sensors or diagnostics that could reveal disease states or imminent events such as a heart attack, the researchers say.
    Ron Weiss, a professor of biological engineering and of electrical engineering and computer science, is the senior author of the study, which appears today in Science. Other authors include Tristan Bepler, a former MIT postdoc; Bonnie Berger, the Simons Professor of Mathematics and head of the Computation and Biology group in MIT’s Computer Science and Artificial Intelligence Laboratory; Brian Teague, an assistant professor at the University of Wisconsin; and Jim Broach, chair of the Department of Biochemistry and Molecular Biology at Penn State Hershey Medical Center.
    Protein interactions
    Inside living cells, protein-protein interactions are essential steps in many signaling pathways, including those involved in immune cell activation and responses to hormones or other signals. Many of these interactions involve one protein activating or deactivating another by adding or removing chemical groups called phosphates. More

  • in

    Digital pens provide new insight into cognitive testing results

    During neuropsychological assessments, participants complete tasks designed to study memory and thinking. Based on their performance, the participants receive a score that researchers use to evaluate how well specific domains of their cognition are functioning.
    Consider, though, two participants who achieve the same score on one of these paper-and-pencil neuropsychological tests. One took 60 seconds to complete the task and was writing the entire time; the other spent three minutes, and alternated between writing answers and staring off into space. If researchers analyzed only the overall score of these two participants, would they be missing something important?
    “By looking only at the outcome, meaning what score someone gets, we lose a lot of important information about how the person performed the task that may help us to better understand the underlying problem,” explains lead author Stacy Andersen, PhD, assistant professor of medicine at Boston University School of Medicine (BUSM).
    Researchers with the Long Life Family Study (LLFS) used digital pens and digital voice recorders to capture differences in study participants’ performance while completing a cognitive test and found that differences in ‘thinking’ versus ‘writing’ time on a symbol coding test might act as clinically relevant, early biomarkers for cognitive/motor decline.
    Participants in the LLFS were chosen for having multiple siblings living to very old ages. Longevity has long been associated with an increased health span and thus these families are studied to better understand contributors to healthy aging. The participants were assessed on a number of physical and cognitive measures, including a symbol coding test called the Digit Symbol Substitution Test.
    This timed test requires participants to fill in numbered boxes with corresponding symbols from a given key and assesses both cognitive (attention and processing speed) and non-cognitive factors (motor speed and visual scanning). To allow researchers to collect data about how a participant went about completing the task, the participants used a digital pen while completing the test. On the tip of this pen was a small camera that tracked what and when a participant wrote. The LLFS researchers divided the output from this digital pen into ‘writing time’ (the time the participant spent writing) and ‘thinking time’ (the time not spent writing) and looked at how these changed over the course of the 90-second test.
    The researchers then identified groups of participants that had similar patterns of writing time and thinking time across the course of the test. They found that although most participants had consistent writing and thinking times, there were groups of participants who got faster or slowed down. “This method of clustering allowed us to look at other similarities among the participants in each group in terms of their health and function that may be related to differences in writing and thinking time patterns,” said coauthor and lead biostatistician Benjamin Sweigart, a biostatistics doctoral student at Boston University School of Public Health. The researchers found that those who got slower in writing the symbols during the test had poorer physical function on tests of grip strength and walking speed. In contrast, those who changed speed in thinking time had poorer scores on memory and executive function tests suggesting that writing time and thinking time capture different contributors to overall performance on the test.
    According to the researchers, these findings show the importance of capturing additional facets of test performance beyond test scores. “Identifying whether poor test performance is related to impaired cognitive function as opposed to impaired motor function is important for choosing the correct treatment for an individual patient” adds Andersen. “The incorporation of digital technologies amplifies our ability to detect subtle differences in test behavior and functional abilities, even on brief tests of cognitive function. Moreover, these metrics have the potential to be very early markers of dysfunction.”
    Story Source:
    Materials provided by Boston University School of Medicine. Note: Content may be edited for style and length. More

  • in

    Making computer servers worldwide more climate friendly

    An elegant new algorithm developed by Danish researchers can significantly reduce the resource consumption of the world’s computer servers. Computer servers are as taxing on the climate as global air traffic combined, thereby making the green transition in IT an urgent matter. The researchers, from the University of Copenhagen, expect major IT companies to deploy the algorithm immediately.
    One of the flipsides of our runaway internet usage is its impact on climate due to the massive amount of electricity consumed by computer servers. Current CO2 emissions from data centres are as high as from global air traffic combined — with emissions expected to double within just a few years.
    Only a handful of years have passed since Professor Mikkel Thorup was among a group of researchers behind an algorithm that addressed part of this problem by producing a groundbreaking recipe to streamline computer server workflows. Their work saved energy and resources. Tech giants including Vimeo and Google enthusiastically implemented the algorithm in their systems, with online video platform Vimeo reporting that the algorithm had reduced their bandwidth usage by a factor of eight.
    Now, Thorup and two fellow UCPH researchers have perfected the already clever algorithm, making it possible to address a fundamental problem in computer systems — the fact that some servers become overloaded while other servers have capacity left — many times faster than today.
    “We have found an algorithm that removes one of the major causes of overloaded servers once and for all. Our initial algorithm was a huge improvement over the way industry had been doing things, but this version is many times better and reduces resource usage to the greatest extent possible. Furthermore, it is free to use for all,” says Professor Thorup of the University of Copenhagen’s Department of Computer Science, who developed the algorithm alongside department colleagues Anders Aamand and Jakob Bæk Tejs Knudsen.
    Soaring internet traffic
    The algorithm addresses the problem of servers becoming overloaded as they receive more requests from clients than they have the capacity to handle. This happens as users pile in to watch a certain Vimeo video or Netflix film. As a result, systems often need to shift clients around many times to achieve a balanced distribution among servers. More

  • in

    New report aims to improve VR use in healthcare education

    A new report that could help improve how immersive technologies such as Virtual Reality (VR) and Augmented Reality (AR) are used in healthcare education and training has been published with significant input from the University of Huddersfield.
    Professor David Peebles, Director of the University’s Centre for Cognition and Neuroscience, and Huddersfield PhD graduate Matthew Pears contributed to the report — ‘Immersive technologies in healthcare training and education: Three principles for progress’ — recently published by the University of Leeds with input from range of academics, technologists and health professionals.
    The principles have also been expanded upon in a letter to the prestigious journal BMJ Simulation and Technology Enhanced Learning.
    The Huddersfield contribution to the report stems from research conducted over several years, which involved another former Huddersfield PhD researcher, Yeshwanth Pulijala, and Professor Eunice Ma, now with Falmouth University.
    “Yeshwanth had an interest in technology and education, and in using VR for dentistry training. Matthew was looking at soft skills and situation awareness, which could be applied to investigating how dentists were able to keep a track of what was going on around them. They were similar subjects, although with different emphases, and so it seemed a natural area for collaboration.”
    With only a relatively small number of dental schools in the UK, the quartet visited seven dental schools in India in early 2017, with support from travel grants from Santander Bank, to test their VR-based training materials on students. The experience gained from that visit contributed to both researchers’ PhDs, and ultimately led to the involvement of Professor Peebles and Matthew Pears in the new report.
    The report argues for greater standardisation of how to use immersive technologies in healthcare training and education. As Professor Peebles explains, “It’s about developing a set of principles and guidelines for the use of immersive technology in medical treatment. Immersive technology is becoming increasingly popular and, as the technology is advancing, it’s becoming clear that there is great potential to make training more accessible and effective.
    “It is important, however, that research is driven by the needs of the user and existing evidence rather than the technology. Rather than thinking ‘we have a new bit of VR or AR kit, what can we do with it?’, we should be looking at the problem that needs solving — what are the learning needs, so how do we use technology to solve it?
    “Developing immersive training materials can be very time-consuming and difficult to evaluate properly. Getting surgeons and medical students to take time out to test your VR training is challenging. In our case we were lucky to have a surgeon, Professor Ashraf Ayoub, a Professor of Oral and Maxillofacial Surgery at the University of Glasgow, who granted us permission to film a surgical procedure that was then transformed into a 3D environment to train students about situation awareness while in the operating theatre.”
    Professor Peebles hopes the work so far will provide a basis for more investigations that could help get the most from the potential that VR and immersive technology have to offer.
    “Conducting these kinds of studies is difficult to do well, in particular getting sufficient quantitative data that allows you to rigorously evaluate them. “As the report recommends, more collaboration is required to pool technological and intellectual resources, to try to develop a set of standards and a community that works together to boost and improve research in this area.”
    Story Source:
    Materials provided by University of Huddersfield. Note: Content may be edited for style and length. More