More stories

  • in

    Scientists use artificial intelligence to detect gravitational waves

    When gravitational waves were first detected in 2015 by the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), they sent a ripple through the scientific community, as they confirmed another of Einstein’s theories and marked the birth of gravitational wave astronomy. Five years later, numerous gravitational wave sources have been detected, including the first observation of two colliding neutron stars in gravitational and electromagnetic waves.
    As LIGO and its international partners continue to upgrade their detectors’ sensitivity to gravitational waves, they will be able to probe a larger volume of the universe, thereby making the detection of gravitational wave sources a daily occurrence. This discovery deluge will launch the era of precision astronomy that takes into consideration extrasolar messenger phenomena, including electromagnetic radiation, gravitational waves, neutrinos and cosmic rays. Realizing this goal, however, will require a radical re-thinking of existing methods used to search for and find gravitational waves.
    Recently, computational scientist and lead for translational artificial intelligence (AI), Eliu Huerta of the U.S. Department of Energy’s (DOE) Argonne National Laboratory, in conjunction with collaborators from Argonne, the University of Chicago, the University of Illinois at Urbana-Champaign, NVIDIA and IBM, has developed a new production-scale AI framework that allows for accelerated, scalable and reproducible detection of gravitational waves.
    This new framework indicates that AI models could be as sensitive as traditional template matching algorithms, but orders of magnitude faster. Furthermore, these AI algorithms would only require an inexpensive graphics processing unit (GPU), like those found in video gaming systems, to process advanced LIGO data faster than real time.
    The AI ensemble used for this study processed an entire month — August 2017 — of advanced LIGO data in less than seven minutes, distributing the dataset over 64 NVIDIA V100 GPUs. The AI ensemble used by the team for this analysis identified all four binary black hole mergers previously identified in that dataset, and reported no misclassifications.
    “As a computer scientist, what’s exciting to me about this project,” said Ian Foster, director of Argonne’s Data Science and Learning (DSL) division, “is that it shows how, with the right tools, AI methods can be integrated naturally into the workflows of scientists — allowing them to do their work faster and better — augmenting, not replacing, human intelligence.”
    Bringing disparate resources to bear, this interdisciplinary and multi-institutional team of collaborators has published a paper in Nature Astronomy showcasing a data-driven approach that combines the team’s collective supercomputing resources to enable reproducible, accelerated, AI-driven gravitational wave detection.
    “In this study, we’ve used the combined power of AI and supercomputing to help solve timely and relevant big-data experiments. We are now making AI studies fully reproducible, not merely ascertaining whether AI may provide a novel solution to grand challenges,” Huerta said.
    Building upon the interdisciplinary nature of this project, the team looks forward to new applications of this data-driven framework beyond big-data challenges in physics.
    “This work highlights the significant value of data infrastructure to the scientific community,” said Ben Blaiszik, a research scientist at Argonne and the University of Chicago. “The long-term investments that have been made by DOE, the National Science Foundation (NSF), the National Institutes of Standards and Technology and others have created a set of building blocks. It is possible for us to bring these building blocks together in new and exciting ways to scale this analysis and to help deliver these capabilities to others in the future.”
    Huerta and his research team developed their new framework through the support of the NSF, Argonne’s Laboratory Directed Research and Development (LDRD) program and DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program.
    “These NSF investments contain original, innovative ideas that hold significant promise of transforming the way scientific data arriving in fast streams are processed. The planned activities are bringing accelerated and heterogeneous computing technology to many scientific communities of practice,” said Manish Parashar, director of the Office of Advanced Cyberinfrastructure at NSF. More

  • in

    Study gauges hospital preparedness for the next national medical crisis

    As the COVID-19 pandemic wanes in the U.S., a new study from the University of Maryland School of Medicine (UMSOM) and University of Maryland Medical Center (UMMC) finds that hospitals nationwide may not be adequately prepared for the next pandemic. A 10-year analysis of hospitals’ preparedness for pandemics and other mass casualty events found only marginal improvements in a measurement to assess preparedness during the years leading up to the COVID-19 pandemic. The study was published last month in the Journal of Healthcare Management.
    “Our work links objective healthcare data to a hospital score that assesses the ability to save lives in a disaster,” said study lead author David Marcozzi, MD, Professor of Emergency Medicine at UMSOM and Chief Clinical Officer/Senior Vice President at UMMC. “It attempts to fill a glaring gap in the national conversation on the need for improved assessments of and the opportunity for better hospital planning to assure readiness.”
    To conduct the research, Dr. Marcozzi, who is also the COVID-19 Incident Commander for the University of Maryland Medical System, and his colleagues first developed and published a surge index tool that linked standard reported hospital information to healthcare preparedness elements. The tool, called the Hospital Medical Surge Preparedness Index (HMSPI), used data from 2005 to 2014 to produce a score designed to predict how well a hospital can handle a sudden influx in patients due to a mass shooting or infectious disease outbreak. Such data included the size of the medical staff, the number of hospital beds, and the amount of equipment and supplies.
    Medical surge capacity is an important measure to assess a hospital’s ability to expand quickly beyond normal services to meet an increased demand for healthcare. The Las Vegas mass shooting in 2017, for example, sent more than 500 concertgoers to local hospitals. During the early weeks of the COVID-19 pandemic, New York City hospitals were under siege with 4,000 patients hospitalized. To calculate the HMSPI, researchers input data from four important metrics: Staff: Doctors, nurses, pharmacists, respiratory technicians and others Supplies: Personal protective equipment, cardiac monitors, sterile bandages, and ventilators Space: Total beds and number of beds that current staff can handle Systems: Framework for enabling electronic sharing of files and information between departments and multiple hospitalsIn the new study, Dr. Marcozzi and his colleagues used data from the American Hospital Association’s annual surveys of more than 6,200 hospitals nationwide that were collected from 2005 to 2014. They also employed data from the U.S. Census Bureau to determine population estimates in cities and the Dartmouth Atlas Project to establish the geographic service area of each hospital. They combined the hospital metrics gleaned from the AHA’s annual surveys with the geographic data to calculate HMSPI composite scores for hospitals in each state.
    Their evaluation found varying levels of increases in HMSPI scores from 2005 to 2014 in every state, which could indicate that states are becoming better prepared to handle a medical surge. The scores also indicated that ideal readiness had not yet been achieved in any state before the COVID-19 pandemic.
    “This is just the starting point. We need to better understand the ability of our nation’s hospitals to save lives in times of crisis,” said Dr. Marcozzi. This information, and follow-up studies building from this work, will be key to better matching states’ healthcare resources to their population to assure optimal care is delivered. Dr. Marcozzi described one follow-up study that would be impactful would be to use data from the COVID-19 pandemic to see whether the index was predictive to indicate which hospitals were most prepared for the pandemic surge based on their patient outcomes.
    “This pioneering work is a needed advancement that could allow for a transparent assessment of a hospital’s ability to save lives in a large-scale emergency,” Dr. Marcozzi said. “The COVID-19 pandemic demonstrated that there is still plenty of room for improvement in the ability of our nation’s healthcare system to triage and manage multiple patients in a crisis and that translates into lives lost, unnecessarily. Our research is dedicated to those who lost their lives in this tragedy and other mass casualty events. We can do better.”
    National health leadership organizations, such as the U.S. Centers for Medicare and Medicaid Services, the Assistant Secretary for Preparedness and Response, the Joint Commission, and the American Medical Association, as well as state and local emergency planners, could all potentially benefit from the use of HMSPI scores, according to Dr. Marcozzi. The tool could be used to support data-driven policy development and resource allocation to close gaps and assure that individuals get the care they need, when then need it, during a crisis.
    Ricardo Pietrobon, MD, PhD, MBA, Adjunct Associate Professor of Emergency Medicine at UMSOM, Nicole Baehr, Manager of Operations at UMMC, and Brian J. Browne, MD, Professor and Chair of the Department of Emergency Medicine, were co-authors on this study. Researchers from the University of Nebraska Medical Center, University of Miami, and the U.S. Department of Veterans Affairs also participated in this research. The study was funded by the Bipartisan Commission on Biodefense.
    “The COVID-19 pandemic taught us that we need to be better prepared for the unexpected crisis,” said E. Albert Reece, MD, PhD, MBA, Executive Vice President for Medical Affairs, UM Baltimore, and the John Z. and Akiko K. Bowers Distinguished Professor and Dean, University of Maryland School of Medicine. “Having an important metric like the HMSPI could be a game changer that ultimately saves lives during a surge by helping hospitals identify and fix their vulnerabilities. More

  • in

    For many students, double-dose algebra leads to college attainment

    In the United States, low-income and minority students are completing college at low rates compared to higher-income and majority peers — a detriment to reducing economic inequality. Double-dose algebra could be a solution, according to a new study published in roceedings of the National Academy of Sciences of the United States of America (PNAS).
    The paper, “Effects of Double-Dose Algebra on College Persistence and Degree Attainment,” is the culmination of a series of studies that followed two cohorts of ninth-grade students over a period of 12 years in the Chicago Public Schools (CPS) where double-dose algebra was introduced in 2003.
    The new policy required incoming ninth graders with eighth-grade math scores below the national median to complete two periods of math — one period of algebra, plus an additional period of instruction designed to build foundational prealgebra skills. Research findings showed that, for median-skill students scoring at or above the 50th percentile in the 2003 cohort, double-dose algebra significantly increased semesters of college attended and college degree attainment.
    “This provides unique insight for districts that provide extra instruction but are unable to rigorously study the impact of those programs,” said Takako Nomi, Ph.D., associate professor of educational studies at Saint Louis University. Her work focuses on educational policy and equity.
    Nomi, who also serves as research affiliate at the University of Chicago’s Consortium on Chicago School Research, led the study. Other authors include Stephen W. Raudenbush, Ed.D., of the department of sociology at the University of Chicago; and Jake J. Smith, of Harris School of Public Policy at the University of Chicago.
    A key takeaway from the study is how schools chose to implement the policy matters, Nomi said. Fewer schools adopted the cut-score-based double-dose algebra program in 2004 than in 2003. Most schools that did strongly comply in 2004, did so by placing their median-skill double-dose students in low-skill algebra classrooms, according to the study.
    In terms of classroom peer composition, “the impact was largest when schools didn’t group double-dose students with low-skilled students,” Nomi said. Research findings demonstrate that when students were placed in double-dose classes with much lower-skilled peers, the program had no effect. Subsequent research should address the design of optimal policies for lower-skill students, Nomi said. A math intervention far more intensive than double-dose algebra is essential to improve their high school and postsecondary outcomes. The study also notes that ninth-grade students who fail math also tend to fail other core classes.
    “It’s not just a math issue,” Nomi said. “The policy of giving extra math is not enough to change the trajectory for the students who struggle the most. It’s important to support struggling students in general.”
    This study was supported by grant R305A170602 from the Institute of Education Sciences entitled, “Doubling Up? Understanding the Long-Term Effects of Ninth-Grade Algebra Reform on College Persistence and Graduation.”
    Nomi’s research interests include urban education, education policy, inequality in education, school reforms, and college readiness. Nomi is associate director of the Sinquefield Center for Applied Economic Research where she collaborates with top SLU researchers. In a separate study, she’s exploring why low-income and minority students — particularly Black males — are less likely to complete college. She is also a part of a faculty advisory board at SLU’s Geospatial Institute.
    Story Source:
    Materials provided by Saint Louis University. Original written by Bridjes O’Neil. Note: Content may be edited for style and length. More

  • in

    New study shows mathematical models helped reduce the spread of COVID-19

    Colorado researchers have published new findings in Emerging Infectious Diseases that take a first look at the use of SARS-CoV-2 mathematical modeling to inform early statewide policies enacted to reduce the spread of the Coronavirus pandemic in Colorado. Among other findings, the authors estimate that 97 percent of potential hospitalizations across the state in the early months of the pandemic were avoided as a result of social distancing and other transmission-reducing activities such as mask wearing and social isolation of symptomatic individuals.
    The modeling team was led by faculty and researchers in the Colorado School of Public Health and involved experts from the University of Colorado Anschutz Medical Campus, University of Colorado Denver, University of Colorado Boulder, and Colorado State University.
    “One of the defining characteristics of the COVID-19 pandemic was the need for rapid response in the face of imperfect and incomplete information,” said the authors. “Mathematical models of infectious disease transmission can be used in real-time to estimate parameters, such as the effective reproductive number (Re) and the efficacy of current and future intervention measures, and to provide time-sensitive data to policymakers.”
    The new paper describes the development of such a model, in close collaboration with the Colorado Department of Health and Environment and the Colorado Governor’s office to gage the impact of early policies to decrease social contacts and, later, the impact of gradual relaxation of Stay-at-Home orders. The authors note that preparing for hospital intensive care unit (ICU) loads or capacity limits was a critical decision-making issue.
    The Colorado COVID-19 Modeling team developed a susceptible-exposed-infected-recovered (SEIR) model calibrated to Colorado COVID-19 case and hospitalization data to estimate changes in the contact rate and the Re after emergence of SARS-CoV-2 and the implementation of statewide COVID-19 control policies in Colorado. The modeling team supplemented model estimates with an analysis of mobility by using mobile device location data. Estimates were generated in near real time, at multiple time-points, with a rapidly evolving understanding of SARS-CoV-2. At each time point, the authors generated projections of the possible course of the outbreak under an array of intervention scenarios. Findings were regularly provided to key Colorado decision-makers.
    “Real-time estimation of contact reduction enabled us to respond to urgent requests to actively inform rapidly changing public health policy amidst a pandemic. In early stages, the urgent need was to flatten the curve,” note the authors. “Once infections began to decrease, there was interest in the degree of increased social contact that could be tolerated as the economy reopened without leading to overwhelmed hospitals.”
    “Although our analysis is specific to Colorado, our experience highlights the need for locally calibrated transmission models to inform public health preparedness and policymaking, along with ongoing analyses of the impact of policies to slow the spread of SARS-CoV-2,” said Andrea Buchwald, PhD, lead author from the Colorado School of Public Health at CU Anschutz. “We present this material not as a final estimate of the impact of social distancing policies, but to illustrate how models can be constructed and adapted in real-time to inform critical policy questions.”
    Story Source:
    Materials provided by University of Colorado Anschutz Medical Campus. Original written by Tonya Ewers. Note: Content may be edited for style and length. More

  • in

    AI predicts diabetes risk by measuring fat around the heart

    A team led by researchers from Queen Mary University of London has developed a new artificial intelligence (AI) tool that is able to automatically measure the amount of fat around the heart from MRI scan images.
    Using the new tool, the team was able to show that a larger amount of fat around the heart is associated with significantly greater odds of diabetes, independent of a person’s age, sex, and body mass index.
    The research is published in the journal Frontiers in Cardiovascular Medicine and is the result of funding from the CAP-AI programme, which is led by Barts Life Sciences, a research and innovation partnership between Queen Mary University of London and Barts Health NHS Trust.
    The distribution of fat in the body can influence a person’s risk of developing various diseases. The commonly used measure of body mass index (BMI) mostly reflects fat accumulation under the skin, rather than around the internal organs. In particular, there are suggestions that fat accumulation around the heart may be a predictor of heart disease, and has been linked to a range of conditions, including atrial fibrillation, diabetes, and coronary artery disease.
    Lead researcher Dr Zahra Raisi-Estabragh from Queen Mary University of London said: “Unfortunately, manual measurement of the amount of fat around the heart is challenging and time-consuming. For this reason, to date, no-one has been able to investigate this thoroughly in studies of large groups of people.
    “To address this problem, we’ve invented an AI tool that can be applied to standard heart MRI scans to obtain a measure of the fat around the heart automatically and quickly, in under three seconds. This tool can be used by future researchers to discover more about the links between the fat around the heart and disease risk, but also potentially in the future, as part of a patient’s standard care in hospital.”
    The research team tested the AI algorithm’s ability to interpret images from heart MRI scans of more than 45,000 people, including participants in the UK Biobank, a database of health information from over half a million participants from across the UK. The team found that the AI tool was accurately able to determine the amount of fat around the heart in those images, and it was also able to calculate a patient’s risk of diabetes.
    Dr Andrew Bard from Queen Mary University of London, who led the technical development, added: “The AI tool also includes an in-built method for calculating uncertainty of its own results, so you could say it has an impressive ability to mark its own homework.”
    Professor Steffen Petersen from Queen Mary University of London, who supervised the project, said: “This novel tool has high utility for future research and, if clinical utility is demonstrated, may be applied in clinical practice to improve patient care. This work highlights the value of cross-disciplinary collaborations in medical research, particularly within cardiovascular imaging.”
    Story Source:
    Materials provided by Queen Mary University of London. Note: Content may be edited for style and length. More

  • in

    A tweaked yeast can make ethanol from cornstalks and a harvest’s other leftovers

    When corn farmers harvest their crop, they often leave the stalks, leaves and spent cobs to rot in the fields. Now, engineers have fashioned a new strain of yeast that can convert this inedible debris into ethanol, a biofuel. If the process can be scaled up, this largely untapped renewable energy source could help reduce reliance on fossil fuels.

    Previous efforts to convert this fibrous material, called corn stover, into fuel met with limited success. Before yeasts can do their job, corn stover must be broken down, but this process often generates by-products that kill yeasts. But by tweaking a gene in common baker’s yeast, researchers have engineered a strain that can defuse those deadly by-products and get on with the job of turning sugar into ethanol.

    The new yeast was able to produce over 100 grams of ethanol for every liter of treated corn stover, an efficiency comparable to the standard process using corn kernels to make the biofuel, the researchers report June 25 in Science Advances.

    “They’ve produced a more resilient yeast,” says Venkatesh Balan, a chemical engineer at the University of Houston not involved in the research. The new strain may benefit biofuel producers trying to harness materials like corn stover, he says.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    In the United States, most ethanol is made from corn, the country’s largest crop, and is mixed into most of the gasoline sold at gas stations. Corn ethanol is a renewable energy source, but it has limitations. Diverting corn to make ethanol can detract from the food supply, and expanding cropland just to plant corn for biofuel clears natural habitats (SN: 12/21/20). Converting inedible corn stover into ethanol could increase the biofuel supply without having to plant more crops.

    “Corn can’t really displace petroleum as a raw material for fuels,” says metabolic engineer Felix Lam of MIT. “But we have an alternative.”

    Lam and colleagues started with Saccharomyces cerevisiae, or common baker’s yeast. Like sourdough bakers and brewers, biofuel producers already use yeast: It can convert sugars in corn kernels into ethanol (SN: 9/19/17).

    But unlike corn kernels with easy-access sugars, corn stover contains sugars bound in lignocellulose, a plant compound that yeast can’t break down. Applying harsh acids can free these sugars, but the process generates toxic by-products called aldehydes that can kill yeasts.

    But Lam’s team had an idea — convert the aldehydes into something tolerable to yeast. The researchers already knew that by adjusting the chemistry of the yeast’s growing environment, they could improve its tolerance to alcohol, which is also harmful at high concentrations. With that in mind, Lam and colleagues homed in on a yeast gene called GRE2, which helps convert aldehydes into alcohol. The team randomly generated about 20,000 yeast variants, each with a different, genetically modified version of GRE2. Then, the researchers placed the horde of variants inside a flask that also contained toxic aldehydes to see which yeasts would survive.

    Multiple variants survived the gauntlet, but one dominated. With this battle-tested version of GRE2, the researchers found that the modified baker’s yeast could produce ethanol from treated corn stover almost as efficiently as from corn kernels. What’s more, the yeast could generate ethanol from other woody materials, including wheat straw and switchgrass (SN: 1/14/14). “We have a single strain that can accomplish all this,” Lam says.

    This strain resolves a key challenge in fermenting ethanol from fibrous materials like corn stover, Balan says. But “there are many more improvements that will have to happen to make this technology commercially viable,” he adds, such as logistical challenges in harvesting, transporting and storing large volumes of corn stover.

    “There are so many moving parts to this problem,” Lam acknowledges. But he thinks his team’s findings could help kick-start a “renewable pipeline” that harnesses underused, sustainable fuel sources. The vision, he says, is to challenge the reign of fossil fuels. More

  • in

    Scientists closing in on map of the mammalian immune system

    Using artificial intelligence, UT Southwestern scientists have identified thousands of genetic mutations likely to affect the immune system in mice. The work is part of one Nobel laureate’s quest to find virtually all such variations in mammals.
    “This study identifies 101 novel gene candidates with greater than 95% chance of being required for immunity,” says Bruce Beutler, M.D., director of the Center for the Genetics of Host Defense (CGHD) and corresponding author of the study published this week in the Proceedings of the National Academy of Sciences. “Many of these candidates we have already verified by re-creating the mutations or knocking out the genes.” Lead author Darui Xu, a computational biologist at CGHD, wrote the software.
    “We’ve developed software called Candidate Explorer (CE) that uses a machine-learning algorithm to identify chemically induced mutations likely to cause traits related to immunity. The software determines the probability that any mutation we’ve induced will be verified as causative after further testing,” Beutler says. His discovery of an important family of receptors that allow mammals to quickly sense infection and trigger an inflammatory response led to the 2011 Nobel Prize in Physiology or Medicine.
    “The purpose of CE is to help researchers predict whether a mutation associated with a phenotype (trait or function) is a truly causative mutation. CE has already helped us to identify hundreds of genes with novel functions in immunity. This will improve our understanding of the immune system so that we can find new ways to keep it robust, and also know the reason it sometimes falters,” says Beutler, Regental Professor, and professor of immunology and internal medicine at UT Southwestern.
    “CE provides a score that tells us the likelihood that a particular mutation-phenotype association will be verified for cause and effect if we re-create the mutation or knock out the gene,” he says.
    CE examines 67 features of the primary genetic mapping data to arrive at an estimate of the likelihood of causation. For some mutations, causation is very clear; for others, less so. Over time, the program “learns” from experiments in which researchers re-create the mutation in a fresh pedigree and verify or exclude the hypothesis of causation. All mutations are made available to the scientific community through a public repository, and the data supporting causation are viewable within the Candidate Explorer program on the CGHD website, Mutagenetix (https://mutagenetix.utsouthwestern.edu/). More

  • in

    A new look at color displays

    Researchers at Linköping University have developed a method that may lead to new types of displays based on structural colours. The discovery opens the way to cheap and energy-efficient colour displays and electronic labels. The study has been published in the scientific journal Advanced Materials.
    We usually think of colours as created by pigments, which absorb light at certain wavelengths such that we perceive colour from other wavelengths that are scattered and reach our eyes. That’s why leaves, for example, are green and tomatoes red. But colours can be created in other ways, and some materials appear coloured due to their structure. Structural colours can arise when light is internally reflected inside the material on a scale of nanometres. This is usually referred to as interference effects. An example found in nature are peacock feathers, which are fundamentally brown but acquire their characteristic blue-green sheen from small structural features.
    Researchers at Linköping University have developed a new and simple method to create structural colours for use with reflective colour displays. The new method may enable manufacturing of thin and lightweight displays with high energy-efficiency for a broad range of applications.
    Reflective colour displays differ from the colour displays we see in everyday life on devices such as mobile phones and computers. The latter consist of small light-emitting diodes of red, green and blue positioned close to each other such that they together create white light. The colour of each light-emitting diode depends on the molecules from which it is built, or in other words, its pigment. However, it is relatively expensive to manufacture light-emitting diodes, and the global use of emissive displays consumes a lot of energy. Another type of display, reflective displays, is therefore being explored for purposes such as tablet computers used as e-readers, and electronic labels. Reflective displays form images by controlling how incident light from the surroundings is reflected, which means that they do not need their own source of illumination. However, most reflective displays are intrinsically monochrome, and attempts to create colour versions have been rather complicated and have sometimes given poor results.
    Shangzhi Chen is a newly promoted doctor at the Laboratory of Organic Electronics at Linköping University and principal author of an article that describes a new type of dynamic structural colour image, published in the scientific journal Advanced Materials.
    “We have developed a simple method to produce structural colour images with electrically conducting plastics, or conducting polymers. The polymer is applied at nanoscale thicknesses onto a mirror by a technique known as vapour phase polymerisation, after the substrate has been illuminated with UV light. The stronger the UV illumination, the thicker the polymer film, and this allows us to control the structural colours that appear at different locations on the substrate,” says Shangzhi Chen.
    The method can produce all colours in the visible spectrum. Furthermore, the colours can be subsequently adjusted using electrochemical variation of the redox state of the polymer. This function has been popular for monochrome reflective displays, and the new study shows that the same materials can provide dynamic images in colour using optical interference effects combined with spatial control of nanoscale thicknesses. Magnus Jonsson, associate professor at the Laboratory of Organic Electronics at Linköping University, believes that the method has great potential, for example, for applications such as electronic labels in colour. Further research may also allow more advanced displays to be manufactured.
    “We receive increasing amounts of information via digital displays, and if we can contribute to more people gaining access to information through cheap and energy-efficient displays, that would be a major benefit. But much research remains to be done, and new projects are already under way,” says Magnus Jonsson.
    Story Source:
    Materials provided by Linköping University. Original written by Anders Ryttarson Törneholm. Note: Content may be edited for style and length. More