More stories

  • in

    Machine learning can help slow down future pandemics

    Artificial intelligence could be one of the keys for limiting the spread of infection in future pandemics. In a new study, researchers at the University of Gothenburg have investigated how machine learning can be used to find effective testing methods during epidemic outbreaks, thereby helping to better control the outbreaks.
    In the study, the researchers developed a method to improve testing strategies during epidemic outbreaks and with relatively limited information be able to predict which individuals offer the best potential for testing.
    “This can be a first step towards society gaining better control of future major outbreaks and reduce the need to shutdown society,” says Laura Natali, a doctoral student in physics at the University of Gothenburg and the lead author of the published study.
    Machine learning is a type of artificial intelligence and can be described as a mathematical model where computers are trained to learn to see connections and solve problems using different data sets. The researchers used machine learning in a simulation of an epidemic outbreak, where information about the first confirmed cases was used to estimate infections in the rest of the population. Data about the infected individual’s network of contacts and other information was used: who they have been in close contact with, where and for how long.
    “In the study, the outbreak can quickly be brought under control when the method is used, while random testing leads to uncontrolled spread of the outbreak with many more infected individuals. Under real world conditions, information can be added, such as demographic data, age and health-related conditions, which can improve the method’s effectiveness even more. The same method can also be used to prevent reinfections in the population if immunity after the disease is only temporary.”
    She emphasises that the study is a simulation and that testing with real data is needed to improve the method even more. Therefore, it is too early to use it in the ongoing coronavirus pandemic. At the same time, she sees the research as a first step in being able to implement more targeted initiatives to reduce the spread of infections, since the machine learning-based testing strategy automatically adapts to the specific characteristics of diseases. As an example, she mentions the potential to easily predict if a specific age group should be tested or if a limited geographic area is a risk zone, such as a school, a community or a specific neighbourhood.
    “When a large outbreak has begun, it is important to quickly and effectively identify infectious individuals. In random testing, there is a significant risk failing to achieve this, but with a more goal-oriented testing strategy we can find more infected individuals and thereby also gain the necessary information to decrease the spread of infection. We show that machine learning can be used to develop this type of testing strategy,” she says.
    There are few previous studies that have examined how machine learning can be used in cases of pandemics, particularly with a clear focus on finding the best testing strategies.
    “We show that it is possible to use relatively simple and limited information to make predictions of who would be most beneficial to test. This allows better use of available testing resources.”
    Story Source:
    Materials provided by University of Gothenburg. Note: Content may be edited for style and length. More

  • in

    New approach to centuries-old 'three-body problem'

    The “three-body problem,” the term coined for predicting the motion of three gravitating bodies in space, is essential for understanding a variety of astrophysical processes as well as a large class of mechanical problems, and has occupied some of the world’s best physicists, astronomers and mathematicians for over three centuries. Their attempts have led to the discovery of several important fields of science; yet its solution remained a mystery.
    At the end of the 17th century, Sir Isaac Newton succeeded in explaining the motion of the planets around the sun through a law of universal gravitation. He also sought to explain the motion of the moon. Since both the earth and the sun determine the motion of the moon, Newton became interested in the problem of predicting the motion of three bodies moving in space under the influence of their mutual gravitational attraction (see attached illustration), a problem that later became known as “the three-body problem.”
    However, unlike the two-body problem, Newton was unable to obtain a general mathematical solution for it. Indeed, the three-body problem proved easy to define, yet difficult to solve.
    New research, led by Professor Barak Kol at Hebrew University of Jerusalem’s Racah Institute of Physics, adds a step to this scientific journey that began with Newton, touching on the limits of scientific prediction and the role of chaos in it.
    The theoretical study presents a novel and exact reduction of the problem, enabled by a re-examination of the basic concepts that underlie previous theories. It allows for a precise prediction of the probability for each of the three bodies to escape the system.
    Following Newton and two centuries of fruitful research in the field including by Euler, Lagrange and Jacobi, by the late 19th century the mathematician Poincare discovered that the problem exhibits extreme sensitivity to the bodies’ initial positions and velocities. This sensitivity, which later became known as chaos, has far-reaching implications — it indicates that there is no deterministic solution in closed-form to the three-body problem. More

  • in

    People may trust computers more than humans

    Despite increasing concern over the intrusion of algorithms in daily life, people may be more willing to trust a computer program than their fellow humans, especially if a task becomes too challenging, according to new research from data scientists at the University of Georgia.
    From choosing the next song on your playlist to choosing the right size pants, people are relying more on the advice of algorithms to help make everyday decisions and streamline their lives.
    “Algorithms are able to do a huge number of tasks, and the number of tasks that they are able to do is expanding practically every day,” said Eric Bogert, a Ph.D. student in the Terry College of Business Department of Management Information Systems. “It seems like there’s a bias towards leaning more heavily on algorithms as a task gets harder and that effect is stronger than the bias towards relying on advice from other people.”
    Bogert worked with management information systems professor Rick Watson and assistant professor Aaron Schecter on the paper, “Humans rely more on algorithms than social influence as a task becomes more difficult,” which was published April 13 in Nature’s Scientific Reports journal.
    Their study, which involved 1,500 individuals evaluating photographs, is part of a larger body of work analyzing how and when people work with algorithms to process information and make decisions.
    For this study, the team asked volunteers to count the number of people in a photograph of a crowd and supplied suggestions that were generated by a group of other people and suggestions generated by an algorithm. More

  • in

    Combining mask wearing, social distancing suppresses COVID-19 virus spread

    Studies show wearing masks and social distancing can contain the spread of the COVID-19 virus, but their combined effectiveness is not precisely known.
    In Chaos, by AIP Publishing, researchers at New York University and Politecnico di Torino in Italy developed a network model to study the effects of these two measures on the spread of airborne diseases like COVID-19. The model shows viral outbreaks can be prevented if at least 60% of a population complies with both measures.
    “Neither social distancing nor mask wearing alone are likely sufficient to halt the spread of COVID-19, unless almost the entire population adheres to the single measure,” author Maurizio Porfiri said. “But if a significant fraction of the population adheres to both measures, viral spreading can be prevented without mass vaccination.”
    A network model encompasses nodes, or data points, and edges, or links between nodes. Such models are used in applications ranging from marketing to tracking bird migration. In the researchers’ model, based on a susceptible, exposed, infected, or removed (recovered or has died) framework, each node represents a person’s health status. The edges represent potential contacts between pairs of individuals.
    The model accounts for activity variability, meaning a few highly active nodes are responsible for much of the network’s contacts. This mirrors the validated assumption that most people have few interactions and only a few interact with many others. Scenarios involving social distancing without mask wearing and vice versa were also tested by setting up the measures as separate variables.
    The model drew on cellphone mobility data and Facebook surveys obtained from the Institute for Health Metrics and Evaluation at the University of Washington. The data showed people who wear masks are also those who tend to reduce their mobility. Based on this premise, nodes were split into individuals who regularly wear masks and socially distance and those whose behavior remains largely unchanged by an epidemic or pandemic.
    Using data collected by The New York Times to gauge the model’s effectiveness, the researchers analyzed the cumulative cases per capita in all 50 states and the District of Columbia between July 14, 2020, when the Centers for Disease Control and Prevention officially recommended mask wearing, through Dec. 10.
    In addition to showing the effects of combining mask wearing and social distancing, the model shows the critical need for widespread adherence to public health measures.
    “U.S. states suffering the most from the largest number of infections last fall were also those where people complied less with public health guidelines, thereby falling well above the epidemic threshold predicted by our model,” Porfiri said.
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    A molecule that responds to light

    Light can be used to operate quantum information processing systems, e.g. quantum computers, quickly and efficiently. Researchers at Karlsruhe Institute of Technology (KIT) and Chimie ParisTech/CNRS have now significantly advanced the development of molecule-based materials suitable for use as light-addressable fundamental quantum units. As they report in the journal Nature Communications, they have demonstrated for the first time the possibility of addressing nuclear spin levels of a molecular complex of europium(III) rare-earth ions with light.
    Whether in drug development, communication, or for climate forecasts: Processing information quickly and efficiently is crucial in many areas. It is currently done using digital computers, which work with so-called bits. The state of a bit is either 0 or 1 — there is nothing in between. This severely limits the performance of digital computers, and it is becoming increasingly difficult and time-consuming to handle complex problems related to real-world tasks. Quantum computers, on the other hand, use quantum bits to process information. A quantum bit (qubit) can be in many different states between 0 and 1 simultaneously due to a special quantum mechanical property referred to as quantum superposition. This makes it possible to process data in parallel, which increases the computing power of quantum computers exponentially compared to digital computers.
    Qubit Superposition States Are Required to Persist Long Enough
    “In order to develop practically applicable quantum computers, the superposition states of a qubit should persist for a sufficiently long time. Researchers speak of ‘coherence lifetime’,” explains Professor Mario Ruben, head of the Molecular Materials research group at KIT’s Institute of Nanotechnology (INT). “However, the superposition states of a qubit are fragile and are disturbed by fluctuations in the environment, which leads to decoherence, i.e. shortening of the coherence lifetime.” To preserve the superposition state long enough for computational operations, isolating a qubit from the noisy environment is conceivable. Nuclear spin levels in molecules can be used to create superposition states with long coherence lifetimes because nuclear spins are weakly coupled to the environment, protecting the superposition states of a qubit from disturbing external influences.
    Molecules Are Ideally Suited As Qubit Systems
    One single qubit, however, is not enough to build a quantum computer. Many qubits to be organized and addressed are required. Molecules represent ideal qubit systems as they can be arranged in sufficiently large numbers as identical scalable units and can be addressed with light to perform qubit operations. In addition, the physical properties of molecules, such as emission and/or magnetic properties, can be tailored by changing their structures using chemical design principles. In their paper now published in the journal Nature Communications, researchers led by Professor Mario Ruben at KIT’s IQMT and Strasbourg´s European Center for Quantum Sciences — CESQ and Dr. Philippe Goldner at École nationale supérieure de chimie de Paris (Chimie ParisTech/CNRS) present a nuclear-spin-containing dimeric europium(III) molecule as light-addressable qubit.
    The molecule, which belongs to the rare earth metals, is designed to exhibit luminescence, i.e., a europium(III)-centered sensitized emission, when excited by ultraviolet light-absorbing ligands surrounding the center. After light absorption, the ligands transfer the light energy to the europium(III) center, thereby exciting it. Relaxation of the excited center to the ground state leads to light emission. The whole process is referred to as sensitized luminescence. Spectral hole burning — special experiments with lasers — detect the polarization of the nuclear spin levels, indicating the generation of a efficient light-nuclear spin interface. The latter enables the generation of light-addressable hyperfine qubits based on nuclear spin levels. “By demonstrating for the first time light-induced spin polarization in the europium(III) molecule, we have succeeded in taking a promising step towards the development of quantum computing architectures based on rare-earth ion-containing molecules,” explains Dr. Philippe Goldner.
    Story Source:
    Materials provided by Karlsruher Institut für Technologie (KIT). Note: Content may be edited for style and length. More

  • in

    Basketball Mathematics scores big at inspiring kids to learn

    New study with 756 1st through 5th graders demonstrates that a six-week mashup of hoops and math has a positive effect on their desire to learn more, provides them with an experience of increased self-determination and grows math confidence among youth. The Basketball Mathematics study was conducted at five Danish primary and elementary schools by researchers from the University of Copenhagen’s Department of Nutrition, Exercise and Sports.
    Over the past decades, there has been a considerable amount of attention paid to explore different approaches to stimulate children’s learning. Especially, there has been a focus on how physical activity, separated from the learning activities, can improve children’s cognitive performance and learning. Conversely, there has been less of a focus aimed at the potential of integrating physical activity into the learning activities. The main purpose of this study therefore was to develop a learning activity that integrates basketball and mathematics and examine how it might affect children’s motivation in mathematics.
    Increased motivation, self-determination and mastery
    756 children from 40 different classes at Copenhagen area schools participated in the project, where about half of the them — once a week for six weeks — had Basketball Mathematics during gym class, while the other half played basketball without mathematics.
    “During classes with Basketball Mathematics, the children had to collect numbers and perform calculations associated with various basketball exercises. An example could be counting how many times they could sink a basket from three meters away vs. at a one-meter distance, and subsequently adding up the numbers. Both the math and basketball elements could be adjusted to suit the children’s levels, as well as adjusting for whether it was addition, multiplication or some other function that needed to be practiced,” explains Linn Damsgaard, who is writing her PhD thesis on the connection between learning and physical activity at the University of Copenhagen’s Department of Nutrition, Exercise and Sports.
    The results demonstrate that children’s motivation for math integrated with basketball is 16% higher com-pared to classroom math learning. Children also experienced a 14% increase in self-determination compared with classroom teaching, while Basketball Mathematics increases mastery by 6% compared versus classroom-based mathematics instruction. Furthermore, the study shows that Basketball Mathematics can maintain children’s motivation for mathematics over a six-week period, while the motivation of the control group decreases significantly.
    “It is widely acknowledged that youth motivation for schoolwork decreases as the school year progresses. Therefore, it is quite interesting that we don’t see any decrease in motivation when kids take part in Basketball Mathematics. While we can’t explain our results with certainty, it could be that Basketball Mathematics endows children with a sense of ownership of their calculations and helps them clarify and concretize abstract concepts, which in turn increases their motivation to learn mathematics through Basketball Mathematics,” says PhD student Linn Damsgaard
    Active math on the school schedule
    Associate Professor Jacob Wienecke of UCPH’s Department of Nutrition, Exercise and Sports, who supervised the study, says that other studies have proved the benefits of movement and physical activity on children’s academic learning. He expects for the results of Basketball Mathematics on children’s learning and academic performance to be published soon:
    “We are currently investigating whether the Basketball Mathematics model can strengthen youth performance in mathematics. Once we have the final results, we hope that they will inspire school teachers and principals to prioritize more physical activity and movement in these subjects,” says Jacob Wienecke, who concludes:
    “Eventually, we hope to succeed in having these tools built into the school system and the teacher’s education. The aim is that schools in the future will include “Active English” and “Active Mathematics” in the weekly schedule as subjects where physical education and subject-learning instructors collaborate to integrate this type of instruction with the normally more sedentary classwork.” More

  • in

    165 new cancer genes identified with the help of machine learning

    A new algorithm can predict which genes cause cancer, even if their DNA sequence is not changed. A team of researchers in Berlin combined a wide variety of data, analyzed it with “Artificial Intelligence” and identified numerous cancer genes. This opens up new perspectives for targeted cancer therapy in personalized medicine and for the development of biomarkers.
    In cancer, cells get out of control. They proliferate and push their way into tissues, destroying organs and thereby impairing essential vital functions. This unrestricted growth is usually induced by an accumulation of DNA changes in cancer genes — i.e. mutations in these genes that govern the development of the cell. But some cancers have only very few mutated genes, which means that other causes lead to the disease in these cases.
    A team of researchers at the Max Planck Institute for Molecular Genetics (MPIMG) in Berlin and at the Institute of Computational Biology of Helmholtz Zentrum München developed a new algorithm using machine learning technology to identify 165 previously unknown cancer genes. The sequences of these genes are not necessarily altered — apparently, already a dysregulation of these genes can lead to cancer. All of the newly identified genes interact closely with well-known cancer genes and have been shown to be essential for the survival of tumor cells in cell culture experiments.
    Additional targets for personalized medicine
    The algorithm, dubbed “EMOGI” for Explainable Multi-Omics Graph Integration, can also explain the relationships in the cell’s machinery that make a gene a cancer gene. As the team of researchers headed by Annalisa Marsico describe in the journal Nature Machine Intelligence, the software integrates tens of thousands of data sets generated from patient samples. These contain information about DNA methylations, the activity of individual genes and the interactions of proteins within cellular pathways in addition to sequence data with mutations. In these data, a deep-learning algorithm detects the patterns and molecular principles that lead to the development of cancer.
    “Ideally, we obtain a complete picture of all cancer genes at some point, which can have a different impact on cancer progression for different patients,” says Marsico, head of a research group at the MPIMG until recently and now at Helmholtz Zentrum München. “This is the foundation for personalized cancer therapy.”
    Unlike with conventional cancer treatments such as chemotherapy, personalized therapy approaches tailor medication precisely to the type of tumor. “The goal is to select the best therapy for each patient — that is, the most effective treatment with the fewest side effects. Additionally, we would be able to identify cancers already at early stages, based on their molecular characteristics.” More

  • in

    Discovery could help lengthen lifespan of electronic devices

    Ferroelectric materials are used in many devices, including memories, capacitors, actuators and sensors. These devices are commonly used in both consumer and industrial instruments, such as computers, medical ultrasound equipment and underwater sonars.
    Over time, ferroelectric materials are subjected to repeated mechanical and electrical loading, leading to a progressive decrease in their functionality, ultimately resulting in failure. This process is referred to as ‘ferroelectric fatigue’.
    It is a main cause of the failure of a range of electronic devices, with discarded electronics a leading contributor to e-waste. Globally, tens of millions of tonnes of failed electronic devices go to landfill every year.
    Using advanced in-situ electron microscopy, the School of Aerospace, Mechanical and Mechatronic Engineering researchers were able to observe ferroelectric fatigue as it occurred. This technique uses an advanced microscope to ‘see’, in real-time, down to the nanoscale and atomic levels.
    The researchers hope this new observation, described in a paper published in Nature Communications, will help better inform the future design of ferroelectric nanodevices.
    “Our discovery is a significant scientific breakthrough as it shows a clear picture of how the ferroelectric degradation process is present at the nanoscale,” said co-author Professor Xiaozhou Liao, also from the University of Sydney Nano Institute.
    Dr Qianwei Huang, the study’s lead researcher, said: “Although it has long been known that ferroelectric fatigue can shorten the lifespan of electronic devices, how it occurs has previously not been well understood, due to a lack of suitable technology to observe it.”
    Co-author Dr Zibin Chen said: “With this, we hope to better inform the engineering of devices with longer lifespans.”
    Observational findings spark new debate
    Nobel laureate Herbert Kroemer once famously asserted “The interface is the device.” The observations by the Sydney researchers could therefore spark a new debate on whether interfaces — which are physical boundaries separating different regions in materials — are a viable solution to the unreliability of next-generation devices.
    “Our discovery has indicated that interfaces could actually speed up ferroelectric degradation. Therefore, better understanding of these processes is needed to achieve the best performance of devices,” Dr Chen said.
    Story Source:
    Materials provided by University of Sydney. Note: Content may be edited for style and length. More