More stories

  • in

    Spontaneous superconducting currents in Sr2RuO4

    Superconductivity is a complete loss of electrical resistance. Superconductors are not merely very good metals: it is a fundamentally different electronic state. In normal metals, electrons move individually, and they collide with defects and vibrations in the lattice. In superconductors, electrons are bound together by an attractive force, which allows them to move together in a correlated way and avoid defects.
    In a very small number of known superconductors, the onset of superconductivity causes spontaneous electrical currents to flow. These currents are very different from those in a normal metal wire: they are built into the ground state of the superconductor, and so they cannot be switched off. For example, in a sheet of a superconducting material, currents might appear that flow around the edge, as shown in the figure.
    This is a very rare form of superconductivity, and it always indicates that the attractive interaction is something unusual. Sr2RuO4 is one famous material where this type of superconductivity is thought to occur. Although the transition temperature is low — Sr2RuO4 superconducts only below 1.5 Kelvin — the reason why it superconducts at all is completely unknown. To explain the superconductivity in this material has become a major test of physicists’ understanding of superconductivity in general. Theoretically, it is very difficult to obtain spontaneous currents in Sr2RuO4 from standard models of superconductivity, and so if they are confirmed then a new model for superconductivity — an attractive force that is not seen in other materials — might be required.
    The way that these electrical currents are detected is subtle. Subatomic particles known as muons are implanted into the sample. The spin of each muon then precesses in whatever magnetic field exists at the muon stopping site. In effect, the muons act as sensitive detectors of magnetic field, that can be placed inside the sample. From such muon implantation experiments it has been found that spontaneous magnetic fields appear when Sr2RuO4 becomes superconducting, which shows that there are spontaneous electrical currents.
    However, because the signal is subtle, researchers have questioned whether it is in fact real. Onset of superconductivity is a major change in the electronic properties of a material, and maybe this subtle additional signal appeared because the measurement apparatus was not properly tuned.
    In this work, researchers at the Max Planck Institute for Chemical Physics of Solids, the Technical University of Dresden, and the Paul Scherrer Institute (Switzerland) have shown that when uniaxial pressure is applied to Sr2RuO4, the spontaneous currents onset at a lower temperature than the superconductivity. In other words, the transition splits into two: first superconductivity, then spontaneous currents. This splitting has not been clearly demonstrated in any other material, and it is important because it shows definitively that the second transition is real. The spontaneous currents must be explained scientifically, not as a consequence of imperfect measurement. This may require a major re-write of our understanding of superconductivity.
    Story Source:
    Materials provided by Max Planck Institute for Chemical Physics of Solids. Note: Content may be edited for style and length. More

  • in

    Smart quantum technologies for secure communication

    Researchers from Louisiana State University have introduced a smart quantum technology for the spatial mode correction of single photons. In a paper featured on the cover of the March 2021 issue of Advanced Quantum Technologies, the authors exploit the self-learning and self-evolving features of artificial neural networks to correct the distorted spatial profile of single photons.
    The authors, PhD candidate Narayan Bhusal, postdoctoral researcher Chenglong You, graduate student Mingyuan Hong, undergraduate student Joshua Fabre, and Assistant Professor Omar S. Magaña?Loaiza of LSU — together with collaborators Sanjaya Lohani, Erin M. Knutson, and Ryan T. Glasser of Tulane University and Pengcheng Zhao of Qingdao University of Science and Technology — report on the potential of artificial intelligence to correct spatial modes at the single-photon level.
    “The random phase distortion is one of the biggest challenges in using spatial modes of light in a wide variety of quantum technologies, such as quantum communication, quantum cryptography, and quantum sensing,” said Bhusal. “In this paper, we use artificial neurons to correct distorted spatial modes of light at the single-photon level. Our method is remarkably effective and time-efficient compared to conventional techniques. This is an exciting development for the future of free-space quantum technologies.”
    The newly developed technique boosts the channel capacity of optical communication protocols that rely on structured photons.
    “One important goal of the Quantum Photonics Group at LSU is to develop robust quantum technologies that work under realistic conditions,” said Magaña?Loaiza. “This smart quantum technology demonstrates the possibility of encoding multiple bits of information in a single photon in realistic communication protocols affected by atmospheric turbulence. Our technique has enormous implications for optical communication and quantum cryptography. We are now exploring paths to implement our machine learning scheme in the Louisiana Optical Network Initiative (LONI) to make it smart, secure, and quantum.”
    “We are still in the fairly early stages of understanding the potential for machine learning techniques to play a role in quantum information science,” said Dr. Sara Gamble, program manager at the Army Research Office, an element of DEVCOM ARL. “The team’s result is an exciting step forward in developing this understanding, and it has the potential to ultimately enhance the Army’s sensing and communication capabilities on the battlefield.”
    Story Source:
    Materials provided by Louisiana State University. Note: Content may be edited for style and length. More

  • in

    Modeling a safe new normal

    Just one year after the World Health Organization declared the novel coronavirus a global pandemic, three COVID-19 vaccines are available in the United States, and more than 2 million Americans are receiving shots each day. Americans are eager to get back to business as usual, but experts caution that opening the economy prematurely could allow a potential resurgence of the virus. How foot traffic patterns in restaurants and bars, schools and universities, nail salons and barbershops affect the risk of transmission has been largely unknown.
    In an article published in npj Digital Medicine, researcher-physicians from Beth Israel Deaconess Medical Center (BIDMC) used anonymized cell-phone data to build a Business Risk Index, which quantifies the potential risk of COVID-19 transmission in these establishments. The team’s index accounts for both the density of visits and the length of time individuals linger inside, providing a more precise description of the human interactions — and thus risk of viral transmission — going on inside.
    “While business traffic pre-pandemic and during statewide shut downs has been studied, business foot traffic and its relationship to COVID-19 transmission in the so-called ‘new normal’ of re-opening has not been well understood.” said corresponding author Ashley O’Donoghue, PhD, Economist, in the Center for Healthcare Delivery Science at BIDMC. “Many forecasting models use anonymized cell-phone mobility data as a broad measure of the movement of residents. But two regions with same levels of mobility will likely see very different levels of COVID-19 transmission if people in one region are diligently practicing social distancing and people in the other are not.”
    O’Donoghue and colleagues built their risk index by analyzing trends in foot traffic patterns in more than 1.25 million businesses across eight states from January to June 2020. In the six New England states, New York and California, the team saw a 30 percent drop in high-density foot traffic and long visit lengths to businesses — two factors that can increase the risk of COVID-19 transmission — from the pre-pandemic baseline to April 2020. They saw similar declines when they looked at similar risky foot traffic patterns in restaurants, bars, universities and personal care establishments (which includes hair and nail salons and barbershops). In both analyses, the risk index rose steadily starting in mid-June as states eased restrictions.
    Next, using county-level COVID-19 data for the same time period, the team demonstrated that their index could accurately forecast future COVID-19 cases with a one-week lag. The team found that an increase in a county’s average Business Risk Index was associated with an increase in COVID-19 cases per 10,000 people in one week.
    “Not all types of mobility contribute equally to increased risk of transmission, so it is important to directly measure human interaction when weighing the costs and benefits of reopening and lifting restrictions on businesses,” said senior author Jennifer P. Stevens, MD, MS, Director of the Center for Healthcare Delivery Science at BIDMC. “Tracking how individuals use different businesses may provide the kind of information policymakers need to re-open different businesses in the safest way possible.”
    O’Donoghue, Stevens and team are now building an online decision-support tool that will help policymakers and hospital decision-makers monitor weekly risk in their areas. They have also deployed a prototype of their tool for Massachusetts that is being used by a large tertiary academic medical center in Boston to monitor potential surges in their service area, and their index has been integrated as a feature in a forecasting model for a large health system in Massachusetts.
    “Our index can better quantify close human interactions, which are important predictors of transmission and help identify potential disease surges,” said Stevens.
    Study co-authors also include Tenzin Dechen, MPH, of BIDMC; Whitney Pavlova, BA, of Pennsylvania State University; Michael Boals, MS, of Requisite Analytics; Manvi Madan, MInfoTech, of Ports of Auckland; Garba Moussa, PhD, of Open-Classroom; Aalok Thakkar, BS, of University of Pennsylvania; and Frank J. DeFalco, BS, of Janssen Research & Development.
    Dr. Stevens is supported by grant number K08HS024288 from the Agency for Healthcare Research and Quality. The authors declare no competing interests. More

  • in

    Of mice and men and their different tolerance to pathogens

    Trillions of commensal microbes live on the mucosal and epidermal surfaces of the body and it is firmly established that this microbiome affects its host’s tolerance and sensitivity of the host to a variety of pathogens. However, host tolerance to infection with pathogens is not equally developed in all organisms. For example, it is known that the gut microbiome of mice protects more effectively against infection with certain pathogens, such as the bacterium Salmonella typhimurium, than the human gut microbiome.
    This raises the interesting possibility that analyzing differences between host-microbiome interactions in humans and other species, such as mice, and pinpointing individual types of bacterial that either protect or sensitize against certain pathogens, could lead to entirely new types of therapeutic approaches. However, while the intestinal microbiome composition and its effect on host immune responses have been well investigated in mice, it is not possible to study how the microbiome interacts directly with the epithelial cells lining the intestine under highly defined conditions, and thereby uncover specific bacterial strains that can induce host-tolerance to infectious pathogens.
    Now, a collaborative team led by Wyss Founding Director Donald Ingber, M.D., Ph.D. at Harvard’s Wyss Institute for Biologically Inspired Engineering and Dennis Kasper, M.D. at Harvard Medical School (HMS) has harnessed the Wyss’s microfluidic Organs-on-Chip (Organ Chip) technology to model the different anatomical sections of the mouse intestine and their symbiosis with a complex living microbiome in vitro. The researchers recapitulated the destructive effects of S. typhimurium on the intestinal epithelial surface in an engineered mouse Colon Chip, and in a comparative analysis of mouse and human microbiomes were able to confirm the commensal bacterium Enterococcus faecium contributes to host tolerance to S. typhimurium infection. The study is published in Frontiers in Cellular and Infection Microbiology.
    The project was started under a DARPA-supported “Technologies for Host Resilience” (THoR) Project at the Wyss Institute, whose goal it was to uncover key contributions to tolerance to infection by studying differences observed in certain animal species and humans. Using a human Colon Chip, Ingber’s group had shown in a previous study how metabolites produced by microbes derived from mouse and human feces have different potential to impact susceptibility to infection with an enterohemorrhagic E. coli pathogen.
    “Biomedical research strongly depends on animal models such as mice, which undoubtedly have tremendous benefits, but do not provide an opportunity to study normal and pathological processes within a particular organ, such as the intestine, close-up and in real-time. This important proof-of-concept study with Dennis Kasper’s group highlights that our engineered mouse Intestine Chip platform offers exactly this capability and provides the possibility to study host-microbiome interactions with microbiomes from different species under highly controllable conditions in vitro,” said Ingber. “Given the deep level of characterization of mouse immunology, this capability could greatly help advance the work of researchers who currently use these animals to do research on microbiome and host responses. It enables them to compare their results they obtain directly with human Intestine Chips in the future so that the focus can be on identifying features of host response that are most relevant for humans.” Ingber also is the Judah Folkman Professor of Vascular Biology at HMS and Boston Children’s Hospital, and Professor of Bioengineering at the Harvard John A. Paulson School of Engineering and Applied Sciences.
    Engineering a mouse Intestine-on-Chip platform
    In their new study, the team focused on the mouse intestinal tract. “It has traditionally been extremely difficult to model host-microbiome interactions outside any organism as many bacteria are strictly anaerobic and die in normal atmospheric oxygen conditions. Organ Chip technology can recreate these conditions, and it is much easier to obtain primary intestinal and immune cells from mice than having to rely on human biopsies,” said first-author Francesca Gazzaniga, Ph.D., a Postdoctoral Fellow who works between Ingber’s and Kasper’s groups and spear-headed the project.

    advertisement

    Gazzaniga and her colleagues isolated intestinal crypts from different regions of the mouse intestinal tract, including the duodenum, jejunum, ileum, and colon, took their cells through an intermediate “organoid” step in culture in which small tissue fragments form and grow, which they then seeded into one of two parallel microfluidically perfused channels of the Wyss’ Organ Chips to create region-specific Intestine Chips. The second independently perfused channel mimics the blood vasculature, and is separated from the first by a porous membrane that allows the exchange of nutrients, metabolites, and secreted molecules that intestinal epithelial cells use to communicate with vascular and immune cells.
    Homing in on the pathogen
    The team then honed in on S. typhimurium as a pathogen. First, they introduced the pathogen into the epithelial lumen of the engineered mouse Colon Chip and recapitulated the key features associated with the break-down of intestinal tissue integrity known from mouse studies, including the disruption of normally tight adhesions between neighboring epithelial cells, decreased production of mucus, a spike in secretion of a key inflammatory chemokine (the mouse homolog of human IL-8), and changes in epithelial gene expression. In parallel, they showed that the mouse Colon Chip supported the growth and viability of complex bacterial consortia normally present in mouse and human gut microbiomes.
    Putting these capabilities together, the researchers compared the effects of specific mouse and human microbial consortia that had previously been maintained stably in the intestines of ‘gnotobiotic’ mice that were housed in germ-free conditions by the Kasper team. By collecting complex microbiomes from the stool of those mice, and then inoculating them into the Colon Chips, the researchers observed chip-to-chip variability in consortium composition, which enabled them to relate microbe composition to functional effects on the host epithelium. “Using 16s sequencing gave us a good sense of the microbial compositions of the two consortia, and high numbers of one individual species, Enterococcus faecium, generated by only one of them in the Colon Chip, allowed the intestinal tissue to better tolerate the infection,” said Gazzaniga. “This nicely confirmed past findings and validated our approach as a new discovery platform that we can now use to investigate the mechanisms that underlie these effects as well as the contribution of vital immune cell contributions to host-tolerance, as well as infectious processes involving other pathogens.”
    “The mouse intestine on a chip technology provides a unique approach to understand the relationship between the gut microbiota, host immunity, and a microbial pathogen. This important interrelationship is challenging to study in the living animal because there are so many uncontrollable factors. The beauty of this system is that essentially all parameters you wish to study are controllable and can easily be monitored. This system is a very useful step forward,” said Kasper, who is the William Ellery Channing Professor of Medicine and Professor of Immunology at HMS.
    The researchers believe that their comparative in vitro approach could uncover specific cross-talk between pathogens and commensal bacteria with intestinal epithelial and immune cells, and that identified tolerance-enhancing bacteria could be used in future therapies, which may circumvent the problem increasing antimicrobial resistance of pathogenic bacterial strains. More

  • in

    Internet-access spending improves academic outcomes, according to study of Texas public schools

    Increased internet-access spending by Texas public schools improved academic performance but also led to more disciplinary problems among students, a study of 9,000 schools conducted by a research team from Rice University, Texas A&M University and the University of Notre Dame shows.
    Whether students benefit from increased internet access in public schools has been an open question, according to the researchers. For example, some parents and policy advocates contend it increases children’s access to obscene or harmful content and disciplinary problems. Others believe it promotes personalized learning and higher student engagement.
    To address these policy questions, the research team created a multiyear dataset (2000-14) of 1,243 school districts representing more than 9,000 Texas public schools. The team measured internet-access spending, 11 academic performance indicators and 47 types of school disciplinary problems. It used econometric techniques to develop causal estimates linking internet-access spending to academic performance and disciplinary problems. Using student earning, the researchers calculated the economic impact of increased annual internet spending.
    To date, this is the largest and most comprehensive study linking school internet-access spending to academic and disciplinary outcomes, the researchers said.
    The team found that increased school district internet spending is associated with not only improved graduation rates, but also higher numbers of students meeting SAT/ACT criterion and completing advanced courses. It also led to an improvement in commended performance in math, reading, writing and social studies. Interestingly, the researchers noted these improvements were stronger for students who lived in counties with greater internet access (as measured by the number of broadband providers).
    On the flip side, increased school district internet spending also led to higher rates of disciplinary problems at schools, they said.
    The team also calculated how much economic benefit a school district’s internet access will bring students during their lifetimes. It found that a $600,000 increase in annual internet-access spending produces a financial gain of approximately $820,000 to $1.8 million per school district, together with losses from disciplinary problems totaling $25,800 to $53,440.
    In other words, investments in internet access are well worth the costs.
    “We are proud that Texas public schools can serve as a live learning case for understanding education policy,” said study co-author Vikas Mittal, a professor of marketing at Rice’s Jones Graduate School of Business. “Investments in internet access provide clear and meaningful academic benefits. Yet, schools need to implement policies to address increased disciplinary issues such as cyberbullying.
    “K-12 education has transformed into virtual learning due to COVID-19,” he continued. “Our research conclusions apply to a setting where physical learning is supplemented by internet access.”
    However, Mittal cautioned that these benefits cannot be expected to hold if physical learning is completely supplanted by internet-based learning.

    Story Source:
    Materials provided by Rice University. Note: Content may be edited for style and length. More

  • in

    New AI tool can revolutionize microscopy

    An AI tool developed at the University of Gothenburg offers new opportunities for analysing images taken with microscopes. A study shows that the tool, which has already received international recognition, can fundamentally change microscopy and pave the way for new discoveries and areas of use within both research and industry.
    The focus of the study is deep learning, a type of artificial intelligence (AI) and machine learning that we all interact with daily, often without thinking about it. For example when a new song on Spotify pops up that is similar to songs we have previously listened to or when our mobile phone camera automatically finds the best settings and corrects colours in a photo.
    “Deep learning has taken the world by storm and has had a huge impact on many industries, sectors and scientific fields. We have now developed a tool that makes it possible to utilise the incredible potential of deep learning, with focus on images taken with microscopes,” says Benjamin Midtvedt, a doctoral student in physics and the main author of the study.
    Deep learning can be described as a mathematical model used to solve problems that are difficult to tackle using traditional algorithmic methods. In microscopy, the great challenge is to retrieve as much information as possible from the data-packed images, and this is where deep learning has proven to be very effective.
    The tool that Midtvedt and his research colleagues have developed involves neural networks learning to retrieve exactly the information that a researcher wants from an image by looking through a huge number of images, known as training data. The tool simplifies the process of producing training data compared with having to do so manually, so that tens of thousands of images can be generated in an hour instead of a hundred in a month.
    “This makes it possible to quickly extract more details from microscope images without needing to create a complicated analysis with traditional methods. In addition, the results are reproducible, and customised, specific information can be retrieved for a specific purpose.”
    For example, the tool allows the user to decide the size and material characteristics for very small particles and to easily count and classify cells. The researchers have already demonstrated that the tool can be used by industries that need to purify their emissions since they can see in real time whether all unwanted particles have been filtered out.
    The researchers are hopeful that in the future the tool can be used to follow infections in a cell and map cellular defence mechanisms, which would open up huge possibilities for new medicines and treatments.
    “We have already seen major international interest in the tool. Regardless of the microscopic challenges, researchers can now more easily conduct analyses, make new discoveries, implement ideas and break new ground within their fields.”

    Story Source:
    Materials provided by University of Gothenburg. Note: Content may be edited for style and length. More

  • in

    Faster drug discovery through machine learning

    Drugs can only work if they stick to their target proteins in the body. Assessing that stickiness is a key hurdle in the drug discovery and screening process. New research combining chemistry and machine learning could lower that hurdle.
    The new technique, dubbed DeepBAR, quickly calculates the binding affinities between drug candidates and their targets. The approach yields precise calculations in a fraction of the time compared to previous state-of-the-art methods. The researchers say DeepBAR could one day quicken the pace of drug discovery and protein engineering.
    “Our method is orders of magnitude faster than before, meaning we can have drug discovery that is both efficient and reliable,” says Bin Zhang, the Pfizer-Laubach Career Development Professor in Chemistry at MIT, an associate member of the Broad Institute of MIT and Harvard, and a co-author of a new paper describing the technique.
    The research appears today in the Journal of Physical Chemistry Letters. The study’s lead author is Xinqiang Ding, a postdoc in MIT’s Department of Chemistry.
    The affinity between a drug molecule and a target protein is measured by a quantity called the binding free energy — the smaller the number, the stickier the bind. “A lower binding free energy means the drug can better compete against other molecules,” says Zhang, “meaning it can more effectively disrupt the protein’s normal function.” Calculating the binding free energy of a drug candidate provides an indicator of a drug’s potential effectiveness. But it’s a difficult quantity to nail down.
    Methods for computing binding free energy fall into two broad categories, each with its own drawbacks. One category calculates the quantity exactly, eating up significant time and computer resources. The second category is less computationally expensive, but it yields only an approximation of the binding free energy. Zhang and Ding devised an approach to get the best of both worlds.

    advertisement

    Exact and efficient
    DeepBAR computes binding free energy exactly, but it requires just a fraction of the calculations demanded by previous methods. The new technique combines traditional chemistry calculations with recent advances in machine learning.
    The “BAR” in DeepBAR stands for “Bennett acceptance ratio,” a decades-old algorithm used in exact calculations of binding free energy. Using the Bennet acceptance ratio typically requires a knowledge of two “endpoint” states (e.g., a drug molecule bound to a protein and a drug molecule completely dissociated from a protein), plus knowledge of many intermediate states (e.g., varying levels of partial binding), all of which bog down calculation speed.
    DeepBAR slashes those in-between states by deploying the Bennett acceptance ratio in machine-learning frameworks called deep generative models. “These models create a reference state for each endpoint, the bound state and the unbound state,” says Zhang. These two reference states are similar enough that the Bennett acceptance ratio can be used directly, without all the costly intermediate steps.
    In using deep generative models, the researchers were borrowing from the field of computer vision. “It’s basically the same model that people use to do computer image synthensis,” says Zhang. “We’re sort of treating each molecular structure as an image, which the model can learn. So, this project is building on the effort of the machine learning community.”
    While adapting a computer vision approach to chemistry was DeepBAR’s key innovation, the crossover also raised some challenges. “These models were originally developed for 2D images,” says Ding. “But here we have proteins and molecules — it’s really a 3D structure. So, adapting those methods in our case was the biggest technical challenge we had to overcome.”

    advertisement

    A faster future for drug screening
    In tests using small protein-like molecules, DeepBAR calculated binding free energy nearly 50 times faster than previous methods. Zhang says that efficiency means “we can really start to think about using this to do drug screening, in particular in the context of Covid. DeepBAR has the exact same accuracy as the gold standard, but it’s much faster.” The researchers add that, in addition to drug screening, DeepBAR could aid protein design and engineering, since the method could be used to model interactions between multiple proteins.
    DeepBAR is “a really nice computational work” with a few hurdles to clear before it can be used in real-world drug discovery, says Michael Gilson, a professor of pharmaceutical sciences at the University of California at San Diego, who was not involved in the research. He says DeepBAR would need to be validated against complex experimental data. “That will certainly pose added challenges, and it may require adding in further approximations.”
    In the future, the researchers plan to improve DeepBAR’s ability to run calculations for large proteins, a task made feasible by recent advances in computer science. “This research is an example of combining traditional computational chemistry methods, developed over decades, with the latest developments in machine learning,” says Ding. “So, we achieved something that would have been impossible before now.” More

  • in

    Researchers enhance Alzheimer's disease classification through artificial intelligence

    Warning signs for Alzheimer’s disease (AD) can begin in the brain years before the first symptoms appear. Spotting these clues may allow for lifestyle changes that could possibly delay the disease’s destruction of the brain.
    “Improving the diagnostic accuracy of Alzheimer’s disease is an important clinical goal. If we are able to increase the diagnostic accuracy of the models in ways that can leverage existing data such as MRI scans, then that can be hugely beneficial,” explained corresponding author Vijaya B. Kolachalama, PhD, assistant professor of medicine at Boston University School of Medicine (BUSM).
    Using an advanced AI (artificial intelligence) framework based on game theory (known as generative adversarial network or GAN), Kolachalama and his team processed brain images (some low and high quality) to generate a model that was able to classify Alzheimer’s disease with improved accuracy.
    Quality of an MRI scan is dependent on the scanner instrument that is used. For example, a 1.5 Tesla magnet scanner has a slightly lower quality image than an image taken from a 3 Tesla magnet scanner. The magnetic strength is a key parameter associated with a specific scanner. The researchers obtained brain MR images from both 1.5 Tesla and the 3 Tesla scanners of the same subjects taken at the same time, and developed a GAN model that learned from both these images.
    As the model was “learning” from the 1.5 Tesla and 3 Tesla images, it generated images that had improved quality than the 1.5 Tesla scanner, and these generated images also better predicted the Alzheimer’s disease status on these individuals than what could possibly be achieved using models that are based on 1.5 Tesla images alone. “Our model essentially can take 1.5 Tesla scanner derived images and generate images that are of better quality and we can also use the derived images to better predict Alzheimer’s disease than what we could possibly do using just 1.5 Tesla-based images alone,” he added.
    Globally, the population aged 65 and over is growing faster than all other age groups. By 2050, one in six people in the world will be over age 65. While the estimated total healthcare costs for the treatment of AD) in 2020 was estimated at $305 billion and expected to increase to more than $1 trillion as the population ages. The severe burden upon patients and their caregivers, in particular, family caregivers of AD patients face extreme hardship and distress that represents a major but often hidden burden.
    According to the researchers it may be possible to generate images of enhanced quality on disease cohorts that have previously used the 1.5T scanners, and in those centers who continue to rely on 1.5T scanners. “This would allow us to reconstruct the earliest phases of AD, and build a more accurate model of predicting Alzheimer’s disease status than would otherwise be possible using data from 1.5T scanners alone,” said Kolachalama.
    He hopes that such advanced AI methods can be put to good use so that medical imaging community can get the best out of the advances in AI. Such frameworks he believes, can be used to harmonize imaging data across multiple studies so that models can be developed and compared across different populations. This can lead to the development of better approaches to diagnosing AD.
    These findings appear online in the journal Alzheimer’s Research & Therapy.
    Story Source:
    Materials provided by Boston University School of Medicine. Note: Content may be edited for style and length. More