More stories

  • in

    Simulating microswimmers in nematic fluids

    Artificial microswimmers have received much attention in recent years. By mimicking microbes which convert their surrounding energy into swimming motions, these particles could soon be exploited for many important applications. Yet before this can happen, researchers must develop methods to better control the trajectories of individual microswimmers in complex environments. In a new study published inEPJ E, Shubhadeep Mandal at the Indian Institute of Technology Guwahati (India), and Marco Mazza at the Max Planck Institute for Dynamics and Self-Organisation in Göttingen (Germany) and Loughborough University (UK), show how this control could be achieved using exotic materials named ‘nematic liquid crystals’ (LCs) — whose viscosity and elasticity can vary depending on the direction of an applied force.

    advertisement More

  • in

    Mathematical model predicts the movement of microplastics in the ocean

    A new model tracking the vertical movement of algae-covered microplastic particles offers hope in the fight against plastic waste in our oceans.
    Research led by Newcastle University’s Dr Hannah Kreczak is the first to identify the processes that underpin the trajectories of microplastics below the ocean surface. Publishing their findings in the journal Limnology and Oceanography the authors analysed how biofouling — the accumulation of algae on the surface of microplastics, impacts the vertical movement of buoyant particles.
    The researchers found that particle properties are the biggest factor in determining the period and characteristics of the repetitive vertical movement below the surface, while the algal population dynamics determine the maximum depth reached. Their findings also show that the smallest particles are extremely sensitive to algal cell attachment and growth, suggesting they are always submerged at depths surrounding the base of the euphotic zone, the layer closer to the surface that receives enough light to support photosynthesis, or could become trapped in large algal colonies.
    In general, the results suggest that a higher concentration of biofouled microplastic is expected to be found subsurface, close to the euphotic zone depth rather than at the ocean’s surface.
    Microplastics (fragments with a diameter smaller than 5mm) make up 90% of the plastic debris found at the ocean surface and the amount of plastic entering our ocean is significantly larger than the estimates of floating plastic on the surface of the ocean. However, it is not exactly known what happens to these particles once they enter the ocean, and 99% of microplastics within our ocean are considered missing.
    This new model has the potential to understand the distribution of fouled plastics in the ocean and therefore the ecological impact, particularly in areas of high concentration.
    Dr Hannah Kreczak, EPSRC Doctoral Prize Fellow at Newcastle University’s School of Mathematics, Statistics and Physics, said: “Mathematical modelling has been extremely beneficial in identifying hot-spots for marine plastic pollution on the ocean surface. I hope this research can be a constructive step in understanding the impact plastic pollution has below the surface and aid in the effort towards a more sustainable ocean.”
    Co-Author Dr Andrew Baggaley, Lecturer in Applied Mathematics at the School of Mathematics, Statistics and Physics, added: “This is an exciting first step in our project to develop a comprehensive modelling framework to understand the transport of microplastic particles and their distribution in the oceans.”
    Future research by the team will focus on the fluid motion in the ocean mixed layer, to allow for even more complete assessment of microplastic vertical distributions in the ocean.
    Story Source:
    Materials provided by Newcastle University. Note: Content may be edited for style and length. More

  • in

    Reducing data-transfer error in radiation therapy

    Just as helicopter traffic reporters use their “bird’s eye view” to route drivers around roadblocks safely, radiation oncologists treating a variety of cancers can use new guidelines developed by a West Virginia University researcher to reduce mistakes in data transfer and more safely treat their patients.
    Ramon Alfredo Siochi — the director of medical physics at WVU — led a task group to help ensure the accuracy of data that dictates a cancer patient’s radiation therapy. The measures he and his colleagues recommended in their new report safeguard against medical errors in a treatment that more than half of all cancer patients receive.
    “The most common mistake that happens in radiation oncology is the transfer of information from one system to another,” Siochi, the associate chair for the School of Medicine’s Department of Radiation Oncology, said. “This report gives you a good, bird’s-eye view of the way data is moving around in your department.”
    “How frequently do these accidents occur? I think one estimate I saw was that three out of every 100 patients might have an error, but it doesn’t necessarily harm them. Now, I don’t know what the incidence rate is of errors that are quote-unquote ‘near misses’ — when an error happens before it hits the patient — but I would imagine it is much higher.
    Siochi recently chaired the Task Group of Quality Assurance on External Beam Treatment Data Transfer, a division of the American Association of Physicists in Medicine.
    The group was formed in response to news coverage of radiation overdoses caused by faulty data transfer.
    “In 2010, it was reported in the New York Times that a patient [in a New York City hospital] was overdosed with radiation because the data somehow didn’t transfer properly from one system to another,” Siochi said. “Long story short, the patient received a lethal dose of radiation to his head that went on for three days undetected. Now, that falls into the general class of many things happening that were not standard practice. But it could have been avoided.”
    Radiation therapy is used to treat a variety of cancers, including cancers of the lung, pancreas, prostate, breast, brain and bladder. Depending on a cancer’s type or stage, radiation may cure it, shrink it or stop it from coming back.
    But as the complexity of radiation therapy has grown — making it possible to target cancers that would once have been too difficult to treat — so too has the amount of data that goes into treatment machines. With more data comes more opportunity for errors.
    When Siochi started practicing radiation oncology physics — in the 1990s — this data evoked a tree-lined residential street more than the six-lane highway it brings to mind today.
    “It was very analog,” he said. “We’re talking maybe 20 parameters that you would need to check on a plan, and you would put it all on a paper chart. But I once did a calculation — to do an order of magnitude — and now we’re talking about 100,000 parameters. It’s just impossible for a human to check.”
    The group’s report — which earned the approval of AAPM and the Science Council — makes that volume of parameters less overwhelming. It explains how data is transferred among various systems used in radiation therapy, and it suggests ways that medical physicists can test the data’s integrity throughout the process, contributing to safer treatments.
    Story Source:
    Materials provided by West Virginia University. Note: Content may be edited for style and length. More

  • in

    'Hydrogel-based flexible brain-machine interface'

    A KAIST research team and collaborators revealed a newly developed hydrogel-based flexible brain-machine interface. To study the structure of the brain or to identify and treat neurological diseases, it is crucial to develop an interface that can stimulate the brain and detect its signals in real time. However, existing neural interfaces are mechanically and chemically different from real brain tissue. This causes foreign body response and forms an insulating layer (glial scar) around the interface, which shortens its lifespan.
    To solve this problem, the research team of Professor Seongjun Park developed a ‘brain-mimicking interface’ by inserting a custom-made multifunctional fiber bundle into the hydrogel body. The device is composed not only of an optical fiber that controls specific nerve cells with light in order to perform optogenetic procedures, but it also has an electrode bundle to read brain signals and a microfluidic channel to deliver drugs to the brain.
    The interface is easy to insert into the body when dry, as hydrogels become solid. But once in the body, the hydrogel will quickly absorb body fluids and resemble the properties of its surrounding tissues, thereby minimizing foreign body response.
    The research team applied the device on animal models, and showed that it was possible to detect neural signals for up to six months, which is far beyond what had been previously recorded. It was also possible to conduct long-term optogenetic and behavioral experiments on freely moving mice with a significant reduction in foreign body responses such as glial and immunological activation compared to existing devices.
    “This research is significant in that it was the first to utilize a hydrogel as part of a multifunctional neural interface probe, which increased its lifespan dramatically,” said Professor Park. “With our discovery, we look forward to advancements in research on neurological disorders like Alzheimer’s or Parkinson’s disease that require long-term observation.”
    Story Source:
    Materials provided by The Korea Advanced Institute of Science and Technology (KAIST). Note: Content may be edited for style and length. More

  • in

    Discovery of 10 faces of plasma leads to new insights in fusion and plasma science

    Scientists have discovered a novel way to classify magnetized plasmas that could possibly lead to advances in harvesting on Earth the fusion energy that powers the sun and stars. The discovery by theorists at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) found that a magnetized plasma has 10 unique phases and the transitions between them might hold rich implications for practical development.
    The spatial boundaries, or transitions, between different phases will support localized wave excitations, the researchers found. “These findings could lead to possible applications of these exotic excitations in space and laboratory plasmas,” said Yichen Fu, a graduate student at PPPL and lead author of a paper in Nature Communications that outlines the research. “The next step is to explore what these excitations could do and how they might be utilized.”
    Possible applications
    Possible applications include using the excitations to create current in magnetic fusion plasmas or facilitating plasma rotation in fusion experiments. However, “Our paper doesn’t consider any practical applications,” said physicist Hong Qin, co-author of the paper and Fu’s advisor. “The paper is the basic theory and the technology will follow the theoretical understanding.”
    In fact, “the discovery of the 10 phases in plasma marks a primary development in plasma physics,” Qin said. “The first and foremost step in any scientific endeavor is to classify the objects under investigation. Any new classification scheme will lead to improvement in our theoretical understanding and subsequent advances in technology,” he said.
    Qin cites discovery of the major types of diabetes as an example of the role classification plays in scientific progress. “When developing treatments for diabetes, scientists found that there were three major types,” he said. “Now medical practitioners can effectively treat diabetic patients.”
    Fusion, which scientists around the world are seeking to produce on Earth, combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei that makes up 99 percent of the visible universe — to release massive amounts of energy. Such energy could serve as a safe and clean source of power for generating electricity.
    The plasma phases that PPPL has uncovered are technically known as “topological phases,” indicating the shapes of the waves supported by plasma. This unique property of matter was first discovered in the discipline of condensed matter physics during the 1970s — a discovery for which physicist Duncan Haldane of Princeton University shared the 2016 Nobel Prize for his pioneering work.
    Robust and intrinsic
    The localized plasma waves produced by phase transitions are robust and intrinsic because they are “topologically protected,” Qin said. “The discovery that this topologically protected excitation exists in magnetized plasmas is a big step forward that can be explored for practical applications,” he said.
    For first author Fu, “The most important progress in the paper is looking at plasma based on its topological properties and identifying its topological phases. Based on these phases we identify the necessary and sufficient condition for the excitations of these localized waves. As for how this progress can be applied to facilitate fusion energy research, we have to find out.”
    Story Source:
    Materials provided by DOE/Princeton Plasma Physics Laboratory. Original written by John Greenwald. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence could be new blueprint for precision drug discovery

    Writing in the July 12, 2021 online issue of Nature Communications , researchers at University of California San Diego School of Medicine describe a new approach that uses machine learning to hunt for disease targets and then predicts whether a drug is likely to receive FDA approval.
    The study findings could measurably change how researchers sift through big data to find meaningful information with significant benefit to patients, the pharmaceutical industry and the nation’s health care systems.
    “Academic labs and pharmaceutical and biotech companies have access to unlimited amounts of ‘big data’ and better tools than ever to analyze such data. However, despite these incredible advances in technology, the success rates in drug discovery are lower today than in the 1970s,” said Pradipta Ghosh, MD, senior author of the study and professor in the departments of Medicine and Cellular and Molecular Medicine at UC San Diego School of Medicine.
    “This is mostly because drugs that work perfectly in preclinical inbred models, such as laboratory mice, that are genetically or otherwise identical to each other, don’t translate to patients in the clinic, where each individual and their disease is unique. It is this variability in the clinic that is believed to be the Achilles heel for any drug discovery program.”
    In the new study, Ghosh and colleagues replaced the first and last steps in preclinical drug discovery with two novel approaches developed within the UC San Diego Institute for Network Medicine (iNetMed), which unites several research disciplines to develop new solutions to advance life sciences and technology and enhance human health.
    The researchers used the disease model for inflammatory bowel disease (IBD), which is a complex, multifaceted, relapsing autoimmune disorder characterized by inflammation of the gut lining. Because it impacts all ages and reduces the quality of life in patients, IBD is a priority disease area for drug discovery and is a challenging condition to treat because no two patients behave similarly. More

  • in

    MaxDIA: Taking proteomics to the next level

    Proteomics produces enormous amounts of data, which can be very complex to analyze and interpret. The free software platform MaxQuant has proven to be invaluable for data analysis of shotgun proteomics over the past decade. Now, Jürgen Cox, group leader at the Max Planck Institute of Biochemistry, and his team present the new version 2.0. It provides an improved computational workflow for data-independent acquisition (DIA) proteomics, called MaxDIA. MaxDIA includes library-based and library-free DIA proteomics and permits highly sensitive and accurate data analysis. Uniting data-dependent and data-independent acquisition into one world, MaxQuant 2.0 is a big step towards improving applications for personalized medicine.
    Proteins are essential for our cells to function, yet many questions about their synthesis, abundance, functions, and defects still remain unanswered. High-throughput techniques can help improve our understanding of these molecules. For analysis by liquid chromatography followed by mass spectrometry (MS), proteins are broken down into smaller peptides, in a process referred to as “shotgun proteomics.” The mass-to-charge ratio of these peptides is subsequently determined with a mass spectrometer, resulting in MS spectra. From these spectra, information about the identity of the analyzed proteins can be reconstructed. However, the enormous amount and complexity of data make data analysis and interpretation challenging.
    Two ways to analyze proteins with mass spectrometry
    Two main methods are used in shotgun proteomics: Data-dependent acquisition (DDA) and data-independent acquisition (DIA). In DDA, the most abundant peptides of a sample are preselected for fragmentation and measurement. This allows to reconstruct the sequences of these few preselected peptides, making analysis simpler and faster. However, this method induces a bias towards highly abundant peptides. DIA, in contrast, is more robust and sensitive. All peptides from a certain mass range are fragmented and measured at once, without preselection by abundance.
    As a result, this method generates large amounts of data, and the complexity of the obtained information increases considerably. Up to now, identification of the original proteins was only possible by matching the newly measured spectra against spectra in libraries that comprise previously measured spectra.
    Combining DDA and DIA into one world
    Jürgen Cox and his team have now developed a software that provides a complete computational workflow for DIA data. It allows, for the first time, to apply algorithms to DDA and DIA data in the same way. Consequently, studies based on either DDA or DIA will now become more easily comparable. MaxDIA analyzes proteomics data with and without spectral libraries. Using machine learning, the software predicts peptide fragmentation and spectral intensities. Hence, it creates precise MS spectral libraries in silico. In this way, MaxDIA includes a library-free discovery mode with reliable control of false positive protein identifications.
    Furthermore, the software supports new technologies such as bootstrap DIA, BoxCar DIA and trapped ion mobility spectrometry DIA. What are the next steps? The team is already working on further improving the software. Several extensions are being developed, for instance for improving the analysis of posttranslational modifications and identification of cross-linked peptides.
    Enabling researchers to conduct complex proteomics data analysis
    MaxDIA is a free software available to scientists all over the world. It is embedded in the established software environment MaxQuant. “We would like to make proteomics data analysis accessible to all researchers,” says Pavel Sinitcyn, first author of the paper that introduces MaxDIA. Thus, at the MaxQuant summer school, Cox and his team offer hands-on training in this software for all interested researchers. They thereby help bridging the gap between wet lab work and complex data analysis.
    Sinitcyn states that the aim is to “bring mass spectrometry from the Max Planck Institute of Biochemistry to the clinics.” Instead of measuring only a few proteins, thousands of proteins can now be measured and analyzed. This opens up new possibilities for medical applications, especially in the field of personalized medicine.
    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Mathematicians develop ground-breaking modeling toolkit to predict local COVID-19 impact

    A Sussex team — including university mathematicians — have created a new modelling toolkit which predicts the impact of COVID-19 at a local level with unprecedented accuracy. The details are published in the International Journal of Epidemiology, and are available for other local authorities to use online, just as the UK looks as though it may head into another wave of infections.
    The study used the local Sussex hospital and healthcare daily COVID-19 situation reports, including admissions, discharges, bed occupancy and deaths.
    Through the pandemic, the newly-published modelling has been used by local NHS and public health services to predict infection levels so that public services can plan when and how to allocate health resources — and it has been conclusively shown to be accurate. The team are now making their modelling available to other local authorities to use via the Halogen toolkit.
    Anotida Madzvamuse, professor of mathematical and computational biology within the School of Mathematical and Physical Sciences at the University of Sussex, who led of the study, said:
    “We undertook this study as a rapid response to the COVID-19 pandemic. Our objective was to provide support and enhance the capability of local NHS and Public Health teams to accurately predict and forecast the impact of local outbreaks to guide healthcare demand and capacity, policy making, and public health decisions.”
    “Working with outstanding mathematicians, Dr James Van Yperen and Dr Eduard Campillo-Funollet, we formulated an epidemiological model and inferred model parameters by fitting the model to local datasets to allow for short, and medium-term predictions and forecasts of the impact of COVID-19 outbreaks. More