More stories

  • in

    New organ-on-a-chip finds crucial interaction between blood, ovarian cancer tumors

    In the evolving field of cancer biology and treatment, innovations in organ-on-a-chip microdevices allow researchers to discover more about the disease outside the human body. These organs-on-chips serve as a model of the state an actual cancer patient is in, thus allowing an opportunity to finding the correct treatment before administering it to the patient. At Texas A&M University, researchers are pushing these devices to new levels that could change the way clinicians approach cancer treatment, particularly ovarian cancer.
    The team has recently submitted a patent disclosure with the Texas A&M Engineering Experiment Station.
    “We claim several novelties in technological design as well as biological capabilities that didn’t exist in prior organs-on-chips,” said Dr. Abhishek Jain, lead researcher and assistant professor in the Department of Biomedical Engineering.
    Jain also has a joint appointment in the College of Medicine at Texas A&M.
    Jain’s device — the ovarian tumor microenvironment-chip (OTME-Chip) — focuses on platelets, tiny blood cells that help the body form clots to stop bleeding. The microdevice, about the size of a USB, models the properties of a tumor in the lab. Researchers then can recreate events within platelets circulating in the blood as they approach the tumor and make it more potent and metastatic.
    “We are creating a platform technology using the organ-on-a-chip approach where tumor biology can be advanced, and new drugs can be identified by recreating the platelet-tumor and platelet-tumor-drug interactions under the influence of flow, supporting blood vessels and the extracellular matrix,” Jain said. More

  • in

    Topology in biology

    When can we say that a certain property of a system is robust? Intuitively, robustness implies that, even under the effect of external perturbations on the system, no matter how strong or random, said property remains unchanged. In mathematics, properties of an object that are robust against deformations are called topological. For example, the letters s, S, and L can be transformed into each other by stretching or bending their shape. The same holds true for letters o, O, and D. However, it is impossible to turn an S into an O without a discontinuous operation, such as cutting the O apart or sticking the two ends of the S together. Therefore, we say that the letters s, S and L have the same topology — as do the letters o, O and D — whereas the two groups of letters have different topologies. But how does topology relate to biology?
    “During the last decades, physicists have discovered that certain properties of quantum systems depend only on the topology of some underlying feature of the system, such as the phase of its wave function or its energy spectrum” explains Evelyn Tang, co-first author of the study. “We wanted to know if this model can also be applied to biochemical systems to better describe and understand processes out of equilibrium.” As topology is insensitive to continuous perturbations — like the stretching or bending of letters in the example above — properties linked to topology are extremely robust. They will remain unchanged unless a qualitative change to the system occurs, such as cutting apart or sticking together the letters above. The scientists Evelyn Tang, Jaime Agudo-Canalejo and Ramin Golestanian now demonstrated that the same concept of topological protection may be found in biochemical systems, which ensures the robustness of the corresponding biochemical processes.
    Flowing along the edges
    One of the most famous observations regarding topology in quantum systems is the quantum Hall effect: This phenomenon occurs when a two-dimensional conducting material is subjected to a perpendicular magnetic field. In such a setting, the electrons in the material begin to move in tiny circles known as cyclotron orbits, which overall do not lead to any net current in the bulk of the material. However, at the material’s edges, the electrons will bounce off before completing an orbit, and effectively move in the opposite direction, resulting in a net flow of electrons along these edges. Importantly, this edge flow will occur independently of the shape of the edges, and will persist even if the edges are strongly deformed, highlighting the topological and thus robust nature of the effect.
    The researchers noticed a parallel between such cyclotron orbits in the quantum Hall effect and an observation in biochemical systems termed “futile cycles”: directed reaction cycles that consume energy but are useless, at least at first sight. For example, a chemical A may get converted to B, which gets converted to C, which subsequently gets converted back to A. This raised the question: is it possible that, like for cyclotron orbits in the quantum Hall effect, futile cycles can cause edge currents resulting in a net flow in a two-dimensional biochemical reaction network?
    The authors thus modelled biochemical processes that occur in a two-dimensional space. One simple example are the assembly dynamics of a biopolymer that is composed of two different subunits X and Y: A clockwise futile cycle would then correspond to adding a Y subunit, adding an X subunit, removing a Y subunit, and removing an X subunit, which would bring the system back to the initial state. Now, such a two-dimensional space will also have “edges,” representing constraints in the availability of subunits. As anticipated, the researchers found that counterclockwise currents along these edges would indeed arise spontaneously. Jaime Agudo-Canalejo, co-first author of the study, explains: “In this biochemical context, edge currents correspond to large-scale cyclic oscillations in the system. In the example of a biopolymer, they would result in a cycle in which first all X subunits in the system are added to the polymer, followed by all Y subunits, then first all X and finally all Y subunits are again removed, so the cycle is completed.”
    The power of topology
    Like in the quantum Hall system, these biochemical edge currents appear robust to changes in the shape of the system’s boundaries or to disorder in the bulk of the system. Thus the researchers aimed to investigate whether topology indeed sits at the heart of this robustness. However, the tools used in quantum systems are not directly applicable to biochemical systems, which underlie classical, stochastic laws. To this end, the researchers devised a mapping between their biochemical system and an exotic class of systems known as non-Hermitian quantum systems. Evelyn Tang, who has a background in topological quantum matter, recalls: “Once this mapping was established, the whole toolbox of topological quantum systems became available to us. We could then show that, indeed, edge currents are robust thanks to topological protection. Moreover, we found that the emergence of edge currents is inextricably linked to the out-of-equilibrium nature of the futile cycles, which are driven by energy consumption.”
    A new realm of possibilities
    The robustness arising from topological protection, coupled to the versatility inherently present in biochemical networks, results in a multitude of phenomena that can be observed in these systems. Examples include an emergent molecular clock that can reproduce some features of circadian systems, dynamical growth and shrinkage of microtubules (proteins of the cell skeleton) and spontaneous synchronization between two or more systems that are coupled through a shared pool of resources. Ramin Golestanian, co-author of the study and Director of the Department of Living Matter Physics at MPI-DS, is optimistic for the future: “Our study proposes, for the first time, minimal biochemical systems in which topologically-protected edge currents can arise. Given the wealth of biochemical networks that exists in biology, we believe it is only a matter of time until examples are found in which topological protection sensitively control the operations in such systems.” More

  • in

    Researchers develop tool to drastically speed up the study of enzymes

    For much of human history, animals and plants were perceived to follow a different set of rules than the rest of the universe. In the 18th and 19th centuries, this culminated in a belief that living organisms were infused by a non-physical energy or “life force” that allowed them to perform remarkable transformations that couldn’t be explained by conventional chemistry or physics alone.
    Scientists now understand that these transformations are powered by enzymes — protein molecules comprised of chains of amino acids that act to speed up, or catalyze, the conversion of one kind of molecule (substrates) into another (products). In so doing, they enable reactions such as digestion and fermentation — and all of the chemical events that happen in every one of our cells — that, left alone, would happen extraordinarily slowly.
    “A chemical reaction that would take longer than the lifetime of the universe to happen on its own can occur in seconds with the aid of enzymes,” said Polly Fordyce, an assistant professor of bioengineering and of genetics at Stanford University.
    While much is now known about enzymes, including their structures and the chemical groups they use to facilitate reactions, the details surrounding how their forms connect to their functions, and how they pull off their biochemical wizardry with such extraordinary speed and specificity are still not well understood.
    A new technique, developed by Fordyce and her colleagues at Stanford and detailed this week in the journal Science, could help change that. Dubbed HT-MEK — short for High-Throughput Microfluidic Enzyme Kinetics — the technique can compress years of work into just a few weeks by enabling thousands of enzyme experiments to be performed simultaneously. “Limits in our ability to do enough experiments have prevented us from truly dissecting and understanding enzymes,” said study co-leader Dan Herschlag, a professor of biochemistry at Stanford’s School of Medicine.
    By allowing scientists to deeply probe beyond the small “active site” of an enzyme where substrate binding occurs, HT-MEK could reveal clues about how even the most distant parts of enzymes work together to achieve their remarkable reactivity. More

  • in

    Smartphone screens effective sensors for soil or water contamination

    The touchscreen technology used in billions of smartphones and tablets could also be used as a powerful sensor, without the need for any modifications.
    Researchers from the University of Cambridge have demonstrated how a typical touchscreen could be used to identify common ionic contaminants in soil or drinking water by dropping liquid samples on the screen, the first time this has been achieved. The sensitivity of the touchscreen sensor is comparable to typical lab-based equipment, which would make it useful in low-resource settings.
    The researchers say their proof of concept could one day be expanded for a wide range of sensing applications, including for biosensing or medical diagnostics, right from the phone in your pocket. The results are reported in the journal Sensors and Actuators B.
    Touchscreen technology is ubiquitous in our everyday lives: the screen on a typical smartphone is covered in a grid of electrodes, and when a finger disrupts the local electric field of these electrodes, the phone interprets the signal.
    Other teams have used the computational power of a smartphone for sensing applications, but these have relied on the camera or peripheral devices, or have required significant changes to be made to the screen.
    “We wanted to know if we could interact with the technology in a different way, without having to fundamentally change the screen,” said Dr Ronan Daly from Cambridge’s Institute of Manufacturing, who co-led the research. “Instead of interpreting a signal from your finger, what if we could get a touchscreen to read electrolytes, since these ions also interact with the electric fields?”
    The researchers started with computer simulations, and then validated their simulations using a stripped down, standalone touchscreen, provided by two UK manufacturers, similar to those used in phones and tablets. More

  • in

    Gaming graphics card allows faster, more precise control of fusion energy experiments

    Nuclear fusion offers the potential for a safe, clean and abundant energy source.
    This process, which also occurs in the sun, involves plasmas, fluids composed of charged particles, being heated to extremely high temperatures so that the atoms fuse together, releasing abundant energy.
    One challenge to performing this reaction on Earth is the dynamic nature of plasmas, which must be controlled to reach the required temperatures that allow fusion to happen. Now researchers at the University of Washington have developed a method that harnesses advances in the computer gaming industry: It uses a gaming graphics card, or GPU, to run the control system for their prototype fusion reactor.
    The team published these results May 11 in Review of Scientific Instruments.
    “You need this level of speed and precision with plasmas because they have such complex dynamics that evolve at very high speeds. If you cannot keep up with them, or if you mispredict how plasmas will react, they have a nasty habit of going in the totally wrong direction very quickly,” said co-author Chris Hansen, a UW senior research scientist in the aeronautics and astronautics department.
    “Most applications try to operate in an area where the system is pretty static. At most all you have to do is ‘nudge’ things back in place,” Hansen said. “In our lab, we are working to develop methods to actively keep the plasma where we want it in more dynamic systems.”
    The UW team’s experimental reactor self-generates magnetic fields entirely within the plasma, making it potentially smaller and cheaper than other reactors that use external magnetic fields. More

  • in

    Personalized immunotherapy: Rapid screening of therapeutic combinations

    An innovative testing platform that more closely mimics what cancer encounters in the body may allow for more precise, personalized therapies by enabling the rapid study of multiple therapeutic combinations against tumor cells. The platform, which uses a three-dimensional environment to more closely mirror a tumor microenvironment, is demonstrated in research published in Communications Biology.
    “This whole platform really gives us a way to optimize personalized immunotherapy on a rapid, high throughput scale,” said Jonathan Dordick, Institute Professor of chemical and biological engineering and member of the Center for Biotechnology and Interdisciplinary Studies (CBIS) at Rensselaer Polytechnic Institute, who led this research. “You can imagine somebody having cancer, and you quickly biopsy the tumor and then you use this biochip platform to identify very quickly — within a day or two — what specific treatment modality might be ideally suited against a particular cancer.”
    Of particular interest to researchers is the behavior of a specific type of immune cell known as natural killer (NK) cells, which seek out cancer or viruses within the body, bind to their receptors, and excrete an enzyme meant to kill the unwanted cells. The platform studied in this paper allows researchers to compare what happens when the NK cells are left to fight tumor cells on their own versus how they behave when an antibody or cancer drug, or a combination of the two, is added.
    The platform is a small two-piece plastic chip that’s about the size of a microscope slide. One side of the sandwich chip contains 330 tiny pillars upon which researchers can place an external matrix, made of a gel-like substance, which mimics the mechanical environment of a tumor cell. When cancer cells are placed inside this gel-like structure, they’re encouraged to grow into a spheroid shape, much as they would inside the body. The second piece contains 330 microwells within which NK cells can be added in suspension — much as they would flow, untethered inside the body.
    At Rensselaer, Dordick collaborated with Seok-Joon Kwon, senior research scientist in CBIS, and Sneha Gopal, who recently received her Ph.D. based, in part, on this study. The Rensselaer team collaborated with researchers from Konyang University and Medical & Bio Decision Company Ltd. To test this platform, researchers studied two types of breast cancer cells, as well as pancreatic cancer cells, with various combinations of NK cells, two monoclonal antibodies, and an anti-cancer chemotherapy drug.
    “You can screen very quickly to determine what combinations of NK cells, antibodies, and chemotherapeutic drugs target the cancer cells within the spheroid geometry,” Dordick said. “What really is amazing is we see very significant differences between what happens in that spheroid, within the slots of the chip, versus what would happen in a more traditional two-dimensional cell culture that’s often used in the screening.”
    In the spheroid design, for instance, the chemotherapy drug paclitaxel had little effect on the three types of cancer cells on its own, whereas in a traditional two-dimensional system, Dordick said, the drug may appear to do well. It performed dramatically better when it was combined with both NK cells and an antibody.
    “This platform moves researchers closer to personalized medicine,” said Deepak Vashishth, director of CBIS. “This work conducted by Professor Dordick and his research group is an excellent example of how we, at Rensselaer, are providing a new angle to human health by developing new approaches at the intersection of engineering and life sciences to enhance cures for diseases such as cancer.”
    To further the potential use of this tool, Dordick said that it must be tested on a wide range of cancer types, including a tumor microenvironment that consists of multiple different types of cells. In the future, he envisions that the platform has the potential to identify combination therapies that work best against a patient’s specific cancer, enabling the identification and delivery of personalized immunotherapy. More

  • in

    Antimatter from laser pincers

    In the depths of space, there are celestial bodies where extreme conditions prevail: Rapidly rotating neutron stars generate super-strong magnetic fields. And black holes, with their enormous gravitational pull, can cause huge, energetic jets of matter to shoot out into space. An international physics team with the participation of the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) has now proposed a new concept that could allow some of these extreme processes to be studied in the laboratory in the future: A special setup of two high-intensity laser beams could create conditions similar to those found near neutron stars. In the discovered process, an antimatter jet is generated and accelerated very efficiently. The experts present their concept in the journal Communications Physics.
    The basis of the new concept is a tiny block of plastic, crisscrossed by micrometer-fine channels. It acts as a target for two lasers. These simultaneously fire ultra-strong pulses at the block, one from the right, the other from the left — the block is literally taken by laser pincers. “When the laser pulses penetrate the sample, each of them accelerates a cloud of extremely fast electrons,” explains HZDR physicist Toma Toncian. “These two electron clouds then race toward each other with full force, interacting with the laser propagating in the opposite direction.” The following collision is so violent that it produces an extremely large number of gamma quanta — light particles with an energy even higher than that of X-rays.
    The swarm of gamma quanta is so dense that the light particles inevitably collide with each other. And then something crazy happens: According to Einstein’s famous formula E=mc2, light energy can transform into matter. In this case, mainly electron-positron pairs should be created. Positrons are the antiparticles of electrons. What makes this process special is that “very strong magnetic fields accompany it,” describes project leader Alexey Arefiev, a physicist at the University of California at San Diego. “These magnetic fields can focus the positrons into a beam and accelerate them strongly.” In numbers: Over a distance of just 50 micrometers, the particles should reach an energy of one gigaelectronvolt (GeV) — a size that usually requires a full-grown particle accelerator.
    Successful computer simulation
    To see whether the unusual idea could work, the team tested it in an elaborate computer simulation. The results are encouraging; in principle, the concept should be feasible. “I was surprised that the positrons that were created in the end were formed into a high-energy and bundled beam in the simulation,” Arefiev says happily. What’s more, the new method should be much more efficient than previous ideas, in which only a single laser pulse is fired at an individual target: According to the simulation, the “laser double strike” should be able to generate up to 100,000 times more positrons than the single-treatment concept.
    “Also, in our case, the lasers would not have to be quite as powerful as in other concepts,” Toncian explains. “This would probably make the idea easier to put into practice.” However, there are only few places in the world where the method could be implemented. The most suitable would be ELI-NP (Extreme Light Infrastructure Nuclear Physics), a unique laser facility in Romania, largely funded by the European Union. It has two ultra-powerful lasers that can fire simultaneously at a target — the basic requirement for the new method.
    First tests in Hamburg
    Essential preliminary tests, however, could take place in Hamburg beforehand: The European XFEL, the most powerful X-ray laser in the world, is located there. The HZDR plays a major role in this large-scale facility: It leads a user consortium called HIBEF, which has been targeting matter in extreme states for some time. “At HIBEF, colleagues from HZDR, together with the Helmholtz Institute in Jena, are developing a platform that can be used to experimentally test whether the magnetic fields actually form as our simulations predict,” explains Toma Toncian. “This should be easy to analyze with the powerful X-ray flashes of the European XFEL.”
    For astrophysics as well as nuclear physics, the new technique could be exceedingly useful. After all, some extreme processes in space are also likely to produce vast quantities of gamma quanta, which then quickly materialize again into high-energy pairs. “Such processes are likely to take place, among others, in the magnetosphere of pulsars, i.e. of rapidly rotating neutron stars,” says Alexey Arefiev. “With our new concept, such phenomena could be simulated in the laboratory, at least to some extent, which would then allow us to understand them better.” More

  • in

    Artificial intelligence models to analyze cancer images take shortcuts that introduce bias

    Artificial intelligence tools and deep learning models are a powerful tool in cancer treatment. They can be used to analyze digital images of tumor biopsy samples, helping physicians quickly classify the type of cancer, predict prognosis and guide a course of treatment for the patient. However, unless these algorithms are properly calibrated, they can sometimes make inaccurate or biased predictions.
    A new study led by researchers from the University of Chicago shows that deep learning models trained on large sets of cancer genetic and tissue histology data can easily identify the institution that submitted the images. The models, which use machine learning methods to “teach” themselves how to recognize certain cancer signatures, end up using the submitting site as a shortcut to predicting outcomes for the patient, lumping them together with other patients from the same location instead of relying on the biology of individual patients. This in turn may lead to bias and missed opportunities for treatment in patients from racial or ethnic minority groups who may be more likely to be represented in certain medical centers and already struggle with access to care.
    “We identified a glaring hole in the in the current methodology for deep learning model development which makes certain regions and patient populations more susceptible to be included in inaccurate algorithmic predictions,” said Alexander Pearson, MD, PhD, assistant Assistant Professor of Medicine at UChicago Medicine and co-senior author. The study was published July 20, in Nature Communications.
    One of the first steps in treatment for a cancer patient is taking a biopsy, or small tissue sample of a tumor. A very thin slice of the tumor is affixed to glass slide, which is stained with multicolored dyes for review by a pathologist to make a diagnosis. Digital images can then be created for storage and remote analysis by using a scanning microscope. While these steps are mostly standard across pathology labs, minor variations in the color or amount of stain, tissue processing techniques and in the imaging equipment can create unique signatures, like tags, on each image. These location-specific signatures aren’t visible to the naked eye, but are easily detected by powerful deep learning algorithms.
    These algorithms have the potential to be a valuable tool for allowing physicians to quickly analyze a tumor and guide treatment options, but the introduction of this kind of bias means that the models aren’t always basing their analysis on the biological signatures it sees in the images, but rather the image artifacts generated by differences between submitting sites.
    Pearson and his colleagues studied the performance of deep learning models trained on data from the Cancer Genome Atlas, one of the largest repositories of cancer genetic and tissue image data. These models can predict survival rates, gene expression patterns, mutations, and more from the tissue histology, but the frequency of these patient characteristics varies widely depending on which institutions submitted the images, and the model often defaults to the “easiest” way to distinguish between samples — in this case, the submitting site.
    For example, if Hospital A serves mostly affluent patients with more resources and better access to care, the images submitted from that hospital will generally indicate better patient outcomes and survival rates. If Hospital B serves a more disadvantaged population that struggles with access to quality care, the images that site submitted will generally predict worse outcomes.
    The research team found that once the models identified which institution submitted the images, they tended to use that as a stand in for other characteristics of the image, including ancestry. In other words, if the staining or imaging techniques for a slide looked like it was submitted by Hospital A, the models would predict better outcomes, whereas they would predict worse outcomes if it looked like an image from Hospital B. Conversely, if all patients in Hospital B had biological characteristics based on genetics that indicated a worse prognosis, the algorithm would link the worse outcomes to Hospital B’s staining patterns instead of things it saw in the tissue.
    “Algorithms are designed to find a signal to differentiate between images, and it does so lazily by identifying the site,” Pearson said. “We actually want to understand what biology within a tumor is more likely to predispose resistance to treatment or early metastatic disease, so we have to disentangle that site-specific digital histology signature from the true biological signal.”
    The key to avoiding this kind of bias is to carefully consider the data used to train the models. Developers can make sure that different disease outcomes are distributed evenly across all sites used in the training data, or by isolating a certain site while training or testing the model when the distribution of outcomes is unequal. The result will produce more accurate tools that can get physicians the information they need to quickly diagnose and plan treatments for cancer patients.
    “The promise of artificial intelligence is the ability to bring accurate and rapid precision health to more people,” Pearson said. “In order to meet the needs of the disenfranchised members of our society, however, we have to be able to develop algorithms which are competent and make relevant predictions for everyone.” More