More stories

  • in

    Mathematical modeling used to analyze dynamics of CAR T-cell therapy

    Chimeric antigen receptor T-cell therapy, or CAR T, is a relatively new type of therapy approved to treat several types of aggressive B cell leukemias and lymphomas. Many patients have strong responses to CAR T; however, some have only a short response and develop disease progression quickly. Unfortunately, it is not completely understood why these patients have progression. In an article published in Proceedings of the Royal Society B, Moffitt Cancer Center researchers use mathematical modeling to help explain why CAR T cells work in some patients and not in others.
    CAR T is a type of personalized immunotherapy that uses a patient’s own T cells to target cancer cells. T cells are harvested from a patient and genetically modified in a laboratory to add a specific receptor that targets cancer cells. The patient then undergoes lymphodepletion with chemotherapy to lower some of their existing normal immune cells to help with expansion of the CAR T cells that are infused back into the patient, where they can get to work and attack the tumor.
    Mathematical modeling has been used to help predict how CAR T cells will behave after being infused back into patients; however, no studies have yet considered how interactions between the normal T cells and CAR T cells impact the dynamics of the therapy, in particular how the nonlinear T cell kinetics factor into the chances of therapy success. Moffitt researchers integrated clinical data with mathematical and statistical modeling to address these unknown factors.
    The researchers demonstrate that CAR T cells are effective because they rapidly expand after being infused back into the patient; however, the modified T cells are shown to compete with existing normal T cells, which can limit their ability to expand.
    “Treatment success critically depends on the ability of the CAR T cells to multiply in the patient, and this is directly dependent upon the effectiveness of lymphodepletion that reduces the normal T cells before CAR T infusion,” said Frederick Locke, M.D., co-lead study author and vice chair of the Blood and Marrow Transplant and Cellular Immunotherapy Department at Moffitt.
    In their model, the researchers discovered that tumor eradication is a random, yet potentially highly probable event. Despite this randomness of cure, the authors demonstrated that differences in the timing and probability of cures are determined largely by variability among patient and disease factors. The model confirmed that cures tend to happen early, within 20 to 80 days before CAR T cells decline in number, while disease progression tends to happen over a wider time range between 200 to 500 days after treatment.
    The researchers’ model could also be used to test new treatments or propose refined clinical trial designs. For example, the researchers used their model to demonstrate that another round of CAR T-cell therapy would require a second chemotherapy lymphodepletion to improve patient outcomes.
    “Our model confirms the hypothesis that sufficient lymphodepletion is an important factor in determining durable response. Improving the adaptation of CAR T cells to expand more and survive longer in vivo could result in increased likelihood and duration of response,” explained Philipp Altrock, Ph.D., lead study author and assistant member of the Integrated Mathematical Oncology Department at Moffitt.
    Story Source:
    Materials provided by H. Lee Moffitt Cancer Center & Research Institute. Note: Content may be edited for style and length. More

  • in

    Deciphering the secrets of printed electronics

    Next-gen electronics is envisioned to be non-rigid, component-free, flexible, bendable, and easily integrable with different objects.
    Direct-write printing techniques provide unique opportunity to enable this vision through use of nanomaterial so-called functional inks, that can be tailored to add desired functionalities on various flexible substrates, such as textiles or plastic.
    The technology, known as Printed Electronics (PE), has been known for decades, but has recently gained considerable attention due to innovation in material inks, process technology and design revolution.
    To keep the research community abreast with the latest technological advancements in the area of droplet-based PE techniques for next-gen devices, researchers from Aarhus University have now published a comprehensive review of the technology in the   scientific journal Advanced Materials.
    “Through this paper, we have tried to fill the existing void in literature by discussing techniques, material inks, ink properties, post processing, substrates and application to provide a complete guide. PE is an industry relevant technology and the gateway to future portable electronics, where advanced printers can print complex circuits on any material,” says Assistant Professor Shweta Agarwala, an expert in PE at the Department of Electrical and Computer Engineering at Aarhus University.
    PE is already being used for many different applications today. It is an attractive method to impart electrical functionality on any surface and the major advantage of PE is that it is inexpensive and readily scalable.
    “PE offers a wide range of advantages over conventional lithography-based technologies. It provides much more production flexibility, it is cheaper and far simpler. More importantly, it opens up a plethora of new possibilities to print flexible electrical circuits directly onto a wide range of substrates such as plastics, papers, clothes, and quite literally any other planar and non-planar surfaces. The research area is moving forwards fast, and this publication provides an overview of how far we have progressed today,” says Hamed Abdolmaleki, a PhD student and first author of the paper.
    Even though PE is being used in more and more industries, and is considered very important in the electronics of the future, the technology is still in its infancy.
    For Shweta Agarwala, the sustainability aspect is very important for the future perspectives of electronics and PE technology:
    “PE is the way towards biodegradable electronics, and with this technology, we can address the huge societal problem that electronics already present, and which will only get more pressing in the future. The world is not only suffering from a huge amount of plastic pollution; it is also burdened by enormous pollution from electronics in all the devices we discard rapidly. In the review article, we have also discussed the emerging field of biodegradable substrates that will have huge environmental impact,” she adds.
    Story Source:
    Materials provided by Aarhus University. Original written by Jesper Bruun. Note: Content may be edited for style and length. More

  • in

    First steps towards revolutionary ULTRARAM™ memory chips

    A new type of universal computer memory — ULTRARAM™ — has taken a step closer towards development with a successful experiment by Lancaster physicists.
    Professor Manus Hayne, who is leading the research, commented: “These new results confirm the astonishing properties of ULTRARAM™, allowing us to demonstrate its potential as a fast and efficient non-volatile memory with high-endurance.”
    Currently, the two main types of memory, dynamic RAM (DRAM) and flash, have complementary characteristics and roles:- DRAM is fast, so used for active (working) memory but it is volatile, meaning that information is lost when power is removed. Indeed, DRAM continually ‘forgets’ and needs to be constantly refreshed. Flash is non-volatile, allowing you to carry data in your pocket, but is very slow and wears out. It is well-suited for data storage but can’t be used for active memory.”Universal memory” is a memory where the data is very robustly stored, but can also easily be changed; something that was widely considered to be unachievable until now.
    The Lancaster team has solved the paradox of universal memory by exploiting a quantum mechanical effect called resonant tunnelling that allows a barrier to switch from opaque to transparent by applying a small voltage.
    Their new non-volatile RAM, called ULTRARAM™, is a working implementation of so-called ‘universal memory’, promising all the advantages of DRAM and flash, with none of the drawbacks.
    In their latest work, published in IEEE Transactions on Electron Devices, the researchers have integrated ULTRARAM™ devices into small (4-bit) arrays for the first time. This has allowed them to experimentally verify a novel, patent-pending, memory architecture that would form the basis of future ULTRARAM™ memory chips.
    They have also modified the device design to take full advantage of the physics of resonant tunnelling, resulting in devices that are 2,000 times faster than the first prototypes, and with program/erase cycling endurance that is at least ten times better than flash, without any compromise in data retention.
    Story Source:
    Materials provided by Lancaster University. Note: Content may be edited for style and length. More

  • in

    Privacy-preserving 'encounter metrics' could slow down future pandemics

    When you bump into someone in the workplace or at your local coffee shop, you might call that an “encounter.” That’s the scientific term for it, too. As part of urgent efforts to fight COVID-19, a science is rapidly developing for measuring the number of encounters and the different levels of interaction in a group.
    At the National Institute of Standards and Technology (NIST), researchers are applying that science to a concept they have created called “encounter metrics.” They have developed an encrypted method that can be applied to a device such as your phone to help with the ultimate goal of slowing down or preventing future pandemics. The method is also applicable to the COVID-19 pandemic.
    Their research is explained in a pilot study published in the Journal of Research of NIST.
    Encounter metrics measure the levels of interactions between members of a population. A level of interaction could be the number of people in a bathroom who are talking to each other or a group of people walking down a hallway. There are numerous levels of interactions because there are so many different ways people can interact with one another in different environments.
    In order to mitigate the spread of an infectious disease there is the assumption that less communication and interaction with people in a community is essential. Fewer interactions among people means there is less of a chance of the disease spreading from one person to another. “We need to measure that. It’s important to develop technology to measure that and then see how we can use that technology to shape our working environment to slow future pandemics,” said NIST researcher René Peralta, an author of the NIST study.
    Picture two people walking from opposite ends of a hallway who meet in the middle. To record this encounter, each person could carry their own phone or a Bluetooth device that broadcasts a signal as soon as the encounter occurs. One way of labeling this encounter is through the exchange of device IDs or pseudonyms. Each device sends its own pseudonym that belongs to the device itself. The pseudonyms could be changed every 10 minutes as a way to promote the privacy of the person’s identity. More

  • in

    Detecting for carpal tunnel syndrome with a smartphone game

    A Japanese research group combined motion analysis that uses smartphone application and machine learning that uses an anomaly detection method, thereby developing a technique to easily screen for carpal tunnel syndrome. Carpal tunnel syndrome is common amongst middle-aged women. The disease causes compressed nerves in the wrist, causing numbness and difficulty with finger movements. While an accurate diagnosis can be reached with nerve conduction study, this is not widely used because it requires expensive devices and specialized skills. Thus, a simple screen tool that does not require any specialized knowledge or techniques is desired.
    The research group of Dr. Koji Fujita of Tokyo Medical and Dental University and associate professor Yuta Sugiura of Keio University focused on increasingly poor movements of the thumb with the advancement of the disease, and analyzed its characteristics. They developed a game application for smartphones that is played using the thumbs and prepared a program that acquires the trajectory of the thumb during a game play and estimates the possibility of the disease with machine learning. The application can screen for possible carpal tunnel syndrome using a simple game that can be played in 30 sec — 1 minute. Even without gathering patient data, they were able to effectively construct an estimate mode from the data of 12 asymptomatic participants using the anomaly detection method. When this program was applied to 15 new asymptomatic subjects and 36 patients with carpal tunnel syndrome to verify its accuracy, the result was promising with 93% sensitivity, 69% specificity, and 0.86 Area Under the Curve (AUC)(1). This is equivalent or better than the results of physical examinations by expert orthopedic surgeons.
    The developed tool can be used to screen for possible carpal tunnel syndrome at sites where no expert is present, such as at home or at a health center. In the future, the research group aims to develop a system that is able to encourage an examination by an expert when the disease is suspected in order to prevent exacerbation. It would prevent inconvenience and social loss associated with exacerbation of a disease, which is more common among women, and contribute to creating a society where women play an active role.
    The research was conducted as part of JST’s Strategic Basic Research Program, Precursory Research for Embryonic Science and Technology (PRESTO).
    (1) Area Under the Curve (AUC)
    This assessment item is used for each test method, and a higher value indicates a better test. Sensitivity is the ratio of correct positive results for subjects with a disease. Specificity is the ratio of correct negative results for subjects without a disease. AUC is the comprehensive assessment indicator of accuracy that combines sensitivity and specificity and takes a value between 0 and 1.
    Story Source:
    Materials provided by Japan Science and Technology Agency. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence as a co-driver

    The use of artificial intelligence (AI) is becoming more common in many branches of industry and online retailing. Traditional lines of work, such as transport logistics and driving, are developing in a similar direction although mainly out of public view. Scientists at the University of Göttingen have now investigated how efficient the use of AI can be in the commercial management of trucks. Their answer: the best option is an intelligent combination of human decision-making and AI applications. The study was published in the International Journal of Logistics Management.
    “As has happened in the private sector, digital applications — as well as machine learning, a kind of AI — are increasingly permeating operations and processes in the transport and logistics sector,” explains Professor Matthias Klumpp from the Faculty of Economics. “The question in the commercial sector, however, is whether or not this contributes to achieving goals and efficiency in companies.”
    To answer this question, the researchers compared the work efficiency of truck drivers in relation to their use of AI applications such as dynamic real-time navigation systems, cruise control and automated gear-shifting based on speed and topography and others. Looking at retail trade delivery by truck, they studied three comparison groups: the first drove exclusively following human decision-making patterns; the second used a combination of human and machine; and the third relied exclusively on fully automated decisions.
    The researchers from the Production and Logistics Research Group concluded that an intelligent combination of human work and decision-making capabilities with AI applications promises the highest transport and driving efficiency: “On average, the second group achieved the most efficient transport trips, with the fewest interventions and deviations from the optimal path,” the authors said. “Clearly, neither a purely human decision-making structure nor a fully automated driving system can promise to meet current logistics requirements.”
    The scientists therefore deduce that despite the progress of AI in the field of transportation by truck, human experience and decision-making capabilities will still be necessary in the longer term. “However, extensive training and qualification needs will occur by working with AI applications, especially for simple logistics activities,” the authors conclude. “Technology and AI innovations are therefore not a question for management alone. In particular, efficiency and competitive advantages can be achieved through their application in operational transport.”
    Story Source:
    Materials provided by University of Göttingen. Note: Content may be edited for style and length. More

  • in

    How kelp forests off California are responding to an urchin takeover

    Joshua Smith has been diving in kelp forests in Monterey Bay along the central coast of California since 2012. Back then, he says, things looked very different. Being underwater was like being in a redwood forest, where the kelp was like “towering tall cathedrals,” says Smith, an ecologist at the University of California, Santa Cruz. Their tops were so lush that it was hard to maneuver a boat across them.

    No longer. The once expansive kelp forests are now a mosaic of thinner thickets interspersed with barrens colonized by sea urchins. And those sea urchins have so little to eat, they aren’t even worth the effort of hungry sea otters — which usually keep urchins in check and help keep kelp forests healthy, Smith and his colleagues report March 8 in the Proceedings of the National Academy of Sciences.

    A similar scene is playing out farther north. A thick kelp forest once stretched 350 kilometers along the northern California coast. More than 95 percent of it has vanished since 2014, satellite imagery shows. Once covering about 210 hectares on average, those forests have been reduced to a mere 10 hectares scattered among a few small patches, Meredith McPherson, a marine biologist also at UC Santa Cruz, and her colleagues report March 5 in Communications Biology. Like the barrens farther south, the remaining forests are now covered by purple sea urchins.

    Satellite images in 2008 (left) and 2019 (right) of a section of the northern California coastline reveal a 95 percent reduction in the area covered by underwater kelp forests (yellow).Meredith McPherson

    Satellite images in 2008 (left) and 2019 (right) of a section of the northern California coastline reveal a 95 percent reduction in the area covered by underwater kelp forests (yellow).Meredith McPherson

    Together, the two studies reveal the devastation of these once resilient ecosystems. But a deeper dive into the cascading effects of this loss may also provide clues to how at least some of these forests can bounce back.

    California’s kelp forests, which provide a rich habitat for marine organisms, got hit by a double whammy of ecological disasters in the past decade, says UC Santa Cruz ecologist Mark Carr. He is a coauthor on the Communications Biology paper who has mentored both McPherson and Smith.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    First, sea star wasting syndrome wiped out local populations of sunflower sea stars (Pycnopodia helianthoides), which typically feed on urchins (SN: 1/20/21). Without sea stars, purple sea urchins (Strongylocentrotus purpuratus) proliferated.

    The second wallop was a marine heat wave so big and persistent it was nicknamed “The Blob” (SN: 12/14/17). While kelp forests have been resilient to warming events before, this one was so extreme it spiked temperatures in many parts of the Pacific to 2 to 3 degrees Celsius above normal (SN: 1/15/20).

    Kelp thrives in cold and nutrient rich water. As its growth slowed in the warmer water, less kelp drifted into the crevices of the reefs where sea urchins typically lurk. With a key predator gone and a newfound need to forage for food rather than waiting for it to come to them, urchins emerged and turned the remaining kelp into a giant buffet.

    For the northern California kelp forests, the shift could spell doom for two reasons. The dominant species growing there is bull kelp (Nereocystis leutkeana). It dies each winter to return again in the spring, and the changes are making it more difficult to bounce back year after year.  In comparison, one of the main kelp species in Monterey Bay is giant kelp (Macrocystis pyrifera), which lives for many years, making it a bit more resilient.

    Bull kelp (Nereocystis leutkeana), seen here growing at Pescadero Point near Carmel-by-the-Sea, Calif., becomes the dominant species of kelp growing along the northern California coast. A marine heat wave and loss of a sea urchin predator has led to a massive loss of bull kelp in that region.Steve Lonhart/NOAA, MBNMS

    The kelp forests in the north also lack an urchin predator common farther south: sea otters. Those sea otters are what’s providing a glimmer of hope in Monterey Bay. Smith and his colleagues wondered how the bonanza of sea urchins was affecting the otters. They found that sea otters were eating three times as many sea urchins as they were before 2014, but they were being picky. They avoided the more populous urchin barrens, instead feasting only on urchins in the remaining patches of kelp. That’s because the barrens offer only a poor diet of scraps, leaving the urchins there essentially hollow on the inside. “Zombies,” Smith calls them.

    The nutrient-rich urchins in the healthy kelp make a far better sea otter snack. And by zeroing in on those urchins, the otters keep the population in check, preventing urchins from scarfing up the remaining kelp.

    Simply transplanting sea otters to new locations may create new challenges. That’s what happened off the Pacific Coast of Canada. Kelp forests there rebounded, but the otters competed with humans, especially Indigenous communities, that rely on the same food sources (SN: 6/11/20).

    “The community on the North Coast is a very natural resource–dependent community, and this will impact them,” says Marissa Baskett, an ecologist at the University of California, Davis.

    And there’s a lot of work to do to figure out how to bring back sunflower sea stars, now a critically endangered species. Nailing down the cause of the wasting syndrome, which is still unknown, will be crucial to recovery efforts.

    Even so, understanding these interactions can provide clues to how to help restore the lost kelp forests, Baskett says. “These findings can inform restoration efforts aimed at recovering kelp forests and anticipating the effects of future marine heat waves.” More

  • in

    AI used in battle against asbestos-linked cancer

    International genomics research led by the University of Leicester has used artificial intelligence (AI) to study an aggressive form of cancer, which could improve patient outcomes.
    Mesothelioma is caused by breathing asbestos particles and most commonly occurs in the linings of the lungs or abdomen. Currently, only seven per cent of people survive five years after diagnosis, with a prognosis averaging 12 to 18 months.
    New research undertaken by the Leicester Mesothelioma Research Programme has now revealed, using AI analysis of DNA-sequenced mesotheliomas, that they evolve along similar or repeated paths between individuals. These paths predict the aggressiveness and possible therapy of this otherwise incurable cancer.
    Professor Dean Fennell, Chair of Thoracic Medical Oncology at the University of Leicester and Director of the Leicester Mesothelioma Research Programme, said:
    “It has long been appreciated that asbestos causes mesothelioma, however how this occurs remains a mystery.
    “Using AI to interrogate genomic ‘big data’, this initial work shows us that mesotheliomas follow ordered paths of mutations during development, and that these so-called trajectories predict not only how long a patient may survive, but also how to better treat the cancer — something Leicester aims to lead on internationally through clinical trial initiatives.”
    While use of asbestos is now outlawed — and stringent regulations in place on its removal — each year around 25 people are diagnosed with mesothelioma in Leicestershire and 190 are diagnosed in the East Midlands. Cases of mesothelioma in the UK have increased by 61% since the early 1990s.
    Until very recently, chemotherapy was the only licenced choice for patients with mesothelioma. However, treatment options start to become limited once people stop responding to their treatment.
    Professor Fennell in collaboration with the University of Southampton recently made a major breakthrough in treating the disease by demonstrating that use of an immunotherapy drug called nivolumab increased survival and stabilised the disease for patients. This was the first-ever trial to demonstrate improved survival in patients with relapsed mesothelioma.
    Story Source:
    Materials provided by University of Leicester. Note: Content may be edited for style and length. More