More stories

  • in

    Outlook for the blue economy

    A handful of hyper-productive fisheries provide sustenance to a billion people and employ tens of millions. These fisheries occur on the eastern edges of the world’s oceans — off the West Coast of the U.S., the Canary Islands, Peru, Chile, and Benguela. There, a process called upwelling brings cold water and nutrients to the surface, which in turn supports large numbers of larger sea creatures that humans depend on for sustenance.
    A new project led by researchers at Texas A&M University is seeking to understand how changes to the climate and oceans will impact fisheries in the U.S. and around the world.
    “We’re interested in how climate change is going to alter upwelling and how the sustainability of the future fisheries will be impacted,” said Ping Chang, Louis & Elizabeth Scherck Chair in Oceanography at Texas A&M University (TAMU). “It turns out that when we increase the resolution of our climate models, we find that the upwelling simulation becomes much closer to reality.”
    Funded by the National Science Foundation (NSF), the project aims to develop medium to long-term fishery forecasts, driven by some of the highest-resolution coupled climate forecasts ever run. It is one of the 16 Convergence Accelerator Phase 1 projects that address the ‘Blue Economy’ — the sustainable use of ocean resources for economic growth. Convergence projects integrate scholars from different science disciplines.
    The TAMU team, led by oceanographer Piers Chapman, includes computational climate modelers, marine biogeochemical modelers, fishery modelers, decision support system experts, and risk communications scholars from academia, federal agencies, and industry.
    Chang and Gokhan Danabasoglu at the National Center for Atmospheric Research (NCAR) lead the climate modeling component of the research. They use the Frontera supercomputer at the Texas Advanced Computing Center (TACC) — the fastest academic supercomputer in the U.S. — to power their research.

    In the 1990s, marine biologist Andrew Bakun proposed that a warming climate would increase upwelling in the eastern boundary regions. He reasoned that since land is warming faster than the oceans, the temperature gradient between land and ocean would drive a stronger wind, which makes upwelling stronger. However, recent historical data suggests the opposite might in fact be the norm.
    “A lot of papers written in the past use coarse resolution models that don’t resolve upwelling very well,” Chang said. “High resolution models so far predict upwelling in most areas, not increasing. The models are predicting warmer, not colder temperatures in these waters. In Chile and Peru, the warming is quite significant — 2-3ºC warming in the worst case scenario, which is business as usual. That can be bad news for upwelling.”
    The areas where upwelling occur are quite narrow and localized, but their impact on the marine ecosystem is very large. The eastern Pacific upwelling, for instance, is only about 100 kilometers wide. The climate models used by the Intergovernmental Panel on Climate Change (IPCC) have a resolution of 100 kilometers — and would therefore only produce one data point for the upwelling region, not nearly enough to predict future changes accurately.
    On the other hand, the model used by Chang and his colleagues uses a resolution of 10 kilometers in each direction. These are 100 times more resolved than the IPCC models — and require roughly 100 times the compute power.
    Chang’s study relies on two separate, but related, sets of simulations. The first set involves an ensemble (the same model run with a slightly different starting point to produce a statistically valid result) of high-resolution coupled Earth system models. The second incorporates observed data in the atmosphere to generate realistic ocean states that are then used to initialize the model prediction. Starting from 1982, it will perform five-year retrospective forecasts to determine the skill of the model in forecasting upwelling effects.

    “There’s a limit to how far out you can make a forecast,” Chang said. “Beyond a certain time limit, the model no longer has skill. At five years, our model still shows useful skill.”
    The team reported their results in Nature’s Communications Earth & Environment in January 2023.
    The Blue Economy project continues the TAMU-NCAR team’s multi-decade effort to upgrade global climate models so they are higher resolution and more physically accurate. The model used by the team was one of a handful of high-resolution Earth system models that were included in the most recent IPCC report and are being explored by an IPCC subcommittee. They represent the future of global climate modeling.
    At 10 kilometer resolution, researchers believe it is possible for models to realistically generate extreme weather events like tropical cyclones or atmospheric rivers, as well as more refined predictions of how climate in a specific region will change. However, models at this resolution still cannot resolve clouds, which requires models with a few kilometer resolution and can currently only be integrated for short-term, not climate, timescales.
    The effort to capture the Earth system continues to improve.
    The TAMU-NCAR project will be one of the first to incorporate biogeochemical models of the ocean and fisheries models into Earth system models at 10 km resolution.
    “TACC is unique in providing resources for researchers like us to tackle the fundamental questions of science,” Chang said. “Our goal is not routine forecasts. What we want is a better understanding of the Earth system dynamics that are missing in current climate models to make our model and our methods better. Without Frontera, I don’t know if we could make simulations like we do. It’s critical.” More

  • in

    Quantum physicists make major nanoscopic advance

    In a new breakthrough, researchers at the University of Copenhagen, in collaboration with Ruhr University Bochum, have solved a problem that has caused quantum researchers headaches for years. The researchers can now control two quantum light sources rather than one. Trivial as it may seem to those uninitiated in quantum, this colossal breakthrough allows researchers to create a phenomenon known as quantum mechanical entanglement. This in turn, opens new doors for companies and others to exploit the technology commercially.
    Going from one to two is a minor feat in most contexts. But in the world of quantum physics, doing so is crucial. For years, researchers around the world have strived to develop stable quantum light sources and achieve the phenomenon known as quantum mechanical entanglement — a phenomenon, with nearly sci-fi-like properties, where two light sources can affect each other instantly and potentially across large geographic distances. Entanglement is the very basis of quantum networks and central to the development of an efficient quantum computer.
    Today, researchers from the Niels Bohr Institute published a new result in the journal Science, in which they succeeded in doing just that. According to Professor Peter Lodahl, one of the researchers behind the result, it is a crucial step in the effort to take the development of quantum technology to the next level and to “quantize” society’s computers, encryption and the internet.
    “We can now control two quantum light sources and connect them to each other. It might not sound like much, but it’s a major advancement and builds upon the past 20 years of work. By doing so, we’ve revealed the key to scaling up the technology, which is crucial for the most ground-breaking of quantum hardware applications,” says Professor Peter Lodahl, who has conducted research the area since 2001.
    The magic all happens in a so-called nanochip — which is not much larger than the diameter of a human hair — that the researchers also developed in recent years.
    Quantum sources overtake the world’s most powerful computer
    Peter Lodahl’s group is working with a type of quantum technology that uses light particles, called photons, as micro transporters to move quantum information about.

    While Lodahl’s group is a leader in this discipline of quantum physics, they have only been able to control one light source at a time until now. This is because light sources are extraordinarily sensitive to outside “noise,” making them very difficult to copy. In their new result, the research group succeeded in creating two identical quantum light sources rather than just one.
    “Entanglement means that by controlling one light source, you immediately affect the other. This makes it possible to create a whole network of entangled quantum light sources, all of which interact with one another, and which you can get to perform quantum bit operations in the same way as bits in a regular computer, only much more powerfully,” explains postdoc Alexey Tiranov, the article’s lead author.
    This is because a quantum bit can be both a 1 and 0 at the same time, which results in processing power that is unattainable using today’s computer technology. According to Professor Lodahl, just 100 photons emitted from a single quantum light source will contain more information than the world’s largest supercomputer can process.
    By using 20-30 entangled quantum light sources, there is the potential to build a universal error-corrected quantum computer — the ultimate “holy grail” for quantum technology, that large IT companies are now pumping many billions into.
    Other actors will build upon the research
    According to Lodahl, the biggest challenge has been to go from controlling one to two quantum light sources. Among other things, this has made it necessary for researchers to develop extremely quiet nanochips and have precise control over each light source.

    With the new research breakthrough, the fundamental quantum physics research is now in place. Now it is time for other actors to take the researchers’ work and use it in their quests to deploy quantum physics in a range of technologies including computers, the internet and encryption.
    “It is too expensive for a university to build a setup where we control 15-20 quantum light sources. So, now that we have contributed to understanding the fundamental quantum physics and taken the first step along the way, scaling up further is very much a technological task,” says Professor Lodahl.
    The research was conducted at the Danish National Research Foundation’s “Center of Excellence for Hybrid Quantum Networks (Hy-Q)” and is a collaboration between Ruhr University Bochum in Germany and the the University of Copenhagen’s Niels Bohr Institute. More

  • in

    New AI tool makes speedy gene-editing possible

    An artificial intelligence program may enable the first simple production of customizable proteins called zinc fingers to treat diseases by turning genes on and off.
    The researchers at NYU Grossman School of Medicine and the University of Toronto who designed the tool say it promises to accelerate the development of gene therapies on a large scale.
    Illnesses including cystic fibrosis, Tay-Sachs disease, and sickle cell anemia are caused by errors in the order of DNA letters that encode the operating instructions for every human cell. Scientists can in some cases correct these mistakes with gene editing methods that rearrange these letters.
    Other conditions are caused, not by a mistake in the code itself, but by problems in how the cellular machinery reads DNA (epigenetics). A gene, which provides the recipe for a particular protein, often partners with molecules called transcription factors that tell the cell how much of that protein to make. When this process goes awry, over- or underactive genes contribute to diabetes, cancer, and neurological disorders. As a result, researchers have been exploring ways to restore normal epigenetic activity.
    One such technique is zinc-finger editing, which can both change and control genes. Among the most abundant protein structures in the human body, zinc fingers can guide DNA repair by grabbing onto scissor-like enzymes and directing them to cut faulty segments out of the code.
    Similarly, zinc fingers can also hook onto transcription factors and pull them toward a gene segment in need of regulation. By customizing these instructions, genetic engineers can tailor any gene’s activity. A drawback, however, is that artificial zinc fingers are challenging to design for a specific task. Since these proteins attach to DNA in complex groups, researchers would need to be able to tell — out of countless possible combinations — how every zinc finger interacts with its neighbor for each desired genetic change.

    The study authors’ new technology, called ZFDesign, overcomes this obstacle by using artificial intelligence (AI) to model and design these interactions. The model is based on data generated by the screen of nearly 50 billion possible zinc finger-DNA interactions in the researchers’ labs. A report on the tool is publishing online Jan. 26 in the journal Nature Biotechnology.
    “Our program can identify the right grouping of zinc fingers for any modification, making this type of gene editing faster than ever before,” says study lead author David Ichikawa, PhD, a former graduate student at NYU Langone Health.
    Ichikawa notes that zinc-finger editing offers a potentially safer alternative to CRISPR, a key gene-editing technology with applications that range from finding new ways to kill cancer cells to designing more nourishing crops. Unlike the entirely human-derived zinc fingers, CRISPR, which stands for clustered regularly interspaced short palindromic repeat, relies on bacterial proteins to interact with genetic code. These “foreign” proteins could trigger patients’ immune defense systems, which may attack them like any other infection and lead to dangerous inflammation.
    The study authors add that besides posing a lower immune risk, the small size of zinc-finger tools may also provide more flexible gene therapy techniques compared with CRISPR by enabling more ways to deliver the tools to the right cells in patients.
    “By speeding up zinc-finger design coupled with their smaller size, our system paves the way for using these proteins to control multiple genes at the same time,” says study senior author Marcus Noyes, PhD. “In the future, this approach may help correct diseases that have multiple genetic causes, such as heart disease, obesity, and many cases of autism.”
    To test the computer’s AI design code, Noyes and his team used a customized zinc finger to disrupt the coding sequence of a gene in human cells. In addition, they built several zinc fingers that successfully reprogrammed transcription factors to bind near a target gene sequence and turn up or down its expression, demonstrating that their technology can be used for epigenetic changes.

    Noyes, an assistant professor in the Department of Biochemistry and Molecular Pharmacology at NYU Langone, cautions that, while promising, zinc fingers can be difficult to control. Since they are not always specific to a single gene, some combinations can affect DNA sequences beyond a particular target, leading to unintended changes in genetic code.
    As a result, Noyes says the team next plans to refine their AI program so it can build more precise zinc-finger groupings that only prompt the desired edit. Noyes is also a member of NYU Langone’s Institute for System Genetics.
    Funding for the study was provided by National Institutes of Health grants R01GM118851 and R01GM133936. Further funding was provided by Canadian Institutes of Health Research Project grant PJT-159750, the Compute Canada Resource Allocation, the Frederick Banting and Charles Best Canada Graduate Scholarship, and the Ontario Graduate Scholarship.
    Noyes is a co-founder of TBG Therapeutics, a company that develops methods to design zinc fingers and apply them to treatments for diseases with genetic components. NYU Langone has patents pending (PCT/US21/30267, 63145929) for these tools and approaches, from which both Noyes and NYU Langone may benefit financially. The terms and conditions of these relationships are being managed in accordance with the policies of NYU Langone.
    In addition to Noyes, other NYU investigators involved in the study were Manjunatha Kogenaru, PhD; April Mueller, BS; David Giganti, PhD; Gregory Goldberg, PhD; Samantha Adams, PhD; Jeffrey Spencer, PhD; Courtney Gianco; Finnegan Clark, BS; and Timothee Lionnet, PhD. Other study investigators included Osama Abdin, BS; Nader Alerasool, PhD; Han Wen, MS; Rozita Razavi, PhD, MPH; Satra Nim, PhD; Hong Zheng, PhD; Mikko Taipale, PhD; and Philip Kim, PhD, at the University of Toronto. Study lead author David Ichikawa is at the Pandemic Response Lab in Long Island City, N.Y. More