More stories

  • in

    Outlook for the blue economy

    A handful of hyper-productive fisheries provide sustenance to a billion people and employ tens of millions. These fisheries occur on the eastern edges of the world’s oceans — off the West Coast of the U.S., the Canary Islands, Peru, Chile, and Benguela. There, a process called upwelling brings cold water and nutrients to the surface, which in turn supports large numbers of larger sea creatures that humans depend on for sustenance.
    A new project led by researchers at Texas A&M University is seeking to understand how changes to the climate and oceans will impact fisheries in the U.S. and around the world.
    “We’re interested in how climate change is going to alter upwelling and how the sustainability of the future fisheries will be impacted,” said Ping Chang, Louis & Elizabeth Scherck Chair in Oceanography at Texas A&M University (TAMU). “It turns out that when we increase the resolution of our climate models, we find that the upwelling simulation becomes much closer to reality.”
    Funded by the National Science Foundation (NSF), the project aims to develop medium to long-term fishery forecasts, driven by some of the highest-resolution coupled climate forecasts ever run. It is one of the 16 Convergence Accelerator Phase 1 projects that address the ‘Blue Economy’ — the sustainable use of ocean resources for economic growth. Convergence projects integrate scholars from different science disciplines.
    The TAMU team, led by oceanographer Piers Chapman, includes computational climate modelers, marine biogeochemical modelers, fishery modelers, decision support system experts, and risk communications scholars from academia, federal agencies, and industry.
    Chang and Gokhan Danabasoglu at the National Center for Atmospheric Research (NCAR) lead the climate modeling component of the research. They use the Frontera supercomputer at the Texas Advanced Computing Center (TACC) — the fastest academic supercomputer in the U.S. — to power their research.

    In the 1990s, marine biologist Andrew Bakun proposed that a warming climate would increase upwelling in the eastern boundary regions. He reasoned that since land is warming faster than the oceans, the temperature gradient between land and ocean would drive a stronger wind, which makes upwelling stronger. However, recent historical data suggests the opposite might in fact be the norm.
    “A lot of papers written in the past use coarse resolution models that don’t resolve upwelling very well,” Chang said. “High resolution models so far predict upwelling in most areas, not increasing. The models are predicting warmer, not colder temperatures in these waters. In Chile and Peru, the warming is quite significant — 2-3ºC warming in the worst case scenario, which is business as usual. That can be bad news for upwelling.”
    The areas where upwelling occur are quite narrow and localized, but their impact on the marine ecosystem is very large. The eastern Pacific upwelling, for instance, is only about 100 kilometers wide. The climate models used by the Intergovernmental Panel on Climate Change (IPCC) have a resolution of 100 kilometers — and would therefore only produce one data point for the upwelling region, not nearly enough to predict future changes accurately.
    On the other hand, the model used by Chang and his colleagues uses a resolution of 10 kilometers in each direction. These are 100 times more resolved than the IPCC models — and require roughly 100 times the compute power.
    Chang’s study relies on two separate, but related, sets of simulations. The first set involves an ensemble (the same model run with a slightly different starting point to produce a statistically valid result) of high-resolution coupled Earth system models. The second incorporates observed data in the atmosphere to generate realistic ocean states that are then used to initialize the model prediction. Starting from 1982, it will perform five-year retrospective forecasts to determine the skill of the model in forecasting upwelling effects.

    “There’s a limit to how far out you can make a forecast,” Chang said. “Beyond a certain time limit, the model no longer has skill. At five years, our model still shows useful skill.”
    The team reported their results in Nature’s Communications Earth & Environment in January 2023.
    The Blue Economy project continues the TAMU-NCAR team’s multi-decade effort to upgrade global climate models so they are higher resolution and more physically accurate. The model used by the team was one of a handful of high-resolution Earth system models that were included in the most recent IPCC report and are being explored by an IPCC subcommittee. They represent the future of global climate modeling.
    At 10 kilometer resolution, researchers believe it is possible for models to realistically generate extreme weather events like tropical cyclones or atmospheric rivers, as well as more refined predictions of how climate in a specific region will change. However, models at this resolution still cannot resolve clouds, which requires models with a few kilometer resolution and can currently only be integrated for short-term, not climate, timescales.
    The effort to capture the Earth system continues to improve.
    The TAMU-NCAR project will be one of the first to incorporate biogeochemical models of the ocean and fisheries models into Earth system models at 10 km resolution.
    “TACC is unique in providing resources for researchers like us to tackle the fundamental questions of science,” Chang said. “Our goal is not routine forecasts. What we want is a better understanding of the Earth system dynamics that are missing in current climate models to make our model and our methods better. Without Frontera, I don’t know if we could make simulations like we do. It’s critical.” More

  • in

    It’s possible to reach net-zero carbon emissions. Here’s how

    Patricia Hidalgo-Gonzalez saw the future of energy on a broiling-hot day last September.

    An email alert hit her inbox from the San Diego Gas & Electric Company. “Extreme heat straining the grid,” read the message, which was also pinged as a text to 27 million people. “Save energy to help avoid power interruptions.”

    It worked. People cut their energy use. Demand plunged, blackouts were avoided and California successfully weathered a crisis exacerbated by climate change. “It was very exciting to see,” says Hidalgo-Gonzalez, an electrical engineer at the University of California, San Diego who studies renewable energy and the power grid.

    Science News headlines, in your inbox

    Headlines and summaries of the latest Science News articles, delivered to your email inbox every Thursday.

    Thank you for signing up!

    There was a problem signing you up.

    This kind of collective societal response, in which we reshape how we interact with the systems that provide us energy, will be crucial as we figure out how to live on a changing planet.

    Earth has warmed at least 1.1 degrees Celsius since the 19th century, when the burning of coal, oil and other fossil fuels began belching heat-trapping gases such as carbon dioxide into the atmosphere. Scientists agree that only drastic action to cut emissions can keep the planet from blasting past 1.5 degrees of warming — a threshold beyond which the consequences become even more catastrophic than the rising sea levels, extreme weather and other impacts the world is already experiencing.

    The goal is to achieve what’s known as net-zero emissions, where any greenhouse gases still entering the atmosphere are balanced by those being removed — and to do it as soon as we can.

    Scientists say it is possible to swiftly transform the ways we produce and consume energy. To show the way forward, researchers have set out paths toward a world where human activities generate little to no carbon dioxide and other greenhouse gases — a decarbonized economy.

    The key to a decarbonized future lies in producing vast amounts of new electricity from sources that emit little to none of the gases, such as wind, solar and hydropower, and then transforming as much of our lives and our industries as possible to run off those sources. Clean electricity needs to power not only the planet’s current energy use but also the increased demands of a growing global population.

    Once humankind has switched nearly entirely to clean electricity, we will also have to counter­balance the carbon dioxide we still emit — yes, we will still emit some — by pulling an equivalent amount of carbon dioxide out of the atmosphere and storing it somewhere permanently.

    Achieving net-zero emissions won’t be easy. Getting to effective and meaningful action on climate change requires overcoming decades of inertia and denial about the scope and magnitude of the problem. Nations are falling well short of existing pledges to reduce emissions, and global warming remains on track to charge past 1.5 degrees perhaps even by the end of this decade.

    Yet there is hope. The rate of growth in CO2 emissions is slowing globally — down from 3 percent annual growth in the 2000s to half a percent annual growth in the last decade, according to the Global Carbon Project, which quantifies greenhouse gas emissions.

    There are signs annual emissions could start shrinking. And over the last two years, the United States, by far the biggest cumulative contributor to global warming, has passed several pieces of federal legislation that include financial incentives to accelerate the transition to clean energy. “We’ve never seen anything at this scale,” says Erin Mayfield, an energy researcher at Dartmouth College.

    Though the energy transition will require many new technologies, such as innovative ways to permanently remove carbon from the atmosphere, many of the solutions, such as wind and solar power, are in hand — “stuff we already have,” Mayfield says.

    The current state of carbon dioxide emissions

    Of all the emissions that need to be slashed, the most important is carbon dioxide, which comes from many sources such as cars and trucks and coal-burning power plants. The gas accounted for 79 percent of U.S. greenhouse gas emissions in 2020. The next most significant greenhouse gas, at 11 percent of emissions in the United States, is methane, which comes from oil and gas operations as well as livestock, landfills and other land uses.

    The amount of methane may seem small, but it is mighty — over the short term, methane is more than 80 times as efficient at trapping heat as carbon dioxide is, and methane’s atmospheric levels have nearly tripled in the last two centuries. Other greenhouse gases include nitrous oxides, which come from sources such as applying fertilizer to crops or burning fuels and account for 7 percent of U.S. emissions, and human-made fluorinated gases such as hydrofluorocarbons that account for 3 percent.

    Globally, emissions are dominated by large nations that produce lots of energy. The United States alone emits around 5 billion metric tons of carbon dioxide each year. It is responsible for most of the greenhouse gas emissions throughout history and ceded the spot for top annual emitter to China only in the mid-2000s. India ranks third.

    Because of the United States’ role in producing most of the carbon pollution to date, many researchers and advocates argue that it has the moral responsibility to take the global lead on cutting emissions. And the United States has the most ambitious goals of the major emitters, at least on paper. President Joe Biden has said the country is aiming to reach net-zero emissions by 2050. Leaders in China and India have set net-zero goals of 2060 and 2070, respectively.

    Under the auspices of a 2015 international climate change treaty known as the Paris agreement, 193 nations plus the European Union have pledged to reduce their emissions. The agreement aims to keep global warming well below 2 degrees, and ideally to 1.5 degrees, above preindustrial levels. But it is insufficient. Even if all countries cut their emissions as much as they have promised under the Paris agreement, the world would likely blow past 2 degrees of warming before the end of this century. 

    Every nation continues to find its own path forward. “At the end of the day, all the solutions are going to be country-specific,” says Sha Yu, an earth scientist at the Pacific Northwest National Laboratory and University of Maryland’s Joint Global Change Research Institute in College Park, Md. “There’s not a universal fix.”

    But there are some common themes for how to accomplish this energy transition — ways to focus our efforts on the things that will matter most. These are efforts that go beyond individual consumer choices such as whether to fly less or eat less meat. They instead penetrate every aspect of how society produces and consumes energy.

    Such massive changes will need to overcome a lot of resistance, including from companies that make money off old forms of energy as well as politicians and lobbyists. But if society can make these changes, it will rank as one of humanity’s greatest accomplishments. We will have tackled a problem of our own making and conquered it.

    Here’s a look at what we’ll need to do.

    Make as much clean electricity as possible

    To meet the need for energy without putting carbon dioxide into the atmosphere, countries would need to dramatically scale up the amount of clean energy they produce. Fortunately, most of that energy would be generated by technologies we already have — renewable sources of energy including wind and solar power.

    “Renewables, far and wide, are the key pillar in any net-zero scenario,” says Mayfield, who worked on an influential 2021 report from Princeton University’s Net-Zero America project, which focused on the U.S. economy.

    The Princeton report envisions wind and solar power production roughly quadrupling by 2030 to get the United States to net-zero emissions by 2050. That would mean building many new solar and wind farms, so many that in the most ambitious scenario, wind turbines would cover an area the size of Arkansas, Iowa, Kansas, Missouri, Nebraska and Oklahoma combined.

    Such a scale-up is only possible because prices to produce renewable energy have plunged. The cost of wind power has dropped nearly 70 percent, and solar power nearly 90 percent, over the last decade in the United States. “That was a game changer that I don’t know if some people were expecting,” Hidalgo-Gonzalez says.

    Globally the price drop in renewables has allowed growth to surge; China, for instance, installed a record 55 gigawatts of solar power capacity in 2021, for a total of 306 gigawatts or nearly 13 percent of the nation’s installed capacity to generate electricity. China is almost certain to have had another record year for solar power installations in 2022.

    Challenges include figuring out ways to store and transmit all that extra electricity, and finding locations to build wind and solar power installations that are acceptable to local communities. Other types of low-carbon power, such as hydropower and nuclear power, which comes with its own public resistance, will also likely play a role going forward.

    Get efficient and go electric

    The drive toward net-zero emissions also requires boosting energy efficiency across industries and electrifying as many aspects of modern life as possible, such as transportation and home heating.

    Some industries are already shifting to more efficient methods of production, such as steelmaking in China that incorporates hydrogen-based furnaces that are much cleaner than coal-fired ones, Yu says. In India, simply closing down the most inefficient coal-burning power plants provides the most bang for the buck, says Shayak Sengupta, an energy and policy expert at the Observer Research Foundation America think tank in Washington, D.C. “The list has been made up,” he says, of the plants that should close first, “and that’s been happening.”

    To achieve net-zero, the United States would need to increase its share of electric heat pumps, which heat houses much more cleanly than gas- or oil-fired appliances, from around 10 percent in 2020 to as much as 80 percent by 2050, according to the Princeton report. Federal subsidies for these sorts of appliances are rolling out in 2023 as part of the new Inflation Reduction Act, legislation that contains a number of climate-related provisions.

    Shifting cars and other vehicles away from burning gasoline to running off of electricity would also lead to significant emissions cuts. In a major 2021 report, the National Academies of Sciences, Engineering and Medicine said that one of the most important moves in decarbonizing the U.S. economy would be having electric vehicles account for half of all new vehicle sales by 2030. That’s not impossible; electric car sales accounted for nearly 6 percent of new sales in the United States in 2022, which is still a low number but nearly double the previous year.

    Make clean fuels

    Some industries such as manufacturing and transportation can’t be fully electrified using current technologies — battery powered airplanes, for instance, will probably never be feasible for long-duration flights. Technologies that still require liquid fuels will need to switch from gas, oil and other fossil fuels to low-carbon or zero-carbon fuels.

    One major player will be fuels extracted from plants and other biomass, which take up carbon dioxide as they grow and emit it when they die, making them essentially carbon neutral over their lifetime. To create biofuels, farmers grow crops, and others process the harvest in conversion facilities into fuels such as hydrogen. Hydrogen, in turn, can be substituted for more carbon-intensive substances in various industrial processes such as making plastics and fertilizers — and maybe even as fuel for airplanes someday.

    In one of the Princeton team’s scenarios, the U.S. Midwest and Southeast would become peppered with biomass conversion plants by 2050, so that fuels can be processed close to where crops are grown. Many of the biomass feedstocks could potentially grow alongside food crops or replace other, nonfood crops.

    Cut methane and other non-CO2 emissions

    Greenhouse gas emissions other than carbon dioxide will also need to be slashed. In the United States, the majority of methane emissions come from livestock, landfills and other agricultural sources, as well as scattered sources such as forest fires and wetlands. But about one-third of U.S. methane emissions come from oil, gas and coal operations. These may be some of the first places that regulators can target for cleanup, especially “super emitters” that can be pinpointed using satellites and other types of remote sensing.

    In 2021, the United States and the European Union unveiled what became a global methane pledge endorsed by 150 countries to reduce emissions. There is, however, no enforcement of it yet. And China, the world’s largest methane emitter, has not signed on.

    Nitrous oxides could be reduced by improving soil management techniques, and fluorinated gases by finding alternatives and improving production and recycling efforts.

    Sop up as much CO2 as possible

    Once emissions have been cut as much as possible, reaching net-zero will mean removing and storing an equivalent amount of carbon to what society still emits.

    One solution already in use is to capture carbon dioxide produced at power plants and other industrial facilities and store it permanently somewhere, such as deep underground. Globally there are around 35 such operations, which collectively draw down around 45 million tons of carbon dioxide annually. About 200 new plants are on the drawing board to be operating by the end of this decade, according to the International Energy Agency.

    The Princeton report envisions carbon capture being added to almost every kind of U.S. industrial plant, from cement production to biomass conversion. Much of the carbon dioxide would be liquefied and piped along more than 100,000 kilometers of new pipelines to deep geologic storage, primarily along the Texas Gulf Coast, where underground reservoirs can be used to trap it permanently. This would be a massive infrastructure effort. Building this pipeline network could cost up to $230 billion, including $13 billion for early buy-in from local communities and permitting alone.

    Another way to sop up carbon is to get forests and soils to take up more. That could be accomplished by converting crops that are relatively carbon-intensive, such as corn to be used in ethanol, to energy-rich grasses that can be used for more efficient biofuels, or by turning some cropland or pastures back into forest. It’s even possible to sprinkle crushed rock onto croplands, which accelerates natural weathering processes that suck carbon dioxide out of the atmosphere.

    Another way to increase the amount of carbon stored in the land is to reduce the amount of the Amazon rainforest that is cut down each year. “For a few countries like Brazil, preventing deforestation will be the first thing you can do,” Yu says.

    When it comes to climate change, there’s no time to waste

    The Princeton team estimates that the United States would need to invest at least an additional $2.5 trillion over the next 10 years for the country to have a shot at achieving net-zero emissions by 2050. Congress has begun ramping up funding with two large pieces of federal legislation it passed in 2021 and 2022. Those steer more than $1 trillion toward modernizing major parts of the nation’s economy over a decade — including investing in the energy transition to help fight climate change.

    Between now and 2030, solar and wind power, plus increasing energy efficiency, can deliver about half of the emissions reductions needed for this decade, the International Energy Agency estimates. After that, the primary drivers would need to be increasing electrification, carbon capture and storage, and clean fuels such as hydrogen.

    A lot of the technology needed for a future with fewer carbon dioxide emissions is already available. The Ivanpah Solar Electric Generating System in the Mojave Desert focuses sunlight to generate steam. That steam spins turbines to make electricity.ADAMKAZ/E+/GETTY IMAGES

    The trick is to do all of this without making people’s lives worse. Developing nations need to be able to supply energy for their economies to develop. Communities whose jobs relied on fossil fuels need to have new economic opportunities.

    Julia Haggerty, a geographer at Montana State University in Bozeman who studies communities that are dependent on natural resources, says that those who have money and other resources to support the transition will weather the change better than those who are under-resourced now. “At the landscape of states and regions, it just remains incredibly uneven,” she says.

    The ongoing energy transition also faces unanticipated shocks such as Russia’s invasion of Ukraine, which sent energy prices soaring in Europe, and the COVID-19 pandemic, which initially slashed global emissions but later saw them rebound.

    But the technologies exist for us to wean our lives off fossil fuels. And we have the inventiveness to develop more as needed. Transforming how we produce and use energy, as rapidly as possible, is a tremendous challenge — but one that we can meet head-on. For Mayfield, getting to net-zero by 2050 is a realistic goal for the United States. “I think it’s possible,” she says. “But it doesn’t mean there’s not a lot more work to be done.” More

  • in

    Quantum physicists make major nanoscopic advance

    In a new breakthrough, researchers at the University of Copenhagen, in collaboration with Ruhr University Bochum, have solved a problem that has caused quantum researchers headaches for years. The researchers can now control two quantum light sources rather than one. Trivial as it may seem to those uninitiated in quantum, this colossal breakthrough allows researchers to create a phenomenon known as quantum mechanical entanglement. This in turn, opens new doors for companies and others to exploit the technology commercially.
    Going from one to two is a minor feat in most contexts. But in the world of quantum physics, doing so is crucial. For years, researchers around the world have strived to develop stable quantum light sources and achieve the phenomenon known as quantum mechanical entanglement — a phenomenon, with nearly sci-fi-like properties, where two light sources can affect each other instantly and potentially across large geographic distances. Entanglement is the very basis of quantum networks and central to the development of an efficient quantum computer.
    Today, researchers from the Niels Bohr Institute published a new result in the journal Science, in which they succeeded in doing just that. According to Professor Peter Lodahl, one of the researchers behind the result, it is a crucial step in the effort to take the development of quantum technology to the next level and to “quantize” society’s computers, encryption and the internet.
    “We can now control two quantum light sources and connect them to each other. It might not sound like much, but it’s a major advancement and builds upon the past 20 years of work. By doing so, we’ve revealed the key to scaling up the technology, which is crucial for the most ground-breaking of quantum hardware applications,” says Professor Peter Lodahl, who has conducted research the area since 2001.
    The magic all happens in a so-called nanochip — which is not much larger than the diameter of a human hair — that the researchers also developed in recent years.
    Quantum sources overtake the world’s most powerful computer
    Peter Lodahl’s group is working with a type of quantum technology that uses light particles, called photons, as micro transporters to move quantum information about.

    While Lodahl’s group is a leader in this discipline of quantum physics, they have only been able to control one light source at a time until now. This is because light sources are extraordinarily sensitive to outside “noise,” making them very difficult to copy. In their new result, the research group succeeded in creating two identical quantum light sources rather than just one.
    “Entanglement means that by controlling one light source, you immediately affect the other. This makes it possible to create a whole network of entangled quantum light sources, all of which interact with one another, and which you can get to perform quantum bit operations in the same way as bits in a regular computer, only much more powerfully,” explains postdoc Alexey Tiranov, the article’s lead author.
    This is because a quantum bit can be both a 1 and 0 at the same time, which results in processing power that is unattainable using today’s computer technology. According to Professor Lodahl, just 100 photons emitted from a single quantum light source will contain more information than the world’s largest supercomputer can process.
    By using 20-30 entangled quantum light sources, there is the potential to build a universal error-corrected quantum computer — the ultimate “holy grail” for quantum technology, that large IT companies are now pumping many billions into.
    Other actors will build upon the research
    According to Lodahl, the biggest challenge has been to go from controlling one to two quantum light sources. Among other things, this has made it necessary for researchers to develop extremely quiet nanochips and have precise control over each light source.

    With the new research breakthrough, the fundamental quantum physics research is now in place. Now it is time for other actors to take the researchers’ work and use it in their quests to deploy quantum physics in a range of technologies including computers, the internet and encryption.
    “It is too expensive for a university to build a setup where we control 15-20 quantum light sources. So, now that we have contributed to understanding the fundamental quantum physics and taken the first step along the way, scaling up further is very much a technological task,” says Professor Lodahl.
    The research was conducted at the Danish National Research Foundation’s “Center of Excellence for Hybrid Quantum Networks (Hy-Q)” and is a collaboration between Ruhr University Bochum in Germany and the the University of Copenhagen’s Niels Bohr Institute. More

  • in

    New AI tool makes speedy gene-editing possible

    An artificial intelligence program may enable the first simple production of customizable proteins called zinc fingers to treat diseases by turning genes on and off.
    The researchers at NYU Grossman School of Medicine and the University of Toronto who designed the tool say it promises to accelerate the development of gene therapies on a large scale.
    Illnesses including cystic fibrosis, Tay-Sachs disease, and sickle cell anemia are caused by errors in the order of DNA letters that encode the operating instructions for every human cell. Scientists can in some cases correct these mistakes with gene editing methods that rearrange these letters.
    Other conditions are caused, not by a mistake in the code itself, but by problems in how the cellular machinery reads DNA (epigenetics). A gene, which provides the recipe for a particular protein, often partners with molecules called transcription factors that tell the cell how much of that protein to make. When this process goes awry, over- or underactive genes contribute to diabetes, cancer, and neurological disorders. As a result, researchers have been exploring ways to restore normal epigenetic activity.
    One such technique is zinc-finger editing, which can both change and control genes. Among the most abundant protein structures in the human body, zinc fingers can guide DNA repair by grabbing onto scissor-like enzymes and directing them to cut faulty segments out of the code.
    Similarly, zinc fingers can also hook onto transcription factors and pull them toward a gene segment in need of regulation. By customizing these instructions, genetic engineers can tailor any gene’s activity. A drawback, however, is that artificial zinc fingers are challenging to design for a specific task. Since these proteins attach to DNA in complex groups, researchers would need to be able to tell — out of countless possible combinations — how every zinc finger interacts with its neighbor for each desired genetic change.

    The study authors’ new technology, called ZFDesign, overcomes this obstacle by using artificial intelligence (AI) to model and design these interactions. The model is based on data generated by the screen of nearly 50 billion possible zinc finger-DNA interactions in the researchers’ labs. A report on the tool is publishing online Jan. 26 in the journal Nature Biotechnology.
    “Our program can identify the right grouping of zinc fingers for any modification, making this type of gene editing faster than ever before,” says study lead author David Ichikawa, PhD, a former graduate student at NYU Langone Health.
    Ichikawa notes that zinc-finger editing offers a potentially safer alternative to CRISPR, a key gene-editing technology with applications that range from finding new ways to kill cancer cells to designing more nourishing crops. Unlike the entirely human-derived zinc fingers, CRISPR, which stands for clustered regularly interspaced short palindromic repeat, relies on bacterial proteins to interact with genetic code. These “foreign” proteins could trigger patients’ immune defense systems, which may attack them like any other infection and lead to dangerous inflammation.
    The study authors add that besides posing a lower immune risk, the small size of zinc-finger tools may also provide more flexible gene therapy techniques compared with CRISPR by enabling more ways to deliver the tools to the right cells in patients.
    “By speeding up zinc-finger design coupled with their smaller size, our system paves the way for using these proteins to control multiple genes at the same time,” says study senior author Marcus Noyes, PhD. “In the future, this approach may help correct diseases that have multiple genetic causes, such as heart disease, obesity, and many cases of autism.”
    To test the computer’s AI design code, Noyes and his team used a customized zinc finger to disrupt the coding sequence of a gene in human cells. In addition, they built several zinc fingers that successfully reprogrammed transcription factors to bind near a target gene sequence and turn up or down its expression, demonstrating that their technology can be used for epigenetic changes.

    Noyes, an assistant professor in the Department of Biochemistry and Molecular Pharmacology at NYU Langone, cautions that, while promising, zinc fingers can be difficult to control. Since they are not always specific to a single gene, some combinations can affect DNA sequences beyond a particular target, leading to unintended changes in genetic code.
    As a result, Noyes says the team next plans to refine their AI program so it can build more precise zinc-finger groupings that only prompt the desired edit. Noyes is also a member of NYU Langone’s Institute for System Genetics.
    Funding for the study was provided by National Institutes of Health grants R01GM118851 and R01GM133936. Further funding was provided by Canadian Institutes of Health Research Project grant PJT-159750, the Compute Canada Resource Allocation, the Frederick Banting and Charles Best Canada Graduate Scholarship, and the Ontario Graduate Scholarship.
    Noyes is a co-founder of TBG Therapeutics, a company that develops methods to design zinc fingers and apply them to treatments for diseases with genetic components. NYU Langone has patents pending (PCT/US21/30267, 63145929) for these tools and approaches, from which both Noyes and NYU Langone may benefit financially. The terms and conditions of these relationships are being managed in accordance with the policies of NYU Langone.
    In addition to Noyes, other NYU investigators involved in the study were Manjunatha Kogenaru, PhD; April Mueller, BS; David Giganti, PhD; Gregory Goldberg, PhD; Samantha Adams, PhD; Jeffrey Spencer, PhD; Courtney Gianco; Finnegan Clark, BS; and Timothee Lionnet, PhD. Other study investigators included Osama Abdin, BS; Nader Alerasool, PhD; Han Wen, MS; Rozita Razavi, PhD, MPH; Satra Nim, PhD; Hong Zheng, PhD; Mikko Taipale, PhD; and Philip Kim, PhD, at the University of Toronto. Study lead author David Ichikawa is at the Pandemic Response Lab in Long Island City, N.Y. More