More stories

  • in

    New study warns: We have underestimated the pace at which the Arctic is melting

    Temperatures in the Arctic Ocean between Canada, Russia and Europe are warming faster than researchers’ climate models have been able to predict.
    Over the past 40 years, temperatures have risen by one degree every decade, and even more so over the Barents Sea and around Norway’s Svalbard archipelago, where they have increased by 1.5 degrees per decade throughout the period.
    This is the conclusion of a new study published in Nature Climate Change.
    “Our analyses of Arctic Ocean conditions demonstrate that we have been clearly underestimating the rate of temperature increases in the atmosphere nearest to the sea level, which has ultimately caused sea ice to disappear faster than we had anticipated,” explains Jens Hesselbjerg Christensen, a professor at the University of Copenhagen’s Niels Bohr Institutet (NBI) and one of the study’s researchers.
    Together with his NBI colleagues and researchers from the Universities of Bergen and Oslo, the Danish Metrological Institute and Australian National University, he compared current temperature changes in the Arctic with climate fluctuations that we know from, for example, Greenland during the ice age between 120,000-11,000 years ago.
    “The abrupt rise in temperature now being experienced in the Arctic has only been observed during the last ice age. During that time, analyses of ice cores revealed that temperatures over the Greenland Ice Sheet increased several times, between 10 to 12 degrees, over a 40 to 100-year period,” explains Jens Hesselbjerg Christensen.
    He emphasizes that the significance of the steep rise in temperature is yet to be fully appreciated. And, that an increased focus on the Arctic and reduced global warming, more generally, are musts.
    Climate models ought to take abrupt changes into account Until now, climate models predicted that Arctic temperatures would increase slowly and in a stable manner. However, the researchers’ analysis demonstrates that these changes are moving along at a much faster pace than expected.
    “We have looked at the climate models analysed and assessed by the UN Climate Panel. Only those models based on the worst-case scenario, with the highest carbon dioxide emissions, come close to what our temperature measurements show over the past 40 years, from 1979 to today,” says Jens Hesselbjerg Christensen.
    In the future, there ought to be more of a focus on being able to simulate the impact of abrupt climate change on the Arctic. Doing so will allow us to create better models that can accurately predict temperature increases:
    “Changes are occurring so rapidly during the summer months that sea ice is likely to disappear faster than most climate models have ever predicted. We must continue to closely monitor temperature changes and incorporate the right climate processes into these models,” says Jens Hesselbjerg Christensen. He concludes:
    “Thus, successfully implementing the necessary reductions in greenhouse gas emissions to meet the Paris Agreement is essential in order to ensure a sea-ice packed Arctic year-round.”

    Story Source:
    Materials provided by University of Copenhagen. Note: Content may be edited for style and length. More

  • in

    NBA playoff format is optimizing competitive balance by eliminating travel

    In addition to helping protect players from COVID-19, the NBA “bubble” in Orlando may be a competitive equalizer by eliminating team travel. Researchers analyzing the results of nearly 500 NBA playoff games over six seasons found that a team’s direction of travel and the number of time zones crossed were associated with its predicted win probability and actual game performance.
    Preliminary results of the study suggest that the 2020 NBA playoffs, which begin Aug. 17, will eliminate any advantages or disadvantages related to long-distance travel. In this year’s unique playoff format, implemented due to the COVID-19 pandemic, all 16 teams will stay in Orlando, Florida, and compete at the ESPN Wide World of Sports Complex in Walt Disney World.
    The study found that scoring was significantly higher following eastward travel. Although there were no differences in actual game outcomes based on overall direction of travel, there were differences when considering both the direction and magnitude of travel. Teams that traveled east with three-hour time zone changes had higher predicted probabilities of winning than teams that traveled west or played in the same time zone. In contrast, teams that traveled west across three time zones had lower predicted win probabilities than teams that traveled east or played in the same time zone.
    “During this initial study, it was interesting to find that team scoring improved during general eastward travel compared to westward travel and travel in the same zone, but game outcomes were unaffected by direction of travel during the playoffs,” said lead author Sean Pradhan, assistant professor of sports management and business analytics in the School of Business Administration at Menlo College in Atherton, California. “However, when considering the magnitude of travel across different time zones, we found that teams had predicted probabilities of winning that were lower after traveling three time zones westward, and tended to actually lose more games when traveling two time zones westward compared to most other types of travel.”
    Circadian rhythms are endogenous, near-24-hour biological rhythms that exist in all living organisms, and these daily rhythms have peaks and troughs in both alertness and sleepiness that can impact individuals in high-performance professions. Therefore, an athlete has a greater opportunity for optimal performance when the timing of an activity is synchronized with the body’s circadian clock.
    Researchers from Menlo College and other collaborators reviewed data from 499 NBA playoff games from the 2013-2014 through 2018-2019 seasons. They looked at the impact of direction of travel and time zones traveled on actual game outcomes, team quality, predicted win probability, and team scoring for visiting teams.
    “A great deal of prior work has examined the effects of travel and circadian advantages on team performance during the regular season of various professional sports leagues,” said Pradhan. “The current study extends such findings of previous research by examining team performance in the NBA playoffs, which is obviously an extremely crucial time for teams competing.”

    Story Source:
    Materials provided by American Academy of Sleep Medicine. Note: Content may be edited for style and length. More

  • in

    Revised code could help improve efficiency of fusion experiments

    An international team of researchers led by the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has upgraded a key computer code for calculating forces acting on magnetically confined plasma in fusion energy experiments. The upgrade will be part of a suite of computational tools that will allow scientists to further improve the design of breakfast-cruller-shaped facilities known as stellarators. Together, the three codes in the suite could help scientists bring efficient fusion reactors closer to reality.
    The revised software lets researchers more easily determine the boundary of plasma in stellarators. When used in concert with two other codes, the code could help find a stellarator configuration that improves the performance of the design. The two complementary codes determine the optimal location for the plasma in a stellarator vacuum chamber to maximize the efficiency of the fusion reactions, and determine the shape that the external electromagnets must have to hold the plasma in the proper position.
    The revised software, called the “free-boundary stepped-pressure equilibrium code (SPEC),” is one of a set of tools scientists can use to tweak the performance of plasma to more easily create fusion energy. “We want to optimize both the plasma position and the magnetic coils to balance the force that makes the plasma expand while holding it in place,” said Stuart Hudson, physicist, deputy head of the Theory Department at PPPL and lead author of the paper reporting the results in Plasma Physics and Controlled Fusion.
    “That way we can create a stable plasma whose particles are more likely to fuse. The updated SPEC code enables us to know where the plasma will be for a given set of magnetic coils.”
    Fusion combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — and in the process generates massive amounts of energy in the sun and stars. Scientists are seeking to replicate fusion in devices on Earth for a virtually inexhaustible supply of safe and clean power to generate electricity.
    Plasma stability is crucial for fusion. If plasma bounces around inside a stellarator, it can escape, cool, and tamp down the fusion reactions, in effect quenching the fusion fire. An earlier version of the code, also developed by Hudson, could only calculate how forces were affecting a plasma if the researchers already knew the plasma’s location. Researchers, however, typically don’t have that information. “That’s one of the problems with plasmas,” Hudson said. “They move all over the place.”
    The new version of the SPEC code helps solve the problem by allowing researchers to calculate the plasma’s boundary without knowing its position beforehand. Used in coordination with a coil-design code called FOCUS and an optimization code called STELLOPT — both of which were also developed at PPPL — SPEC lets physicists simultaneously ensure that the plasma will have the best fusion performance and the magnets will not be too complicated to build. “There’s no point optimizing the shape of the plasma and then later finding out that the magnets would be incredibly difficult to construct,” Hudson said.
    One challenge that Hudson and colleagues faced was verifying that each step of the code upgrade was done correctly. Their slow-and-steady approach was crucial to making sure that the code makes accurate calculations. “Let’s say you are designing a component that will go on a rocket to the moon,” Hudson said. “It’s very important that that part works. So you test and test and test.”
    Updating any computer code calls for a number of interlocking steps:
    First, scientists must translate a set of mathematical equations describing the plasma into a programming language that a computer can understand;
    Next, scientists must determine the mathematical steps needed to solve the equations;
    Finally, the scientists must verify that the code produces correct results, either by comparing the results with those produced by a code that has already been verified or using the code to solve simple equations whose answers are easy to check.
    Hudson and colleagues performed the calculations with widely different methods. They used pencil and paper to determine the equations and solution steps, and powerful PPPL computers to verify the results. “We demonstrated that the code works,” Hudson said. “Now it can be used to study current experiments and design new ones.”
    Collaborators on the paper include researchers at the Max Planck Institute for Plasma Physics, the Australian National University, and the Swiss École Polytechnique Fédérale de Lausanne. The research was supported by the DOE’s Office of Science (Fusion Energy Sciences), the Euratom research and training program, the Australian Research Council, and the Simons Foundation.

    Story Source:
    Materials provided by DOE/Princeton Plasma Physics Laboratory. Original written by Raphael Rosen. Note: Content may be edited for style and length. More

  • in

    Virtual imaging trials optimize CT, radiography for COVID-19

    An open-access article in ARRS’ American Journal of Roentgenology (AJR) established a foundation for the use of virtual imaging trials in effective assessment and optimization of CT and radiography acquisitions and analysis tools to help manage the coronavirus disease (COVID-19) pandemic.
    Virtual imaging trials have two main components–representative models of targeted subjects and realistic models of imaging scanners–and the authors of this AJR article developed the first computational models of patients with COVID-19, while showing, as proof of principle, how they can be combined with imaging simulators for COVID-19 imaging studies.
    “For the body habitus of the models,” lead author Ehsan Abadi explained, “we used the 4D extended cardiac-torso (XCAT) model that was developed at Duke University.”
    Abadi and his Duke colleagues then segmented the morphologic features of COVID-19 abnormalities from 20 CT images of patients with multidiagnostic confirmation of SARS-CoV-2 infection and incorporated them into XCAT models.
    “Within a given disease area, the texture and material of the lung parenchyma in the XCAT were modified to match the properties observed in the clinical images,” Abadi et al. continued.
    Using a specific CT scanner (Definition Flash, Siemens Healthineers) and validated radiography simulator (DukeSim) to help illustrate utility, the team virtually imaged three developed COVID-19 computational phantoms.
    “Subjectively,” the authors concluded, “the simulated abnormalities were realistic in terms of shape and texture,” adding their preliminary results showed that the contrast-to-noise ratios in the abnormal regions were 1.6, 3.0, and 3.6 for 5-, 25-, and 50-mAs images, respectively.
     
     

    Story Source:
    Materials provided by American Roentgen Ray Society. Note: Content may be edited for style and length. More

  • in

    Building mechanical memory boards using origami

    The ancient Japanese art of paper folding, known as origami, can be used to create mechanical, binary switches.
    In Applied Physics Letters, by AIP Publishing, researchers report the fabrication of such a paper device using a particular origami pattern known as the Kresling pattern. This device can act as a mechanical switch.
    By putting several of these together on a single platform, the investigators built a functioning mechanical memory board.
    Origami structures can be either rigid or nonrigid. For the first type, only the creases between panels of paper can deform, but the panels stay fixed. In nonrigid origami, however, the panels themselves can deform.
    The Kresling pattern is an example of nonrigid origami. Folding a piece of paper using this pattern generates a bellowslike structure that can flip between one orientation and another. The bellows act as a type of spring and can be controlled by vibrating a platform that holds the bellows. This creates a switch, which the investigators refer to as a Kresling-inspired mechanical switch, or KIMS.
    The researchers found that oscillating a platform holding the KIMS up and down at a certain speed will cause it to flip, or switch, between its two stable states. They used an electrodynamic shaker to provide controlled movements of the base and monitored the upper surface of the KIMS using a laser. In this way, they were able to map out and analyze the basic physics that underlies the switching behavior.
    “We used the Kresling origami pattern to also develop a cluster of mechanical binary switches,” author Ravindra Masana said. “These can be forced to transition between two different static states using a single controlled input in the form of a harmonic excitation applied at the base of the switch.”
    The group first considered a 2-bit memory board created by placing two KIMS units on a single platform. Because each KIMS bit has two stable states, four distinct states identified as S00, S01, S10 and S11 can be obtained. Oscillations of the platform will cause switching between these four stable states. This proof of concept with just two bits could be extended to multiple KIMS units, creating a type of mechanical memory.
    “Such switches can be miniaturized,” said Mohammed Daqaq, one of the authors and the director of the Laboratory of Applied Nonlinear Dynamics at NYU Abu Dhabi. “Instead of using a bulky electrodynamic shaker for actuation, the memory board can then be actuated using scalable piezoelectric and graphene actuators.”
    Miniaturized origami memory boards should have wide applicability and hold great promise for future device development.

    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Computer modeling used to predict reef health

    A UBC Okanagan researcher has developed a way to predict the future health of the planet’s coral reefs.
    Working with scientists from Australia’s Flinders’ University and privately-owned research firm Nova Blue Environment, biology doctoral student Bruno Carturan has been studying the ecosystems of the world’s endangered reefs.
    “Coral reefs are among the most diverse ecosystems on Earth and they support the livelihoods of more than 500 million people,” says Carturan. “But coral reefs are also in peril. About 75 per cent of the world’s coral reefs are threatened by habitat loss, climate change and other human-caused disturbances.”
    Carturan, who studies resilience, biodiversity and complex systems under UBCO Professors Lael Parrott and Jason Pither, says nearly all the world’s reefs will be dangerously affected by 2050 if no effective measures are taken.
    There is hope, however, as he has determined a way to examine the reefs and explore why some reef ecosystems appear to be more resilient than others. Uncovering why, he says, could help stem the losses.
    “In other ecosystems, including forests and wetlands, experiments have shown that diversity is key to resilience,” says Carturan. “With more species, comes a greater variety of form and function — what ecologists call traits. And with this, there is a greater likelihood that some particular traits, or combination of traits, help the ecosystem better withstand and bounce back from disturbances.”
    The importance of diversity for the health and stability of ecosystems has been extensively investigated by ecologists, he explains. While the consensus is that ecosystems with more diversity are more resilient and function better, the hypothesis has rarely been tested experimentally with corals.

    advertisement

    Using an experiment to recreate the conditions found in real coral reefs is challenging for several reasons — one being that the required size, timeframe and number of different samples and replicates are just unmanageable.
    That’s where computer simulation modelling comes in.
    “Technically called an ‘agent-based model’, it can be thought of as a virtual experimental arena that enables us to manipulate species and different types of disturbances, and then examine their different influences on resilience in ways that are just not feasible in real reefs,” explains Carturan.
    In his simulation arena, individual coral colonies and algae grow, compete with one another, reproduce and die. And they do all this in realistic ways. By using agent-based models — with data collected by many researchers over decades — scientists can manipulate the initial diversity of corals, including their number and identity, and see how the virtual reef communities respond to threats.
    “This is crucial because these traits are the building blocks that give rise to ecosystem structure and function. For instance, corals come in a variety of forms — from simple spheres to complex branching — and this influences the variety of fish species these reefs host, and their susceptibility to disturbances such as cyclones and coral bleaching.”
    By running simulations over and over again, the model can identify combinations that can provide the greatest resilience. This will help ecologists design reef management and restoration strategies using predictions from the model, says collaborating Flinders researcher Professor Corey Bradshaw.

    advertisement

    “Sophisticated models like ours will be useful for coral-reef management around the world,” Bradshaw adds. “For example, Australia’s iconic Great Barrier Reef is in deep trouble from invasive species, climate change-driven mass bleaching and overfishing.”
    “This high-resolution coral ‘video game’ allows us to peek into the future to make the best possible decisions and avoid catastrophes.”
    The research, supported by grants from the Natural Sciences and Engineering Research Council of Canada and the Canada Foundation for Innovation, was published recently in eLife. More

  • in

    Faster, more efficient energy storage could stem from holistic study of layered materials

    A team led by the Department of Energy’s Oak Ridge National Laboratory developed a novel, integrated approach to track energy-transporting ions within an ultra-thin material, which could unlock its energy storage potential leading toward faster charging, longer lasting devices.
    Scientists have for a decade studied the energy-storing possibilities of an emerging class of two-dimensional materials — those constructed in layers that are only a few atoms thick — called MXenes, pronounced “max-eens.”
    The ORNL-led team integrated theoretical data from computational modeling of experimental data to pinpoint potential locations of a variety of charged ions in titanium carbide, the most studied MXene phase. Through this holistic approach, they could track and analyze the ions’ motion and behavior from the single-atom to the device scale.
    “By comparing all the methods we employed, we were able to form links between theory and different types of materials characterization, ranging from very simple to very complex over a wide range of length and time scales,” said Nina Balke, ORNL co-author of the published study that was conducted within the Fluid Interface Reactions, Structures and Transport, or FIRST, Center. FIRST is a DOE-funded Energy Frontier Research Center located at ORNL.
    “We pulled all those links together to understand how ion storage works in layered MXene electrodes,” she added. The study’s results allowed the team to predict the material’s capacitance, or its ability to store energy. “And, in the end, after much discussion, we were able to unify all these techniques into one cohesive picture, which was really cool.”
    Layered materials can enhance energy stored and power delivered because the gaps between the layers allow charged particles, or ions, to move freely and quickly. However, ions can be difficult to detect and characterize, especially in a confined environment with multiple processes at play. A better understanding of these processes can advance the energy storage potential of lithium-ion batteries and supercapacitors.
    As a FIRST center project, the team focused on the development of supercapacitors — devices that charge quickly for short-term, high-power energy needs. In contrast, lithium-ion batteries have a higher energy capacity and provide electrical power longer, but the rates of discharge, and therefore their power levels, are lower.
    MXenes have the potential to bridge the benefits of these two concepts, Balke said, which is the overarching goal of fast-charging devices with greater, more efficient energy storage capacity. This would benefit a range of applications from electronics to electric vehicle batteries.
    Using computational modeling, the team simulated the conditions of five different charged ions within the layers confined in an aqueous solution, or “water shell.” The theoretical model is simple, but combined with experimental data, it created a baseline that provided evidence of where the ions within the MXene layers went and how they behaved in a complex environment.
    “One surprising outcome was we could see, within the simulation limits, different behavior for the different ions,” said ORNL theorist and co-author Paul Kent.
    The team hopes their integrated approach can guide scientists toward future MXene studies. “What we developed is a joint model. If we have a little bit of data from an experiment using a certain MXene, and if we knew the capacitance for one ion, we can predict it for the other ones, which is something that we weren’t able to do before,” Kent said.
    “Eventually, we’ll be able to trace those behaviors to more real-world, observable changes in the material’s properties,” he added. More

  • in

    Bold proposal to tackle one of the biggest barriers to more renewable energy

    The phrase “too much of a good thing” may sound like a contradiction, but it encapsulates one of the key hurdles preventing the expansion of renewable energy generation. Too much of a service or commodity makes it harder for companies to sell them, so they curtail production.
    Usually that works out fine: The market reaches equilibrium and economists are happy. But external factors are bottlenecking renewable electricity despite the widespread desire to increase its capacity.
    UC Santa Barbara’s Sangwon Suh is all too familiar with this issue. The professor of industrial ecology has focused on it and related challenges for at least the past two years at the Bren School of Environmental Science & Management. “Curtailment is the biggest problem of renewable energy we are facing,” said Suh, who noted it will only escalate as renewable energy capacity increases.
    Now Suh, along with Bren doctoral student Jiajia Zheng, and Andrew Chien at the University of Chicago, have presented an innovative proposal to address this issue by routing workloads between data centers in different regions. The concept, published in the journal Joule, is cheap, efficient and requires minimal new infrastructure. Yet it could reduce thousands of tons of greenhouse gas emissions per year, all while saving companies money and encouraging the expansion of renewable energy.
    The main roadblock
    Curtailment comes into play when renewable energy sources generate more electricity than is required to meet demand. Modern power grids balance energy supply and demand in real-time, every minute of every day. Extra electricity would overwhelm them, so it needs to be either stored, sold off or curtailed.

    advertisement

    This occurs because reliable energy sources — like fossil fuel and nuclear power plants — are critical to grid stability, as well as meeting nighttime demand. These facilities have to operate above a minimum capacity, since shutting down and restarting them is both costly and inefficient. This sets a minimum for electricity from conventional power sources, and if renewables continue to generate more power, then the extra energy is effectively useless.
    California is a case study in the challenges of variable renewable electricity and the problem of curtailment. Presumably the state could sell its surplus electricity to neighbors. Unfortunately, many power grids are encountering the same problem, and the transmission network has limited capacity. As a result, the state has resorted to selling excess electricity at a negative price, essentially paying other states to take the energy.
    There are two other solutions for dealing with excess electricity aside from simply curtailing energy generation, Suh explained. Energy can be stored in batteries and even hydroelectric reservoirs. That said, batteries are incredibly expensive, and hydropower storage is only suitable for certain locations.
    The other option is to use the extra electricity to generate things of value that can be used later. “Whatever we produce will have to be stored and transported to where it’s needed,” Suh pointed out. “And this can be very expensive.
    “But,” he added, “transporting data and information is very cheap because we can use fiber optics to transmit the data literally at the speed of light.” As the authors wrote in the study, the idea behind data load migration is “moving bits, not watts.”
    An innovative idea

    advertisement

    The task ahead of the authors was clear. “The question we were trying to answer was can we process data using excess electricity?” Suh said. “If we can, then it’s probably the cheapest solution for transporting the product or service made using excess electricity.”
    Currently, Northern Virginia hosts most of the nation’s data centers. Unlike California’s grid, CAISO, the grid Northern Virginia sits on, PJM, relies heavily on coal-fired power plants, “the dirtiest electricity that we can ever imagine,” in Suh’s words.
    Suh, Zheng and Chien propose that workloads from the PJM region could be sent to centers out west whenever California has excess electricity. The jobs can be accomplished using electricity that otherwise would have been curtailed or sold for a loss, and then the processed data can be sent wherever the service is needed. Data centers usually have average server usage rates below 50%, Zheng explained, meaning there is plenty of idle capacity ready to be tapped.
    This plan is not only environmentally sound; it represents significant savings for the companies using these services. “This approach could potentially save the data center operators tens of millions of dollars,” said lead author Zheng. Since the electricity would otherwise have been useless, its cost to the company is essentially zero.
    The authors analyzed historical curtailment data of CAISO from 2015 through 2019. They found that load migration could have absorbed up to 62% of CAISO’s curtailed electricity capacity in 2019. That’s nearly 600,000 megawatt-hours of previously wasted energy — roughly as much electricity as 100,000 Californian households consume in a year.
    At the same time, the strategy could have reduced the equivalent of up to 240,000 metric tons of CO2 emissions in 2019 using only existing data center capacity in California. “That is equivalent to the greenhouse gas emissions from 600 million miles of driving using average passenger vehicles,” Suh said. And, rather than costing money, each ton of CO2 emissions averted by switching power grids would actually provide around $240 in savings due to decreased spending on electricity.
    Untapped potential
    These findings were within what the authors expected to see. It was the ramifications that amazed them. “What surprised me was why we were not doing this before,” Suh said. “This seems very straightforward: There’s excess electricity, and electricity is a valuable thing, and information is very cheap to transmit from one location to another. So why are we not doing this?”
    Suh suspects it may be because data center operators are less inclined to cooperate with each other under current market pressures. Despite the environmental and financial benefits, these companies may be reluctant to outsource data processing to a facility run by a different firm.
    In fact, the data center industry is somewhat of a black box. “It was very challenging for us to get detailed information on the power usage characteristics and energy consumption data from the industry,” said Zheng.
    Harnessing the potential of curtailed renewable energy will require fluid coordination between the data center operators. Shifting the system may require changing the incentives currently at work. This could take the form of new regulations, a price on carbon emissions or collaborations between rival companies.
    “Two different things need to happen in parallel,” Suh said. “One is from the private sector: They need to cooperate and come up with the technological and managerial solutions to enable this. And from the government side, they need to think about the policy changes and incentives that can enable this type of change more quickly.”
    A widespread price on carbon emissions could provide the necessary nudge. California already has a carbon price, and Suh believes that, as additional states follow suit, it will become more economically attractive for companies to start using the strategies laid out in this study.
    And these strategies have huge growth potential. Data processing and renewable electricity capacity are both growing rapidly. Researchers predict that the datasphere will expand more than fivefold from 2018 to 2025. As a result, there is a lot of room for data centers to absorb additional processing needs using excess renewable energy in the future.
    This paper offers only a conservative estimate of the financial and environmental benefits of data load migration, Suh acknowledged. “As we increase the data center capacity, I think that the ability for a data center to be used as a de facto battery is actually increasing as well,” he said.
    “If we can think ahead and be prepared, I think that a substantial portion of the curtailment problem can be addressed in a very cost-effective way by piggybacking on the growth of data centers.” More