More stories

  • in

    Shining a light on the hidden properties of quantum materials

    Certain materials have desirable properties that are hidden, and just as you would use a flashlight to see in the dark, scientists can use light to uncover these properties.
    Researchers at the University of California San Diego have used an advanced optical technique to learn more about a quantum material called Ta2NiSe5 (TNS). Their work appears in Nature Materials.
    Materials can be perturbed through different external stimuli, often with changes in temperature or pressure; however, because light is the fastest thing in the universe, materials will respond very quickly to optical stimuli, revealing properties that would otherwise remain hidden.
    “In essence, we shine a laser on a material and it’s like stop-action photography where we can incrementally follow a certain property of that material,” said Professor of Physics Richard Averitt, who led the research and is one of the paper’s authors. “By looking at how constituent particles move around in that system, we can tease out these properties that are really tricky to find otherwise.”
    The experiment was conducted by lead author Sheikh Rubaiat Ul Haque, who graduated from UC San Diego in 2023 and is now a postdoctoral scholar at Stanford University. He, along with Yuan Zhang, another graduate student in Averitt’s lab, improved upon a technique called terahertz time-domain spectroscopy. This technique allows scientists to measure a material’s properties over a range of frequencies, and Haque’s improvements allowed them access to a broader range of frequencies.
    The work was based on a theory created by another of the paper’s authors, Eugene Demler, a professor at ETH Zürich. Demler and his graduate student Marios Michael developed the idea that when certain quantum materials are excited by light, they may turn into a medium that amplifies terahertz frequency light. This led Haque and colleagues to look closely into the optical properties of TNS.
    When an electron is excited to a higher level by a photon, it leaves behind a hole. If the electron and hole are bound, an exciton is created. Excitons may also form a condensate — a state that occurs when particles come together and behave as a single entity.

    Haque’s technique, backed by Demler’s theory and using density functional calculations by Angel Rubio’s group at Max Planck Institute for the Structure and Dynamics of Matter, the team was able to observe anomalous terahertz light amplification, which uncovered some of the hidden properties of the TNS exciton condensate.
    Condensates are a well-defined quantum state and using this spectroscopic technique could allow some of their quantum properties to be imprinted onto light. This may have implications in the emerging field of entangled light sources (where multiple light sources have interconnected properties) utilizing quantum materials.
    “I think it’s a wide-open area,” stated Haque. “Demler’s theory can be applied to a suite of other materials with nonlinear optical properties. With this technique, we can discover new light-induced phenomena that haven’t been explored before.”
    Funding provided by the DARPA DRINQS Program (D18AC00014), the Swiss National Science Foundation (200021_212899), Army Research Office (W911NF-21-1-0184), the European Research Council (ERC-2015-AdG694097), the Cluster of Excellence ‘Advanced Imaging of Matter’ (AIM), Grupos Consolidados (IT1249-19), Deutsche Forschungsgemeinschaft (170620586), and the Flatiron Institute. More

  • in

    Computer scientists invent simple method to speed cache sifting

    Computer scientists have invented a highly effective, yet incredibly simple, algorithm to decide which items to toss from a web cache to make room for new ones. Known as SIEVE, the new open-source algorithm holds the potential to transform the management of web traffic on a large scale.
    SIEVE is a joint project of computer scientists at Emory University, Carnegie Mellon University and the Pelikan Foundation. The team’s paper on SIEVE will be presented at the 21st USENIX Symposium on Networked Systems Design and Implementation (NSDI) in Santa Clara, California, in April.
    A preprint of the paper is already making waves.
    “SIEVE is bigger and greater than just us,” says Yazhuo Zhang, an Emory PhD student and co-first author of the paper. “It is already performing well but we are getting a lot of good suggestions to make it even better. That’s the beauty of the open-source world.”
    Zhang shares first authorship of the paper with Juncheng (Jason) Yang, who received his master’s degree in computer science at Emory and is now a PhD candidate at Carnegie Mellon.
    “SIEVE is an easy improvement of a tried-and-true cache-eviction algorithm that’s been in use for decades — which is literally like centuries in the world of computing,” says Ymir Vigfusson, associate professor in Emory’s Department of Computer Science.
    Vigfusson is co-senior author of the paper, along with Rashmi Vinayak, an associate professor in Carnegie Mellon’s computer science department. Yao Yue, a computer engineer at the Pelikan Foundation, is also a co-author.

    In addition to its speed and effectiveness, a key factor sparking interest in SIEVE is its simplicity, lending it scalability.
    “Simplicity is the ultimate sophistication,” Vigfusson says. “The simpler the pieces are within a system designed to serve billions of people within a fraction of a second, the easier it is to efficiently implement and maintain that system.”
    Keeping ‘hot objects’ handy
    Many people understand the value of regularly reorganizing their clothing closet. Items that are never used can be tossed and those that are rarely used can be moved to the attic or some other remote location. That leaves the items most commonly worn within easy reach so they can be found quickly, without rummaging around.
    A cache is like a well-organized closet for computer data. The cache is filled with copies of the most popular objects requested by users, or “hot objects” in IT terminology. The cache maintains this small collection of hot objects separately from a computer network’s main database, which is like a vast warehouse filled with all the information that could be served by the system.
    Caching hot objects allows a networked system to run more efficiently, rapidly responding to requests from users. A web application can effectively handle more traffic by popping into a handy closet to grab most of the objects users want rather than traveling down to the warehouse and searching through a massive database for each request.

    “Caching is everywhere,” Zhang says. “It’s important to every company, big or small, that is using web applications. Every website needs a cache system.”
    And yet, caching is relatively understudied in the computer science field.
    How caching works
    While caching can be thought of as a well-organized closet for a computer, it is difficult to know what should go into that closet when millions of people, with constantly changing needs, are using it.
    The fast memory of the cache is expensive to run yet critical to a good experience for web users. The goal is to keep the most useful, future information within the cache. Other objects must be continuously winnowed out, or “evicted” in tech terminology, to make room for the changing array of hot objects.
    Cache-eviction algorithms determine what objects to toss and when to do so.
    FIFO, or “first-in, first-out,” is a classic eviction algorithm developed in the 1960s. Imagine objects lined up on a conveyor belt. Newly requested objects enter on the left and the oldest objects get evicted when they reach the end of the line on the right.
    In the LRU, or “least recently used,” algorithm the objects also move along the line towards eviction at the end. However, if an object is requested again while it moves down the conveyor belt, it gets moved back to the head of the line.
    Hundreds of variations of eviction algorithms exist but they have tended to take on greater complexity to gain efficiency. That generally means they are opaque to reason about and require high maintenance, especially when dealing with massive workloads.
    “If an algorithm is very complicated, it tends to have more bugs, and all of those bugs need to be fixed,” Zhang explains.
    A simple idea
    Like LRU and some other algorithms, SIEVE makes a simple tweak on the basic FIFO scheme.
    SIEVE initially labels a requested object as a “zero.” If the object is requested again as it moves down the belt, its status changes to “one.” When an object labeled “one” makes it to the end of the line it is automatically reset to “zero” and evicted.
    A pointer, or “moving hand,” also scans the objects as they travel down the line. The pointer starts at the end of the line and then jumps to the head, moving in a continuous circle. Anytime the pointer hits an object labeled “zero,” the object is evicted.
    “It’s important to evict unpopular objects as quickly as possible, and SIEVE is very fast at this task,” Zhang says.
    In addition to this quick demotion of objects, SIEVE manages to maintain popular objects in the cache with minimal computational effort, known as “lazy promotion” in computer terminology. The researchers believe that SIEVE is the simplest cache-eviction algorithm to effectively achieve both quick demotion and lazy promotion.
    A lower miss ratio
    The purpose of caching is to achieve a low miss ratio — the fraction of requested objects that must be fetched from “the warehouse.”
    To evaluate SIEVE, the researchers conducted experiments on open-source web-cache traces from Meta, Wikimedia, X and four other large datasets. The results showed that SIEVE achieves a lower miss ratio than nine state-of-the-art algorithms on more than 45% of the traces. The next best algorithm has a lower miss ratio on only 15%.
    The ease and simplicity of SIEVE raises the question of why no one came up with the method before. The SIEVE team’s focus on how patterns of web traffic have changed in recent years may have made the difference, Zhang theorizes.
    “For example,” she says, “new items now become ‘hot’ quickly but also disappear quickly. People continuously lose interest in things because new things keep coming up.”
    Web-cache workloads tend to follow what are known as generalized Zipfian distributions, where a small subset of objects account for a large proportion of requests. SIEVE may have hit a Zipfian sweet spot for current workloads.
    “It is clearly a transformative moment for our understanding of web-cache eviction,” Vigfusson says. “It changes a construct that’s been used blindly for so long.”
    Even a tiny improvement in a web-caching system, he adds, can save millions of dollars at a major data center.
    Zhang and Yang are on track to receive their PhDs in May.
    “They are doing incredible work,” Vigfusson says. “It’s safe to say that both of them are now among the world experts on web-cache eviction.” More

  • in

    Researchers add a ‘twist’ to classical material design

    Researchers with the Department of Energy’s SLAC National Accelerator Laboratory, Stanford University and the DOE’s Lawrence Berkeley National Laboratory (LBNL) grew a twisted multilayer crystal structure for the first time and measured the structure’s key properties. The twisted structure could help researchers develop next-generation materials for solar cells, quantum computers, lasers and other devices.
    “This structure is something that we have not seen before — it was a huge surprise to me,” said Yi Cui, a professor at Stanford and SLAC and paper co-author. “A new quantum electronic property could appear within this three-layer twisted structure in future experiments.”
    Adding layers, with a twist
    The crystals the team designed extended the concept of epitaxy, a phenomenon that occurs when one type of crystal material grows on top of another material in an ordered way — kind of like growing a neat lawn on top of soil, but at the atomic level. Understanding epitaxial growth has been critical to the development of many industries for more than 50 years, particularly the semiconductor industry. Indeed, epitaxy is part of many of the electronic devices that we use today, from cell phones to computers to solar panels, allowing electricity to flow, and not flow, through them.
    To date, epitaxy research has focused on growing one layer of material onto another, and the two materials have the same crystal orientation at the interface. This approach has been successful for decades in many applications, such as transistors, light-emitting diodes, lasers and quantum devices. But to find new materials that perform even better for more demanding needs, like quantum computing, researchers are searching for other epitaxial designs — ones that might be more complex, yet better performing, hence the “twisted epitaxy” concept demonstrated in this study.
    In their experiment, detailed this month in Science, researchers added a layer of gold between two sheets of a traditional semiconducting material, molybdenum disulfide (MoS2). Because the top and bottom sheets were oriented differently, the gold atoms could not align with both simultaneously, which allowed the Au structure to twist, said Yi Cui, Professor Cui’s graduate student in materials science and engineering at Stanford and co-author of the paper.
    “With only a bottom MoS2 layer, the gold is happy to align with it, so no twist happens,” said Cui, the graduate student. “But with two twisted MoS2 sheets, the gold isn’t sure to align with the top or bottom layer. We managed to help the gold solve its confusion and discovered a relationship between the orientation of Au and the twist angle of bilayer MoS2.”
    Zapping gold nanodiscs

    To study the gold layer in detail, the researcher team from the Stanford Institute for Materials and Energy Sciences (SIMES) and LBNL heated a sample of the whole structure to 500 degrees Celsius. Then they sent a stream of electrons through the sample using a technique called transmission electron microscopy (TEM), which revealed the morphology, orientation and strain of the gold nanodiscs after annealing at the different temperatures. Measuring these properties of the gold nanodiscs was a necessary first step toward understanding how the new structure could be designed for real world applications in the future.
    “Without this study, we would not know if twisting an epitaxial layer of metal on top of a semiconductor was even possible,” said Cui, the graduate student. “Measuring the complete three-layer structure with electron microscopy confirmed that it was not only possible, but also that the new structure could be controlled in exciting ways.”
    Next, researchers want to further study the optical properties of the gold nanodiscs using TEM and learn if their design alters physical properties like band structure of Au. They also want to extend this concept to try to build three-layer structures with other semiconductor materials and other metals.
    “We’re beginning to explore whether only this combination of materials allows this or if it happens more broadly,” said Bob Sinclair, the Charles M. Pigott Professor in Stanford’s school of Materials Science and Engineering and paper co-author. “This discovery is opening a whole new series of experiments that we can try. We could be on our way to finding brand new material properties that we could exploit.” More

  • in

    The complexity of forests cannot be explained by simple mathematical rules, study finds

    The way trees grow together do not resemble how branches grow on a single tree, scientists have discovered.
    Nature is full of surprising repetitions. In trees, the large branches often look like entire trees, while smaller branches and twigs look like the larger branches they grow from. If seen in isolation, each part of the tree could be mistaken for a miniature version of itself.
    It has long been assumed that this property, called fractality, also applies to entire forests but researchers from the University of Bristol have found that this is not the case.
    The study, published in December in Journal of Ecology, refutes claims that the self-similarity which is observed within individual trees can be extended to whole forest canopies and landscapes.
    Lead author Dr Fabian Fischer explained: “Fractality can be found in many natural systems. Transport networks such as arteries or rivers often show self-similarity in the way they branch, and many organic structures, such as trees, ferns or broccoli, are composed of parts that look like the whole.
    “Fractality provides a way of categorising and quantifying these self-similar patterns we so often observe in nature, and has been hypothesized to be an emergent property that is shared by many natural systems.
    “Intuitively, if you look at a picture of something and you can’t quite determine how big it is, then this is good indicator of fractality. For instance, is this a large mountain in front of me or just a small rock looking like a mountain? Is it a branch or whole a tree?

    “Scientifically, this self-similarity has the attractive property that it allows you to describe an apparently complex object using some very simple rules and numbers.”
    If self-similarity extended from the small twigs of a single tree to entire forest ecosystems, it would help ecologists describe complex landscapes in much simpler ways, and potentially directly compare the complexity of very different ecosystems, such as coral reefs and forest canopies.
    To test this idea that forest canopies behave like fractals, the team used airborne laser scanning data from nine sites spread across Australia’s Terrestrial Ecosystem Research Network (TERN). These sites span a large rainfall gradient and vary enormously in their structure: from sparse and short arid woodlands in Western Australia to towering, 90-m tall mountain ash forests in Tasmania. From each laser scan, they derived high-resolution forest height maps and compared these to what forest heights would look like if the forests were fractal in nature.
    Dr Fischer said: “We found that forest canopies are not fractal, but they are very similar in how they deviate from fractality, irrespective of what ecosystem they are in.
    “That they are not fractal makes a lot of sense and was our hypothesis from the start. While it might be possible to confuse a branch for an entire tree, it’s usually easy to differentiate trees from a grove of trees or from an entire forest.
    “But it was surprising how similar all forest canopies were in the way they deviated from true fractals, and how deviations were linked to the size of the trees and how dry their environment was.

    “The consistency of deviations also gave us an idea of how we could compare complexity across ecosystems. Most ecosystems, like forests, will hit an upper limit — most likely determined by the maximum size of its organisms — beyond which their structure cannot vary freely anymore.
    “If we could determine these upper limits, this could open up routes to understanding how very different organisms and systems (coral reefs, forests, etc.) work and to test whether they might share the same basic organising principles.”
    Now the team plan to compare an even wider range of forest ecosystems across the globe, find out whether there are similar organizing principles in forests and beyond, and discover what drives these patterns by looking at multiple scans in time.
    Dr Fischer concluded: “A key question in science is whether there are generalizable patterns in nature, and an excellent candidate for this is fractality.
    “The forests we studied were not fractal, but there were clear similarities across all sites in how they deviated from fractality. From a theoretical point of view, this points the way to a framework for finding general organizing principles in biology.
    “But this also has practical implications: if we cannot understand the forest from its trees, and vice versa, then we must monitor forests both at small and large scales to understand how they respond to climatic changes and growing human pressure.” More

  • in

    Misinformation and irresponsible AI — experts forecast how technology may shape our near future

    From misinformation and invisible cyber attacks, to irresponsible AI that could cause events involving multiple deaths, expert futurists have forecast how rapid technology changes may shape our world by 2040.
    As the pace of computer technology advances surges ahead, and systems become increasingly interlinked, it is vital to know how these fast technology advances could impact the world in order to take steps to prevent the worst outcomes.
    Using a Delphi study, a well known technique for forecasting, a team of cyber security researchers led by academics from Lancaster University interviewed 12 experts in the future of technologies.
    The experts, ranged from chief technology officers in businesses, consultant futurists and a technology journalist to academic researchers. They were asked how particular technologies may develop and change our world over the next 15 years by 2040, what risks they might pose, and how to address the challenges that may arise.
    Most of the experts forecasted exponential growth in Artificial Intelligence (AI) over the next 15 years, and many also expressed concern that corners could be cut in the development of safe AI. They felt that this corner cutting could be driven by nation states seeking competitive advantage. Several of the experts even considered it possible that poorly implemented AI could lead to incidents involving many deaths, although other experts disagreed with this view.
    Dr Charles Weir, Lecturer at Lancaster University’s School of Computing and Communications and lead researcher of the study, said: “Technology advances have brought, and will continue to bring, great benefits. We also know there are risks around some of these technologies, including AI, and where their development may go — everyone’s been discussing them — but the possible magnitude of some of the risks forecast by some of the experts was staggering.
    “But by forecasting what potential risks lie just beyond the horizon we can take steps to avoid major problems.”
    Another significant concern held by most of the experts involved in the study was that technology advances will make it easier for misinformation to spread. This has the potential to make it harder for people to tell the difference between truth and fiction — with ramifications for democracies.

    Dr Weir said: “We are already seeing misinformation on social media networks, and used by some nation states. The experts are forecasting that advances in technologies will make it much easier for people and bad actors to continue spreading misleading material by 2040.”
    Other technologies were forecast to not have as big as impact by 2040, including quantum computing which experts see as having impacts over a much longer timeframe, and Blockchain which was dismissed by most of the experts as being a source of major change.
    The experts forecast that:
    · By 2040, competition between nation states and big tech companies will lead to corners being cut in the development of safe AI
    · Quantum computing will have limited impact by 2040
    · By 2040 there will be ownership of public web assets. These will be identified and traded through digital tokens
    · By 2040 it will be harder to distinguish truth from fiction because widely accessible AI can massively generate doubtful content

    · By 2040 there will be less ability to distinguish accidents from criminal incidents due to the decentralised nature and complexity of systems
    The forecasters also offered some suggested solutions to help mitigate against some of the concerns raised. Their suggestions included governments introducing AI purchasing safety principles, new laws to regulate AI safety. In addition, universities could be vital by introducing courses combining technical skills and legislation.
    These forecasts will help policy makers and technology professionals make strategic decisions around developing and deploying novel computing technologies. They are outlined in the paper ‘Interlinked Computing in 2040: Safety, Truth, Ownership and Accountability’ which has been published by the peer-reviewed journal IEEE Computer.
    The paper’s authors are: Charles Weir and Anna Dyson of Lancaster University; Olamide Jogunola and Katie Paxton-Fear of Manchester Metropolitan University; and Louise Dennis of Manchester University. More

  • in

    What coffee with cream can teach us about quantum physics

    Add a dash of creamer to your morning coffee, and clouds of white liquid will swirl around your cup. But give it a few seconds, and those swirls will disappear, leaving you with an ordinary mug of brown liquid.
    Something similar happens in quantum computer chips — devices that tap into the strange properties of the universe at its smallest scales — where information can quickly jumble up, limiting the memory capabilities of these tools.
    That doesn’t have to be the case, said Rahul Nandkishore, associate professor of physics at the University of Colorado Boulder.
    In a new coup for theoretical physics, he and his colleagues have used math to show that scientists could create, essentially, a scenario where the milk and coffee never mix — no matter how hard you stir them.
    The group’s findings may lead to new advances in quantum computer chips, potentially providing engineers with new ways to store information in incredibly tiny objects.
    “Think of the initial swirling patterns that appear when you add cream to your morning coffee,” said Nandkishore, senior author of the new study. “Imagine if these patterns continued to swirl and dance no matter how long you watched.”
    Researchers still need to run experiments in the lab to make sure that these never-ending swirls really are possible. But the group’s results are a major step forward for physicists seeking to create materials that remain out of balance, or equilibrium, for long periods of time — a pursuit known as “ergodicity breaking.”
    The team’s findings appeared this week in the latest issue of Physical Review Letters.

    Quantum memory
    The study, which includes co-authors David Stephen and Oliver Hart, postdoctoal researchers in physics at CU Boulder, hinges on a common problem in quantum computing.
    Normal computers run on “bits,” which take the form of zeros or ones. Nandkishore explained that quantum computers, in contrast, employ “qubits,” which can exist as zero, one or, through the strangeness of quantum physics, zero and one at the same time. Engineers have made qubits out of a wide range of things, including individual atoms trapped by lasers or tiny devices called superconductors.
    But just like that cup of coffee, qubits can become easily mixed up. If you flip, for example, all of your qubits to one, they’ll eventually flip back and forth until the entire chip becomes a disorganized mess.
    In the new research, Nandkishore and his colleagues may have figured a way around that tendency toward mixing. The group calculated that if scientists arrange qubits into particular patterns, these assemblages will retain their information — even if you disturb them using a magnetic field or a similar disruption. That could, the physicist said, allow engineers to build devices with a kind of quantum memory.
    “This could be a way of storing information,” he said. “You would write information into these patterns, and the information couldn’t be degraded.”
    Tapping into geometry

    In the study, the researchers used mathematical modeling tools to envision an array of hundreds to thousands of qubits arranged in a checkerboard-like pattern.
    The trick, they discovered, was to stuff the qubits into a tight spot. If qubits get close enough together, Nadkishore explained, they can influence the behavior of their neighbors, almost like a crowd of people trying to squeeze themselves into a telephone booth. Some of those people might be standing upright or on their heads, but they can’t flip the other way without pushing on everyone else.
    The researchers calculated that if they arranged these patterns in just the right way, those patterns might flow around a quantum computer chip and never degrade — much like those clouds of cream swirling forever in your coffee.
    “The wonderful thing about this study is that we discovered that we could understand this fundamental phenomenon through what is almost simple geometry,” Nandkishore said.
    The team’s findings could influence a lot more than just quantum computers.
    Nandkishore explained that almost everything in the universe, from cups of coffee to vast oceans, tends to move toward what scientists call “thermal equilibrium.” If you drop an ice cube into your mug, for example, heat from your coffee will melt the ice, eventually forming a liquid with a uniform temperature.
    His new findings, however, join a growing body of research that suggests that some small organizations of matter can resist that equilibrium — seemingly breaking some of the most immutable laws of the universe.
    “We’re not going to have to redo our math for ice and water,” Nandkishore said. “The field of mathematics that we call statistical physics is incredibly successful for describing things we encounter in everyday life. But there are settings where maybe it doesn’t apply.” More

  • in

    Offshore wind farms are vulnerable to cyberattacks

    The hurrying pace of societal electrification is encouraging from a climate perspective. But the transition away from fossil fuels toward renewable sources like wind presents new risks that are not yet fully understood.
    Researchers from Concordia and Hydro-Quebec presented a new study on the topic in Glasgow, United Kingdom at the 2023 IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm). Their study explores the risks of cyberattacks faced by offshore wind farms. Specifically, the researchers considered wind farms that use voltage-source-converter high-voltage direct-current (VSC-HVDC) connections, which are rapidly becoming the most cost-effective solution to harvest offshore wind energy around the world.
    “As we advance the integration of renewable energies, it is imperative to recognize that we are venturing into uncharted territory, with unknown vulnerabilities and cyber threats,” says Juanwei Chen, a PhD student at the Concordia Institute for Information Systems Engineering (CIISE) at the Gina Cody School of Engineering and Computer Science.
    “Offshore wind farms are connected to the main power grid using HVDC technologies. These farms may face new operational challenges,” Chen explains.
    “Our focus is to investigate how these challenges could be intensified by cyber threats and to assess the broader impact these threats might have on our power grid.”
    Concordia PhD student Hang Du, CIISE associate professor Jun Yan and Gina Cody School dean Mourad Debbabi, along with Rawad Zgheib from the Hydro-Quebec Research Institute (IREQ), also contributed to the study. This work is part of a broad research collaboration project involving the group of Prof. Debbabi and the IREQ cybersecurity research group led by Dr. Marthe Kassouf and involving a team of researchers including Dr. Zgheib.
    Complex and vulnerable systems
    Offshore wind farms require more cyber infrastructure than onshore wind farms, given that offshore farms are often dozens of kilometres from land and operated remotely. Offshore wind farms need to communicate with onshore systems via a wide area network. Meanwhile, the turbines also communicate with maintenance vessels and inspection drones, as well as with each other.

    This complex, hybrid-communication architecture presents multiple access points for cyberattacks. If malicious actors were able to penetrate the local area network of the converter station on the wind farm side, these actors could tamper with the system’s sensors. This tampering could lead to the replacement of actual data with false information. As a result, electrical disturbances would affect the offshore wind farm at the points of common coupling.
    In turn, these disturbances could trigger poorly dampened power oscillations from the offshore wind farms when all the offshore wind farms are generating their maximum output. If these cyber-induced electrical disturbances are repetitive and match the frequency of the poorly dampened power oscillations, the oscillations could be amplified. These amplified oscillations might then be transmitted through the HVDC system, potentially reaching and affecting the stability of the main power grid. While existing systems usually have redundancies built in to protect them against physical contingencies, such protection is rare against cyber security breaches.
    “The system networks can handle events like router failures or signal decays. If there is an attacker in the middle who is trying to hijack the signals, then that becomes more concerning,” says Yan, the Concordia University Research Chair (Tier 2) in Artificial Intelligence in Cyber Security and Resilience.
    Yan adds that considerable gaps exist in the industry, both among manufacturers and utilities. While many organizations are focusing on corporate issues such as data security and access controls, much is to be done to strengthen the security of operational technologies.
    He notes that Concordia is leading the push for international standardization efforts but acknowledges the work is just beginning.
    “There are regulatory standards for the US and Canada, but they often only state what is required without specifying how it should be done,” he says. “Researchers and operators are aware of the need to protect our energy security, but there remain many directions to pursue and open questions to answer.”
    This research is supported by the Concordia/Hydro-Québec/Hitachi Partnership Research Chair, with additional support from NSERC and PROMPT. More

  • in

    Many but not all of the world’s aquifers are losing water

    The world’s precious stash of subterranean freshwater is shrinking — and in nearly a third of aquifers, that loss has been speeding up in the last couple of decades, researchers report in the Jan. 25 Nature.

    A one-two punch of unsustainable groundwater withdrawals and changing climate has been causing global water levels to fall on average, leading to water shortages, slumping land surfaces and seawater intrusion into aquifers. The new study suggests that groundwater decline has accelerated in many places since 2000, but also suggests that these losses can be reversible with better water management. More