More stories

  • in

    Your paper notebook could become your next tablet

    Innovators from Purdue University hope their new technology can help transform paper sheets from a notebook into a music player interface and make food packaging interactive.
    Purdue engineers developed a simple printing process that renders any paper or cardboard packaging into a keyboard, keypad or other easy-to-use human-machine interfaces. This technology is published in the Aug. 23 edition of Nano Energy.
    “This is the first time a self-powered paper-based electronic device is demonstrated,” said Ramses Martinez, an assistant professor in Purdue’s School of Industrial Engineering and in the Weldon School of Biomedical Engineering in Purdue’s College of Engineering. “We developed a method to render paper repellent to water, oil and dust by coating it with highly fluorinated molecules. This omniphobic coating allows us to print multiple layers of circuits onto paper without getting the ink to smear from one layer to the next one.”
    Martinez said this innovation facilitates the fabrication of vertical pressure sensors that do not require any external battery, since they harvest the energy from their contact with the user.
    This technology is compatible with conventional large-scale printing processes and could easily be implemented to rapidly convert conventional cardboard packaging or paper into smart packaging or a smart human-machine interface.
    “I envision this technology to facilitate the user interaction with food packaging, to verify if the food is safe to be consumed, or enabling users to sign the package that arrives at home by dragging their finger over the box to proper identify themselves as the owner of the package,” Martinez said. “Additionally, our group demonstrated that simple paper sheets from a notebook can be transformed into music player interfaces for users to choose songs, play them and change their volume.”
    Videos showing this technology are available at https://youtu.be/TfA0d8IpjWU, https://youtu.be/J0iCxjicJIQ and https://youtu.be/c9E6vXYtIw0.

    Story Source:
    Materials provided by Purdue University. Original written by Chris Adam. Note: Content may be edited for style and length. More

  • in

    New evidence for quantum fluctuations near a quantum critical point in a superconductor

    Among all the curious states of matter that can coexist in a quantum material, jostling for preeminence as temperature, electron density and other factors change, some scientists think a particularly weird juxtaposition exists at a single intersection of factors, called the quantum critical point or QCP.
    “Quantum critical points are a very hot issue and interesting for many problems,” says Wei-Sheng Lee, a staff scientist at the Department of Energy’s SLAC National Accelerator Laboratory and investigator with the Stanford Institute for Materials and Energy Sciences (SIMES). “Some suggest that they’re even analogous to black holes in the sense that they are singularities — point-like intersections between different states of matter in a quantum material — where you can get all sorts of very strange electron behavior as you approach them.”
    Lee and his collaborators reported in Nature Physics today that they have found strong evidence that QCPs and their associated fluctuations exist. They used a technique called resonant inelastic X-ray scattering (RIXS) to probe the electronic behavior of a copper oxide material, or cuprate, that conducts electricity with perfect efficiency at relatively high temperatures.
    These so-called high-temperature superconductors are a bustling field of research because they could give rise to zero-waste transmission of energy, energy-efficient transportation systems and other futuristic technologies, although no one knows the underlying microscopic mechanism behind high-temperature superconductivity yet. Whether QCPs exist in cuprates is also a hotly debated issue.
    In experiments at the UK’s Diamond Light Source, the team chilled the cuprate to temperatures below 90 kelvins (minus 183 degrees Celsius), where it became superconducting. They focused their attention on what’s known as charge order — alternating stripes in the material where electrons and their negative charges are denser or more sparse.
    The scientists excited the cuprate with X-rays and measured the X-ray light that scattered into the RIXS detector. This allowed them to map out how the excitations propagated through the material in the form of subtle vibrations, or phonons, in the material’s atomic lattice, which are hard to measure and require very high-resolution tools.

    advertisement

    At the same time, the X-rays and the phonons can excite electrons in the charge order stripes, causing the stripes to fluctuate. Since the data obtained by RIXS reflects the coupling between the behavior of the charge stripes and the behavior of the phonons, observing the phonons allowed the researchers to measure the behavior of the charge order stripes, too.
    What the scientists expected to see is that when the charge order stripes grew weaker, their excitations would also fade away. “But what we observed was very strange,” Lee said. “We saw that when charge order became weaker in the superconducting state, the charge order excitations became stronger. This is a paradox because they should go hand in hand, and that’s what people find in other charge order systems.”
    He added, “To my knowledge this is the first experiment about charge order that has shown this behavior. Some have suggested that this is what happens when a system is near a quantum critical point, where quantum fluctuations become so strong that they melt the charge order, much like heating ice increases thermal vibrations in its rigid atomic lattice and melts it into water. The difference is that quantum melting, in principle, occurs at zero temperature.” In this case, Lee said, the unexpectedly strong charge order excitations seen with RIXS were manifestations of those quantum fluctuations.
    Lee said the team is now studying these phenomena at a wider range of temperatures and at different levels of doping — where compounds are added to change the density of freely moving electrons in the material — to see if they can nail down exactly where the quantum critical point could be in this material.
    Thomas Devereaux, a theorist at SIMES and senior author of the report, noted that many phases of matter can be intertwined in cuprates and other quantum materials.
    “Superconducting and magnetic states, charge order stripes and so on are so entangled that you can be in all of them at the same time,” he said. “But we’re stuck in our classical way of thinking that they have to be either one way or another.”
    Here, he said, “We have an effect, and Wei-Sheng is trying to measure it in detail, trying to see what’s going on.” More

  • in

    New theory hints at more efficient way to develop quantum algorithms

    In 2019, Google claimed it was the first to demonstrate a quantum computer performing a calculation beyond the abilities of today’s most powerful supercomputers.
    But most of the time, creating a quantum algorithm that stands a chance at beating a classical computer is an accidental process, Purdue University scientists say. To bring more guidance to this process and make it less arbitrary, these scientists developed a new theory that may eventually lead to more systematic design of quantum algorithms.
    The new theory, described in a paper published in the journal Advanced Quantum Technologies, is the first known attempt to determine which quantum states can be created and processed with an acceptable number of quantum gates to outperform a classical algorithm.
    Physicists refer to this concept of having the right number of gates to control each state as “complexity.” Since the complexity of a quantum algorithm is closely related to the complexity of quantum states involved in the algorithm, the theory could therefore bring order to the search for quantum algorithms by characterizing which quantum states meet that complexity criteria.
    An algorithm is a sequence of steps to perform a calculation. The algorithm is usually implemented on a circuit.
    In classical computers, circuits have gates that switch bits to either a 0 or 1 state. A quantum computer instead relies on computational units called “qubits” that store 0 and 1 states simultaneously in superposition, allowing more information to be processed.

    advertisement

    What would make a quantum computer faster than a classical computer is simpler information processing, characterized by the enormous reduction in the number of quantum gates in a quantum circuit compared with a classical circuit.
    In classical computers the number of gates in circuits increases exponentially with respect to the size of the problem of interest. This exponential model grows so astonishingly fast that it becomes physically impossible to handle for even a moderately sized problem of interest.
    “For example, even a small protein molecule may contain hundreds of electrons. If each electron can only take two forms, then to simulate 300 electrons would require 2300 classical states, which is more than the number of all the atoms in the universe,” said Sabre Kais, a professor in Purdue’s Department of Chemistry and member of the Purdue Quantum Science and Engineering Institute.
    For quantum computers, there is a way for quantum gates to scale up “polynomially” — rather than just exponentially like a classical computer — with the size of the problem (like the number of electrons in the last example). “Polynomial” means that there would be drastically fewer steps (gates) needed to process the same amount of information, making a quantum algorithm superior to a classical algorithm.
    Researchers so far haven’t had a good way to identify which quantum states could satisfy this condition of polynomial complexity.
    “There is a very large search space for finding the states and sequence of gates that match up in complexity to create a useful quantum algorithm capable of performing calculations faster than a classical algorithm,” said Kais, whose research group is developing quantum algorithms and quantum machine learning methods.
    Kais and Zixuan Hu, a Purdue postdoctoral associate, used the new theory to identify a large group of quantum states with polynomial complexity. They also showed that these states may share a coefficient feature that could be used to better identify them when designing a quantum algorithm.
    “Given any quantum state, we are now able to design an efficient coefficient sampling procedure to determine if it belongs to the class or not,” Hu said.
    This work is supported by the U.S. Department of Energy (Office of Basic Energy Sciences) under Award No. DE-SC0019215. The Purdue Quantum Science and Engineering Institute is part of Purdue’s Discovery Park.

    Story Source:
    Materials provided by Purdue University. Original written by Kayla Wiles. Note: Content may be edited for style and length. More

  • in

    Team's flexible micro LEDs may reshape future of wearable technology

    University of Texas at Dallas researchers and their international colleagues have developed a method to create micro LEDs that can be folded, twisted, cut and stuck to different surfaces.
    The research, published online in June in the journal Science Advances, helps pave the way for the next generation of flexible, wearable technology.
    Used in products ranging from brake lights to billboards, LEDs are ideal components for backlighting and displays in electronic devices because they are lightweight, thin, energy efficient and visible in different types of lighting. Micro LEDs, which can be as small as 2 micrometers and bundled to be any size, provide higher resolution than other LEDs. Their size makes them a good fit for small devices such as smart watches, but they can be bundled to work in flat-screen TVs and other larger displays. LEDs of all sizes, however, are brittle and typically can only be used on flat surfaces.
    The researchers’ new micro LEDs aim to fill a demand for bendable, wearable electronics.
    “The biggest benefit of this research is that we have created a detachable LED that can be attached to almost anything,” said Dr. Moon Kim, Louis Beecherl Jr. Distinguished Professor of materials science and engineering at UT Dallas and a corresponding author of the study. “You can transfer it onto your clothing or even rubber — that was the main idea. It can survive even if you wrinkle it. If you cut it, you can use half of the LED.”
    Researchers in the Erik Jonsson School of Engineering and Computer Science and the School of Natural Sciences and Mathematics helped develop the flexible LED through a technique called remote epitaxy, which involves growing a thin layer of LED crystals on the surface of a sapphire crystal wafer, or substrate.

    advertisement

    Typically, the LED would remain on the wafer. To make it detachable, researchers added a nonstick layer to the substrate, which acts similarly to the way parchment paper protects a baking sheet and allows for the easy removal of cookies, for instance. The added layer, made of a one-atom-thick sheet of carbon called graphene, prevents the new layer of LED crystals from sticking to the wafer.
    “The graphene does not form chemical bonds with the LED material, so it adds a layer that allows us to peel the LEDs from the wafer and stick them to any surface,” said Kim, who oversaw the physical analysis of the LEDs using an atomic resolution scanning/transmission electron microscope at UT Dallas’ Nano Characterization Facility.
    Colleagues in South Korea carried out laboratory tests of LEDs by adhering them to curved surfaces, as well as to materials that were subsequently twisted, bent and crumpled. In another demonstration, they adhered an LED to the legs of a Lego minifigure with different leg positions.
    Bending and cutting do not affect the quality or electronic properties of the LED, Kim said.
    The bendy LEDs have a variety of possible uses, including flexible lighting, clothing and wearable biomedical devices. From a manufacturing perspective, the fabrication technique offers another advantage: Because the LED can be removed without breaking the underlying wafer substrate, the wafer can be used repeatedly.
    “You can use one substrate many times, and it will have the same functionality,” Kim said.
    In ongoing studies, the researchers also are applying the fabrication technique to other types of materials.
    “It’s very exciting; this method is not limited to one type of material,” Kim said. “It’s open to all kinds of materials.”

    Story Source:
    Materials provided by University of Texas at Dallas. Original written by Kim Horner. Note: Content may be edited for style and length. More

  • in

    Intelligent software tackles plant cell jigsaw puzzle

    Imagine working on a jigsaw puzzle with so many pieces that even the edges seem indistinguishable from others at the puzzle’s centre. The solution seems nearly impossible. And, to make matters worse, this puzzle is in a futuristic setting where the pieces are not only numerous, but ever-changing. In fact, you not only must solve the puzzle, but “un-solve” it to parse out how each piece brings the picture wholly into focus.
    That’s the challenge molecular and cellular biologists face in sorting through cells to study an organism’s structural origin and the way it develops, known as morphogenesis. If only there was a tool that could help. An eLife paper out this week shows there now is.
    An EMBL research group led by Anna Kreshuk, a computer scientist and expert in machine learning, joined the DFG-funded FOR2581 consortium of plant biologists and computer scientists to develop a tool that could solve this cellular jigsaw puzzle. Starting with computer code and moving on to a more user-friendly graphical interface called PlantSeg, the team built a simple open-access method to provide the most accurate and versatile analysis of plant tissue development to date. The group included expertise from EMBL, Heidelberg University, the Technical University of Munich, and the Max Planck Institute for Plant Breeding Research in Cologne.
    “Building something like PlantSeg that can take a 3D perspective of cells and actually separate them all is surprisingly hard to do, considering how easy it is for humans,” Kreshuk says. “Computers aren’t as good as humans when it comes to most vision-related tasks, as a rule. With all the recent development in deep learning and artificial intelligence at large, we are closer to solving this now, but it’s still not solved — not for all conditions. This paper is the presentation of our current approach, which took some years to build.”
    If researchers want to look at morphogenesis of tissues at the cellular level, they need to image individual cells. Lots of cells means they also have to separate or “segment” them to see each cell individually and analyse the changes over time.
    “In plants, you have cells that look extremely regular that in a cross-section looks like rectangles or cylinders,” Kreshuk says. “But you also have cells with so-called ‘high lobeness’ that have protrusions, making them look more like puzzle pieces. These are more difficult to segment because of their irregularity.”
    Kreshuk’s team trained PlantSeg on 3D microscope images of reproductive organs and developing lateral roots of a common plant model, Arabidopsis thaliana, also known as thale cress. The algorithm needed to factor in the inconsistencies in cell size and shape. Sometimes cells were more regular, sometimes less. As Kreshuk points out, this is the nature of tissue.

    advertisement

    A beautiful side of this research came from the microscopy and images it provided to the algorithm. The results manifested themselves in colourful renderings that delineated the cellular structures, making it easier to truly “see” segmentation.
    “We have giant puzzle boards with thousands of cells and then we’re essentially colouring each one of these puzzle pieces with a different colour,” Kreshuk says.
    Plant biologists have long needed this kind of tool, as morphogenesis is at the crux of many developmental biology questions. This kind of algorithm allows for all kinds of shape-related analysis, for example, analysis of shape changes through development or under a change in environmental conditions, or between species. The paper gives some examples, such as characterising developmental changes in ovules, studying the first asymmetric cell division which initiates the formation of the lateral root, and comparing and contrasting the shape of leaf cells between two different plant species.
    While this tool currently targets plants specifically, Kreshuk points out that it could be tweaked to be used for other living organisms as well.
    Machine learning-based algorithms, like the ones used at the core of PlantSeg, are trained from correct segmentation examples. The group has trained PlantSeg on many plant tissue volumes, so that now it generalises quite well to unseen plant data. The underlying method is, however, applicable to any tissue with cell boundary staining and one could easily retrain it for animal tissue.
    “If you have tissue where you have a boundary staining, like cell walls in plants or cell membranes in animals, this tool can be used,” Kreshuk says. “With this staining and at high enough resolution, plant cells look very similar to our cells, but they are not quite the same. The tool right now is really optimised for plants. For animals, we would probably have to retrain parts of it, but it would work.”
    Currently, PlantSeg is an independent tool but one that Kreshuk’s team will eventually merge into another tool her lab is working on, ilastik Multicut workflow. More

  • in

    Algorithm aims to alert consumers before they use illicit online pharmacies

    Consumers are expected to spend more than $100 billion at online pharmacies in the next few years, but not all of these businesses are legitimate. Without proper quality control, these illicit online pharmacies are more than just a commercial threat, they can create serious health threats.
    In a study, a team of Penn State researchers report that an algorithm they developed may be able to spot illicit online pharmacies that could be providing customers with substandard medications without their knowledge, among other potential problems.
    “There are several problems with illicit online pharmacies,” said Soundar Kumara, the Allen E. Pearce and Allen M. Pearce Professor of Industrial Engineering. “One is they might put bad content into a pill, and the other problem is they might reduce the content of a medicine, so, for example, instead of taking 200 milligrams of a medication, the customers are only taking 100 milligrams — and they probably never realize it.”
    Besides often selling sub-standard and counterfeit drugs, illicit pharmacies may provide potentially dangerous and addictive drugs, such as opioids, without a prescription, according to the researchers, who report their findings in the Journal of Medical Internet Research, a top-tier peer-reviewed open-access journal in health/medical informatics. The paper, “Managing Illicit Online Pharmacies: Web Analytics and Predictive Models Study,” can be accessed here.
    The researchers designed the computer model to approach the problem of weeding out good online pharmacies from bad in much the same way that people make comparisons, said Kumara, who is also an associate of Penn State’s Institute for Computational and Data Sciences.
    “The essential question in this study is, how do you know what is good or bad — you create a baseline of what is good and then you compare that baseline with anything else you encounter, which normally tells you whether something is not good,” said Kumara. “This is how we recognize things that might be out of the norm. The same thing applies here. You look at a good online pharmacy and find out what the features are of that site and then you collect the features of other online pharmacies and do a comparison.”
    Hui Zhao, associate professor of supply chain and information systems and the Charles and Lilian Binder Faculty Fellow in the Smeal College of Business, said that sorting legitimate online pharmacies from illicit ones can be a daunting task.

    advertisement

    “It’s very challenging to develop these tools for two reasons,” said Zhao. “First is just the huge scale of the problem. There are at least 32,000 to 35,000 online pharmacies. Second, the nature of online channels because these online pharmacies are so dynamic. They come and go quickly — around 20 a day.”
    According to Sowmyasri Muthupandi, a former research assistant in industrial engineering and currently a data engineer at Facebook, the team looked at several attributes of online pharmacies but identified the relationships between the pharmacies and other sites as a critical attribute in determining whether the business was legitimate, or not.
    “One novelty of the algorithm is that we focused mostly on websites that link to these particular pharmacies,” said Muthupandi. “And among all the attributes we found that it’s these referral websites that paint a clearer picture when it comes to classifying online pharmacies.”
    She added that if a pharmacy is mainly reached from referral websites that mostly link to or refer illicit pharmacies, then this pharmacy is more likely to be illicit.
    Zhao said that the algorithm the team developed could help consumers identify illicit online pharmacies, which are estimated to represent up to 75% of all online drug merchants. As an added danger, most consumers lack the awareness of the prevalence and the danger of these illicit pharmacies and consequently use the site without knowing the potential risks, she said.

    advertisement

    The researchers said a warning system could be developed that alerts the consumer before a purchase that the site may be an illicit pharmacy. Search engines, social media, online markets, such as Amazon, and payment or credit card companies could also use the algorithm to filter out illicit online pharmacies, or take the status of the online pharmacies into consideration when ranking search results, deciding advertising allocations, making payments, or disqualifying vendors.
    Policy makers, government agencies, patient advocacy groups and drug manufacturers could use such a system to identify, monitor, curb illicit online pharmacies and educate consumers.
    According to Muthupandi, for future work, researchers may want to consider expanding the number of websites and attributes for analysis to further improve the algorithm’s ability to detect illicit online pharmacies.
    This work was funded through the Smeal Commercialization of Research (SCOR) Grant, established for “Research with Impact.” This particular project was funded collaboratively by the Farrell Center for Corporate Innovation and Entrepreneurship, the College of Engineering’s ENGINE Program and the Penn State Fund for Innovation. The team has also received a patent — U.S. Patent No. 10,672,048 — for this work. More

  • in

    What’s behind August 2020’s extreme weather? Climate change and bad luck

    August 2020 has been a devastating month across large swaths of the United States: As powerful Hurricane Laura barreled into the U.S. Gulf Coast on August 27, fires continued to blaze in California. Meanwhile, farmers are still assessing widespread damage to crops in the Midwest following an Aug. 10 “derecho,” a sudden, hurricane-force windstorm.
    Each of these extreme weather events was the result of a particular set of atmospheric — and in the case of Laura, oceanic — conditions. In part, it’s just bad luck that the United States is being slammed with these events back-to-back-to-back. But for some of these events, such as intense hurricanes and more frequent wildfires, scientists have long warned that climate change has been setting the stage for disaster.
    Science News takes a closer look at what causes these kinds of extreme weather events, and the extent to which human-caused climate change may be playing a role in each of them.
    On August 25, NASA’s GOES-West satellite watched as hazy gray smoke emanating from hundreds of wildfires in California drifted eastward, while Hurricane Laura barreled toward Louisiana and Texas. Farther south and east are the wispy remnants of Tropical Storm Marco. Laura made landfall on August 27 as a Category 4 hurricane.NOAA
    California wildfires
    A “dry lightning” storm, which produced nearly 11,000 bursts of lightning between August 15 and August 19, set off devastating wildfires in across California. To date, these fires have burned more than 520,000 hectares.
    That is “an unbelievable number to say out loud, even in the last few years,” says climate scientist Daniel Swain, of the Institute of the Environment and Sustainability at UCLA.
    Lightning crackles over Mitchell’s Cove in Santa Cruz, Calif., on August 16, part of a rare and severe storm system that triggered wildfires across the state.Shmuel Thaler/The Santa Cruz Sentinel via AP
    The storm itself was the result of a particular, unusual set of circumstances. But the region was already primed for fires, the stage set by a prolonged and record-breaking heat wave in the western United States — including one of the hottest temperatures ever measured on Earth, at Death Valley, Calif. — as well as extreme dryness in the region (SN: 8/17/20). And those conditions bear the fingerprints of climate change, Swain says.
    The extreme dryness is particularly key, he adds. “It’s not just incremental; it absolutely matters how dry it is. You don’t just flip a switch from dry enough to burn to not dry enough to burn. There’s a wide gradient up to dry enough to burn explosively.”
    Both California’s average heat and dryness have become more severe due to climate change, dramatically increasing the likelihood of extreme wildfires. In an Aug. 20 study in Environmental Research Letters, Swain and colleagues noted that over the last 40 years, average autumn temperatures increased across the state by about 1 degree Celsius, and statewide precipitation dropped by about 30 percent. That, in turn, has more than doubled the number of autumn days with extreme fire weather conditions since the early 1980s, they found.
    An unusual dry lightning storm combined with very dry vegetation and a record-breaking heat wave to spark hundreds of wildfires across California between August 15 and August 19. One group of these fires, collectively referred to as the LNU Lightning Complex, blazed through Napa, Sonoma, Solano, Yolo and Lake counties. Firefighters continued to battle the LNU complex fires on August 23, including in unincorporated Lake County, Calif. (shown).AP Photo/Noah Berger
    Although fall fires in California tend to be more wind-driven, and summertime fires more heat-driven, studies show that the fingerprint of climate change is present in both, Swain says. “A lot of it is very consistent with the long-term picture that scientists were suggesting would evolve.”
    Though the stage had been set by the climate, the particular trigger for the latest fires was a “dry lightning” storm that resulted from a strange confluence of two key conditions, each in itself rare for the region and time of year. “’Freak storm’ would not be too far off,” Swain says.
    Smoke still engulfed California on August 24, as more than 650 wildfires continued to blaze across the state (red dots indicate likely fire areas). The two largest fires, both in Northern California, were named for the lightning storm that sparked them: the LNU Lightning Complex and the SCU Lightning Complex. They are now second and third on the list of California’s largest wildfires.NASA Worldview, Earth Observing System Data and Information System (EOSDIS)
    The first was a plume of moisture from Tropical Storm Fausto, far to the south, which managed to travel north to California on the wind and provide just enough moisture to form clouds. The second was a small atmospheric ripple, the remnants of an old thunderstorm complex in the Sonoran Desert. That ripple, Swain says, was just enough to kick-start mixing in the atmosphere; such vertical motion is the key to thunderstorms. The resulting clouds were stormy but very high, their bases at least 3,000 meters aboveground. They produced plenty of lightning, but most rain would have evaporated during the long dry journey down.
    Possible links between climate change and the conditions that led to such a dry lightning storm would be “very hard to disentangle,” Swain says. “The conditions are rare to begin with, and not well modeled from a weather perspective.”
    But, he adds, “we know there’s a climate signal in the background conditions that allowed that rare event to have the outcome it did.”
    Midwest derecho
    On August 10, a powerful windstorm with the ferocity of a hurricane traveled over 1,200 kilometers in just 14 hours, leaving a path of destruction from eastern South Dakota to western Ohio.
    The storm was what’s known as a derecho, roughly translating to “straight ahead.” These storms have winds rivaling the strength of a hurricane or tornado, but push forward in one direction instead of rotating. By definition, a derecho produces sustained winds of at least 93 kilometers per hour (similar to the fury of tropical storm-force winds), nearly continuously, for at least 400 kilometers. Their power is equally devastating: The August derecho flattened millions of hectares of crops, uprooted trees, damaged homes, flipped trucks and left hundreds of thousands of people without power.
    A powerful derecho on August 10 twisted these corn and soybean grain bins in Luther, Iowa. The storm-force winds swept 1,200 kilometers across the U.S. Midwest, from South Dakota to Ohio, damaging homes and croplands and leaving hundreds of thousands of people without power.Daniel Acker/Getty Images
    The Midwest has had many derechos before, says Alan Czarnetzki, a meteorologist at the University of Northern Iowa in Cedar Falls. What made this one significant and unusual was its intensity and scale — and, Czarnetzki notes, the fact that it took even researchers by surprise.
    Derechos originate within a mesoscale convective system — a vast, organized system of thunderclouds that are the basic building block for many different kinds of storms, including hurricanes and tornadoes. Unlike the better-known rotating supercells, however, derechos form from long bands of swiftly moving thunderstorms, sometimes called squall lines. In hindsight, derechos are easy to recognize. In addition to the length and strength conditions, derechos acquire a distinctive bowlike shape on radar images; this one appeared as though the storm was aiming its arrow eastward.
    But the storms are much more difficult to forecast, because the conditions that can lead them to form can be very subtle. And there’s overall less research on these storms than on their more dramatic cousins, tornadoes. “We have to rely on situational awareness,” Czarnetzki says. “Like people, sometimes you can have an exceptional storm arise from very humble origins.”

    The Aug. 10 derecho was particularly long and strong, with sustained winds in some places of up to 160 kilometers per hour (100 miles an hour). Still, such a strong derecho is not unheard of, Czarnetzki says. “It’s probably every 10 years you’d see something this strong.”
    Whether such strong derechos might become more, or less, common due to climate change is difficult to say, however. Some anticipated effects of climate change, such as warming at the planet’s surface, could increase the likelihood of more and stronger derechos by increasing atmospheric instability. But warming higher in the atmosphere, also a possible result of climate change, could similarly increase atmospheric stability, Czarnetzki says. “It’s a straightforward question with an uncertain answer.”
    Atlantic hurricanes
    Hurricane Laura roared ashore in Louisiana in the early morning hours of August 27 as a Category 4 hurricane, with sustained winds of about 240 kilometers per hour (150 miles per hour). Just two days earlier, the storm had been a Category 1. But in the mere 24 hours from August 25 to August 26, the storm rapidly intensified, supercharged by warm waters in the Gulf of Mexico.
    Hurricane Laura intensified rapidly due to the warm waters of the Gulf of Mexico, strengthening from a Category 1 hurricane on August 25 to a Category 4 on August 26 (shown). The U.S. National Hurricane Center warned coastal residents of Louisiana and Texas to expect a storm surge — ocean waters elevated by the storm above the normal tide level — of as much as five meters.NOAA
    The Atlantic hurricane season is already setting several new records, with the National Oceanographic and Atmospheric Administration predicting as many as 25 named storms, the most the agency has ever anticipated (SN: 8/7/20).
    At present, 2005 still holds the record for the most named storms to actually form in the Atlantic in a given season, at 28 (SN: 8/22/18). But 2020 may yet surpass that record. By August 26, 13 named storms had already formed in the Atlantic, the most ever before September.
    The previous week, researchers pondered whether another highly unusual set of circumstances might be in the offing. As Laura’s track shifted southward, away from Florida, tropical storm Marco appeared to be on track to enter the Gulf of Mexico right behind it. That might have induced a type of physical interaction known as a Fujiwhara effect, in which a strong storm might strengthen further as it absorbs the energy of a lesser storm. In perhaps a stroke of good luck in the midst of this string of weather extremes, Marco dissipated instead.
    As Hurricane Laura approached landfall, the U.S. National Hurricane Center warned that “unsurvivable” storm surges of up to five meters could inundate the Gulf Coast in parts of Texas and Louisiana. Storm surge is the height to which the seawater level rises as a result of a storm, on top of the normal tidal level.
    Debris litters Lake Charles, La., in the aftermath of Hurricane Laura’s landfall August 27.AP Photo/Gerald Herbert
    It’s impossible to attribute the fury of any one storm to climate change, but scientists have observed a statistically significant link between warmer waters and hurricane intensity. Warm waters in the Atlantic Ocean, the result of climate change, juiced up 2017’s hurricanes, including Irma and Maria, researchers have found (SN: 9/28/18).
    And the Gulf of Mexico’s bathlike waters have notably supercharged several hurricanes in recent years. In 2018, for example, Hurricane Michael intensified rapidly before slamming into the Florida panhandle (SN: 10/10/18). And in 2005, hurricanes Katrina and Rita did the same before making landfall (SN: 9/13/05).
    As for Laura, one contributing factor to its rapid intensification was a drop in wind shear as it spun through the Gulf.  Wind shear, a change in the speed and/or direction of winds with height, can disrupt a storm’s structure, robbing it of some of its power.  But the Gulf’s warmer-than-average waters, which in some locations approached 32.2° C (90° Fahrenheit), were also key to the storm’s sudden strength. And, by warming the oceans, climate change is also setting the stage for supercharged storms, scientists say. 

    Trustworthy journalism comes at a price.

    Scientists and journalists share a core belief in questioning, observing and verifying to reach the truth. Science News reports on crucial research and discovery across science disciplines. We need your financial support to make it happen – every contribution makes a difference.

    Subscribe or Donate Now More

  • in

    Brain-inspired electronic system could vastly reduce AI's carbon footprint

    Extremely energy-efficient artificial intelligence is now closer to reality after a study by UCL researchers found a way to improve the accuracy of a brain-inspired computing system.
    The system, which uses memristors to create artificial neural networks, is at least 1,000 times more energy efficient than conventional transistor-based AI hardware, but has until now been more prone to error.
    Existing AI is extremely energy-intensive — training one AI model can generate 284 tonnes of carbon dioxide, equivalent to the lifetime emissions of five cars. Replacing the transistors that make up all digital devices with memristors, a novel electronic device first built in 2008, could reduce this to a fraction of a tonne of carbon dioxide — equivalent to emissions generated in an afternoon’s drive.
    Since memristors are so much more energy-efficient than existing computing systems, they can potentially pack huge amounts of computing power into hand-held devices, removing the need to be connected to the Internet.
    This is especially important as over-reliance on the Internet is expected to become problematic in future due to ever-increasing data demands and the difficulties of increasing data transmission capacity past a certain point.
    In the new study, published in Nature Communications, engineers at UCL found that accuracy could be greatly improved by getting memristors to work together in several sub-groups of neural networks and averaging their calculations, meaning that flaws in each of the networks could be cancelled out.

    advertisement

    Memristors, described as “resistors with memory,” as they remember the amount of electric charge that flowed through them even after being turned off, were considered revolutionary when they were first built over a decade ago, a “missing link” in electronics to supplement the resistor, capacitor and inductor. They have since been manufactured commercially in memory devices, but the research team say they could be used to develop AI systems within the next three years.
    Memristors offer vastly improved efficiency because they operate not just in a binary code of ones and zeros, but at multiple levels between zero and one at the same time, meaning more information can be packed into each bit.
    Moreover, memristors are often described as a neuromorphic (brain-inspired) form of computing because, like in the brain, processing and memory are implemented in the same adaptive building blocks, in contrast to current computer systems that waste a lot of energy in data movement.
    In the study, Dr Adnan Mehonic, PhD student Dovydas Joksas (both UCL Electronic & Electrical Engineering), and colleagues from the UK and the US tested the new approach in several different types of memristors and found that it improved the accuracy of all of them, regardless of material or particular memristor technology. It also worked for a number of different problems that may affect memristors’ accuracy.
    Researchers found that their approach increased the accuracy of the neural networks for typical AI tasks to a comparable level to software tools run on conventional digital hardware.
    Dr Mehonic, director of the study, said: “We hoped that there might be more generic approaches that improve not the device-level, but the system-level behaviour, and we believe we found one. Our approach shows that, when it comes to memristors, several heads are better than one. Arranging the neural network into several smaller networks rather than one big network led to greater accuracy overall.”
    Dovydas Joksas further explained: “We borrowed a popular technique from computer science and applied it in the context of memristors. And it worked! Using preliminary simulations, we found that even simple averaging could significantly increase the accuracy of memristive neural networks.”
    Professor Tony Kenyon (UCL Electronic & Electrical Engineering), a co-author on the study, added: “We believe now is the time for memristors, on which we have been working for several years, to take a leading role in a more energy-sustainable era of IoT devices and edge computing.” More