More stories

  • in

    Mathematicians derive the formulas for boundary layer turbulence 100 years after the phenomenon was first formulated

    Turbulence makes many people uneasy or downright queasy. And it’s given researchers a headache, too. Mathematicians have been trying for a century or more to understand the turbulence that arises when a flow interacts with a boundary, but a formulation has proven elusive.
    Now an international team of mathematicians, led by UC Santa Barbara professor Björn Birnir and the University of Oslo professor Luiza Angheluta, has published a complete description of boundary layer turbulence. The paper appears in Physical Review Research, and synthesizes decades of work on the topic. The theory unites empirical observations with the Navier-Stokes equation — the mathematical foundation of fluid dynamics — into a mathematical formula.
    This phenomenon was first described around 1920 by Hungarian physicist Theodore von Kármán and German physicist Ludwig Prandtl, two luminaries in fluid dynamics. “They were honing in on what’s called boundary layer turbulence,” said Birnir, director of the Center for Complex and Nonlinear Science. This is turbulence caused when a flow interacts with a boundary, such as the fluid’s surface, a pipe wall, the surface of the Earth and so forth.
    Prandtl figured out experimentally that he could divide the boundary layer into four distinct regions based on proximity to the boundary. The viscous layer forms right next to the boundary, where turbulence is damped by the thickness of the flow. Next comes a transitional buffer region, followed by the inertial region, where turbulence is most fully developed. Finally, there is the wake, where the boundary layer flow is least affected by the boundary, according to a formula by von Kármán.
    The fluid flows quicker farther from the boundary, but its velocity changes in a very specific manner. Its average velocity increases in the viscous and buffer layers and then transitions to a logarithmic function in the inertial layer. This “log law,” found by Prandtl and von Kármán, has perplexed researchers, who worked to understand where it came from and how to describe it.
    The flow’s variation — or deviation from the mean velocity — also displayed peculiar behavior across the boundary layer. Researchers sought to understand these two variables and derive formulas that could describe them. More

  • in

    Researchers develop rapid computer software to track pandemics as they happen

    Researchers at Georgia State University have created lightning-fast computer software that can help nations track and analyze pandemics, like the one caused by COVID-19, before they spread like wildfire around the globe.
    The group of computer science and mathematics researchers says its new software is several orders of magnitude faster than existing computer programs and can process more than 200,000 novel virus genomes in less than two hours. The software then builds a clear visual tree of the strains and where they are spreading. This provides information that can be invaluable for countries making early decisions about lockdowns, quarantines, social distancing and testing during infectious disease outbreaks.
    “The future of infectious outbreaks will no doubt be heavily data driven,” said Alexander Zelikovsky, a Georgia State computer science professor who worked on the project.
    The new software was co-created with Pavel Skums, assistant professor of computer science, Mark Grinshpon, principal senior lecturer of mathematics and statistics, Daniel Novikov, a computer science Ph.D. student, and two former Georgia State Ph.D. students — Sergey Knyazev (now a postdoctoral scholar at the University of California at Los Angeles) and Pelin Icer (now a postdoctoral scholar at Swiss Federal Institute of Technology, ETH Zürich).
    Their paper describing the new approach, “Scalable Reconstruction of SARS-CoV-2 Phylogeny with Recurrent Mutations,” was published in the Journal of Computational Biology.
    “The COVID-19 pandemic has been an unprecedented challenge and opportunity for scientists,” said Skums, who noted that never before have researchers around the world sequenced so many complete genomes of any virus. The strains of SARS-CoV-2 are uploaded onto the free global GISAID database (https://www.gisaid.org/hcov19-variants/), where they can be data-mined and studied by any scientist. Zelikovsky, Skums and their colleagues analyzed more than 300,000 different GISAID strains for their new work. More

  • in

    A nanoantenna for long-distance, ultra-secure communication

    Information storage and transfer in the manner of simple ones and zeros — as in today’s classical computer technologies — is insufficient for quantum technologies under development. Now, researchers from Japan have fabricated a nanoantenna that will help bring quantum information networks closer to practical use.
    In a study recently published in Applied Physics Express, researchers from Osaka University and collaborating partners have substantially enhanced photon-to-electron conversion through a metal nanostructure, which is an important step forward in the development of advanced technologies for sharing and processing data.
    Classical computer information is based on simple on/off readouts. It’s straightforward to use a technology known as a repeater to amplify and retransmit this information over long distances. Quantum information is based on comparatively more complex and secure readouts, such as photon polarization and electron spin. Semiconductor nanoboxes known as quantum dots are materials that researchers have proposed for storing and transferring quantum information. However, quantum repeater technologies have some limitations — for example, current ways to convert photon-based information to electron-based information are highly inefficient. Overcoming this information conversion and transfer challenge is what the researchers at Osaka University aimed to address.
    “The efficiency of converting single photons into single electrons in gallium arsenide quantum dots — common materials in quantum communication research — is currently too low,” explains lead author Rio Fukai. “Accordingly, we designed a nanoantenna — consisting of ultra-small concentric rings of gold — to focus light onto a single quantum dot, resulting in a voltage readout from our device.”
    The researchers enhanced photon absorption by a factor of up to 9, compared with not using the nanoantenna. After illuminating a single quantum dot, most of the photogenerated electrons weren’t trapped there, and instead accumulated in impurities or other locations in the device. Nevertheless, these excess electrons gave a minimal voltage readout that was readily distinguished from that generated by the quantum dot electrons, and thus didn’t disrupt the device’s intended readout.
    “Theoretical simulations indicate that we can improve the photon absorption by up to a factor of 25,” says senior author Akira Oiwa. “Improving the alignment of the light source and more precisely fabricating the nanoantenna are ongoing research directions in our group.”
    These results have important applications. Researchers now have a means of using well-established nano-photonics to advance the prospects of upcoming quantum communication and information networks. By using abstract physics properties such as entanglement and superposition, quantum technology could provide unprecedented information security and data processing in the coming decades.
    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    Research team makes considerable advance in brain-inspired computing

    While AI is often perceived by the public to be affiliated with software, researchers in Han Wang’s Emerging Nanoscale Materials and Device Lab at USC Ming Hsieh Department of Electrical and Computer Engineering and the Mork Family Department of Chemical, focus on improving AI and machine learning performance through hardware. The lab, whose work is concentrated on neuromorphic computing or brain-inspired computing, has new research that introduces hardware improvements by harnessing a quality known as “randomness” or “stochasticity.” Their research now published in Nature Communications, contradicts the perception of randomness as a quality that will negatively impact computation results and demonstrates the utilization of finely controlled stochastic features in semiconductor devices to improve performing optimization.
    In the brain, randomness plays an important role in human thought or computation. It is born from billions of neurons that spike in response to input stimuli and generate a lot of signals that may or may not be relevant. The decision-making process perhaps is the best-studied example of how our brain makes use of randomness. It allows the brain to take a detour from past experiences and explore a new solution when making a decision, especially to a challenging and unpredictable situation.
    “Neurons exhibit stochastic behavior, which can help certain computational functions” said a USC PhD student Jiahui Ma and a lead author Xiaodong Yan (both equally contributed as first authors). The team wanted to emulate neurons as much as possible and designed a circuit to solve combinatorial optimization problems, which are one of the most important tasks for computers to complete.
    The thinking is that for computers to do this efficiently, they need to behave more like the human brain (on super steroids) in terms of how they process stimuli and information, as well as make decisions.
    In much simpler terms, we need computers to converge on the best solution among all possibilities. Says the researchers, “The randomness introduced in the new device demonstrated in this work can prevent it from getting stuck at a not-so-viable solution, and instead continue to search until it finds a close-to-optimal result.” This is particularly important for optimization problems, says corresponding author Professor Wang, “If one can dynamically tune the randomness features, the machine for performing optimization can work more efficiently as we desire.”
    The researchers achieve this dynamic “tuning” by creating a specialized device, a hetero-memristor. Unlike transistors which are logic switches inside a regular computer chip, the hetero-memristor combines memory and computation together. Memristors have been developed prior, normally with two-terminal structure. The Viterbi team’s innovation is in adding a third electrical terminal and modulating its voltage to activate the neuron-like device and to dynamically tune the stochastic features in its output, much like one heats up a pot of water and dynamically adjusts the temperature to control the activity of the water molecules, hence enabling the so-called simulated “cooling.” This provides a level of control that earlier memristors do not have.
    The researchers say, “This method emulates the stochastic properties of neuron activity.” In fact, neuron activity is perceived to be random, but may follow a certain probability pattern. The hetero-memristors they developed introduce such probability-governed randomness into a neuromorphic computing circuit by the reconfigurable tuning of the device’s intrinsic stochastic property.
    This is thus a more sophisticated building block for creating computers that can tackle sophisticated optimization problems, which can potentially be more efficient. What’s more they can consume less power.
    The full research team includes Xiaodong Yan, Jiahui, Ma Tong Wu, Aoyang Zhang, Jiangbin Wu, Matthew Chin, Zhihan Zhang, Madan Dubey, Wei Wu, Mike Shuo-Wei Chen, Jing Guo, & Han Wang.
    Research was done in collaboration with the Army Research Laboratory, the University of Florida and Georgia Tech.
    Story Source:
    Materials provided by University of Southern California. Original written by Amy Blumenthal. Note: Content may be edited for style and length. More

  • in

    Big data privacy for machine learning just got 100 times cheaper

    Rice University computer scientists have discovered an inexpensive way for tech companies to implement a rigorous form of personal data privacy when using or sharing large databases for machine learning.
    “There are many cases where machine learning could benefit society if data privacy could be ensured,” said Anshumali Shrivastava, an associate professor of computer science at Rice. “There’s huge potential for improving medical treatments or finding patterns of discrimination, for example, if we could train machine learning systems to search for patterns in large databases of medical or financial records. Today, that’s essentially impossible because data privacy methods do not scale.”
    Shrivastava and Rice graduate student Ben Coleman hope to change that with a new method they’ll present this week at CCS 2021, the Association for Computing Machinery’s annual flagship conference on computer and communications security. Using a technique called locality sensitive hashing, Shirvastava and Coleman found they could create a small summary of an enormous database of sensitive records. Dubbed RACE, their method draws its name from these summaries, or “repeated array of count estimators” sketches.
    Coleman said RACE sketches are both safe to make publicly available and useful for algorithms that use kernel sums, one of the basic building blocks of machine learning, and for machine-learning programs that perform common tasks like classification, ranking and regression analysis. He said RACE could allow companies to both reap the benefits of large-scale, distributed machine learning and uphold a rigorous form of data privacy called differential privacy.
    Differential privacy, which is used by more than one tech giant, is based on the idea of adding random noise to obscure individual information.
    “There are elegant and powerful techniques to meet differential privacy standards today, but none of them scale,” Coleman said. “The computational overhead and the memory requirements grow exponentially as data becomes more dimensional.”
    Data is increasingly high-dimensional, meaning it contains both many observations and many individual features about each observation.
    RACE sketching scales for high-dimensional data, he said. The sketches are small and the computational and memory requirements for constructing them are also easy to distribute.
    “Engineers today must either sacrifice their budget or the privacy of their users if they wish to use kernel sums,” Shrivastava said. “RACE changes the economics of releasing high-dimensional information with differential privacy. It’s simple, fast and 100 times less expensive to run than existing methods.”
    This is the latest innovation from Shrivasta and his students, who have developed numerous algorithmic strategies to make machine learning and data science faster and more scalable. They and their collaborators have: found a more efficient way for social media companies to keep misinformation from spreading online, discovered how to train large-scale deep learning systems up to 10 times faster for “extreme classification” problems, found a way to more accurately and efficiently estimate the number of identified victims killed in the Syrian civil war, showed it’s possible to train deep neural networks as much as 15 times faster on general purpose CPUs (central processing units) than GPUs (graphics processing units), and slashed the amount of time required for searching large metagenomic databases.
    The research was supported by the Office of Naval Research’s Basic Research Challenge program, the National Science Foundation, the Air Force Office of Scientific Research and Adobe Inc.
    Story Source:
    Materials provided by Rice University. Original written by Jade Boyd. Note: Content may be edited for style and length. More

  • in

    Researchers train computers to predict the next designer drugs

    UBC researchers have trained computers to predict the next designer drugs before they are even on the market, technology that could save lives.
    Law enforcement agencies are in a race to identify and regulate new versions of dangerous psychoactive drugs such as bath salts and synthetic opioids, even as clandestine chemists work to synthesize and distribute new molecules with the same psychoactive effects as classical drugs of abuse.
    Identifying these so-called “legal highs” within seized pills or powders can take months, during which time thousands of people may have already used a new designer drug.
    But new research is already helping law enforcement agencies around the world to cut identification time down from months to days, crucial in the race to identify and regulate new versions of dangerous psychoactive drugs.
    “The vast majority of these designer drugs have never been tested in humans and are completely unregulated. They are a major public health concern to emergency departments across the world,” says UBC medical student Dr. Michael Skinnider, who completed the research as a doctoral student at UBC’s Michael Smith Laboratories.
    A Minority Report for new designer drugs
    Dr. Skinnider and his colleagues used a database of known psychoactive substances contributed by forensic laboratories around the world to train an artificial intelligence algorithm on the structures of these drugs. The algorithm they used, known as a deep neural network, is inspired by the structure and function of the human brain. More

  • in

    Ultra-large single-crystal WS2 monolayer

    As silicon based semiconducting technology is approaching the limit of its performance, new materials that may replace or partially replace silicon in technology is highly desired. Recently, the emergence of graphene and other two-dimensional (2D) materials offers a new platform for building next generation semiconducting technology. Among them, transition metal dichalcogenides (TMDs), such as MoS2, WS2, MoSe2, WSe2, as most appealing 2D semiconductors.
    A prerequisite of building ultra-large-scale high-performance semiconducting circuits is that the base materials must be a single-crystal of wafer-scale, just like the silicon wafer used today. Although great efforts have been dedicated to the growth of wafer-scale single-crystals of TMDs, the success was very limited until now.
    Distinguished Professor Feng Ding and his research team from the Center for Multidimensional CarbonMaterials (CMCM), within the Institute for Basic Science (IBS) at UNIST, in cooperation with researcher at Peking University (PKU), Beijing Institute of Technology, and Fudan University, reported the direct growth of 2-inch single-crystal WS2 monolayer films very recently. Besides the WS2, the research team also demonstrated the growth of single-crystal MoS2, WSe2, and MoSe2 in wafer scale as well.
    The key technology of epitaxially grown a large sing-crystal is to ensure that all small single-crystal grown on a substrate are uniformly aligned. Because TMDs has non-centrosymmetric structure or the mirror image of a TMD with respect to an edge of it has opposite alignment, we must break such a symmetry by carefully design the substrate. Based on theoretical calculations, the authors proposed a mechanisms of “dual-coupling-guided epitaxy growth” for experimental design. The WS2-sapphireplane interaction as the first driving force, leading to two preferred antiparallel orientations of the WS2 islands. The coupling between WS2 and sapphire step-edge is the second driving force and it will break the degeneracy of the two antiparallel orientations. Then all the TMD single crystals grown on a substrate with step edges are all unidirectional aligned and finally, the coalescence of these small single-crystals leads to a large single-crystal of the same size of the substrate.
    “This new dual-coupling epitaxy growth mechanism is new for controllable materials growth. In principle, it allows us realize to grow all 2D materials into large-area single crystals if proper substrate was found.” Says Dr. Ting Cheng, the co-first author of the study. “We have considered how to choose proper substrates theoretically. First, the substrate should have a low symmetry and, secondly, more step edges are preferred.” emphasizes Professor Feng Ding, the corresponding author of the study.
    “This is a major step forward in the area of 2D materials based device. As the successful growth of wafer-scale single-crystal 2D TMDs on insulators beyond graphene and hBN on transition metal substrates, our study provide the required keystone of 2D semiconductors in high-end applications of electronic and optical devices,” explains professor Feng Ding.
    Story Source:
    Materials provided by Institute for Basic Science. Note: Content may be edited for style and length. More

  • in

    Scientists are racing to save the Last Ice Area, an Arctic Noah’s Ark

    It started with polar bears.

    In 2012, polar bear DNA revealed that the iconic species had faced extinction before, likely during a warm period 130,000 years ago, but had rebounded. For researchers, the discovery led to one burning question: Could polar bears make a comeback again?

    Studies like this one have emboldened an ambitious plan to create a refuge where Arctic, ice-dependent species, from polar bears down to microbes, could hunker down and wait out climate change. For this, conservationists are pinning their hopes on a region in the Arctic dubbed the Last Ice Area — where ice that persists all summer long will survive the longest in a warming world.

    Here, the Arctic will take its last stand. But how long the Last Ice Area will hold on to its summer sea ice remains unclear. A computer simulation released in September predicts that the Last Ice Area could retain its summer sea ice indefinitely if emissions from fossil fuels don’t warm the planet more than 2 degrees Celsius above preindustrial levels, which is the goal set by the 2015 Paris Climate Agreement (SN: 12/12/15). But a recent report by the United Nations found that the climate is set to warm 2.7 degrees Celsius by 2100 under current pledges to reduce emissions, spelling the end of the Arctic’s summer sea ice (SN: 10/26/21).

    Nevertheless, some scientists are hoping that humankind will rally to curb emissions and implement technology to capture carbon and other greenhouse gases, which could reduce, or even reverse, the effects of climate change on sea ice. In the meantime, the Last Ice Area could buy ice-dependent species time in the race against extinction, acting as a sanctuary where they can survive climate change, and maybe one day, make their comeback.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Ecosystem of the frozen sea

    The Last Ice Area is a vast floating landscape of solid ice extending from the northern coast of Greenland to Canada’s Banks Island in the west. This region, roughly the length of the West Coast of the United States, is home to the oldest and thickest ice in the Arctic, thanks to an archipelago of islands in Canada’s far north that prevents sea ice from drifting south and melting in the Atlantic.

    As sea ice from others part of the Arctic rams into this natural barrier, it piles up, forming long towering ice ridges that run for kilometers across the frozen landscape. From above, the area appears desolate. “It’s a pretty quiet place,” says Robert Newton, an oceanographer at Columbia University and coauthor of the recent sea ice model, published September 2 in Science. “A lot of the life is on the bottom of the ice.”

    The muddy underbelly of icebergs is home to plankton and single-celled algae that evolved to grow directly on ice. These species form the backbone of an ecosystem that feeds everything from tiny crustaceans all the way up to beluga whales, ringed seals and polar bears.

    These plankton and algae species can’t survive without ice. So as summer sea ice disappears across the Arctic, the foundation of this ecosystem is literally melting away. “Much of the habitat Arctic species depend on will become uninhabitable,” says Brandon Laforest, an Arctic expert at World Wildlife Fund Canada in Montreal. “There is nowhere else for these species to go. They’re literally being squeezed into the Last Ice Area.”

    The Last Ice Area extends across national borders, making it especially challenging to protect the last summer sea ice in the Arctic. The extent of the ice is predicted to shrink considerably by 2039.WWF CanadaThe Last Ice Area extends across national borders, making it especially challenging to protect the last summer sea ice in the Arctic. The extent of the ice is predicted to shrink considerably by 2039.WWF Canada

    The last stronghold of summer ice provides an opportunity to create a floating sanctuary —an Arctic ark if you will — for the polar bears and many other species that depend on summer ice to survive. For over a decade, WWF Canada and a coalition of researchers and Indigenous communities have lobbied for the area to be protected from another threat: development by industries that may be interested in the region’s oil and mineral resources.

    “The tragedy would be if we had an area where these animals could survive this bottleneck, but they don’t because it’s been developed commercially,” Newton says.

    But for Laforest, protecting the Last Ice Area is not only a question of safeguarding arctic creatures. Sea ice is also an important tool in climate regulation, as the white surface reflects sunlight back into space, helping to cool the planet. In a vicious cycle, losing sea ice helps speed up warming, which in turn melts more ice.

    And for the people who call the Arctic home, sea ice is crucial for food security, transportation and cultural survival, wrote Inuit Circumpolar Council Chair Okalik Eegeesiak in a 2017 article for the United Nations. “Our entire cultures and identity are based on free movement on land, sea ice and the Arctic Ocean,” Eegeesiak wrote. “Our highway is sea ice.” 

    The efforts of these groups have borne some fruit. In 2019, the Canadian government moved to set aside nearly a third of the Last Ice Area as protected spaces called marine preserves. Until 2024, all commercial activity within the boundaries of the preserves is forbidden, with provisions for Indigenous peoples. Conservationists are now asking these marine preserves to be put under permanent protection.

    Rifts in the ice

    However, there are some troubling signs that the sea ice in the region is already precarious. Most worrisome was the appearance in May 2020 of a Rhode Island—sized rift in the ice at the heart of the Last Ice Area. Kent Moore, a geophysicist at the University of Toronto, says that these unusual events may become more frequent as the ice thins. This suggests that the Last Ice Area may not be as resilient as we thought, he says.  

    This is something that worries Laforest. He and others are skeptical that reversing climate change and repopulating the Arctic with ice-dependent species will be possible. “I would love to live in a world where we eventually reverse warming and promote sea ice regeneration,” he says. “But stabilization seems like a daunting task on its own.”

    Still, hope remains. “All the models show that if you were to bring temperatures back down, sea ice will revert to its historical pattern within several years,” says Newton.

    To save the last sea ice — and the creatures that depend on it — removing greenhouse gases from the atmosphere will be essential, says oceanographer Stephanie Pfirman of Arizona State University in Tempe, who coauthored the study on sea ice with Newton. Technology to capture carbon, and prevent more carbon from entering the atmosphere, already exists. The largest carbon capture plant is in Iceland, but projects like that one have yet to be implemented on a major scale.

    Without such intervention, the Arctic is set to lose the last of its summer ice before the end of the century. It would mean the end of life on the ice. But Pfirman, who suggested making the Last Ice Area a World Heritage Site in 2008, says that humankind has undergone big economic and social changes — like the kind needed to reduce emissions and prevent warming — in the past. “I was in Germany when the [Berlin] wall came down, and people hadn’t expected that to happen,” she says.

    Protecting the Last Ice Area is about buying time to protect sea ice and species, says Pfirman. The longer we can hold on to summer sea ice, she says, the better chance we have at bringing arctic species —from plankton to polar bears — back from the brink.    More