More stories

  • in

    Mammoths, meet the metaverse

    Fearsome dire wolves and saber-toothed cats no longer prowl around La Brea Tar Pits, but thanks to new research, anyone can bring these extinct animals back to life through augmented reality (AR). Dr. Matt Davis and colleagues at the Natural History Museum of Los Angeles County and La Brea Tar Pits collaborated with researchers and designers at the University of Southern California (USC) to create more than a dozen new, scientifically accurate virtual models of Ice Age animals, published recently in Palaeontologia Electronica.
    The team is investigating how AR impacts learning in museums, but soon realized there weren’t any accurate Ice Age animals in the metaverse yet that they could use. So, they took all the latest paleontological research and made their own. The models were built in a blocky, low poly style so that they could be scientifically accurate, but still simple enough to run on normal cell phones with limited processing power.
    According to study co-author Dr. William Swartout, Chief Technology Officer at the USC Institute for Creative Technologies, “The innovation of this approach is that it allows us to create scientifically accurate artwork for the metaverse without overcommitting to details where we still lack good fossil evidence.”
    The researchers hope this article will also bring more respect to paleoart, the kind of art that recreates what extinct animals might have looked like. “Paleoart can be very influential in how the public, and even scientists, understand fossil life,” said Dr. Emily Lindsey, Assistant Curator at La Brea Tar Pits and senior author of the study. A lot of paleoart is treated as an afterthought, though, and not subjected to the same rigorous scrutiny as other scientific research. This can lead to particularly bad reconstructions of extinct animals being propagated for generations in both the popular media and academic publications.
    “We think paleoart is a crucial part of paleontological research,” said Dr. Davis, the study’s lead author. “That’s why we decided to publish all the scientific research and artistic decisions that went into creating these models. This will make it easier for other scientists and paleoartists to critique and build off our team’s work.”
    Dr. Davis notes that it is just as important to acknowledge what we don’t know about these animals’ appearances as it is to record what we do know. For example, we can accurately depict the shaggy fur of Shasta ground sloths because paleontologists have found a whole skeleton of this species with hair and skin still preserved. But for mastodons, paleontologists have only found a few strands of hair. Their thick fur pelt was an artistic decision. Dr. Davis and colleagues hope that other paleoartists and scientists will follow their example by publishing all the research that goes into their reconstructions of extinct species. It will lead to better and more accurate paleoart for everyone.
    This research was funded by an NSF AISL collaborative grant (1811014; 1810984) led by Dr. Benjamin Nye of the USC Institute for Creative Technologies, Dr. Gale Sinatra of the USC Rossier School of Education, Dr. William Swartout of the USC Institute for Creative Technologies, and Dr. Emily Lindsey of La Brea Tar Pits.
    Story Source:
    Materials provided by Natural History Museum of Los Angeles County. Note: Content may be edited for style and length. More

  • in

    Deciphering algorithms used by ants and the Internet

    Scientists found that ants and other natural systems use optimization algorithms similar to those used by engineered systems, including the Internet. These algorithms invest incrementally more resources as long as signs are encouraging but pull back quickly at the first sign of trouble. The systems are designed to be robust, allowing for portions to fail without harming the entire system. Understanding how these algorithms work in the real world may help solve engineering problems, whereas engineered systems may offer clues to understanding the behavior of ants, cells, and other natural systems.
    Engineers sometimes turn to nature for inspiration. Cold Spring Harbor Laboratory Associate Professor Saket Navlakha and research scientist Jonathan Suen found that adjustment algorithms — the same feedback control process by which the Internet optimizes data traffic — are used by several natural systems to sense and stabilize behavior, including ant colonies, cells, and neurons.
    Internet engineers route data around the world in small packets, which are analogous to ants. As Navlakha explains:
    “The goal of this work was to bring together ideas from machine learning and Internet design and relate them to the way ant colonies forage.”
    The same algorithm used by Internet engineers is used by ants when they forage for food. At first, the colony may send out a single ant. When the ant returns, it provides information about how much food it got and how long it took to get it. The colony would then send out two ants. If they return with food, the colony may send out three, then four, five, and so on. But if ten ants are sent out and most do not return, then the colony does not decrease the number it sends to nine. Instead, it cuts the number by a large amount, a multiple (say half) of what it sent before: only five ants. In other words, the number of ants slowly adds up when the signals are positive, but is cut dramatically lower when the information is negative. Navlakha and Suen note that the system works even if individual ants get lost and parallels a particular type of “additive-increase/multiplicative-decrease algorithm” used on the Internet.
    Suen thinks ants might inspire new ways to protect computer systems against hackers or cyberattacks. Engineers could emulate how nature withstands a range of threats to health and viability. Suen explains:
    “Nature has been shown to be incredibly robust in a lot of aspects responding to changing environments. In cybersecurity [however] we find that a lot of our systems can be tampered with, can be easily broken, and are simply not robust. We wanted to look at nature, which survives across all sorts of natural disasters, evolutionary changes, human changes, and learn a lot from how nature changes its systems dynamically to survive.”
    While Suen plans to apply nature’s algorithms to engineering programs, Navlakha would like to see if engineering solutions might offer alternative approaches to understanding gene regulation and immune feedback control. Navlakha hopes that “successful strategies in one realm could lead to improvements in the other.”
    Story Source:
    Materials provided by Cold Spring Harbor Laboratory. Original written by Eliene Augenbraun. Note: Content may be edited for style and length. More

  • in

    Three critical factors in the end-Permian mass extinction

    The end of the Permian was characterized by the greatest mass extinction event in Earth’s history. 252 million years ago, a series of volcanic eruptions in Siberia led to a massive release of greenhouse gases. In the course of the next several millennia, the climate ultimately warmed by ten degrees. As a consequence, on land, roughly 75 percent of all organisms went extinct; in the oceans, the number was roughly 90 percent.
    By analyzing how the now-extinct marine organisms once lived, Dr. Foster and his team were able to directly link their extinction to the following climate changes: declining oxygen levels in the water, rising water temperatures, and most likely also ocean acidification.
    These changes are similar to current trends. “Needless to say, our findings on the Permian can’t be applied to modern climate change one-to-one. The two climate systems are far too different,” says Foster, a geoscientist. “Yet they do show which traits were critical for an organism’s survival or extinction- under similar conditions. This can offer us valuable indicators for who or what will be at the greatest risk in the future.”
    Specifically, the team analyzed more than 25,000 records on 1283 genera of fossil marine organisms like bivalves, snails, sponges, algae and crustaceans from the region of South China — all of which had mineral skeletons or shells. Their fossilized remains can be dated using a special method, offering insights into marine ecosystems dating back millions of years. The team also drew on an enormous database that offers additional information on various ecological aspects of how these organisms lived.
    For each genus, twelve of these criteria were analyzed. Did certain traits make a given organism more likely to survive under the conditions prevalent at the end of the Permian — or not? With the aid of machine learning, a method from the field of Artificial Intelligence, all of these factors were analyzed jointly and simultaneously. In the process, the machine essentially made certain rational decisions on its own. Once this was done, the team compared the results: what organisms were there before, during and after the mass extinction?
    Their findings reveal the four factors that were most essential to whether or not organisms survived the end of the Permian: where in the water they lived, the mineralization of their shells, species diversity within their genus, and their sensitivity to acidification.
    “But with previous machine learning applications, we couldn’t say how the machine made its decisions.” Using a newly implemented method from games theory, Dr. Foster has now succeeded in unraveling this aspect: “Some animals lived in deeper water. Here, the machine shows that the worsening lack of oxygen posed a risk. In contrast, those animals that lived nearer the surface had to contend with the rising water temperatures. Plus, when you have only a limited habitat, you have nowhere to go when that specific habitat becomes uninhabitable.” As such, the results show which of the organisms’ traits were determined to be potentially fatal. The team was ultimately able to confirm that the mass extinction can be directly attributed to deoxygenation, rising water temperatures and acidification — which indicates that, in a future climate crisis, these could also be the three main causes of extinction in the long term.
    Story Source:
    Materials provided by University of Hamburg. Original written by Stephanie Janssen. Note: Content may be edited for style and length. More

  • in

    Green chemistry: Scientists develop new process for more eco-friendly liquid crystals

    Liquid crystals could soon be produced more efficiently and in a more environmentally friendly way. A new process has been developed by researchers at Martin Luther University Halle-Wittenberg (MLU) in Germany, Bangalore University in India and Cairo University in Egypt. Compared to conventional methods, it is faster, more energy-efficient and promises a high yield as the team reports in the Journal of Molecular Liquids. Liquid crystals are used in most smartphone, tablet and computer displays.
    The production of liquid crystals is a complex process with many intermediate steps. “Often it requires various solvents and expensive catalysts,” says Dr Mohamed Alaasar, a chemist at MLU. The team from Germany, India and Egypt was looking for a way to simplify the process and make it more environmentally friendly. The idea: instead of the chemical reactions taking place one after the other, certain steps could be combined in a so-called multicomponent reaction in which several substances react directly with one another.
    The team developed an approach for producing liquid crystals which does not require environmentally harmful solvents and relies on cheaper catalysts. “We were able to achieve a yield of about 90 per cent. This means that most of the chemicals are used in the process and relatively few residues are produced,” explains Alaasar. This saves energy and ultimately also money. At room temperature the newly created liquid crystals are in a nematic phase — a special arrangement of molecules used in most liquid crystal displays or LCDs.
    So far, the researchers have only tested their new process in the laboratory. However, Alaasar is confident that it could also be implemented on an industrial scale. “However, manufacturers would have to rebuild parts of their manufacturing. This has not happened in the past with other promising materials,” says the scientist. However, consumers started valuing sustainability and more environmentally friendly products of the last years. That could be an additional argument in favour for the new approach.
    Story Source:
    Materials provided by Martin-Luther-Universität Halle-Wittenberg. Note: Content may be edited for style and length. More

  • in

    For new insights into aerodynamics, scientists turn to paper airplanes

    A series of experiments using paper airplanes reveals new aerodynamic effects, a team of scientists has discovered. Its findings enhance our understanding of flight stability and could inspire new types of flying robots and small drones.
    “The study started with simple curiosity about what makes a good paper airplane and specifically what is needed for smooth gliding,” explains Leif Ristroph, an associate professor at New York University’s Courant Institute of Mathematical Sciences and an author of the study, which appears in the Journal of Fluid Mechanics. “Answering such basic questions ended up being far from child’s play. We discovered that the aerodynamics of how paper airplanes keep level flight is really very different from the stability of conventional airplanes.”
    “Birds glide and soar in an effortless way, and paper airplanes, when tuned properly, can also glide for long distances,” adds author Jane Wang, a professor of engineering and physics at Cornell University. “Surprisingly, there has been no good mathematical model for predicting this seemingly simple but subtle gliding flight.”
    Since we can make complicated modern airplanes fly, the researchers say, one might think we know all there is to know about the simplest flying machines.
    “But paper airplanes, while simple to make, involve surprisingly complex aerodynamics,” notes Ristroph.
    The paper’s authors began their study by considering what is needed for a plane to glide smoothly. Since paper airplanes have no engine and rely on gravity and proper design for their movement, they are good candidates for exploring factors behind flight stability. More

  • in

    Machine learning improves human speech recognition

    Hearing loss is a rapidly growing area of scientific research as the number of baby boomers dealing with hearing loss continues to increase as they age.
    To understand how hearing loss impacts people, researchers study people’s ability to recognize speech. It is more difficult for people to recognize human speech if there is reverberation, some hearing impairment, or significant background noise, such as traffic noise or multiple speakers.
    As a result, hearing aid algorithms are often used to improve human speech recognition. To evaluate such algorithms, researchers perform experiments that aim to determine the signal-to-noise ratio at which a specific number of words (commonly 50%) are recognized. These tests, however, are time- and cost-intensive.
    In The Journal of the Acoustical Society of America, published by the Acoustical Society of America through AIP Publishing, researchers from Germany explore a human speech recognition model based on machine learning and deep neural networks.
    “The novelty of our model is that it provides good predictions for hearing-impaired listeners for noise types with very different complexity and shows both low errors and high correlations with the measured data,” said author Jana Roßbach, from Carl Von Ossietzky University.
    The researchers calculated how many words per sentence a listener understands using automatic speech recognition (ASR). Most people are familiar with ASR through speech recognition tools like Alexa and Siri.
    The study consisted of eight normal-hearing and 20 hearing-impaired listeners who were exposed to a variety of complex noises that mask the speech. The hearing-impaired listeners were categorized into three groups with different levels of age-related hearing loss.
    The model allowed the researchers to predict the human speech recognition performance of hearing-impaired listeners with different degrees of hearing loss for a variety of noise maskers with increasing complexity in temporal modulation and similarity to real speech. The possible hearing loss of a person could be considered individually.
    “We were most surprised that the predictions worked well for all noise types. We expected the model to have problems when using a single competing talker. However, that was not the case,” said Roßbach.
    The model created predictions for single-ear hearing. Going forward, the researchers will develop a binaural model since understanding speech is impacted by two-ear hearing.
    In addition to predicting speech intelligibility, the model could also potentially be used to predict listening effort or speech quality as these topics are very related.
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Double locked: Polymer hydrogels secure confidential information

    The development of highly secure but simple and inexpensive encryption technology for the prevention of data leaks and forgeries is decidedly challenging. In the journal Angewandte Chemie, a research team has introduced a “double lock” based on thermoresponsive polymer hydrogels that encrypts information so that it can only be read at a specific window in temperature and time.
    In addition to digital encryption methods, physical methods play an important role. Their decoding is typically based on external stimuli such as light or heat. Multiple stimuli offer more security, but make reading the data cumbersome and complex. “Addition of the time domain greatly raises the chance to achieve the unity of security and simpleness,” according to the team led by Zhikun Zheng, Xudong Chen, and Wei Liu, at Sun Yat-sen University (Guangzhou, China). “We were inspired by the baking of bread: delicious bread can only be produced if the baking temperature is not too low or too high and the baking time is not too short or too long.”
    For their novel “double encryption system” they use thermoresponsive polymer hydrogels — cross-linked chain molecules with water incorporated into their “gaps.” Above or below a specific temperature, the clear gels become opaque due to partial unmixing. There are LCST and UCST gels, which have lower or upper critical solution temperatures, respectively. The phase retention and critical temperature can be controlled via the content of -CO-NH2 groups in the main chain of the polymer hydrogels. The density of cross-linking determines the rate of the phase transition.
    As an example of a locked label, the team used transparent acrylic plates with grooves in the pattern of a QR code. Three different gels were put in defined areas of the pattern; a UCST gel with a phase change around 40 °C, and two LCST gels with a phase change at 33 °C (one with a fast phase transition, and one with a slow phase change). Below 20 °C, the UCST gel is opaque, but highly shrinking. The pattern is deformed and unreadable. Between 20 and 33 °C, it swells and the part of the code formed by this gel becomes readable. The second part of the code, formed by the “fast” LCST gel, still cannot be read. Only heating to over 33 °C makes both LCST gels opaque. Now the timing comes into play; only the pattern of the “fast” LCST gel has the correct second part of the information. At 37 °C, it becomes readable after about half a minute and the complete code can be read. However, a brief three minutes later, the “slow” LCST gel becomes opaque and adds false information that makes the codes unreadable. Above 40 °C, both LCST gels become opaque simultaneously. In addition, the UCST gel becomes transparent and unreadable.
    This encryption can only be decoded if the specific temperature and time windows are known. The source of heat for decoding in this example could be an infrared lamp, a water bath, a hair dryer, or even the human body. If sealed to prevent the evaporation of water, these inexpensive labels are theoretically suitable for long-term use.
    Story Source:
    Materials provided by Wiley. Note: Content may be edited for style and length. More

  • in

    Surprising semiconductor properties revealed with innovative new method

    A research team probing the properties of a semiconductor combined with a novel thin oxide film have observed a surprising new source of conductivity from oxygen atoms trapped inside.
    Scott Chambers, a materials scientist at the Department of Energy’s Pacific Northwest National Laboratory, reported the team’s discovery at the Spring 2022 meeting of the American Physical Society. The research finding is described in detail in the journal Physical Review Materials.
    The discovery has broad implications for understanding the role of thin oxide films in future semiconductor design and manufacture. Specifically, semiconductors used in modern electronics come in two basic flavors — n-type and p-type — depending on the electronic impurity added during crystal growth. Modern electronic devices use both n- and p-type silicon-based materials. But there is ongoing interest in developing other types of semiconductors. Chambers and his team were testing germanium in combination with a specialized thin crystalline film of lanthanum-strontium-zirconium-titanium-oxide (LSZTO).
    “We are reporting on a powerful tool for probing semiconductor structure and function,” said Chambers. “Hard X-ray photoelectron spectroscopy revealed in this case that atoms of oxygen, an impurity in the germanium, dominate the properties of the material system when germanium is joined to a particular oxide material. This was a big surprise.”
    Using the Diamond Light Source on the Harwell Science and Innovation Campus in Oxfordshire, England, the research team discovered they could learn a great deal more about the electronic properties of the germanium/LSZTO system than was possible using the typical methods.
    “When we tried to probe the material with conventional techniques, the much higher conductivity of germanium essentially caused a short circuit,” Chambers said. “As a result, we could learn something about the electronic properties of the Ge, which we already know a lot about, but nothing about the properties of the LSZTO film or the interface between the LSZTO film and the germanium — which we suspected might be very interesting and possibly useful for technology.”
    A new role for hard X-rays
    The so-called “hard” X-rays produced by the Diamond Light Source could penetrate the material and generate information about what was going on at the atomic level.
    “Our results were best interpreted in terms of oxygen impurities in the germanium being responsible for a very interesting effect,” Chambers said. “The oxygen atoms near the interface donate electrons to the LSZTO film, creating holes, or the absence of electrons, in the germanium within a few atomic layers of the interface. These specialized holes resulted in behavior that totally eclipsed the semiconducting properties of both n- and p-type germanium in the different samples we prepared. This, too, was a big surprise.”
    The interface, where the thin-film oxide and the base semiconductor come together, is where interesting semiconducting properties often emerge. The challenge, according to Chambers, is to learn how to control the fascinating and potentially useful electric fields that forms at these interfaces by modifying the electric field at the surface. Ongoing experiments at PNNL are probing this possibility.
    While the samples used in this research do not likely have the immediate potential for commercial use, the techniques and scientific discoveries made are expected to pay dividends in the longer term, Chambers said. The new scientific knowledge will help materials scientists and physicists better understand how to design new semiconductor material systems with useful properties.
    PNNL researchers Bethany Matthews, Steven Spurgeon, Mark Bowden, Zihua Zhu and Peter Sushko contributed to the research. The study was supported by the Department of Energy Office of Science. Some experiments and sample preparation were performed at the Environmental Molecular Sciences Laboratory, a Department of Energy Office of Science user facility located at PNNL. Electron microscopy was performed in the PNNL Radiochemical Processing Laboratory. Collaborators Tien-Lin Lee and Judith Gabel performed experiments at the Diamond Light Source. Additional collaborators included the University of Texas at Arlington’s Matt Chrysler and Joe Ngai, who prepared the samples.
    Story Source:
    Materials provided by DOE/Pacific Northwest National Laboratory. Original written by Karyn Hede. Note: Content may be edited for style and length. More