More stories

  • in

    The nanophotonics orchestra presents: Twisting to the light of nanoparticles

    Physics researchers at the University of Bath in the UK discover a new physical effect relating to the interactions between light and twisted materials — an effect that is likely to have implications for emerging new nanotechnologies in communications, nanorobotics and ultra-thin optical components.
    In the 17th and 18th centuries, the Italian master craftsman Antonio Stradivari produced musical instruments of legendary quality, and most famous are his (so-called) Stradivarius violins. What makes the musical output of these musical instruments both beautiful and unique is their particular timbre, also known as tone colour or tone quality. All instruments have a timbre — when a musical note (sound with frequency fs) is played, the instrument creates harmonics (frequencies that are an integer multiple of the initial frequency, i.e. 2fs, 3fs, 4fs, 5fs, 6fs, etc.).
    Similarly, when light of a certain colour (with frequency fc) shines on materials, these materials can produce harmonics (light frequencies 2fc, 3fc, 4fc, 5fc, 6fc, etc.). The harmonics of light reveal intricate material properties that find applications in medical imaging, communications and laser technology.
    For instance, virtually every green laser pointer is in fact an infrared laser pointer whose light is invisible to human eyes. The green light that we see is actually the second harmonic (2fc) of the infrared laser pointer and it is produced by a special crystal inside the pointer.
    In both musical instruments and shiny materials, some frequencies are ‘forbidden’ — that is, they cannot be heard or seen because the instrument or material actively cancels them. Because the clarinet has a straight, cylindrical shape, it supresses all of the even harmonics (2fs, 4fs, 6fs, etc.) and produces only odd harmonics (3fs, 5fs, 7fs, etc.). By contrast, a saxophone has a conical and curved shape which allows all harmonics and results in a richer, smoother sound. Somewhat similarly, when a specific type of light (circularly polarised) shines on metal nanoparticles dispersed in a liquid, the odd harmonics of light cannot propagate along the direction of light travel and the corresponding colours are forbidden.
    Now, an international team of scientists led by researchers from the Department of Physics at the University of Bath have found a way to reveal the forbidden colours, amounting to the discovery of a new physical effect. To achieve this result, they ‘curved’ their experimental equipment.
    Professor Ventsislav Valev, who led the research, said: “The idea that the twist of nanoparticles or molecules could be revealed through even harmonics of light was first formulated over 42 years ago, by a young PhD student — David Andrews. David thought his theory was too elusive to ever be validated experimentally but, two years ago, we demonstrated this phenomenon. Now, we discovered that the twist of nanoparticles can be observed in the odd harmonics of light as well. It’s especially gratifying that the relevant theory was provided by none other than our co-author and nowadays well-established professor — David Andrews!
    “To take a musical analogy, until now, scientists who study twisted molecules (DNA, amino acids, proteins, sugars, etc) and nanoparticles in water — the element of life — have illuminated them at a given frequency and have either observed that same frequency or its noise (inharmonic partial overtones). Our study opens up the study of the harmonic signatures of these twisted molecules. So, we can appreciate their ‘timbre’ for the first time.
    “From a practical point of view, our results offer a straightforward, user-friendly experimental method to achieve an unprecedented understanding of the interactions between light and twisted materials. Such interactions are at the heart of emerging new nanotechnologies in communications, nanorobotics and ultra-thin optical components. For instance, the ‘twist’ of nanoparticles can determine the value of information bits (for left-handed or right-handed twist). It is also present in the propellers for nanorobots and can affect the direction of propagation for a laser beam. Moreover, our method is applicable in tiny volumes of illumination, suitable for the analysis of natural chemical products that are promising for new pharmaceuticals but where the available material is often scarce.
    PhD student Lukas Ohnoutek, also involved in the research, said: “We came very close to missing this discovery. Our initial equipment was not ‘tuned’ well and so we kept seeing nothing at the third-harmonic. I was starting to lose hope but we had a meeting, identified potential issues and investigated them systematically until we discovered the problem. It is wonderful to experience the scientific method at work, especially when it leads to a scientific discovery!”
    Professor Andrews added: ”Professor Valev has led an international team to a real first in the applied photonics. When he invited my participation, it led me back to theory work from my doctoral studies. It has been amazing to see it come to fruition so many years later.”
    The research was funded by The Royal Society, the Science and Technology Facilities Council (STFC) and the Engineering and Physical Science Research Council (EPSRC). More

  • in

    Researchers infuse bacteria with silver to improve power efficiency in fuel cells

    A UCLA-led team of engineers and chemists has taken a major step forward in the development of microbial fuel cells — a technology that utilizes natural bacteria to extract electrons from organic matter in wastewater to generate electrical currents. A study detailing the breakthrough was recently published in Science. 
    “Living energy-recovery systems utilizing bacteria found in wastewater offer a one-two punch for environmental sustainability efforts,” said co-corresponding author Yu Huang, a professor and chair of the Materials Science and Engineering Department at the UCLA Samueli School of Engineering. “The natural populations of bacteria can help decontaminate groundwater by breaking down harmful chemical compounds. Now, our research also shows a practical way to harness renewable energy from this process.” 
    The team focused on the bacteria genus Shewanella, which have been widely studied for their energy-generation capabilities. They can grow and thrive in all types of environments — including soil, wastewater and seawater — regardless of oxygen levels.  
    Shewanella species naturally break down organic waste matter into smaller molecules, with electrons being a byproduct of the metabolic process. When the bacteria grow as films on electrodes, some of the electrons can be captured, forming a microbial fuel cell that produces electricity. 
    However, microbial fuel cells powered by Shewanella oneidensis have previously not captured enough currents from the bacteria to make the technology practical for industrial use. Few electrons could move quickly enough to escape the bacteria’s membranes and enter the electrodes to provide sufficient electrical currents and power.
    To address this issue, the researchers added nanoparticles of silver to electrodes that are composed of a type of graphene oxide. The nanoparticles release silver ions, which bacteria reduce to silver nanoparticles using electrons generated from their metabolic process and then incorporate into their cells. Once inside the bacteria, the silver particles act as microscopic transmission wires, capturing more electrons produced by the bacteria.
    “Adding the silver nanoparticles into the bacteria is like creating a dedicated express lane for electrons, which enabled us to extract more electrons and at faster speeds,” said Xiangfeng Duan, the study’s other corresponding author and a professor of chemistry and biochemistry at UCLA. 
    With greatly improved electron transport efficiency, the resulting silver-infused Shewanella film outputs more than 80% of the metabolic electrons to external circuit, generating a power of 0.66 milliwatts per square centimeter — more than double the previous best for microbial-based fuel cells.
    With the increased current and improved efficiencies, the study, which was supported by the Office of Naval Research, showed that fuel cells powered by silver-Shewanella hybrid bacteria may pave the way for sufficient power output in practical settings.
    Bocheng Cao, a UCLA doctoral student advised by both Huang and Duan, is the first author of the paper. Other UCLA senior authors are Gerard Wong, a professor of bioengineering; Paul Weiss, a UC Presidential Chair and distinguished professor of chemistry and biochemistry, bioengineering, and materials science and engineering; and Chong Liu, an assistant professor of chemistry and biochemistry. Kenneth Nealson, a professor emeritus of earth sciences at USC, is also a senior author. More

  • in

    Finding new alloys just became simpler

    In metal alloys, behaviour at the atomic scale affects the material’s properties. However, the number of possible alloys is astronomical. Together with an international team of colleagues, Francesco Maresca, an engineer at the University of Groningen, developed a theoretical model that allows him to rapidly determine the strength of millions of different alloys at high temperatures. Experiments confirmed the model predictions. The findings were published in Nature Communications on 16 September.
    The discovery that iron became much stronger with the addition of a little bit of carbon was one of the discoveries that heralded the Industrial Revolution. ‘Tweaking the composition of a base metal by adding different elements, thus creating an alloy, has been important in human history,’ says Francesco Maresca, assistant professor at the Engineering and Technology institute Groningen (ENTEG), at the University of Groningen. As a civil engineer, he likes large structures such as bridges. But he is now studying metals at an atomic scale to find the best alloys for specific applications.
    Dislocation
    Maresca is particularly interested in high-entropy alloys (HEAs), which were first proposed some twenty years ago. These are complex alloys with five or more elements that can have all kinds of useful properties. But how to find the best one? ‘There are around forty metallic elements that are not radioactive or toxic and are therefore suitable for use in alloys. This gives us roughly 1078 different compositions,’ he explains. It is impossible to test a large fraction of these by simply making them.
    This is why Maresca wanted a good theory to describe important properties of HEAs. One of those properties is high-temperature strength, essential in various applications ranging from turbine engines to nuclear power plants. The strength of an alloy depends largely on defects in the crystal structure. ‘Perfect crystals are the strongest, but these do not exist in real life materials.’ A major determinant of strength at high temperatures in body-centred cubic alloys is thought to be a screw dislocation, a dislocation in the lattice structure of a crystal in which the atoms are rearranged into a helical pattern. ‘These dislocations are very hard to model at the atomic scale,’ explains Maresca.
    Composition
    Another type of defect is edge dislocation, where an extra atomic plane is inserted into part of the crystal structure. Maresca: ‘It was believed that these dislocations have no effect on strength at high temperatures, because that was shown experimentally in pure metals. However, we found that they can determine strength in complex alloys.’ Edge dislocations are much easier to model and Maresca created an atomic-scale model for this dislocation in HEAs, which he then translated into a MATLAB script that could predict the engineering-scale strength of millions of different alloys at high temperatures in a matter of minutes.
    The result is a strength versus temperature relationship for these different alloys. ‘Using our results, you can find which compositions will give you a specific strength at, for example, 1300 Kelvin. This allows you to tweak the properties of such a high-temperature-resistant material.’ The theoretical results can be used to create alloys with new properties, or to find alternative compositions when one element in an alloy becomes scarce. The model was validated by creating two different alloys and testing their predicted ‘yield strength’, the amount of stress they can withstand at high temperatures without irreversible deformation. The importance of edge dislocation in this process was confirmed using different experimental techniques.
    Surprise
    ‘We also made an atomic model for screw dislocations, which was too complicated for the high-throughput analysis used for the edge dislocation,’ says Maresca. This confirmed that screw dislocation was not the most important determinant of yield strength in these alloys. The finding that edge dislocation actually determines a large part of the yield strength of complex HEAs was a major surprise and one that has made a simple, theory-driven discovery of new complex alloys possible.
    Story Source:
    Materials provided by University of Groningen. Note: Content may be edited for style and length. More

  • in

    As a population gets older, automation accelerates

    You might think robots and other forms of workplace automation gain traction due to intrinsic advances in technology — that innovations naturally find their way into the economy. But a study co-authored by an MIT professor tells a different story: Robots are more widely adopted where populations become notably older, filling the gaps in an aging industrial work force.
    “Demographic change — aging — is one of the most important factors leading to the adoption of robotics and other automation technologies,” says Daron Acemoglu, an MIT economist and co-author of a new paper detailing the results of the study.
    The study finds that when it comes to the adoption of robots, aging alone accounts for 35 percent of the variation among countries. Within the U.S., the research shows the same pattern: Metro areas where the population is getting older at a faster rate are the places where industry invests more in robots.
    “We provide a lot of evidence to bolster the case that this is a causal relationship, and it is driven by precisely the industries that are most affected by aging and have opportunities for automating work,” Acemoglu adds.
    The paper, “Demographics and Automation,” has been published online by The Review of Economic Studies, and will be appearing in a forthcoming print edition of the journal. The authors are Acemoglu, an Institute Professor at MIT, and Pascual Restrepo PhD ’16, an assistant professor of economics at Boston University.
    An “amazing frontier,” but driven by labor shortages
    The current study is the latest in a series of papers Acemoglu and Restrepo have published about automation, robots, and the workforce. They have previously quantified job displacement in the U.S. due to robots, looked at the firm-level effects of robot use, and identified the late 1980s as a key moment when automation started replacing more jobs than it was creating. More

  • in

    AI system identifies buildings damaged by wildfire

    People around the globe have suffered the nerve-wracking anxiety of waiting weeks or months to find out if their homes have been damaged by wildfires that scorch with increased intensity. Now, once the smoke has cleared for aerial photography, researchers have found a way to identify building damage within minutes.
    Through a system they call DamageMap, a team at Stanford University and the California Polytechnic State University (Cal Poly) has brought an artificial intelligence approach to building assessment: Instead of comparing before-and-after photos, they’ve trained a program using machine learning to rely solely on post-fire images. The findings appear in the International Journal of Disaster Risk Reduction.
    “We wanted to automate the process and make it much faster for first responders or even for citizens that might want to know what happened to their house after a wildfire,” said lead study author Marios Galanis, a graduate student in the Civil and Environmental Engineering Department at Stanford’s School of Engineering. “Our model results are on par with human accuracy.”
    The current method of assessing damage involves people going door-to-door to check every building. While DamageMap is not intended to replace in-person damage classification, it could be used as a scalable supplementary tool by offering immediate results and providing the exact locations of the buildings identified. The researchers tested it using a variety of satellite, aerial and drone photography with at least 92 percent accuracy.
    “With this application, you could probably scan the whole town of Paradise in a few hours,” said senior author G. Andrew Fricker, an assistant professor at Cal Poly, referencing the Northern California town destroyed by the 2018 Camp Fire. “I hope this can bring more information to the decision-making process for firefighters and emergency responders, and also assist fire victims by getting information to help them file insurance claims and get their lives back on track.”
    A different approach
    Most computational systems cannot efficiently classify building damage because the AI compares post-disaster photos with pre-disaster images that must use the same satellite, camera angle and lighting conditions, which can be expensive to obtain or unavailable. Current hardware is not advanced enough to record high-resolution surveillance daily, so the systems can’t rely on consistent photos, according to the researchers. More

  • in

    A statistical fix for archaeology's dating problem

    Archaeologists have long had a dating problem. The radiocarbon analysis typically used to reconstruct past human demographic changes relies on a method easily skewed by radiocarbon calibration curves and measurement uncertainty. And there’s never been a statistical fix that works — until now.
    “Nobody has systematically explored the problem, or shown how you can statistically deal with it,” says Santa Fe Insitute archaeologist Michael Price, lead author on a paper in the Journal of Archaeological Science about a new method he developed for summarizing sets of radiocarbon dates. “It’s really exciting how this work came together. We identified a fundamental problem and fixed it.”
    In recent decades, archaeologists have increasingly relied on sets of radiocarbon dates to reconstruct past population size through an approach called “dates as data.” The core assumption is that the number of radiocarbon samples from a given period is proportional to the region’s population size at that time. Archaeologists have traditionally used “summed probability densities,” or SPDs, to summarize these sets of radiocarbon dates. “But there are a lot of inherent issues with SPDs,” says Julie Hoggarth, Baylor University archaeologist and a co-author on the paper.
    Radiocarbon dating measures the decay of carbon-14 in organic matter. But the amount of carbon-14 in the atmosphere fluctuates through time; it’s not a constant baseline. So researchers create radiocarbon calibration curves that map the carbon-14 values to dates. Yet a single carbon-14 value can correspond to different dates — a problem known as “equifinality,” which can naturally bias the SPD curves. “That’s been a major issue,” and a hurdle for demographic analyses, says Hoggarth. “How do you know that the change you’re looking at is an actual change in population size, and it isn’t a change in the shape of the calibration curve?”
    When she discussed the problem with Price several years ago, he told her he wasn’t a fan of SPDs, either. She asked what archaeologists should do instead. “Essentially, he said, ‘Well, there is no alternative.'”
    That realization led to a years-long quest. Price has developed an approach to estimating prehistoric populations that uses Bayesian reasoning and a flexible probability model that allows researchers to overcome the problem of equifinality. The approach also allows them to combine additional archaeological information with radiocarbon analyses to get a more accurate population estimate. He and his team applied the approach to existing radiocarbon dates from the Maya city of Tikal, which has extensive prior archaeological research. “It serves as a really good test case,” says Hoggarth, a Maya scholar. For a long time, archaeologists debated two demographic reconstructions: Tikal’s population spiked in the early Classic period and then plateaued, or it spiked in the late Classic period. When the team applied the new Bayesian algorithm, “it showed a really steep population increase associated with the late Classic,” she says, “so that was really wonderful confirmation for us.”
    The authors produced an open-source package that implements the new approach, and website links and code are included in their paper. “The reason I’m excited for this,” Price says, “is that it’s pointing out a mistake that matters, fixing it, and laying the groundwork for future work.”
    This paper is just the first step. Next, through “data fusion,” the team will add ancient DNA and other data to radiocarbon dates for even more reliable demographic reconstructions. “That’s the long-term plan,” Price says. And it could help resolve a second issue with the dates as data approach: a “bias problem” if and when radiocarbon dates are skewed toward a particular time period, leading to inaccurate analyses.
    Story Source:
    Materials provided by Santa Fe Institute. Note: Content may be edited for style and length. More

  • in

    Physicists make square droplets and liquid lattices

    When two substances are brought together, they will eventually settle into a steady state called the thermodynamic equilibrium; in everyday life, we see examples of this when oil floats on top of water and when milk mixes uniformly into coffee. Researchers at Aalto University in Finland wanted to disrupt this sort of state to see what happens — and whether they can control the outcome.
    ‘Things in equilibrium tend to be quite boring,’ says Professor Jaakko Timonen, whose research group carried out new work published in Science Advances on 15 September. ‘It’s fascinating to drive systems out of equilibrium and see if the non-equilibrium structures can be controlled or be useful. Biological life itself is a good example of truly complex behavior in a bunch of molecules that are out of thermodynamic equilibrium.’
    In their work, the team used combinations of oils with different dielectric constants and conductivities. They then subjected the liquids to an electric field.
    ‘When we turn on an electric field over the mixture, electrical charge accumulates at the interface between the oils. This charge density shears the interface out of thermodynamic equilibrium and into interesting formations,’ explains Dr Nikos Kyriakopoulos, one of the authors of the paper. As well as being disrupted by the electric field, the liquids were confined into a thin, nearly two-dimensional sheet. This combination led to the oils reshaping into various completely unexpected droplets and patterns.
    The droplets in the experiment could be made into squares and hexagons with straight sides, which is almost impossible in nature, where small bubbles and droplets tend to form spheres. The two liquids could be also made to form into interconnected lattices: grid patterns that occur regularly in solid materials but are unheard of in liquid mixtures. The liquids can even be coaxed into forming a torus, a donut shape, which was stable and held its shape while the field was applied — unlike in nature, as liquids have a strong tendency to collapse in and fill the hole at the centre. The liquids can also form filaments that roll and rotate around an axis.
    ‘All these strange shapes are caused and sustained by the fact that they are prevented from collapsing back into equilibrium by the motion of the electrical charges building up at the interface,’ says Geet Raju, the first author of the paper.
    One of the exciting results of this work is the ability to create temporary structures with a controlled and well-defined size which can be turned on and off with voltage, an area that the researchers are interested in exploring further for creating voltage-controlled optical devices. Another potential outcome is the ability to create interacting populations of rolling microfilaments and microdroplets that, at some elementary level, mimic the dynamics and collective behaviour of microorganisms like bacteria and microalgae that propel themselves using completely different mechanisms.
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More

  • in

    Using artificial intelligence to predict COVID patients' oxygen needs

    Addenbrooke’s Hospital in Cambridge along with 20 other hospitals from across the world and healthcare technology leader, NVIDIA, have used artificial intelligence (AI) to predict Covid patients’ oxygen needs on a global scale.
    The research was sparked by the pandemic and set out to build an AI tool to predict how much extra oxygen a Covid-19 patient may need in the first days of hospital care, using data from across four continents.
    The technique, known as federated learning, used an algorithm to analyse chest x-rays and electronic health data from hospital patients with Covid symptoms.
    To maintain strict patient confidentiality, the patient data was fully anonymised and an algorithm was sent to each hospital so no data was shared or left its location.
    Once the algorithm had ‘learned’ from the data, the analysis was brought together to build an AI tool which could predict the oxygen needs of hospital Covid patients anywhere in the world.
    Published today in Nature Medicine, the study dubbed EXAM (for EMR CXR AI Model), is one of the largest, most diverse clinical federated learning studies to date. More