More stories

  • in

    Why there is no speed limit in the superfluid universe

    Physicists from Lancaster University have established why objects moving through superfluid helium-3 lack a speed limit in a continuation of earlier Lancaster research.
    Helium-3 is a rare isotope of helium, in which one neutron is missing. It becomes superfluid at extremely low temperatures, enabling unusual properties such as a lack of friction for moving objects.
    It was thought that the speed of objects moving through superfluid helium-3 was fundamentally limited to the critical Landau velocity, and that exceeding this speed limit would destroy the superfluid. Prior experiments in Lancaster have found that it is not a strict rule and objects can move at much greater speeds without destroying the fragile superfluid state.
    Now scientists from Lancaster University have found the reason for the absence of the speed limit: exotic particles that stick to all surfaces in the superfluid.
    The discovery may guide applications in quantum technology, even quantum computing, where multiple research groups already aim to make use of these unusual particles.
    To shake the bound particles into sight, the researchers cooled superfluid helium-3 to within one ten thousandth of a degree from absolute zero (0.0001K or -273.15°C). They then moved a wire through the superfluid at a high speed, and measured how much force was needed to move the wire. Apart from an extremely small force related to moving the bound particles around when the wire starts to move, the measured force was zero.
    Lead author Dr Samuli Autti said: “Superfluid helium-3 feels like vacuum to a rod moving through it, although it is a relatively dense liquid. There is no resistance, none at all. I find this very intriguing.”
    PhD student Ash Jennings added: “By making the rod change its direction of motion we were able to conclude that the rod will be hidden from the superfluid by the bound particles covering it, even when its speed is very high.” “The bound particles initially need to move around to achieve this, and that exerts a tiny force on the rod, but once this is done, the force just completely disappears,” said Dr Dmitry Zmeev, who supervised the project.
    The Lancaster researchers included Samuli Autti, Sean Ahlstrom, Richard Haley, Ash Jennings, George Pickett, Malcolm Poole, Roch Schanen, Viktor Tsepelin, Jakub Vonka, Tom Wilcox, Andrew Woods and Dmitry Zmeev. The results are published in Nature Communications.

    Story Source:
    Materials provided by Lancaster University. Note: Content may be edited for style and length. More

  • in

    What we know and don’t know about wildfire smoke’s health risks

    Acrid smoke continues to pollute skies in the western United States. On some recent days, the air quality in Portland, Seattle, San Francisco and Los Angeles has been so hazardous, it’s ranked among the worst in the world. 
    It’s hard to predict when the smoke will fully clear. And with some parts of the West  having faced a week or more of extremely polluted air, the unusual, sustained nature of the assault is increasing worries about people’s health.
    There’s plenty of evidence that air pollution — a broad category that includes soot, smog, and other pollutants from sources such as traffic, industry and fires — can harm health. The list of medical ailments associated with exposure to dirty air includes respiratory diseases, cardiovascular disease and diabetes (SN: 9/19/17).
    Most of what’s known about the hazards of wildfire smoke has to do with particulate matter, the tiny bits of solids and liquids in polluted air. Wildfires are especially good at producing particles in a size range that can be dangerous to health. It isn’t clear yet if what fuels wildfire smoke — be it vegetation, a mix of trees and structures, or other human-made sources — affects the toxicity of particulate matter.
    A growing body of evidence points to a range of risks to health during or soon after wildfires, such as increased trips to the emergency room for chronic lung conditions. But there are many more questions than answers about the long-term risks for people struggling to cope with day upon day of polluted air, and facing longer and fiercer fire seasons each year due to climate change (SN: 8/27/20).
    Science News spoke with scientists about what’s in the air, the health risks and what more we need to learn.
    What’s in wildfire smoke?
    Wildfire smoke is a complex mixture of gases and particles that is similar to cigarette smoke but without the nicotine, says physician John Balmes of the University of California, San Francisco, who studies the effects of air pollution on health. “It has the same kind of mixture of nasty small particles and irritant gases.”
    The precise chemical makeup of the smoke varies by fire. It depends on “the type of fuel burned — including structures, intensity of the fire, atmospheric mixing, and distance or age of smoke,” says Tania Busch Isaksen, who studies public health effects of wildfire smoke at the University of Washington in Seattle.
    “Generally speaking, it’s a mixture of carbon dioxide, carbon monoxide, nitrogen oxides, particle matter — fine to coarse — hydrocarbons and other organic compounds,” she says. “Fine particulate matter, PM2.5, is what we are primarily concerned about when we consider impacts on health” (SN: 7/30/20).
    Those particles are 2.5 micrometers across or smaller, or about one-thirtieth the width of a human hair (SN: 8/22/18). Common in air pollution produced not only by wildfires, but also by power plants and cars, these particles are so tiny that they can be inhaled deeply into the lungs. There, they can trigger inflammation and possibly seep into the bloodstream.
    Can you see how much PM2.5 is in the air?
    No. These particles are so tiny and difficult to see that “even if the air seems clear, PM2.5 could be at levels that are dangerous,” says Perry Hystad, an environmental epidemiologist at Oregon State University in Corvallis. In the United States, the most reliable gauge of PM2.5 is the Air Quality Index, or AQI, which is based on data from air quality monitoring stations that measure the concentrations of pollutants in the air.
    The U.S. Environmental Protection Agency developed the index to grade levels of common air pollutants, such as ozone, PM2.5  and carbon monoxide. On a scale from 0 to 500, higher numbers indicate dirtier air. The EPA assigns AQI scores to different types of pollution based on studies of each contaminant’s health effects.
    The EPA considers scores up to 100 — indicating an average 35.4 micrograms of particulate matter per cubic meter of air over 24 hours  — generally safe. Scores from 101 to 200 may pose particular risk to people in sensitive groups, such as children and those with heart or lung diseases. Those people are advised to limit or avoid prolonged or vigorous outdoor activity. Above 200, everyone should cut down on physical activity outside. At scores 300 or above, with at least 250.4 micrograms of PM2.5 per cubic meter of air, everyone should avoid going outside.
    Smoke blanketing the western United States has created hazardous, and at times off-the-chart, levels of pollution in many places. For instance, on the morning of September 17, areas of Oregon near Portland showed PM2.5 AQI levels up to around a hazardous 380. In regions of central California northeast of Fresno, AQI levels reached a staggering 780.
    “Especially under conditions that we’re experiencing here in the western United States, it would be wise to check the AQI on a daily basis,” says Kent Pinkerton, a biologist at University of California, Davis.

    What happens when people breathe in wildfire smoke?
    “Wildfires, through the combustion process, create lots and lots of particles” in the size range of PM2.5, says Colleen Reid, an environmental epidemiologist and health geographer at the University of Colorado Boulder. A breath of these microscopic particles can send them all the way to the alveoli, the tiny sacs where the lungs and the blood swap oxygen and carbon dioxide.
    Research in lab dishes has found that the particles can lead to inflammation and oxidative stress, in which reactive molecules that contain oxygen build up and can damage cells. The smallest pollution particles may make their way into the bloodstream, possibly causing harm to the cardiovascular system.
    The research linking PM2.5 with health generally does not consider what types of materials are burning, so “at this point we are concerned about all PM2.5 regardless of source,” says Anthony Wexler, who studies particulate pollutants at the University of California, Davis. “But the source is likely important.”
    Historically, wildfires have burned mostly plant matter. But many of the recent devastating fires in the western U.S., such as the Camp Fire that destroyed the town of Paradise, Calif., in 2018, have devoured human-made structures (SN: 11/15/18). “Houses have paint and solvents and plastics and all this other terrible stuff going up in smoke, too, which may be increasing the toxicity of the material that’s being emitted,” says Wexler. He is currently preparing an experiment to compare the toxicity of the smoke from burnt household materials with that from woody materials.
    The impact of extended exposures to wildfire smoke also needs more research. Wildfires put a lot of pollution into the air, more than what’s generally produced from industrial and traffic sources, Reid says. But it’s often for a short period of time. “What’s going on right now in Oregon and Washington and California, where they’ve had essentially a week of very unhealthy levels of air pollution, is less common,” she says.
    Recent fires in the western United States have consumed not only trees but many buildings like this one, in Butte County, Calif., which went up in flames on September 9. Some researchers are concerned that plastics and other materials in homes may make smoke more toxic.Noah Berger/Associated Press
    What are the immediate health risks from wildfire smoke?
    Breathing in smoky air can irritate the respiratory tract, leading to coughing, sore throats and itchy, watery eyes. The foul air can also cause headaches and fatigue.
    Hospital visits for lung care go up during wildfires compared to periods without them, according to studies of emergency department traffic. For instance, an increase in PM2.5 exposure related to wildfires in northern California in 2008 was associated with an increase in risk for emergency department visits and hospitalizations for asthma, Reid and colleagues reported in Environmental Research in 2016. The 2012 wildfires in Colorado were linked to a rise in emergency department visits for asthma and chronic obstructive pulmonary disease, according to a 2016 study in Environmental Health. There’s some evidence of increased trips to the hospital for cardiovascular health problems during wildfires as well.
    Medical visits for kids go up during wildfires too. During the 2017 Lilac Fire in San Diego county, visits for respiratory problems to a children’s hospital rose due to increased exposure to PM2.5, according to a 2020 study in the Annals of the American Thoracic Society.
    Children, especially the very young and those with diseases like asthma, can be more vulnerable to health effects from wildfires. “They breathe more air per minute compared to adults” to meet their physiological needs, says Marissa Hauptman, a pediatrician at Boston Children’s Hospital. That can add up to more exposure. And developing lungs “are more susceptible to injury,” she says. 
    A developing fetus may also be at risk from exposure to PM2.5. In a 2012 study in Environmental Health Perspectives, Reid and colleagues reported a slight decrease in birth weight for infants from pregnancies that occurred during the 2003 wildfires in Southern California. Mothers exposed to smoke from Colorado wildfires during the second trimester were more likely to give birth prematurely, according to a 2019 study in the International Journal of Environmental Research and Public Health. Infants born early or smaller than usual can face developmental delays.
    What’s known about long-term health risks from wildfire smoke?
    Not much. But a few studies provide some initial clues.
    One examined how wildfires that scorched large areas of Indonesia in 1997 impacted health 10 years later. This population-wide study found that males and the elderly were worse off in 2007 for health measures such as lung function, the researchers reported in Economics & Human Biology in 2017.
    In the United States, the wildfire smoke that plagued the Seeley Lake community in Montana in 2017 has parallels to the prolonged, hazardous exposures happening now in the West. The wildfires produced extremely high levels of PM2.5 from July 31 to September 18 that year; the daily average was 221 micrograms per cubic meter of air. Christopher Migliaccio, a respiratory immunology researcher at the University of Montana in Missoula, and his colleagues screened adults in the community right after the last day of increased smoke and two more times in each of the following two years.
    Compared with members of a Montana community that hadn’t been exposed to the same levels of smoke, the participants from the Seeley Lake area had poorer lung function one and two years out, Migliaccio and his colleagues reported in Toxics in August. “I thought people might be worse right after,” he says, “but it’s a little bit of a delayed response.”
    Migliaccio and colleagues had planned to screen the participants again this year, but COVID-19 got in the way. Eventually they hope to see whether, in participants that still have worse lung function, the condition is treatable or if it’s “the new normal.”
    Can a mask protect you from wildfire smoke?
    It depends on the type of mask. “Cloth masks, which are effective at preventing transmission of SARS-CoV-2 [the virus that causes COVID-19] … don’t do anything to protect the wearer from exposure to wildfire smoke,” Balmes says (SN: 6/26/20). Surgical masks provide some protection. But “an N95 is the best protection.” N95 masks are designed to filter out at least 95 percent of airborne particles.
    But N95 masks are in short supply, and those masks have not been certified for use by children as they don’t fit properly. So the best protection is to avoid exposure. “People should stay indoors as much as possible with the windows closed,” Balmes says.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    How can people keep indoor air clean?
    “If they have central ventilation, they should turn that to recirculation,” Balmes says. That can reduce the amount of smoke that enters the home. People can also use a High Efficiency Particulate Air, or HEPA purifier to smoke-proof a single room. And those who cannot afford a HEPA cleaner can put together a makeshift purifier using a MERV-13 furnace filter and a box fan, Balmes says. “They’re not as good as the proper devices, but they do provide some protection.”
    People hunkered down indoors can also keep the air clear by not burning gas stoves or candles, or even vacuuming — which can stir up particles inside the home.
    But some people don’t have a home to escape to. King County in Washington announced on September 11 the opening of a clean air shelter for people experiencing homelessness.
    How else might wildfires be harming health?
    The toll that the wildfires have on mental health could also be significant. The past month in the Pacific Northwest has brought images reminiscent of a science fiction novel: hazy, deep orange skies that sometimes completely obscured the sun, turning day to night.
    Extreme wildfires, with the potential for long periods of time in which the air is a danger, can upend people’s lives and add to stress levels. One of the few respites to the COVID-19 pandemic — going out for a breath of fresh air — has been shut off for millions of people. And there are many that have no choice but to work or live outdoors, exposed to hazardous air. “There could be a psychological impact of that,” says Reid. “That needs to be explored.” More

  • in

    New design principles for spin-based quantum materials

    As our lives become increasingly intertwined with technology — whether supporting communication while working remotely or streaming our favorite show — so too does our reliance on the data these devices create. Data centers supporting these technology ecosystems produce a significant carbon footprint — and consume 200 terawatt hours of energy each year, greater than the annual energy consumption of Iran. To balance ecological concerns yet meet growing demand, advances in microelectronic processors — the backbone of many Internet of Things (IoT) devices and data hubs — must be efficient and environmentally friendly.
    Northwestern University materials scientists have developed new design principles that could help spur development of future quantum materials used to advance (IoT) devices and other resource-intensive technologies while limiting ecological damage.
    “New path-breaking materials and computing paradigms are required to make data centers more energy-lean in the future,” said James Rondinelli, professor of materials science and engineering and the Morris E. Fine Professor in Materials and Manufacturing at the McCormick School of Engineering, who led the research.
    The study marks an important step in Rondinelli’s efforts to create new materials that are non-volatile, energy efficient, and generate less heat — important aspects of future ultrafast, low-power electronics and quantum computers that can help meet the world’s growing demand for data.
    Rather than certain classes of semiconductors using the electron’s charge in transistors to power computing, solid-state spin-based materials utilize the electron’s spin and have the potential to support low-energy memory devices. In particular, materials with a high-quality persistent spin texture (PST) can exhibit a long-lived persistent spin helix (PSH), which can be used to track or control the spin-based information in a transistor.
    Although many spin-based materials already encode information using spins, that information can be corrupted as the spins propagate in the active portion of the transistor. The researchers’ novel PST protects that spin information in helix form, making it a potential platform where ultralow energy and ultrafast spin-based logic and memory devices operate.
    The research team used quantum-mechanical models and computational methods to develop a framework to identify and assess the spin textures in a group of non-centrosymmetric crystalline materials. The ability to control and optimize the spin lifetimes and transport properties in these materials is vital to realizing the future of quantum microelectronic devices that operate with low energy consumption.
    “The limiting characteristic of spin-based computing is the difficulty in attaining both long-lived and fully controllable spins from conventional semiconductor and magnetic materials,” Rondinelli said. “Our study will help future theoretical and experimental efforts aimed at controlling spins in otherwise non-magnetic materials to meet future scaling and economic demands.”
    Rondinelli’s framework used microscopic effective models and group theory to identify three materials design criteria that would produce useful spin textures: carrier density, the number of electrons propagating through an effective magnetic field, Rashba anisotropy, the ratio between intrinsic spin-orbit coupling parameters of the materials, and momentum space occupation, the PST region active in the electronic band structure. These features were then assessed using quantum-mechanical simulations to discover high-performing PSHs in a range of oxide-based materials.
    The researchers used these principles and numerical solutions to a series of differential spin-diffusion equations to assess the spin texture of each material and predict the spin lifetimes for the helix in the strong spin-orbit coupling limit. They also found they could adjust and improve the PST performance using atomic distortions at the picoscale. The group determined an optimal PST material, Sr3Hf2O7, which showed a substantially longer spin lifetime for the helix than in any previously reported material.
    “Our approach provides a unique chemistry-agnostic strategy to discover, identify, and assess symmetry-protected persistent spin textures in quantum materials using intrinsic and extrinsic criteria,” Rondinelli said. “We proposed a way to expand the number of space groups hosting a PST, which may serve as a reservoir from which to design future PST materials, and found yet another use for ferroelectric oxides — compounds with a spontaneous electrical polarization. Our work also will help guide experimental efforts aimed at implementing the materials in real device structures.”

    Story Source:
    Materials provided by Northwestern University. Original written by Alex Gerage. Note: Content may be edited for style and length. More

  • in

    Solar storm forecasts for Earth improved with help from the public

    Solar storm analysis carried out by an army of citizen scientists has helped researchers devise a new and more accurate way of forecasting when Earth will be hit by harmful space weather. Scientists at the University of Reading added analysis carried out by members of the public to computer models designed to predict when coronal mass ejections (CMEs) — huge solar eruptions that are harmful to satellites and astronauts — will arrive at Earth.
    The team found forecasts were 20% more accurate, and uncertainty was reduced by 15%, when incorporating information about the size and shape of the CMEs in the volunteer analysis. The data was captured by thousands of members of the public during the latest activity in the Solar Stormwatch citizen science project, which was devised by Reading researchers and has been running since 2010.
    The findings support the inclusion of wide-field CME imaging cameras on board space weather monitoring missions currently being planned by agencies like NASA and ESA.
    Dr Luke Barnard, space weather researcher at the University of Reading’s Department of Meteorology, who led the study, said: “CMEs are sausage-shaped blobs made up of billions of tonnes of magnetised plasma that erupt from the Sun’s atmosphere at a million miles an hour. They are capable of damaging satellites, overloading power grids and exposing astronauts to harmful radiation.
    “Predicting when they are on a collision course with Earth is therefore extremely important, but is made difficult by the fact the speed and direction of CMEs vary wildly and are affected by solar wind, and they constantly change shape as they travel through space.
    “Solar storm forecasts are currently based on observations of CMEs as soon as they leave the Sun’s surface, meaning they come with a large degree of uncertainty. The volunteer data offered a second stage of observations at a point when the CME was more established, which gave a better idea of its shape and trajectory.

    advertisement

    “The value of additional CME observations demonstrates how useful it would be to include cameras on board spacecraft in future space weather monitoring missions. More accurate predictions could help prevent catastrophic damage to our infrastructure and could even save lives.”
    In the study, published in AGU Advances, the scientists used a brand new solar wind model, developed by Reading co-author Professor Mathew Owens, for the first time to create CME forecasts.
    The simplified model is able to run up to 200 simulations — compared to around 20 currently used by more complex models — to provide improved estimates of the solar wind speed and its impact on the movement of CMEs, the most harmful of which can reach Earth in 15-18 hours.
    Adding the public CME observations to the model’s predictions helped provide a clearer picture of the likely path the CME would take through space, reducing the uncertainty in the forecast. The new method could also be applied to other solar wind models.
    The Solar Stormwatch project was led by Reading co-author Professor Chris Scott. It asked volunteers to trace the outline of thousands of past CMEs captured by Heliospheric Imagers — specialist, wide-angle cameras — on board two NASA STEREO spacecraft, which orbit the Sun and monitor the space between it and Earth.
    The scientists retrospectively applied their new forecasting method to the same CMEs the volunteers had analysed to test how much more accurate their forecasts were with the additional observations.
    Using the new method for future solar storm forecasts would require swift real-time analysis of the images captured by the spacecraft camera, which would provide warning of a CME being on course for Earth several hours or even days in advance of its arrival.

    Story Source:
    Materials provided by University of Reading. Note: Content may be edited for style and length. More

  • in

    Biologists create new genetic systems to neutralize gene drives

    In the past decade, researchers have engineered an array of new tools that control the balance of genetic inheritance. Based on CRISPR technology, such gene drives are poised to move from the laboratory into the wild where they are being engineered to suppress devastating diseases such as mosquito-borne malaria, dengue, Zika, chikungunya, yellow fever and West Nile. Gene drives carry the power to immunize mosquitoes against malarial parasites, or act as genetic insecticides that reduce mosquito populations.
    Although the newest gene drives have been proven to spread efficiently as designed in laboratory settings, concerns have been raised regarding the safety of releasing such systems into wild populations. Questions have emerged about the predictability and controllability of gene drives and whether, once let loose, they can be recalled in the field if they spread beyond their intended application region.
    Now, scientists at the University of California San Diego and their colleagues have developed two new active genetic systems that address such risks by halting or eliminating gene drives in the wild. On Sept.18, 2020 in the journal Molecular Cell, research led by Xiang-Ru Xu, Emily Bulger and Valentino Gantz in the Division of Biological Sciences offers two new solutions based on elements developed in the common fruit fly.
    “One way to mitigate the perceived risks of gene drives is to develop approaches to halt their spread or to delete them if necessary,” said Distinguished Professor Ethan Bier, the paper’s senior author and science director for the Tata Institute for Genetics and Society. “There’s been a lot of concern that there are so many unknowns associated with gene drives. Now we have saturated the possibilities, both at the genetic and molecular levels, and developed mitigating elements.”
    The first neutralizing system, called e-CHACR (erasing Constructs Hitchhiking on the Autocatalytic Chain Reaction) is designed to halt the spread of a gene drive by “shooting it with its own gun.” e-CHACRs use the CRISPR enzyme Cas9 carried on a gene drive to copy itself, while simultaneously mutating and inactivating the Cas9 gene. Xu says an e-CHACR can be placed anywhere in the genome.
    “Without a source of Cas9, it is inherited like any other normal gene,” said Xu. “However, once an e-CHACR confronts a gene drive, it inactivates the gene drive in its tracks and continues to spread across several generations ‘chasing down’ the drive element until its function is lost from the population.”
    The second neutralizing system, called ERACR (Element Reversing the Autocatalytic Chain Reaction), is designed to eliminate the gene drive altogether. ERACRs are designed to be inserted at the site of the gene drive, where they use the Cas9 from the gene drive to attack either side of the Cas9, cutting it out. Once the gene drive is deleted, the ERACR copies itself and replaces the gene-drive.
    “If the ERACR is also given an edge by carrying a functional copy of a gene that is disrupted by the gene drive, then it races across the finish line, completely eliminating the gene drive with unflinching resolve,” said Bier.
    The researchers rigorously tested and analyzed e-CHACRs and ERACRs, as well as the resulting DNA sequences, in meticulous detail at the molecular level. Bier estimates that the research team, which includes mathematical modelers from UC Berkeley, spent an estimated combined 15 years of effort to comprehensively develop and analyze the new systems. Still, he cautions there are unforeseen scenarios that could emerge, and the neutralizing systems should not be used with a false sense of security for field-implemented gene drives.
    “Such braking elements should just be developed and kept in reserve in case they are needed since it is not known whether some of the rare exceptional interactions between these elements and the gene drives they are designed to corral might have unintended activities,” he said.
    According to Bulger, gene drives have enormous potential to alleviate suffering, but responsibly deploying them depends on having control mechanisms in place should unforeseen consequences arise. ERACRs and eCHACRs offer ways to stop the gene drive from spreading and, in the case of the ERACR, can potentially revert an engineered DNA sequence to a state much closer to the naturally-occurring sequence.
    “Because ERACRs and e-CHACRs do not possess their own source of Cas9, they will only spread as far as the gene drive itself and will not edit the wild type population,” said Bulger. “These technologies are not perfect, but we now have a much more comprehensive understanding of why and how unintended outcomes influence their function and we believe they have the potential to be powerful gene drive control mechanisms should the need arise.” More

  • in

    Engineers produce a fisheye lens that's completely flat

    To capture panoramic views in a single shot, photographers typically use fisheye lenses — ultra-wide-angle lenses made from multiple pieces of curved glass, which distort incoming light to produce wide, bubble-like images. Their spherical, multipiece design makes fisheye lenses inherently bulky and often costly to produce.
    Now engineers at MIT and the University of Massachusetts at Lowell have designed a wide-angle lens that is completely flat. It is the first flat fisheye lens to produce crisp, 180-degree panoramic images. The design is a type of “metalens,” a wafer-thin material patterned with microscopic features that work together to manipulate light in a specific way.
    In this case, the new fisheye lens consists of a single flat, millimeter-thin piece of glass covered on one side with tiny structures that precisely scatter incoming light to produce panoramic images, just as a conventional curved, multielement fisheye lens assembly would. The lens works in the infrared part of the spectrum, but the researchers say it could be modified to capture images using visible light as well.
    The new design could potentially be adapted for a range of applications, with thin, ultra-wide-angle lenses built directly into smartphones and laptops, rather than physically attached as bulky add-ons. The low-profile lenses might also be integrated into medical imaging devices such as endoscopes, as well as in virtual reality glasses, wearable electronics, and other computer vision devices.
    “This design comes as somewhat of a surprise, because some have thought it would be impossible to make a metalens with an ultra-wide-field view,” says Juejun Hu, associate professor in MIT’s Department of Materials Science and Engineering. “The fact that this can actually realize fisheye images is completely outside expectation.
    This isn’t just light-bending — it’s mind-bending.”
    Hu and his colleagues have published their results in the journal Nano Letters. Hu’s MIT coauthors are Mikhail Shalaginov, Fan Yang, Peter Su, Dominika Lyzwa, Anuradha Agarwal, and Tian Gu, along with Sensong An and Hualiang Zhang of UMass Lowell.

    advertisement

    Design on the back side
    Metalenses, while still largely at an experimental stage, have the potential to significantly reshape the field of optics. Previously, scientists have designed metalenses that produce high-resolution and relatively wide-angle images of up to 60 degrees. To expand the field of view further would traditionally require additional optical components to correct for aberrations, or blurriness — a workaround that would add bulk to a metalens design.
    Hu and his colleagues instead came up with a simple design that does not require additional components and keeps a minimum element count. Their new metalens is a single transparent piece made from calcium fluoride with a thin film of lead telluride deposited on one side. The team then used lithographic techniques to carve a pattern of optical structures into the film.
    Each structure, or “meta-atom,” as the team refers to them, is shaped into one of several nanoscale geometries, such as a rectangular or a bone-shaped configuration, that refracts light in a specific way. For instance, light may take longer to scatter, or propagate off one shape versus another — a phenomenon known as phase delay.
    In conventional fisheye lenses, the curvature of the glass naturally creates a distribution of phase delays that ultimately produces a panoramic image. The team determined the corresponding pattern of meta-atoms and carved this pattern into the back side of the flat glass.

    advertisement

    ‘We’ve designed the back side structures in such a way that each part can produce a perfect focus,” Hu says.
    On the front side, the team placed an optical aperture, or opening for light.
    “When light comes in through this aperture, it will refract at the first surface of the glass, and then will get angularly dispersed,” Shalaginov explains. “The light will then hit different parts of the backside, from different and yet continuous angles. As long as you design the back side properly, you can be sure to achieve high-quality imaging across the entire panoramic view.”
    Across the panorama
    In one demonstration, the new lens is tuned to operate in the mid-infrared region of the spectrum. The team used the imaging setup equipped with the metalens to snap pictures of a striped target. They then compared the quality of pictures taken at various angles across the scene, and found the new lens produced images of the stripes that were crisp and clear, even at the edges of the camera’s view, spanning nearly 180 degrees.
    “It shows we can achieve perfect imaging performance across almost the whole 180-degree view, using our methods,” Gu says.
    In another study, the team designed the metalens to operate at a near-infrared wavelength using amorphous silicon nanoposts as the meta-atoms. They plugged the metalens into a simulation used to test imaging instruments. Next, they fed the simulation a scene of Paris, composed of black and white images stitched together to make a panoramic view. They then ran the simulation to see what kind of image the new lens would produce.
    “The key question was, does the lens cover the entire field of view? And we see that it captures everything across the panorama,” Gu says. “You can see buildings and people, and the resolution is very good, regardless of whether you’re looking at the center or the edges.”
    The team says the new lens can be adapted to other wavelengths of light. To make a similar flat fisheye lens for visible light, for instance, Hu says the optical features may have to be made smaller than they are now, to better refract that particular range of wavelengths. The lens material would also have to change. But the general architecture that the team has designed would remain the same.
    The researchers are exploring applications for their new lens, not just as compact fisheye cameras, but also as panoramic projectors, as well as depth sensors built directly into smartphones, laptops, and wearable devices.
    “Currently, all 3D sensors have a limited field of view, which is why when you put your face away from your smartphone, it won’t recognize you,” Gu says. “What we have here is a new 3D sensor that enables panoramic depth profiling, which could be useful for consumer electronic devices.” More

  • in

    Promising computer simulations for stellarator plasmas

    For the fusion researchers at IPP, who want to develop a power plant based on the model of the sun, the turbulence formation in its fuel — a hydrogen plasma — is a central research topic. The small eddies carry particles and heat out of the hot plasma centre and thus reduce the thermal insulation of the magnetically confined plasma. Because the size and thus the price of electricity of a future fusion power plant depends on it, one of the most important goals is to understand, predict and influence this “turbulent transport.”
    Since the exact computational description of plasma turbulence would require the solution of highly complex systems of equations and the execution of countless computational steps, the code development process is aimed at achieving reasonable simplifications. The GENE code developed at IPP is based on a set of simplified, so-called gyrokinetic equations. They disregard all phenomena in the plasma which do not play a major role in turbulent transport. Although the computational effort can be reduced by many orders of magnitude in this way, the world’s fastest and most powerful supercomputers have always been needed to further develop the code. In the meantime, GENE is able to describe the formation and propagation of small low-frequency plasma eddies in the plasma interior well and to reproduce and explain the experimental results — but originally only for the simply constructed, because axisymmetric fusion systems of the tokamak type.
    For example, calculations with GENE showed that fast ions can greatly reduce turbulent transport in tokamak plasmas. Experiments at the ASDEX Upgrade tokamak at Garching confirmed this result. The required fast ions were provided by plasma heating using radio waves of the ion cyclotron frequency.
    A tokamak code for stellarators
    In stellarators, this turbulence suppression by fast ions had not been observed experimentally so far. However, the latest calculations with GENE now suggest that this effect should also exist in stellarator plasmas: In the Wendelstein 7-X stellarator at IPP at Greifswald, it could theoretically reduce turbulence by more than half. As IPP scientists Alessandro Di Siena, Alejandro Bañón Navarro and Frank Jenko show in the journal Physical Review Letters, the optimal ion temperature depends strongly on the shape of the magnetic field. Professor Frank Jenko, head of the Tokamak Theory department at IPP in Garching: “If this calculated result is confirmed in future experiments with Wendelstein 7-X in Greifswald, this could open up a path to interesting high-performance plasmas.”
    In order to use GENE for turbulence calculation in the more complicated shaped plasmas of stellarators, major code adjustments were necessary. Without the axial symmetry of the tokamaks, one has to cope with a much more complex geometry for stellarators.
    For Professor Per Helander, head of the Stellarator Theory department at IPP in Greifswald, the stellarator simulations performed with GENE are “very exciting physics.” He hopes that the results can be verified in the Wendelstein 7-X stellarator at Greifswald. “Whether the plasma values in Wendelstein 7-X are suitable for such experiments can be investigated when, in the coming experimental period, the radio wave heating system will be put into operation in addition to the current microwave and particle heating,” says Professor Robert Wolf, whose department is responsible for plasma heating.
    GENE becomes GENE-3D
    According to Frank Jenko, it was another “enormous step” to make GENE not only approximately, but completely fit for the complex, three-dimensional shape of stellarators. After almost five years of development work, the code GENE-3D, now presented in the “Journal of Computational Physics” by Maurice Maurer and co-authors, provides a “fast and yet realistic turbulence calculation also for stellarators,” says Frank Jenko. In contrast to other stellarator turbulence codes, GENE-3D describes the full dynamics of the system, i.e. the turbulent motion of the ions and also of the electrons over the entire inner volume of the plasma, including the resulting fluctuations of the magnetic field.

    Story Source:
    Materials provided by Max-Planck-Institut für Plasmaphysik (IPP). Note: Content may be edited for style and length. More

  • in

    New mathematical tool can select the best sensors for the job

    In the 2019 Boeing 737 Max crash, the recovered black box from the aftermath hinted that a failed pressure sensor may have caused the ill-fated aircraft to nose dive. This incident and others have fueled a larger debate on sensor selection, number and placement to prevent the reoccurrence of such tragedies.
    Texas A&M University researchers have now developed a comprehensive mathematical framework that can help engineers make informed decisions about which sensors to use and where they must be positioned in aircraft and other machines.
    “During the early design stage for any control system, critical decisions have to be made about which sensors to use and where to place them so that the system is optimized for measuring certain physical quantities of interest,” said Dr. Raktim Bhattacharya, associate professor in the Department of Aerospace Engineering. “With our mathematical formulation, engineers can feed the model with information on what needs to be sensed and with what precision, and the model’s output will be the fewest sensors needed and their accuracies.”
    The researchers detailed their mathematical framework in the June issue of the Institute of Electrical and Electronics Engineers’ Control System Letters.
    Whether a car or an airplane, complex systems have internal properties that need to be measured. For instance, in an airplane, sensors for angular velocity and acceleration are placed at specific locations to estimate the velocity.
    Sensors can also have different accuracies. In technical terms, accuracy is measured by the noise or the wiggles in the sensor measurements. This noise impacts how accurately the internal properties can be predicted. However, accuracies may be defined differently depending on the system and the application. For instance, some systems may require that noise in the predictions do not exceed a certain amount, while others may need the square of the noise to be as small as possible. In all cases, prediction accuracy has a direct impact on the cost of the sensor.

    advertisement

    “If you want to get sensor accuracy that is two times more accurate, the cost is likely to be more than double,” said Bhattacharya. “Furthermore, in some cases, very high accuracy is not even required. For example, an expensive 4K HD vehicle camera for object detection is unnecessary because first, fine features are not needed to distinguish humans from other cars and second, data processing from high-definition cameras becomes an issue.”
    Bhattacharya added that even if the sensors are extremely precise, knowing where to put the sensor is critical because one might place an expensive sensor at a location where it is not needed. Thus, he said the ideal solution balances cost and precision by optimizing the number of sensors and their positions.
    To test this rationale, Bhattacharya and his team designed a mathematical model using a set of equations that described the model of an F-16 aircraft. In their study, the researchers’ objective was to estimate the forward velocity, the direction of wind angle with respect to the airplane (the angle of attack), the angle between where the airplane is pointed and the horizon (the pitch angle) and pitch rate for this aircraft. Available to them were sensors that are normally in aircraft for measuring acceleration, angular velocity, pitch rate, pressure and the angle of attack. In addition, the model was also provided with expected accuracies for each sensor.
    Their model revealed that all of the sensors were not needed to accurately estimate forward velocity; readings from angular velocity sensors and pressure sensors were enough. Also, these sensors were enough to estimate the other physical states, like the angle of attack, precluding the need of an additional angle of attack sensor. In fact, these sensors, although a surrogate for measuring the angle of attack, had the effect of introducing redundancy in the system, resulting in higher system reliability.
    Bhattacharya said the mathematical framework has been designed so that it always indicates the least sensors that are needed even if it is provided with a repertoire of sensors to choose from.
    “Let’s assume a designer wants to put every type of sensor everywhere. The beauty of our mathematical model is that it will take out the unnecessary sensors and then give you the minimum number of sensors needed and their position,” he said.
    Furthermore, the researchers noted that although the study is from an aerospace engineering perspective, their mathematical model is very general and can impact other systems as well.
    “As engineering systems become bigger and more complex, the question of where to put the sensor becomes more and more difficult,” said Bhattacharya. “So, for example, if you are building a really long wind turbine blade, some physical properties of the system need to be estimated using sensors and these sensors need to be placed at optimal locations to make sure the structure does not fail. This is nontrivial and that’s where our mathematical framework comes in.” More