More stories

  • in

    Exposure assessment for Deepwater Horizon oil spill: Health outcomes

    Nearly 12 years after the Deepwater Horizon oil spill, scientists are still examining the potential health effects on workers and volunteers who experienced oil-related exposures.
    To help shape future prevention efforts, one West Virginia University researcher — Caroline Groth, assistant professor in the School of Public Health’s Department of Epidemiology and Biostatistics — has developed novel statistical methods for assessing airborne exposure. Working with collaborators from multiple institutions, Groth has made it possible for researchers to characterize oil spill exposures in greater detail than has ever been done before.
    With very few Ph.D. biostatisticians in the area of occupational health, there were few appropriate statistical methodologies for the assessment of inhalation exposures for the GuLF STUDY, a study launched by the National Institute of Environmental Health Sciences shortly after the Deepwater Horizon oil spill. The purpose of the study, which is the largest ever following an oil spill: examine the health of persons involved in the response and clean-up efforts. Groth was part of the exposure assessment team tasked with characterizing worker exposures and led by Patricia Stewart and Mark Stenzel.
    Groth’s statistical methods, which she began in 2012, laid the framework for a crucial step for determining whether there are associations between exposures and health outcomes from the oil spill and clean-up work, which involved over 9,000 vessels deployed in the Gulf of Mexico waters across Alabama, Florida, Louisiana and Mississippi and tens of thousands of workers on the water and on land.
    The Deepwater Horizon oil spill is considered the largest marine oil spill in the history of the U.S.
    “Workers were exposed differently based on their activities, time of exposure, etc., and our research team’s goal was to develop exposure estimates for each of those scenarios and then link them to the participants’ work history through an ‘exposure matrix,'” Groth said. More

  • in

    Predicting the most stable boron nitride structure with quantum simulations

    Boron nitride (BN) is a versatile material with applications in a variety of engineering and scientific fields. This is largely due to an interesting property of BN called “polymorphism,” characterized by the ability to crystallize into more than one type of structure. This generally occurs as a response to changes in temperature, pressure, or both. Furthermore, the different structures, called “polymorphs,” differ remarkably in their physical properties despite having the same chemical formula. As a result, polymorphs play an important role in material design, and a knowledge of how to selectively favor the formation of the desired polymorph is crucial in this regard.
    However, BN polymorphs pose a particular problem. Despite conducting several experiments to assess the relative stabilities of BN polymorphs, a consensus has not emerged on this topic. While computational methods are often the go-to approach for these problems, BN polymorphs have posed serious challenges to standard computation techniques due to the weak “van der Waals (vdW) interactions” between their layers, which is not accounted for in these computations. Moreover, the four stable BN polymorphs, namely rhombohedral (rBN), hexagonal (hBN), wurtzite (wBN), and zinc-blende (cBN), manifest within a narrow energy range, making the capture of small energy differences together with vdW interactions even more challenging.
    Fortunately, an international research team led by Assistant Professor Kousuke Nakano from Japan Advanced Institute of Science and Technology (JAIST) has now provided evidence to settle the debate. In their study, they addressed the issue with a state-of-the-art first principles calculations framework, namely fixed-node diffusion Monte Carlo (FNDMC) simulations. FNDMC represents a step in the popular quantum Monte Carlo simulations method, in which a parametrized many-body quantum “wavefunction” is first optimized to attain the ground state and then supplied to the FNDMC.
    Additionally, the team also computed the Gibbs energy (the useful work obtainable from a system at constant pressure and temperature) of BN polymorphs for different temperatures and pressures using density functional theory (DFT) and phonon calculations. This paper was made available online on March 24, 2022 published in The Journal of Physical Chemistry C.
    According to the FNDMC results, hBN was the most stable structure, followed by rBN, cBN, and wBN. These results were consistent at both 0 K and 300 K (room temperature). However, the DFT estimations yielded conflicting results for two different approximations. Dr. Nakano explains these contradictory findings: “Our results reveal that the estimation of relative stabilities is greatly influenced by the exchange correlational functional, or the approximation used in the DFT calculation. As a result, a quantitative conclusion cannot be reached using DFT findings, and a more accurate approach, such as FNDMC, is required.”
    Notably, the FNDMC results were in agreement with that generated by other refined computation methods, such as “coupled cluster,” suggesting that FNDMC is an effective tool for dealing with polymorphs, especially those governed by vdW forces. The team also showed that it can provide other important information, such as reliable reference energies, when experimental data is unavailable.
    Dr. Nakano is excited about the future prospects of the method in the area of materials science. “Our study demonstrates the ability of FNDMC to detect tiny energy changes involving vdW forces, which will stimulate the use of this method for other van der Waals materials,” he says. “Moreover, molecular simulations based on this accurate and reliable method could empower material designs, enabling the development of medicines and catalysts.” More

  • in

    Hybrid quantum bit based on topological insulators

    With their superior properties, topological qubits could help achieve a breakthrough in the development of a quantum computer designed for universal applications. So far, no one has yet succeeded in unambiguously demonstrating a quantum bit, or qubit for short, of this kind in a lab. However, scientists from Forschungszentrum Jülich have now gone some way to making this a reality. For the first time, they succeeded in integrating a topological insulator into a conventional superconducting qubit. Just in time for “World Quantum Day” on 14 April, their novel hybrid qubit made it to the cover of the latest issue of the journal Nano Letters.
    Quantum computers are regarded as the computers of the future. Using quantum effects, they promise to deliver solutions for highly complex problems that cannot be processed by conventional computers in a realistic time frame. However, the widespread use of such computers is still a long way off. Current quantum computers generally contain only a small number of qubits. The main problem is that they are highly prone to error. The bigger the system, the more difficult it is to fully isolate it from its environment.
    Many hopes are therefore pinned on a new type of quantum bit — the topological qubit. This approach is being pursued by several research groups as well as companies such as Microsoft. This type of qubit exhibits the special feature that it is topologically protected; the particular geometric structure of the superconductors as well as their special electronic material properties ensure that quantum information is retained. Topological qubits are therefore considered to be particularly robust and largely immune to external sources of decoherence. They also appear to enable fast switching times comparable to those achieved by the conventional superconducting qubits used by Google and IBM in current quantum processors.
    However, it is not yet clear whether we will ever succeed in actually producing topological qubits. This is because a suitable material basis is still lacking to experimentally generate the special quasiparticles required for this without any doubt. These quasiparticles are also known as Majorana states. Until now, they could only be unambiguously demonstrated in theory, but not in experiments. Hybrid qubits, as they have now been constructed for the first time by the research group led by Dr. Peter Schüffelgen at the Peter Grünberg Institute (PGI-9) of Forschungszentrum Jülich, are now opening up new possibilities in this area. They already contain topological materials at crucial points. Therefore, this novel type of hybrid qubit provides researchers with a new experimental platform to test the behaviour of topological materials in highly sensitive quantum circuits.
    Story Source:
    Materials provided by Forschungszentrum Juelich. Note: Content may be edited for style and length. More

  • in

    Graphene-hBN breakthrough to spur new LEDs, quantum computing

    In a discovery that could speed research into next-generation electronics and LED devices, a University of Michigan research team has developed the first reliable, scalable method for growing single layers of hexagonal boron nitride on graphene.
    The process, which can produce large sheets of high-quality hBN with the widely used molecular-beam epitaxy process, is detailed in a study in Advanced Materials.
    Graphene-hBN structures can power LEDs that generate deep-UV light, which is impossible in today’s LEDs, said Zetian Mi, U-M professor of electrical engineering and computer science and a corresponding author of the study. Deep-UV LEDs could drive smaller size and greater efficiency in a variety of devices including lasers and air purifiers.
    “The technology used to generate deep-UV light today is mercury-xenon lamps, which are hot, bulky, inefficient and contain toxic materials,” Mi said. “If we can generate that light with LEDs, we could see an efficiency revolution in UV devices similar to what we saw when LED light bulbs replaced incandescents.”
    Hexagonal boron nitride is the world’s thinnest insulator while graphene is the thinnest of a class of materials called semimetals, which have highly malleable electrical properties and are important for their role in computers and other electronics.
    Bonding hBN and graphene together in smooth, single-atom-thick layers unleashes a treasure trove of exotic properties. In addition to deep-UV LEDs, graphene-hBN structures could enable quantum computing devices, smaller and more efficient electronics and optoelectronics and a variety of other applications. More

  • in

    Researchers create miniature wide-angle camera with flat metalenses

    Researchers have designed a new compact camera that acquires wide-angle images of high-quality using an array of metalenses — flat nanopatterned surfaces used to manipulate light. By eliminating the bulky and heavy lenses typically required for this type of imaging, the new approach could enable wide-angle cameras to be incorporated into smartphones and portable imaging devices for vehicles such as cars or drones.
    Tao Li and colleagues from Nanjing University in China report their new ultrathin camera in Optica, Optica Publishing Group’s journal for high-impact research. The new camera, which is just 0.3 centimeters thick, can produce clear images of a scene with a viewing angle of more than 120 degrees.
    Wide-angle imaging is useful for capturing large amounts of information that can create stunning, high-quality images. For machine vision applications such as autonomous driving and drone-based surveillance, wide-angle imaging can enhance performance and safety, for example by revealing an obstacle you couldn’t otherwise see while backing up in a vehicle.
    “To create an extremely compact wide-angle camera, we used an array of metalenses that each capture certain parts of the wide-angle scene,” said Li. “The images are then stitched together to create a wide-angle image without any degradation in image quality.”
    Miniaturizing the wide-angle lens
    Wide-angle imaging is usually accomplished with a fish-eye compound lens or other type of multilayer lens. Although researchers have previously tried to use metalenses to create wide-angle cameras, they tend to suffer from poor image quality or other drawbacks.
    In the new work, the researchers used an array of metalenses that are each carefully designed to focus a different range of illumination angles. This allows each lens to clearly image part of a wide-angle object or scene. The clearest parts of each image can then be computationally stitched together to create the final image.
    “Thanks to the flexible design of the metasurfaces, the focusing and imaging performance of each lens can be optimized independently,” said Li. “This gives rise to a high quality final wide-angle image after a stitching process. What’s more, the array can be manufactured using just one layer of material, which helps keep cost down.”
    Seeing more with flat lenses
    To demonstrate the new approach, the researchers used nanofabrication to create a metalens array and mounted it directly to a CMOS sensor, creating a planar camera that measured about 1 cm × 1 cm × 0.3 cm. They then used this camera to image a wide-angle scene created by using two projectors to illuminate a curved screen surrounding the camera at a distance of 15 cm.
    They compared their new planar camera with one based on a single traditional metalens while imaging the words “Nanjing University” projected across the curved screen. The planar camera produced an image that showed every letter clearly and had a viewing angle larger than 120°, more than three times larger than that of the camera based on a traditional metalens.
    The researchers note that the planar camera demonstrated in this research used individual metalenses just 0.3 millimeters in diameter. They plan to enlarge these to about 1 to 5 millimeters to increase the camera’s imaging quality. After optimization, the array could be mass produced to reduce the cost of each device.
    Story Source:
    Materials provided by Optica. Note: Content may be edited for style and length. More

  • in

    Coastal cities around the globe are sinking

    Coastal cities around the globe are sinking by up to several centimeters per year, on average, satellite observations reveal. The one-two punch of subsiding land and rising seas means that these coastal regions are at greater risk for flooding than previously thought, researchers report in the April 16 Geophysical Research Letters.

    Matt Wei, an earth scientist at the University of Rhode Island in Narragansett, and colleagues studied 99 coastal cities on six continents. “We tried to balance population and geographic location,” he says. While subsidence has been measured in cities previously, earlier research has tended to focus on just one city or region. This investigation is different, Wei says. “It’s one of the first to really use data with global coverage.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Wei and his team relied on observations made from 2015 to 2020 by a pair of European satellites. Instruments onboard beam microwave signals toward Earth and then record the waves that bounce back. By measuring the timing and intensity of those reflected waves, the team determined the height of the ground with millimeter accuracy. And because each satellite flies over the same part of the planet every 12 days, the researchers were able to trace how the ground deformed over time.

    The largest subsidence rates — up to five centimeters per year —are mostly in Asian cities like Tianjin, China; Karachi, Pakistan; and Manila, Philippines, the team found. What’s more, one-third, or 33, of the analyzed cities are sinking in some places by more than a centimeter per year.

    That’s a worrying trend, says Darío Solano-Rojas, an earth scientist at the National Autonomous University of Mexico in Mexico City who was not involved in the research. These cities are being hit with a double whammy: At the same time that sea levels are rising due to climate change, the land is sinking (SN: 8/15/18). “Understanding that part of the problem is a big deal,” Solano-Rojas says.

    Wei and his colleagues think that the subsidence is largely caused by people. When the researchers looked at Google Earth imagery of the regions within cities that were rapidly sinking, the team saw mostly residential or commercial areas. That’s a tip-off that the culprit is groundwater extraction, the team concluded. Landscapes tend to settle as water is pumped out of aquifers (SN: 10/22/12).

    But there’s reason to be hopeful. In the past, cities such as Shanghai and Indonesia’s Jakarta were sinking by more than 10 centimeters per year, on average. But now subsidence in those places has slowed, possibly due to recent governmental regulations limiting groundwater extraction. More

  • in

    Tear-free hair brushing? All you need is math

    As anyone who has ever had to brush long hair knows, knots are a nightmare. But with enough experience, most learn the tricks of detangling with the least amount of pain — start at the bottom, work your way up to the scalp with short, gentle brushes, and apply detangler when necessary.
    L. Mahadevan, the Lola England de Valpine Professor of Applied Mathematics, of Organismic and Evolutionary Biology, and of Physics, learned the mechanics of combing years ago while brushing his young daughter’s hair.
    “I recall that detangling spray seemed to work sometimes, but I still had to be careful to comb gently, by starting from the free ends,” said Mahadevan. “But I was soon fired from the job as I was not very patient.”
    While Mahadevan lost his role as hairdresser, he was still a scientist and the topology, geometry and mechanics of detangling posed interesting mathematical questions that are relevant to a range of applications including textile manufacturing and chemical processes such as polymer processing.
    In a new paper, published in the journal Soft Matter, Mahadevan and co-authors Thomas Plumb Reyes and Nicholas Charles, explore the mathematics of combing and explain why the brushing technique used by so many is the most effective method to detangle a bundle of fibers.
    To simplify the problem, the researchers simulated two helically entwined filaments, rather than a whole head of hair.
    “Using this minimal model, we study the detangling of the double helix via a single stiff tine that moves along it, leaving two untangled filaments in its wake,” said Plumb-Reyes, a graduate student at SEAS. “We measured the forces and deformations associated with combing and then simulated it numerically.”
    “Short strokes that start at the free end and move towards the clamped end remove tangles by creating a flow of a mathematical quantity called the ‘link density’ that characterizes the amount that hair strands that are braided with each other, consistent with simulations of the process” said Nicholas Charles, a graduate student at SEAS.
    The researchers also identified the optimal minimum length for each stroke — any smaller and it would take forever to comb out all the tangles and any longer and it would be too painful.
    The mathematical principles of brushing developed by Plumb-Reyes, Charles and Mahadevan were recently used by Professor Daniela Rus and her team at MIT to design algorithms for brushing hair by a robot.
    Next, the team aims to study the mechanics of brushing curlier hair and how it responds to humidity and temperature, which may lead to a mathematical understanding of a fact every person with curly hair knows: never brush dry hair.
    This research was supported by funds from the US National Science Foundation, and the Henri Seydoux Fund. More

  • in

    Joystick-operated robot could help surgeons treat stroke remotely

    MIT engineers have developed a telerobotic system to help surgeons quickly and remotely treat patients experiencing a stroke or aneurysm. With a modified joystick, surgeons in one hospital may control a robotic arm at another location to safely operate on a patient during a critical window of time that could save the patient’s life and preserve their brain function.
    The robotic system, whose movement is controlled through magnets, is designed to remotely assist in endovascular intervention — a procedure performed in emergency situations to treat strokes caused by a blood clot. Such interventions normally require a surgeon to manually guide a thin wire to the clot, where it can physically clear the blockage or deliver drugs to break it up.
    One limitation of such procedures is accessibility: Neurovascular surgeons are often based at major medical institutions that are difficult to reach for patients in remote areas, particularly during the “golden hour” — the critical period after a stroke’s onset, during which treatment should be administered to minimize any damage to the brain.
    The MIT team envisions that its robotic system could be installed at smaller hospitals and remotely guided by trained surgeons at larger medical centers. The system includes a medical-grade robotic arm with a magnet attached to its wrist. With a joystick and live imaging, an operator can adjust the magnet’s orientation and manipulate the arm to guide a soft and thin magnetic wire through arteries and vessels.
    The researchers demonstrated the system in a “phantom,” a transparent model with vessels replicating complex arteries of the brain. With just an hour of training, neurosurgeons were able to remotely control the robot’s arm to guide a wire through a maze of vessels to reach target locations in the model.
    “We imagine, instead of transporting a patient from a rural area to a large city, they could go to a local hospital where nurses could set up this system. A neurosurgeon at a major medical center could watch live imaging of the patient and use the robot to operate in that golden hour. That’s our future dream,” says Xuanhe Zhao, a professor of mechanical engineering and of civil and environmental engineering at MIT. More