More stories

  • in

    An atom chip interferometer that could detect quantum gravity

    Physicists in Israel have created a quantum interferometer on an atom chip. This device can be used to explore the fundamentals of quantum theory by studying the interference pattern between two beams of atoms. University of Groningen physicist, Anupam Mazumdar, describes how the device could be adapted to use mesoscopic particles instead of atoms. This modification would allow for expanded applications. A description of the device, and theoretical considerations concerning its application by Mazumdar, were published on 28 May in the journal Science Advances.
    The device which scientists from the Ben-Gurion University of the Negev created is a so-called Stern Gerlach Interferometer, which was first proposed one hundred years ago by German physicists Otto Stern and Walter Gerlach. Their original aim of creating an interferometer with freely propagating atoms exposed to gradients from macroscopic magnets has not been practically realized until now. ‘Such experiments have been done using photons, but never with atoms’, explains Anupam Mazumdar, Professor of Theoretical Physics at the University of Groningen and one of the co-authors of the article in Science Advances.
    Diamonds The Israeli scientists, led by Professor Ron Folman, created an interferometer on an atom chip, which can confine and/or manipulate atoms. A beam of rubidium atoms is levitated over the chip using magnets. Magnetic gradients are used to split the beam according to the spin values of the individual atoms. Spin is a magnetic moment that can have two values, either up or down. The spin-up and spin-down atoms are separated by a magnetic gradient. Subsequently, the two divergent beams are brought together again and recombined. The spin values are then measured, and an interference pattern is formed. Spin is a quantum phenomenon, and throughout this interferometer, the opposing spins are entangled. This makes the interferometer sensitive to other quantum phenomena.
    Mazumdar was not involved in the construction of the chip, but he contributed theoretical insights to the paper. Together with a number of his colleagues, he previously proposed an experiment to determine whether gravity is in fact a quantum phenomenon using entangled mesoscopic objects, namely tiny diamonds that can be brought in a state of quantum superposition. ‘It would be possible to use these diamonds instead of the rubidium atoms on this interferometer’, he explains. However, this process would be highly complex as the device, which is currently operated at room temperature, would need to be cooled down to around 1 Kelvin for the mesoscopic experiment.
    Free fall If this is realized, two of these atom chips could free fall together (to neutralize external gravity), so that any interaction occurring between them would depend on the gravitational pull between the two chips. Mazumdar and his colleagues aim to determine whether quantum entanglement of the pair occurs during free fall, which would mean that the force of gravity between the diamonds is indeed a quantum phenomenon. Another application of this experiment is the detection of gravity waves; their deformation of space-time should be visible in the interference pattern.
    The actual implementation of this experiment is still a long way off, but Mazumdar is very excited now that the interferometer has been created. ‘It is already [a] quantum sensor, although we still have to work out exactly what it can detect. The experiment is like the first steps of a baby — now, we have to guide it to reach maturity.’
    Story Source:
    Materials provided by University of Groningen. Note: Content may be edited for style and length. More

  • in

    Using HPC and experiment, researchers continue to refine graphene production

    Graphene may be among the most exciting scientific discoveries of the last century. While it is strikingly familiar to us — graphene is considered an allotrope of carbon, meaning that it essentially the same substance as graphite but in a different atomic structure — graphene also opened up a new world of possibilities for designing and building new technologies.
    The material is two-dimensional, meaning that each “sheet” of graphene is only 1 atom thick, but its bonds make it as strong as some of the world’s hardest metal alloys while remaining lightweight and flexible. This valuable, unique mix of properties have piqued the interest of scientists from a wide range of fields, leading to research in using graphene for next-generation electronics, new coatings on industrial instruments and tools, and new biomedical technologies.
    It is perhaps graphene’s immense potential that has consequently caused one of its biggest challenges — graphene is difficult to produce in large volumes, and demand for the material is continually growing. Recent research indicates that using a liquid copper catalyst may be a fast, efficient way for producing graphene, but researchers only have a limited understanding of molecular interactions happening during these brief, chaotic moments that lead to graphene formation, meaning they cannot yet use the method to reliably produce flawless graphene sheets.
    In order to address these challenges and help develop methods for quicker graphene production, a team of researchers at the Technical University of Munich (TUM) has been using the JUWELS and SuperMUC-NG high-performance computing (HPC) systems at the Jülich Supercomputing Centre (JSC) and Leibniz Supercomputing Centre (LRZ) to run high-resolution simulations of graphene formation on liquid copper.
    A window into experiment
    Graphene’s appeal primarily stems from the material’s perfectly uniform crystal structure, meaning that producing graphene with impurities is wasted effort. For laboratory settings or circumstances where only a small amount of graphene is needed, researchers can place a piece of scotch tape onto a graphite crystal and “peel” away atomic layers of the graphite using a technique that resembles how one would use tape or another adhesive to help remove pet hair from clothing. While this reliably produces flawless graphene layers, the process is slow and impractical for creating graphene for large-scale applications. More

  • in

    Let's talk about the elephant in the data

    You would not be surprised to see an elephant in the savanna or a plate in your kitchen. Based on your prior experiences and knowledge, you know that is where elephants and plates are often to be found. If you saw a mysterious object in your kitchen, how would you figure out what it was? You would rely on your expectations or prior knowledge. Should a computer approach the problem in the same way? The answer may surprise you. Cold Spring Harbor Laboratory Professor Partha Mitra described how he views problems like these in a “Perspective” in Nature Machine Intelligence. He hopes his insights will help researchers teach computers how to analyze complex systems more effectively.
    Mitra thinks it helps to understand the nature of knowledge. Mathematically speaking, many data scientists try to create a model that can “fit an elephant,” or a set of complex data points. Mitra asks researchers to consider what philosophical framework would work best for a particular machine learning task:
    “In philosophical terms, the idea is that there are these two extremes. One, you could say “rationalist,” and the other “empiricist” points of view. And really, it’s about the role of prior knowledge or prior assumptions.”
    Rationalists versus empiricists
    A rationalist views the world through the lens of prior knowledge. They expect a plate to be in a kitchen and an elephant in a savanna.
    An empiricist analyzes the data exactly as it is presented. When they visit the savanna, they no more expect to see an elephant than they do a plate. More

  • in

    A better way to introduce digital tech in the workplace

    When bringing technologies into the workplace, it pays to be realistic. Often, for instance, bringing new digital technology into an organization does not radically improve a firm’s operations. Despite high-level planning, a more frequent result is the messy process of frontline employees figuring out how they can get tech tools to help them to some degree.
    That task can easily fall on overburdened workers who have to grapple with getting things done, but don’t always have much voice in an organization. So isn’t there a way to think systematically about implementing digital technology in the workplace?
    MIT Professor Kate Kellogg thinks there is, and calls it “experimentalist governance of digital technology”: Let different parts of an organization experiment with the technology — and then centrally remove roadblocks to adopt the best practices that emerge, firm-wide.
    “If you want to get value out of new digital technology, you need to allow local teams to adapt the technology to their setting,” says Kellogg, the David J. McGrath Jr. Professor of Management and Innovation at the MIT Sloan School of Management. “You also need to form a central group that’s tracking all these local experiments, and revising processes in response to problems and possibilities. If you just let everyone do everything locally, you’re going to see resistance to the technology, particularly among frontline employees.”
    Kellogg’s perspective comes after she conducted an 18-month close ethnographic study of a teaching hospital, examining many facets of its daily workings — including things like the integration of technology into everyday medical practices.
    Some of the insights from that organizational research now appear in a paper Kellogg has written, “Local Adaptation Without Work Intensification: Experimentalist Governance of Digital Technology for Mutually Beneficial Role Reconfiguration in Organizations,” recently published online in the journal Organization Science. More

  • in

    Shoot better drone videos with a single word

    First, it takes skill to fly the often expensive pieces of equipment smoothly and without crashing. And once you’ve mastered flying, there are camera angles, panning speeds, trajectories and flight paths to plan.
    With all the sensors and processing power onboard a drone and embedded in its camera, there must be a better way to capture the perfect shot.
    “Sometimes you just want to tell the drone to make an exciting video,” said Rogerio Bonatti, a Ph.D. candidate in Carnegie Mellon University’s Robotics Institute.
    Bonatti was part of a team from CMU, the University of Sao Paulo and Facebook AI Research that developed a model that enables a drone to shoot a video based on a desired emotion or viewer reaction. The drone uses camera angles, speeds and flight paths to generate a video that could be exciting, calm, enjoyable or nerve-wracking — depending on what the filmmaker tells it.
    The team presented their paper on the work at the 2021 International Conference on Robotics and Automation this month. More

  • in

    Researchers design simulation tool to predict disease, pest spread

    North Carolina State University researchers have developed a computer simulation tool to predict when and where pests and diseases will attack crops or forests, and also test when to apply pesticides or other management strategies to contain them.
    “It’s like having a bunch of different Earths to experiment on to test how something will work before spending the time, money and effort to do it,” said the study’s lead author Chris Jones, research scholar at North Carolina State University’s Center for Geospatial Analytics.
    In the journal Frontiers in Ecology and the Environment, researchers reported on their efforts to develop and test the tool, which they called “PoPS,” for the Pest or Pathogen Spread Forecasting Platform. Working with the U.S. Department of Agriculture’s Animal and Plant Health Inspection Service, they created the tool to forecast any type of disease or pathogen, no matter the location.
    Their computer modeling system works by combining information on climate conditions suitable for spread of a certain disease or pest with data on where cases have been recorded, the reproductive rate of the pathogen or pest and how it moves in the environment. Over time, the model improves as natural resource managers add data they gather from the field. This repeated feedback with new data helps the forecasting system get better at predicting future spread, the researchers said.
    “We have a tool that can be put into the hands of a non-technical user to learn about disease dynamics and management, and how management decisions will affect spread in the future,” Jones said.
    The tool is needed as state and federal agencies charged with controlling pests and crop diseases face an increasing number of threats to crops, trees and other important natural resources. These pests threaten food supplies and biodiversity in forests and ecosystems.
    “The biggest problem is the sheer number of new pests and pathogens that are coming in,” Jones said. “State and federal agencies charged with managing them have an ever-decreasing budget to spend on an ever-increasing number of pests. They have to figure out how to spend that money as wisely as possible.”
    Already, researchers have been using PoPS to track the spread of eight different emerging pests and diseases. In the study, they described honing the model to track sudden oak death, a disease that has killed millions of trees in California since the 1990s. A new, more aggressive strain of the disease has been detected in Oregon.
    They are also improving the model to track spotted lanternfly, an invasive pest in the United States that primarily infests a certain invasive type of tree known as “tree of heaven.” Spotted lanternfly has been infesting fruit crops in Pennsylvania and neighboring states since 2014. It can attack grape, apple and cherry crops, as well as almonds and walnuts.
    The researchers said that just as meteorologists incorporate data into models to forecast weather, ecological scientists are using data to improve forecasting of environmental events — including pest or pathogen spread.
    “There’s a movement in ecology to forecast environmental conditions,” said Megan Skrip, a study co-author and science communicator at the Center for Geospatial Analytics. “If we can forecast the weather, can we forecast where there will be an algal bloom, or what species will be in certain areas at certain times? This paper is one of the first demonstrations of doing this for the spread of pests and pathogens.”
    Story Source:
    Materials provided by North Carolina State University. Original written by Laura Oleniacz. Note: Content may be edited for style and length. More

  • in

    New algorithm for modern quilting

    Stanford University computer science graduate student Mackenzie Leake has been quilting since age 10, but she never imagined the craft would be the focus of her doctoral dissertation. Included in that work is new prototype software that can facilitate pattern-making for a form of quilting called foundation paper piecing, which involves using a backing made of foundation paper to lay out and sew a quilted design.
    Developing a foundation paper piece quilt pattern — which looks similar to a paint-by-numbers outline — is often non-intuitive. There are few formal guidelines for patterning and those that do exist are insufficient to assure a successful result.
    “Quilting has this rich tradition and people make these very personal, cherished heirlooms but paper piece quilting often requires that people work from patterns that other people designed,” said Leake, who is a member of the lab of Maneesh Agrawala, the Forest Baskett Professor of Computer Science and director of the Brown Institute for Media Innovation at Stanford. “So, we wanted to produce a digital tool that lets people design the patterns that they want to design without having to think through all of the geometry, ordering and constraints.”
    A paper describing this work is published and will be presented at the computer graphics conference SIGGRAPH 2021 in August.
    Respecting the craft
    In describing the allure of paper piece quilts, Leake cites the modern aesthetic and high level of control and precision. The seams of the quilt are sewn through the paper pattern and, as the seaming process proceeds, the individual pieces of fabric are flipped over to form the final design. All of this “sew and flip” action means the pattern must be produced in a careful order. More

  • in

    Something mysteriously wiped out about 90 percent of sharks 19 million years ago

    About 19 million years ago, something terrible happened to sharks.

    Fossils gleaned from sediments in the Pacific Ocean reveal a previously unknown and dramatic shark extinction event, during which populations of the predators abruptly dropped by up to 90 percent, researchers report in the June 4 Science. And scientists don’t know what might have caused the die-off.

    “It’s a great mystery,” says Elizabeth Sibert, a paleobiologist and oceanographer at Yale University. “Sharks have been around for 400 million years. They’ve been through hell and back. And yet this event wiped out [up to] 90 percent of them.”

    Sharks suffered losses of 30 to 40 percent in the aftermath of the asteroid strike that killed off all nonbird dinosaurs 66 million years ago (SN: 8/2/18). But after that, sharks enjoyed about 45 million years of peaceful ocean dominance, sailing through even large climate disruptions such as the Paleocene-Eocene Thermal Maximum — an episode about 56 million years ago marked by a sudden spike in global carbon dioxide and soaring temperatures — without much trouble (SN: 5/7/15).

    Now, clues found in the fine red clay sediments beneath two vast regions of Pacific add a new, surprising chapter to sharks’ story.

    Sibert and Leah Rubin, then an undergraduate student at the College of the Atlantic in Bar Harbor, Maine, sifted through fish teeth and shark scales buried in sediment cores collected during previous research expeditions to the North and South Pacific oceans.

    “The project came out of a desire to better understand the natural background variability of these fossils,” Sibert says. Sharks’ bodies are made of mostly cartilage, which doesn’t tend to fossilize. But their skin is covered in tiny scales, or dermal denticles, each about the width of a human hair follicle. These scales make for an excellent record of past shark abundance: Like shark teeth, the scales are made of the mineral bioapatite, which is readily preserved in sediments. “And we will find several hundred more denticles compared to a tooth,” Sibert says.

    Researchers sorted fossil shark scales, or denticles, into two main types: those with linear striations (left) and those with geometric shapes and with no striations (right). Following the shark extinction event 19 million years ago, the geometric denticles all but disappeared from ocean sediments.E.C. Sibert and L.D. Rubin/Science 2021

    The researchers weren’t expecting to see anything particularly startling. From 66 million years ago to about 19 million years ago, the ratio of fish teeth to shark scales in the sediments held steady at about 5 to 1. But abruptly — the team estimates within 100,000 years, and possibly even faster — that ratio dramatically changed, to 100 fish teeth for every 1 shark scale.

    The sudden disappearance of shark scales coincided with a change in the abundances of shark scale shapes, which give some clues to changes in biodiversity. Most modern sharks have linear striations on their scales, which may offer some boost to their swimming efficiency. But some sharks lack these striations; instead, the scales come in a variety of geometric shapes. By analyzing the change in the different shapes’ abundances before and after 19 million years ago, the researchers estimated a loss of shark biodiversity of between 70 and 90 percent. The extinction event was “selective,” says Rubin, now a marine scientist at the State University of New York College of Environmental Science and Forestry in Syracuse. After the event, the geometric scales “were almost gone, and never really showed up again in the diversity that they [previously] did.”

    There’s no obvious climate event that might explain such a massive shark population shift, Sibert says. “Nineteen million years ago is not known as a formative time in Earth’s history.” Solving the mystery of the die-off is at the top of a long list of questions she hopes to answer. Other questions include better understanding how the different denticles might relate to shark lineages, and what impact the sudden loss of so many big predators might have had on other ocean dwellers.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    It’s a question with modern implications, as paleobiologist Catalina Pimiento of the University of Zurich and paleobiologist Nicholas Pyenson of the Smithsonian National Museum of Natural History in Washington, D.C., write in a commentary in the same issue of Science. In just the last 50 years, shark abundances in the oceans have dramatically declined by more than 70 percent as a result of overfishing and ocean warming. The loss of sharks — and other top marine predators, such as whales — from the oceans has “profound, complex and irreversible ecological consequences,” the researchers write.

    Indeed, one way to view the study is as a cautionary tale about modern conservation’s limits, says marine conservation biologist Catherine Macdonald of the University of Miami, who was not involved with this study. “Our power to act to protect what remains does not include an ability to fully reverse or undo the effects of the massive environmental changes we have already made.”

    Populations of top ocean predators can be important indicators of those changes — and unraveling how the ocean ecosystem responded to their loss in the past could help researchers anticipate what may happen in the near future, Sibert says. “The sharks are trying to tell us something,” she adds, “and I can’t wait to find out what it is.” More