More stories

  • in

    Evading the uncertainty principle in quantum physics

    The uncertainty principle, first introduced by Werner Heisenberg in the late 1920’s, is a fundamental concept of quantum mechanics. In the quantum world, particles like the electrons that power all electrical product can also behave like waves. As a result, particles cannot have a well-defined position and momentum simultaneously. For instance, measuring the momentum of a particle leads to a disturbance of position, and therefore the position cannot be precisely defined.
    In recent research, published in Science, a team led by Prof. Mika Sillanpää at Aalto University in Finland has shown that there is a way to get around the uncertainty principle. The team included Dr. Matt Woolley from the University of New South Wales in Australia, who developed the theoretical model for the experiment.
    Instead of elementary particles, the team carried out the experiments using much larger objects: two vibrating drumheads one-fifth of the width of a human hair. The drumheads were carefully coerced into behaving quantum mechanically.
    “In our work, the drumheads exhibit a collective quantum motion. The drums vibrate in an opposite phase to each other, such that when one of them is in an end position of the vibration cycle, the other is in the opposite position at the same time. In this situation, the quantum uncertainty of the drums’ motion is cancelled if the two drums are treated as one quantum-mechanical entity,” explains the lead author of the study, Dr. Laure Mercier de Lepinay.
    This means that the researchers were able to simultaneously measure the position and the momentum of the two drumheads — which should not be possible according to the Heisenberg uncertainty principle. Breaking the rule allows them to be able to characterize extremely weak forces driving the drumheads.
    “One of the drums responds to all the forces of the other drum in the opposing way, kind of with a negative mass,” Sillanpää says.
    Furthermore, the researchers also exploited this result to provide the most solid evidence to date that such large objects can exhibit what is known as quantum entanglement. Entangled objects cannot be described independently of each other, even though they may have an arbitrarily large spatial separation. Entanglement allows pairs of objects to behave in ways that contradict classical physics, and is the key resource behind emerging quantum technologies. A quantum computer can, for example, carry out the types of calculations needed to invent new medicines much faster than any supercomputer ever could.
    In macroscopic objects, quantum effects like entanglement are very fragile, and are destroyed easily by any disturbances from their surrounding environment. Therefore, the experiments were carried out at a very low temperature, only a hundredth a degree above absolute zero at -273 degrees.
    In the future, the research group will use these ideas in laboratory tests aiming at probing the interplay of quantum mechanics and gravity. The vibrating drumheads may also serve as interfaces for connecting nodes of large-scale, distributed quantum networks.
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More

  • in

    Trial demonstrates early AI-guided detection of heart disease in routine practice

    Heart disease can take a number of forms, but some types of heart disease, such as asymptomatic low ejection fraction, can be hard to recognize, especially in the early stages when treatment would be most effective. The ECG AI-Guided Screening for Low Ejection Fraction, or EAGLE, trial set out to determine whether an artificial intelligence (AI) screening tool developed to detect low ejection fraction using data from an EKG could improve the diagnosis of this condition in routine practice. Study findings are published in Nature Medicine.
    Systolic low ejection fraction is defined as the heart’s inability to contract strongly enough with each beat to pump at least 50% of the blood from its chamber. An echocardiogram can readily diagnose low ejection fraction, but this time-consuming imaging test requires more resources than a 12-lead EKG, which is fast, inexpensive and readily available. The AI-enabled EKG algorithm was tested and developed through a convolutional neural network and validated in subsequent studies.
    The EAGLE trial took place in 45 medical institutions in Minnesota and Wisconsin, including rural clinics, and community and academic medical centers. In all, 348 primary care clinicians from 120 medical care teams were randomly assigned to usual care or intervention. The intervention group was alerted to a positive screening result for low ejection fraction via the electronic health record, prompting them to order an echocardiogram to confirm.
    “The AI-enabled EKG facilitated the diagnosis of patients with low ejection fraction in a real-world setting by identifying people who previously would have slipped through the cracks,” says Peter Noseworthy, M.D., a Mayo Clinic cardiac electrophysiologist. Dr. Noseworthy is senior author on the study.
    In eight months, 22,641 adult patients had an EKG under the care of the clinicians in the trial. The AI found positive results in 6% of the patients. The proportion of patients who received an echocardiogram was similar overall, but among patients with a positive screening result, a higher percentage of intervention patients received an echocardiogram.
    “The AI intervention increased the diagnosis of low ejection fraction overall by 32% relative to usual care. Among patients with a positive AI result, the relative increase of diagnosis was 43%,” says Xiaoxi Yao, Ph.D., a health outcomes researcher in cardiovascular diseases at Mayo Clinic and first author on the study. “To put it in absolute terms, for every 1,000 patients screened, the AI screening yielded five new diagnoses of low ejection fraction over usual care.”
    “With EAGLE, the information was readily available in the electronic health record, and care teams could see the results and decide how to use that information,” says Dr. Noseworthy. “The takeaway is that we are likely to see more AI use in the practice of medicine as time goes on. It’s up to us to figure how to use this in a way that improves care and health outcomes but does not overburden front-line clinicians.”
    Also, the EAGLE trial used a positive deviance approach to evaluate the top five care team users and the top five nonusers of the AI screening information. Dr. Yao says this cycle of learning and feedback from physicians will demonstrate ways of improving adaptation and application of AI technology in the practice.
    EAGLE is one of the first large-scale trials to demonstrate value of AI in routine practice. The low ejection fraction algorithm, which has received Food and Drug Administration breakthrough designation, is one of several algorithms developed by Mayo and licensed to Anumana Inc., a new company focusing on unlocking hidden biomedical knowledge to enable early detection as well as accelerate treatment of heart disease. The low ejection fraction algorithm was also previously licensed to Eko Devices Inc., specifically for hand-held devices that are externally applied to the chest.
    The EAGLE trial was funded by Mayo Clinic’s Robert D. and Patricia E. Kern Center for the Science of Health Care Delivery, in collaboration with the departments of Cardiovascular Medicine and Family Medicine, and the Division of Community Internal Medicine.
    Story Source:
    Materials provided by Mayo Clinic. Original written by Terri Malloy. Note: Content may be edited for style and length. More

  • in

    Quantum drum duet measured

    Like conductors of a spooky symphony, researchers at the National Institute of Standards and Technology (NIST) have “entangled” two small mechanical drums and precisely measured their linked quantum properties. Entangled pairs like this might someday perform computations and transmit data in large-scale quantum networks.
    The NIST team used microwave pulses to entice the two tiny aluminum drums into a quantum version of the Lindy Hop, with one partner bopping in a cool and calm pattern while the other was jiggling a bit more. Researchers analyzed radar-like signals to verify that the two drums’ steps formed an entangled pattern — a duet that would be impossible in the everyday classical world.
    What’s new is not so much the dance itself but the researchers’ ability to measure the drumbeats, rising and falling by just one-quadrillionth of a meter, and verify their fragile entanglement by detecting subtle statistical relationships between their motions.
    The research is described in the May 7 issue of Science.
    “If you analyze the position and momentum data for the two drums independently, they each simply look hot,” NIST physicist John Teufel said. “But looking at them together, we can see that what looks like random motion of one drum is highly correlated with the other, in a way that is only possible through quantum entanglement.”
    Quantum mechanics was originally conceived as the rulebook for light and matter at atomic scales. However, in recent years researchers have shown that the same rules can apply to increasingly larger objects such as the drums. Their back-and-forth motion makes them a type of system known as a mechanical oscillator. Such systems were entangled for the first time at NIST about a decade ago, and in that case the mechanical elements were single atoms. More

  • in

    Physicists find a novel way to switch antiferromagnetism on and off

    When you save an image to your smartphone, those data are written onto tiny transistors that are electrically switched on or off in a pattern of “bits” to represent and encode that image. Most transistors today are made from silicon, an element that scientists have managed to switch at ever-smaller scales, enabling billions of bits, and therefore large libraries of images and other files, to be packed onto a single memory chip.
    But growing demand for data, and the means to store them, is driving scientists to search beyond silicon for materials that can push memory devices to higher densities, speeds, and security.
    Now MIT physicists have shown preliminary evidence that data might be stored as faster, denser, and more secure bits made from antiferromagnets.
    Antiferromagnetic, or AFM materials are the lesser-known cousins to ferromagnets, or conventional magnetic materials. Where the electrons in ferromagnets spin in synchrony — a property that allows a compass needle to point north, collectively following the Earth’s magnetic field — electrons in an antiferromagnet prefer the opposite spin to their neighbor, in an “antialignment” that effectively quenches magnetization even at the smallest scales.
    The absence of net magnetization in an antiferromagnet makes it impervious to any external magnetic field. If they were made into memory devices, antiferromagnetic bits could protect any encoded data from being magnetically erased. They could also be made into smaller transistors and packed in greater numbers per chip than traditional silicon.
    Now the MIT team has found that by doping extra electrons into an antiferromagnetic material, they can turn its collective antialigned arrangement on and off, in a controllable way. They found this magnetic transition is reversible, and sufficiently sharp, similar to switching a transistor’s state from 0 to 1. The results, published today in Physical Review Letters, demonstrate a potential new pathway to use antiferromagnets as a digital switch. More

  • in

    These climate-friendly microbes recycle carbon without producing methane

    Earth’s hot springs and hydrothermal vents are home to a previously unidentified group of archaea. And, unlike similar tiny, single-celled organisms that live deep in sediments and munch on decaying plant matter, these archaea don’t produce the climate-warming gas methane, researchers report April 23 in Nature Communications.

    “Microorganisms are the most diverse and abundant form of life on Earth, and we just know 1 percent of them,” says Valerie De Anda, an environmental microbiologist at the University of Texas at Austin. “Our information is biased toward the organisms that affect humans. But there are a lot of organisms that drive the main chemical cycles on Earth that we just don’t know.”

    Archaea are a particularly mysterious group (SN: 2/14/20). It wasn’t until the late 1970s that they were recognized as a third domain of life, distinct from bacteria and eukaryotes (which include everything else, from fungi to animals to plants).

    For many years, archaea were thought to exist only in the most extreme environments on Earth, such as hot springs. But archaea are actually everywhere, and these microbes can play a big role in how carbon and nitrogen cycle between Earth’s land, oceans and atmosphere. One group of archaea, Thaumarchaeota, are the most abundant microbes in the ocean, De Anda says (SN: 11/28/17). And methane-producing archaea in cows’ stomachs cause the animals to burp large amounts of the gas into the atmosphere (SN: 11/18/15).

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Now, De Anda and her colleagues have identified an entirely new phylum — a large branch of related organisms on the tree of life — of archaea. The first evidence of these new organisms were within sediments from seven hot springs in China as well as from the deep-sea hydrothermal vents in the Guaymas Basin in the Gulf of California. Within these sediments, the team found bits of DNA that it meticulously assembled into the genetic blueprints, or genomes, of 15 different archaea.

    The researchers then compared the genetic information of the genomes with that of thousands of previously identified genomes of microbes described in publicly available databases. But “these sequences were completely different from anything that we know,” De Anda says.

    She and her colleagues gave the new group the name Brockarchaeota, for Thomas Brock, a microbiologist who was the first to grow archaea in the laboratory and who died in April. Brock’s discovery paved the way for polymerase chain reaction, or PCR, a Nobel Prize–winning technique used to copy small bits of DNA, and currently used in tests for COVID-19 (SN: 3/6/20).

    Brockarchaeota, it turns out, actually live all over the world — but until now, they were overlooked, undescribed and unnamed. Once De Anda and her team had pieced together the new genomes and then hunted for them in public databases, they discovered that bits of these previously unknown organisms had been found in hot springs, geothermal and hydrothermal vent sediments from South Africa to Indonesia to Rwanda.

    Within the new genomes, the team also hunted for genes related to the microbes’ metabolism — what nutrients they consume and what kind of waste they produce. Initially, the team expected that — like other archaea previously found in such environments — these archaea would be methane producers. They do munch on the same materials that methane-producing archaea do: one-carbon compounds like methanol or methylsulfide. “But we couldn’t identify the genes that produce methane,” De Anda says. “They are not present in Brockarchaeota.”

    That means that these archaea must have a previously undescribed metabolism, through which they can recycle carbon — for example in sediments on the seafloor — without producing methane. And, given how widespread they are, De Anda says, these organisms could be playing a previously hidden but significant role in Earth’s carbon cycle.

    “It’s twofold interesting — it’s a new phylum and a new metabolism,” says Luke McKay, a microbial ecologist of extreme environments at Montana State University in Bozeman. The fact that this entire group could have remained under the radar for so long, he adds, “is an indication of where we are in the state of microbiology.”

    But, McKay adds, the discovery is also a testimonial to the power of metagenomics, the technique by which researchers can painstakingly tease apart individual genomes out of a large hodgepodge of microbes in a given sample of water or sediments. Thanks to this technique, researchers are identifying more and more parts of the previously mysterious microbial world.

    “There’s so much out there,” De Anda says. And “every time you sequence more DNA, you start to realize that there’s more out there that you weren’t able to see the first time.” More

  • in

    Open source tool can help identify gerrymandering in voting maps

    With state legislatures nationwide preparing for the once-a-decade redrawing of voting districts, a research team has developed a better computational method to help identify improper gerrymandering designed to favor specific candidates or political parties.
    In an article in the Harvard Data Science Review, the researchers describe the improved mathematical methodology of an open source tool called GerryChain (https://github.com/mggg/GerryChain). The tool can help observers detect gerrymandering in a voting district plan by creating a pool, or ensemble, of alternate maps that also meet legal voting criteria. This map ensemble can show if the proposed plan is an extreme outlier — one that is very unusual from the norm of plans generated without bias, and therefore, likely to be drawn with partisan goals in mind.
    An earlier version of GerryChain was used to analyze maps proposed to remedy the Virginia House of Delegates districts that a federal court ruled in 2018 were unconstitutional racial gerrymanders. The updated tool will likely play a role in the upcoming redistricting using new census data.
    “We wanted to build an open-source software tool and make that available to people interested in reform, especially in states where there are skewed baselines,” said Daryl DeFord, assistant mathematics professor at Washington State University and a co-lead author on the paper. “It can be an impactful way for people to get involved in this process, particularly going into this year’s redistricting cycle where there are going to be a lot of opportunities for pointing out less than optimal behavior.”
    The GerryChain tool, first created by a team led by DeFord as a part of the 2018 Voting Rights Data Institute, has already been downloaded 20,000 times. The new paper, authored by Deford along with Moon Duchin of Tufts University and Justin Solomon of the Massachusetts Institute of Technology, focuses on how the mathematical and computational models implemented in GerryChain can be used to put proposed voting districting plans into context by creating large samples of alternative valid plans for comparison. These alternate plans are often used when a voting plan is challenged in court as being unfair as well as to analyze potential impacts of redistricting reform.
    For instance, the enacted 2010 House of Delegates plan in Virginia had 12 voting districts with a Black voting age population at or above 55%. By comparing that plan against an ensemble of alternate plans that all fit the legal criteria, advocates showed that map was an extreme outlier of what was possible. In other words, it was likely drawn intentionally to “pack” some districts with a Black voter population to “crack” other districts, breaking the influence of those voters. More

  • in

    T-GPS processes a graph with trillion edges on a single computer?

    A KAIST research team has developed a new technology that enables to process a large-scale graph algorithm without storing the graph in the main memory or on disks. Named as T-GPS (Trillion-scale Graph Processing Simulation) by the developer Professor Min-Soo Kim from the School of Computing at KAIST, it can process a graph with one trillion edges using a single computer.
    Graphs are widely used to represent and analyze real-world objects in many domains such as social networks, business intelligence, biology, and neuroscience. As the number of graph applications increases rapidly, developing and testing new graph algorithms is becoming more important than ever before. Nowadays, many industrial applications require a graph algorithm to process a large-scale graph (e.g., one trillion edges). So, when developing and testing graph algorithms such for a large-scale graph, a synthetic graph is usually used instead of a real graph. This is because sharing and utilizing large-scale real graphs is very limited due to their being proprietary or being practically impossible to collect.
    Conventionally, developing and testing graph algorithms is done via the following two-step approach: generating and storing a graph and executing an algorithm on the graph using a graph processing engine.
    The first step generates a synthetic graph and stores it on disks. The synthetic graph is usually generated by either parameter-based generation methods or graph upscaling methods. The former extracts a small number of parameters that can capture some properties of a given real graph and generates the synthetic graph with the parameters. The latter upscales a given real graph to a larger one so as to preserve the properties of the original real graph as much as possible.
    The second step loads the stored graph into the main memory of the graph processing engine such as Apache GraphX and executes a given graph algorithm on the engine. Since the size of the graph is too large to fit in the main memory of a single computer, the graph engine typically runs on a cluster of several tens or hundreds of computers. Therefore, the cost of the conventional two-step approach is very high.
    The research team solved the problem of the conventional two-step approach. It does not generate and store a large-scale synthetic graph. Instead, it just loads the initial small real graph into main memory. Then, T-GPS processes a graph algorithm on the small real graph as if the large-scale synthetic graph that should be generated from the real graph exists in main memory. After the algorithm is done, T-GPS returns the exactly same result as the conventional two-step approach.
    The key idea of T-GPS is generating only the part of the synthetic graph that the algorithm needs to access on the fly and modifying the graph processing engine to recognize the part generated on the fly as the part of the synthetic graph actually generated.
    The research team showed that T-GPS can process a graph of 1 trillion edges using a single computer, while the conventional two-step approach can only process of a graph of 1 billion edges using a cluster of eleven computers of the same specification. Thus, T-GPS outperforms the conventional approach by 10,000 times in terms of computing resources. The team also showed that the speed of processing an algorithm in T-GPS is up to 43 times faster than the conventional approach. This is because T-GPS has no network communication overhead, while the conventional approach has a lot of communication overhead among computers.
    Prof. Kim believes that this work will have a large impact on the IT industry where almost every area utilizes graph data, adding, “T-GPS can significantly increase both the scale and efficiency of developing a new graph algorithm.”
    This work was supported by the National Research Foundation (NRF) of Korea and Institute of Information & communications Technology Planning & Evaluation (IITP). More

  • in

    Better way to determine safe drug doses for children

    Determining safe yet effective drug dosages for children is an ongoing challenge for pharmaceutical companies and medical doctors alike. A new drug is usually first tested on adults, and results from these trials are used to select doses for pediatric trials. The underlying assumption is typically that children are like adults, just smaller, which often holds true, but may also overlook differences that arise from the fact that children’s organs are still developing.
    Compounding the problem, pediatric trials don’t always shed light on other differences that can affect recommendations for drug doses. There are many factors that limit children’s participation in drug trials — for instance, some diseases simply are rarer in children — and consequently, the generated datasets tend to be very sparse.
    To make drugs and their development safer for children, researchers at Aalto University and the pharmaceutical company Novartis have developed a method that makes better use of available data.
    ‘This is a method that could help determine safe drug doses more quickly and with less observations than before,’ says co-author Aki Vehtari, an associate professor of computer science at Aalto University and the Finnish Center for Artificial Intelligence FCAI.
    In their study, the research team created a model that improves our understanding of how organs develop.
    ‘The size of an organ is not necessarily the only thing that affects its performance. Kids’ organs are simply not as efficient as those of adults. In drug modeling, if we assume that size is the only thing that matters, we might end up giving too large of doses,’ explains Eero Siivola, first author of the study and doctoral student at Aalto University.
    Whereas the standard approach of assessing pediatric data relies on subjective evaluations of model diagnostics, the new approach, based on Gaussian process regression, is more data-driven and consequently less prone to bias. It is also better at handling small sample sizes as uncertainties are accounted for.
    The research comes out of FCAI’s research programme on Agile and probabilistic AI, offering a great example of a method that makes the best out of even very scarce datasets.
    In the study, the researchers demonstrate their approach by re-analyzing a pediatric trial investigating Everolimus, a drug used to prevent the rejection of organ transplants. But the possible benefits of their method are far reaching.
    ‘It works for any drug whose concentration we want to examine,’ Vehtari says, like allergy and pain medication.
    The approach could be particularly useful for situations where a new drug is tested on a completely new group — of children or adults — which is small in size, potentially making the trial phase much more efficient than it currently is. Another promising application relates to extending use of an existing drug to other symptoms or diseases; the method could support this process more effectively than current practices.
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More