More stories

  • in

    'IcePic' algorithm outperforms humans in predicting ice crystal formation

    Cambridge scientists have developed an artificially intelligent algorithm capable of beating scientists at predicting how and when different materials form ice crystals.
    The program — IcePic — could help atmospheric scientists improve climate change models in the future. Details are published today in the journal PNAS.
    Water has some unusual properties, such as expanding when it turns into ice. Understanding water and how it freezes around different molecules has wide-reaching implications in a broad range of areas, from weather systems that can affect whole continents to storing biological tissue samples in a hospital.
    The Celsius temperature scale was designed based on the premise that it is the transition temperature between water and ice; however, whilst ice always melts at 0°C, water doesn’t necessarily freeze at 0°C. Water can still be in liquid form at -40°C, and it is impurities in wate that enable ice to freeze at higher temperatures. One of the biggest aims of the field has been to predict the ability of different materials to promote the formation of ice — known as a material’s “ice nucleation ability.”
    Researchers at the University of Cambridge, have developed a ‘deep learning’ tool able to predict the ice nucleation ability of different materials — and which was able to beat scientists in an online ‘quiz’ in which they were asked to predict when ice crystals would form.
    Deep learning is how artificial intelligence (AI) learns to draw insights from raw data. It finds its own patterns in the data, freeing it of the need for human input so that it can process results faster and more precisely. In the case of IcePic, it can infer different ice crystal formation properties around different materials. IcePic has been trained on thousands of images so that it can look at completely new systems and infer accurate predictions from them.
    The team set up a quiz in which scientists were asked to predict when ice crystals would form in different conditions shown by 15 different images. These results were then measured against IcePic’s performance. When put to the test, IcePic was far more accurate in determining a material’s ice nucleation ability than over 50 researchers from across the globe. Moreover, it helped identify where humans were going wrong.
    Michael Davies, a PhD student in the ICE lab at the Yusuf Hamied Department of Chemistry, Cambridge, and University College London, London, first author of the study, said: “It was fascinating to learn that the images of water we showed IcePic contain enough information to actually predict ice nucleation.
    “Despite us — that is, human scientists — having a 75 year head start in terms of the science, IcePic was still able to do something we couldn’t.”
    Determining the formation of ice has become especially relevant in climate change research.
    Water continuously moves within the Earth and its atmosphere, condensing to form clouds, and precipitating in the form of rain and snow. Different foreign particles affect how ice forms in these clouds, for example, smoke particles from pollution compared to smoke particles from a volcano. Understanding how different conditions affect our cloud systems is essential for more accurate weather predictions.
    “The nucleation of ice is really important for the atmospheric science community and climate modelling,” said Davies. “At the moment there is no viable way to predict ice nucleation other than direct experiments or expensive simulations. IcePic should open up a lot more applications for discovery.”
    Story Source:
    Materials provided by University of Cambridge. Note: Content may be edited for style and length. More

  • in

    A new leap in understanding nickel oxide superconductors

    A new study shows that nickel oxide superconductors, which conduct electricity with no loss at higher temperatures than conventional superconductors do, contain a type of quantum matter called charge density waves, or CDWs, that can accompany superconductivity.
    The presence of CDWs shows that these recently discovered materials, also known as nickelates, are capable of forming correlated states — “electron soups” that can host a variety of quantum phases, including superconductivity, researchers from the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University reported in Nature Physics today.
    “Unlike in any other superconductor we know about, CDWs appear even before we dope the material by replacing some atoms with others to change the number of electrons that are free to move around,” said Wei-Sheng Lee, a SLAC lead scientist and investigator with the Stanford Institute for Materials and Energy Science (SIMES) who led the study.
    “This makes the nickelates a very interesting new system — a new playground for studying unconventional superconductors.”
    Nickelates and cuprates
    In the 35 years since the first unconventional “high-temperature” superconductors were discovered, researchers have been racing to find one that could carry electricity with no loss at close to room temperature. This would be a revolutionary development, allowing things like perfectly efficient power lines, maglev trains and a host of other futuristic, energy-saving technologies. More

  • in

    Using AI to train teams of robots to work together

    When communication lines are open, individual agents such as robots or drones can work together to collaborate and complete a task. But what if they aren’t equipped with the right hardware or the signals are blocked, making communication impossible? University of Illinois Urbana-Champaign researchers started with this more difficult challenge. They developed a method to train multiple agents to work together using multi-agent reinforcement learning, a type of artificial intelligence.
    “It’s easier when agents can talk to each other,” said Huy Tran, an aerospace engineer at Illinois. “But we wanted to do this in a way that’s decentralized, meaning that they don’t talk to each other. We also focused on situations where it’s not obvious what the different roles or jobs for the agents should be.”
    Tran said this scenario is much more complex and a harder problem because it’s not clear what one agent should do versus another agent.
    “The interesting question is how do we learn to accomplish a task together over time,” Tran said.
    Tran and his collaborators used machine learning to solve this problem by creating a utility function that tells the agent when it is doing something useful or good for the team.
    “With team goals, it’s hard to know who contributed to the win,” he said. “We developed a machine learning technique that allows us to identify when an individual agent contributes to the global team objective. If you look at it in terms of sports, one soccer player may score, but we also want to know about actions by other teammates that led to the goal, like assists. It’s hard to understand these delayed effects.”
    The algorithms the researchers developed can also identify when an agent or robot is doing something that doesn’t contribute to the goal. “It’s not so much the robot chose to do something wrong, just something that isn’t useful to the end goal.”
    They tested their algorithms using simulated games like Capture the Flag and StarCraft, a popular computer game.
    You can watch a video of Huy Tran demonstrating related research using deep reinforcement learning to help robots evaluate their next move in Capture the Flag.
    “StarCraft can be a little bit more unpredictable — we were excited to see our method work well in this environment too.”
    Tran said this type of algorithm is applicable to many real-life situations, such as military surveillance, robots working together in a warehouse, traffic signal control, autonomous vehicles coordinating deliveries, or controlling an electric power grid.
    Tran said Seung Hyun Kim did most of the theory behind the idea when he was an undergraduate student studying mechanical engineering, with Neale Van Stralen, an aerospace student, helping with the implementation. Tran and Girish Chowdhary advised both students. The work was recently presented to the AI community at the Autonomous Agents and Multi-Agent Systems peer-reviewed conference.
    Story Source:
    Materials provided by University of Illinois Grainger College of Engineering. Original written by Debra Levey Larson. Note: Content may be edited for style and length. More

  • in

    The structure of the smallest semiconductor was elucidated

    A semiconductor is a material whose conductivity lies somewhere between that of a conductor and an insulator. This property allows semiconductors to serve as the base material for modern electronics and transistors. It is no understatement that the technological progress in the latter part of the 20th century was largely spearheaded by the semiconductor industry.
    Today, technological advancements in semiconductor nanocrystals are currently ongoing. For example, quantum dots and wires from semiconducting materials are of great interest in displays, photocatalytic, and other electronic devices. However, numerous aspects of the colloidal nanocrystals are still remaining to be understood at the fundamental level. An important one among them is the elucidation of the molecular-level mechanisms of the formation and growth of the nanocrystals.
    These semiconducting nanocrystals are grown starting from tiny individual precursors made of a small number of atoms. These precursors are called “nanoclusters.” Isolation and molecular structure determination of such nanoclusters (or simply clusters) have been the subject of immense interest in the past several decades. The structural details of clusters, typical nuclei of the nanocrystals, are anticipated to provide critical insights into the evolution of the properties of the nanocrystals.
    Different ‘seed’ nanoclusters result in the growth of different nanocrystals. As such, it is important to have a homogenous mixture of identical nanoclusters if one wishes to grow identical nanocrystals. However, the synthesis of nanoclusters often results in the production of clusters with all sorts of different size and configuration, and purifying the mixture to obtain only the desirable particles is very challenging.
    Therefore, producing nanoclusters with homogenous sizes is important. “Magic-sized nanoclusters, MSCs,” which are preferably formed over random sizes in a uniform manner, possess size range from 0.5 to 3.0 nm. Among these, MSCs composed of non-stoichiometric cadmium and chalcogenide ratio (non 1:1) are the most studied. A new class of MSCs with a 1:1 stoichiometric ratio of metal-chalcogenide ratio have been under spotlight owing to the prediction of intriguing structures. For example, Cd13Se13, Cd33Se33 and Cd34Se34, which consist of an equal number of cadmium and selenium atoms have been synthesized and characterized.
    Recently, researchers at the Center for Nanoparticle Research (led by Professor HYEON Taeghwan) within the Institute for Basic Science (IBS) in collaboration with the teams at Xiamen University (led by Professor Nanfeng ZHENG) and at the University of Toronto (led by Professor Oleksandr VOZNYY) reported the colloidal synthesis and atomic-level structure of stoichiometric semiconductor cadmium selenide (CdSe) cluster. This is the smallest nanocluster synthesized as of today. More

  • in

    Next generation atomic clocks are a step closer to real world applications

    Quantum clocks are shrinking, thanks to new technologies developed at the University of Birmingham-led UK Quantum Technology Hub Sensors and Timing
    Working in collaboration with and partly funded by the UK’s Defence Science and Technology Laboratory (Dstl), a team of quantum physicists have devised new approaches that not only reduce the size of their clock, but also make it robust enough to be transported out of the laboratory and employed in the ‘real world’.
    Quantum — or atomic — clocks are widely seen as essential for increasingly precise approaches to areas such as online communications across the world, navigation systems, or global trading in stocks, where fractions of seconds could make a huge economic difference. Atomic clocks with optical clock frequencies can be 10,000 times more accurate than their microwave counterparts, opening up the possibility of redefining the standard (SI) unit of measurement.
    Even more advanced optical clocks could one day make a significant difference both in everyday life and in fundamental science. By allowing longer periods between needing to resynchronise than other kinds of clock, they offer increased resilience for national timing infrastructure and unlock future positioning and navigation applications for autonomous vehicles. The unparalleled accuracy of these clocks can also help us see beyond standard models of physics and understand some of the most mysterious aspects of the universe, including dark matter and dark energy. Such clocks will also help to address fundamental physics questions such as whether the fundamental constants are really ‘constants’ or they are varying with time
    Lead researcher, Dr Yogeshwar Kale, said: “The stability and precision of optical clocks make them crucial to many future information networks and communications. Once we have a system that is ready for use outside the laboratory, we can use them, for example, on -ground navigation networks where all such clocks are connected via optical fibre and started talking with each other. Such networks will reduce our dependence on GPS systems, which can sometimes fail.
    “These transportable optical clocks not only will help to improve geodetic measurements — the fundamental properties of the Earth’s shape and gravity variations — but will also serve as precursors to monitor and identify geodynamic signals like earthquakes and volcanoes at early stages.”
    Although such quantum clocks are advancing rapidly, key barriers to deploying them are their size — current models come in a van or in a car trailer and are about 1500 litres — and their sensitivity to environmental conditions limiting their transport between different places. More

  • in

    Boosting memory performance by strong ion bombardment

    Recently, new technology has emerged that dramatically improves the performance of flash memory by a strong ion bombardment process. This memory platform can reliably express multiple data in a single device, rendering it applicable for future neuromorphic computing as well as increasing memory capacity.
    POSTECH professor Yoonyoung Chung (Department of Electrical Engineering and Department of Semiconductor Engineering) and Ph.D. candidate Seongmin Park (Department of Electrical Engineering), in joint research with Samsung Electronics, have developed a flash memory with increased data storage by intentionally generating defects.
    As artificial intelligence technology advances, developing a novel semiconductor device optimized for the neural network with multilevel data is required. New materials and devices have been developed as neuromorphic devices but have limitations in durability, scalability, and storage capacity compared to flash memory, which has been widely used as a storage device for various applications.
    To overcome these issues, the research team implemented a strong plasma bombardment process during the deposition of the data-storage layer to generate artificial defect sites in a flash memory device. The researchers confirmed that more electrons can be stored in generated defects, dramatically increasing the amount of data storage compared to conventional flash memory.
    A memory with multiple levels of data can be demonstrated when the electrons are gradually filled in the data storage layer in which many defects are generated. The multilevel flash memory developed in this study can reliably distinguish eight data levels.
    The findings from the study are significant in that they can minimize the risk of developing a new semiconductor material or structure and, at the same time, significantly advance flash memory with excellent performance and scalability for AI applications. When applied to neuromorphic systems, inference accuracy and reliability are expected to be dramatically improved compared to conventional devices.
    Recently published in Materials Today Nano, a renowned international academic journal in the field of nanotechnology, this study was supported by Samsung Electronics and the Next-generation Intelligence-Type Semiconductor Development Program.
    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Ant colonies behave like neural networks when making decisions

    Temperatures are rising, and one colony of ants will soon have to make a collective decision. Each ant feels the rising heat beneath its feet but carries along as usual until, suddenly, the ants reverse course. The whole group rushes out as one — a decision to evacuate has been made. It is almost as if the colony of ants has a greater, collective mind.
    A new study suggests that indeed, ants as a group behave similar to networks of neurons in a brain.
    Rockefeller’s Daniel Kronauer and postdoctoral associate Asaf Gal developed a new experimental setup to meticulously analyze decision-making in ant colonies. As reported in the Proceedings of the National Academy of Sciences, they found that when a colony evacuates due to rising temperatures, its decision is a function of both the magnitude of the heat increase and the size of the ant group.
    The findings suggest that ants combine sensory information with the parameters of their group to arrive at a group response — a process similar to neural computations giving rise to decisions.
    “We pioneered an approach to understand the ant colony as a cognitive-like system that perceives inputs and then translates them into behavioral outputs,” says Kronauer, head of the Laboratory of Social Evolution and Behavior. “This is one of the first steps toward really understanding how insect societies engage in collective computation.”
    A new paradigm
    At its most basic level, decision-making boils down to a series of computations meant to maximize benefits and minimize costs. For instance, in a common type of decision-making called sensory response thresholding, an animal has to detect sensory input like heat past a certain level to produce a certain costly behavior, like moving away. If the rise in temperature isn’t big enough, it won’t be worth it.
    Kronauer and Gal wanted to investigate how this type of information processing occurs at the collective level, where group dynamics come into play. They developed a system in which they could precisely perturb an ant colony with controlled temperature increases. To track the behavioral responses of individual ants and the entire colony, they marked each insect with different colored dots and followed their movements with a tracking camera.
    As the researchers expected, colonies of a set size of 36 workers and 18 larvae dependably evacuated their nest when the temperature hit around 34 degrees Celsius. This finding makes intuitive sense, Kronauer says, because “if you become too uncomfortable, you leave.”
    However, the researchers were surprised to find that the ants were not merely responding to temperature itself. When they increased the size of the colony from 10 to 200 individuals, the temperature necessary to trigger the decision to vacate increased. Colonies of 200 individuals, for example, held out until temperatures soared past 36 degrees. “It seems that the threshold isn’t fixed. Rather, it’s an emergent property that changes depending on the group size,” Kronauer says.
    Individual ants are unaware of the size of their colony, so how can their decision depend on it? He and Gal suspect that the explanation has to do with the way pheromones, the invisible messengers that pass information between ants, scale their effect when more ants are present. They use a mathematical model to show that such a mechanism is indeed plausible. But they do not know why larger colonies would require higher temperatures to pack up shop. Kronauer ventures that it could simply be that the larger the colony’s size, the more onerous it is to relocate, pushing up the critical temperature for which relocations happen.
    In future studies, Kronauer and Gal hope to refine their theoretical model of the decision-making process in the ant colony by interfering with more parameters and seeing how the insects respond. For example, they can tamper with the level of pheromones in the ants’ enclosure or create genetically altered ants with different abilities to detect temperature changes. “What we’ve been able to do so far is to perturb the system and measure the output precisely,” Kronauer says. “In the long term, the idea is to reverse engineer the system to deduce its inner workings in more and more detail.”
    Story Source:
    Materials provided by Rockefeller University. Note: Content may be edited for style and length. More

  • in

    New method can improve explosion detection

    Computers can be trained to better detect distant nuclear detonations, chemical blasts and volcano eruptions by learning from artificial explosion signals, according to a new method devised by a University of Alaska Fairbanks scientist.
    The work, led by UAF Geophysical Institute postdoctoral researcher Alex Witsil, was published recently in the journal Geophysical Research Letters.
    Witsil, at the Geophysical Institute’s Wilson Alaska Technical Center, and colleagues created a library of synthetic infrasound explosion signals to train computers in recognizing the source of an infrasound signal. Infrasound is at a frequency too low to be heard by humans and travels farther than high-frequency audible waves.
    “We used modeling software to generate 28,000 synthetic infrasound signals, which, though generated in a computer, could hypothetically be recorded by infrasound microphones deployed hundreds of kilometers from a large explosion,” Witsil said.
    The artificial signals reflect variations in atmospheric conditions, which can alter an explosion’s signal regionally or globally as the sound waves propagate. Those changes can make it difficult to detect an explosion’s origin and type from a great distance.
    Why create artificial sounds of explosions rather than use real-world examples? Because explosions haven’t occurred at every location on the planet and the atmosphere constantly changes, there aren’t enough real-world examples to train generalized machine-learning detection algorithms. More