More stories

  • in

    Development of a versatile, accurate AI prediction technique even with a small number of experiments

    NIMS, Asahi Kasei, Mitsubishi Chemical, Mitsui Chemicals and Sumitomo Chemical have used the chemical materials open platform framework to develop an AI technique capable of increasing the accuracy of machine learning-based predictions of material properties (e.g., strength, brittleness) through efficient use of material structural data obtained from only a small number of experiments. This technique may expedite the development of various materials, including polymers.
    Materials informatics research exploits machine learning models to predict the physical properties of materials of interest based on compositional and processing parameters (e.g., temperature and pressure). This approach has accelerated materials development. When physical properties of materials are known to be strongly influenced by their post-processing microstructures, the model’s property prediction accuracy can be effectively improved by incorporating microstructure-related data (e.g., x-ray diffraction (XRD) and differential scanning calorimetry (DSC) data) into it. However, these types of data can only be obtained by actually analyzing processed materials. In addition to these analyses, improving prediction accuracy requires predetermined parameters (e.g., material compositions).
    This research group developed an AI technique capable of first selecting potentially promising material candidates for fabrication and then accurately predicting their physical properties using XRD, DSC and other measurement data obtained from only a small number of actually synthesized materials. This technique selects candidate materials using Bayesian optimization and other methods and repeats the AI-based selection process while incorporating measurement data into machine learning models. To verify the technique’s effectiveness, the group used it to predict the physical properties of polyolefins. As a result, this technique was found to improve the material property prediction accuracy of machine learning models with a smaller sample set of actually synthesized materials than methods in which candidate materials were randomly selected.
    The use of this prediction accuracy improvement technique may enable a more thorough understanding of the relationship between materials’ structures and physical properties, which would facilitate investigation of fundamental causes of material properties and the formulation of more efficient materials development guidelines. Furthermore, this technique is expected to be applicable to the development of a wide range of materials in addition to polyolefins and other polymers, thereby promoting digital transformation (DX) in materials development.
    Story Source:
    Materials provided by National Institute for Materials Science, Japan. Note: Content may be edited for style and length. More

  • in

    Resolving the puzzles of graphene superconductivity

    A single layer of carbon atoms arranged in a honeycomb lattice makes up the promising nanomaterial called graphene. Research on a setup of three sheets of graphene stacked on top of one another so that their lattices are aligned but shifted — forming rhombohedral trilayer graphene — revealed an unexpected state of superconductivity. In this state electrical resistance vanishes due to the quantum nature of the electrons. The discovery was published and debated in Nature, whilst the origins remained elusive. Now, Professor Maksym Serbyn and Postdoc Areg Ghazaryan from the Institute of Science and Technology (IST) Austria in collaboration with Professor Erez Berg and Postdoc Tobias Holder from the Weizmann Institute of Science, Israel, developed a theoretical framework of unconventional superconductivity, which resolves the puzzles posed by the experimental data.
    The Puzzles and their Resolution
    Superconductivity relies on the pairing of free electrons in the material despite their repulsion arising from their equal negative charges. This pairing happens between electrons of opposite spin through vibrations of the crystal lattice. Spin is a quantum property of particles comparable, but not identical to rotation. The mentioned kind of pairing is the case at least in conventional superconductors. “Applied to trilayer graphene,” co-lead-author Ghazaryan points out, “we identified two puzzles that seem difficult to reconcile with conventional superconductivity.”
    First, above a threshold temperature of roughly -260 °C electrical resistance should rise in equal steps with increasing temperature. However, in the experiments it remained constant up to -250 °C. Second, pairing between electrons of opposite spin implies a coupling that contradicts another experimentally observed feature, namely the presence of a nearby configuration with fully aligned spins, which we know as magnetism. “In the paper, we show that both observations are explainable,” group leader Maksym Serbyn summarizes, “if one assumes that an interaction between electrons provides the ‘glue’ that holds electrons together. This leads to unconventional superconductivity.”
    When one draws all possible states, which electrons can have, on a certain chart and then separates the occupied ones from the unoccupied ones with a line, this separation line is called a Fermi surface. Experimental data from graphene shows two Fermi surfaces, creating a ring-like shape. In their work, the researchers draw from a theory from Kohn and Luttinger from the 1960’s and demonstrate that such circular Fermi surfaces favor a mechanism for superconductivity based only on electron interactions. They also suggest experimental setups to test their argument and offer routes towards raising the critical temperature, where superconductivity starts appearing.
    The Benefits of Graphene Superconductivity
    While superconductivity has been observed in other trilayer and bilayer graphene, these known materials must be specifically engineered and may be hard to control because of their low stability. Rhombohedral trilayer graphene, although rare, is naturally occurring. The proposed theoretical solution has the potential of shedding light on long-standing problems in condensed matter physics and opening the way to potential applications of both superconductivity and graphene.
    Story Source:
    Materials provided by Institute of Science and Technology Austria. Note: Content may be edited for style and length. More

  • in

    AI models microprocessor performance in real-time

    Computer engineers at Duke University have developed a new AI method for accurately predicting the power consumption of any type of computer processor more than a trillion times per second while barely using any computational power itself. Dubbed APOLLO, the technique has been validated on real-world, high-performance microprocessors and could help improve the efficiency and inform the development of new microprocessors.
    The approach is detailed in a paper published at MICRO-54: 54th Annual IEEE/ACM International Symposium on Microarchitecture, one of the top-tier conferences in computer architecture, where it was selected the conference’s best publication.
    “This is an intensively studied problem that has traditionally relied on extra circuitry to address,” said Zhiyao Xie, first author of the paper and a PhD candidate in the laboratory of Yiran Chen, professor of electrical and computer engineering at Duke. “But our approach runs directly on the microprocessor in the background, which opens many new opportunities. I think that’s why people are excited about it.”
    In modern computer processors, cycles of computations are made on the order of 3 trillion times per second. Keeping track of the power consumed by such intensely fast transitions is important to maintain the entire chip’s performance and efficiency. If a processor draws too much power, it can overheat and cause damage. Sudden swings in power demand can cause internal electromagnetic complications that can slow the entire processor down.
    By implementing software that can predict and stop these undesirable extremes from happening, computer engineers can protect their hardware and increase its performance. But such schemes come at a cost. Keeping pace with modern microprocessors typically requires precious extra hardware and computational power.
    “APOLLO approaches an ideal power estimation algorithm that is both accurate and fast and can easily be built into a processing core at a low power cost,” Xie said. “And because it can be used in any type of processing unit, it could become a common component in future chip design.”
    The secret to APOLLO’s power comes from artificial intelligence. The algorithm developed by Xie and Chen uses AI to identify and select just 100 of a processor’s millions of signals that correlate most closely with its power consumption. It then builds a power consumption model off of those 100 signals and monitors them to predict the entire chip’s performance in real-time.
    Because this learning process is autonomous and data driven, it can be implemented on most any computer processor architecture — even those that have yet to be invented. And while it doesn’t require any human designer expertise to do its job, the algorithm could help human designers do theirs.
    “After the AI selects its 100 signals, you can look at the algorithm and see what they are,” Xie said. “A lot of the selections make intuitive sense, but even if they don’t, they can provide feedback to designers by informing them which processes are most strongly correlated with power consumption and performance.”
    The work is part of a collaboration with Arm Research, a computer engineering research organization that aims to analyze the disruptions impacting industry and create advanced solutions, many years ahead of deployment. With the help of Arm Research, APOLLO has already been validated on some of today’s highest performing processors. But according to the researchers, the algorithm still needs testing and comprehensive evaluations on many more platforms before it would be adopted by commercial computer manufacturers.
    “Arm Research works with and receives funding from some of the biggest names in the industry, like Intel and IBM, and predicting power consumption is one of their major priorities,” Chen added. “Projects like this offer our students an opportunity to work with these industry leaders, and these are the types of results that make them want to continue working with and hiring Duke graduates.”
    This work was conducted under the high-performance AClass CPU research program at Arm Research and was partially supported by the National Science Foundation (NSF-2106828, NSF-2112562) and the Semiconductor Research Corporation (SRC).
    Story Source:
    Materials provided by Duke University. Original written by Ken Kingery. Note: Content may be edited for style and length. More

  • in

    Doctoral student finds alternative cell option for organs-on-chips

    Organ-on-a-chip technology has provided a push to discover new drugs for a variety of rare and ignored diseases for which current models either don’t exist or lack precision. In particular, these platforms can include the cells of a patient, thus resulting in patient-specific discovery.
    As an example, even though sickle cell disease was first described in the early 1900s, the range of severity in the disease causes challenges when trying to treat patients. Since this disease is most prevalent among economically poor and underrepresented minorities, there has been a general lack of stimulus to discover new treatment strategies due to socioeconomic inequity, making it one of the most serious orphan conditions globally.
    Tanmay Mathur, doctoral student in Dr. Abhishek Jain’s lab in the Department of Biomedical Engineering at Texas A&M University, is developing personalized blood vessels to improve knowledge and derive treatments against the vascular dysfunction seen in sickle cell disease and other rare diseases of the blood and vessels.
    Current cells used in blood vessel models use induced pluripotent stem cells (IPSCs), which are derived from a patient’s endothelial cells. However, Mathur said these cells have limitations — they expire quickly and can’t be stored for long periods of time.
    Mathur’s research offers an alternative — blood outgrowth endothelial cells (BOECs), which can be isolated from a patient’s blood. All that is needed is 50 to 100 milliliters of blood.
    “The equipment and the reagents involved are also very cheap and available in most clinical settings,” Mathur said. “These cells are progenitor endothelial cells, meaning they have high proliferation, so if you keep giving them the food they want, within a month, we will have enough cells so that we can successfully keep on subculturing them forever.”
    However, the question is do BOECs work like IPSCs in the context of organ-on-chips, a microdevice that allows researchers to create these blood vessel models. That’s a question Mathur recently answered in a paper published in the Journal of the American Heart Association. More

  • in

    Real-world study shows the potential of gait authentication to enhance smartphone security

    Real-world tests have shown that gait authentication could be a viable means of protecting smartphones and other mobiles devices from cyber crime, according to new research.
    A study led by the University of Plymouth asked smartphone users to go about their daily activities while motion sensors within their mobile devices captured data about their stride patterns.
    The results showed the system was on average around 85% accurate in recognising an individual’s gait, with that figure rising to almost 90% when they were walking normally and fast walking.
    There are currently more than 6.3billion smartphone users around the world, using their devices to provide a wide range of services and to store sensitive and confidential information.
    While authentication mechanisms — such as passwords, PINs and biometrics — exist, studies have shown the level of security and usability of such approaches varies considerably.
    Writing in Computers & Security, the researchers say the study illustrates that — within an appropriate framework — gait recognition could be a viable technique for protecting individuals and their data from potential crime. More

  • in

    Artificial intelligence that can discover hidden physical laws in various data

    Researchers at Kobe University and Osaka University have successfully developed artificial intelligence technology that can extract hidden equations of motion from regular observational data and create a model that is faithful to the laws of physics.
    This technology could enable us to discover the hidden equations of motion behind phenomena for which the laws were considered unexplainable. For example, it may be possible to use physics-based knowledge and simulations to examine ecosystem sustainability.
    The research group consisted of Associate Professor YAGUCHI Takaharu and PhD. student CHEN Yuhan (Graduate School of System Informatics, Kobe University), and Associate Professor MATSUBARA Takashi (Graduate School of Engineering Science, Osaka University).
    These research achievements were made public on December 6, 2021, and were presented at the Thirty-fifth Conference on Neural Information Processing Systems (NeurIPS2021), a meeting on artificial intelligence technologies. This research was among the top 3% selected for the spotlight category.
    Main Points Being able to model (formularize) physical phenomena using artificial intelligence could result in extremely precise, high-speed simulations. In current methods using artificial intelligence, it is necessary to use transformed data that fits the equation of motion.Therefore it is difficult to apply artificial intelligence to actual observational data for which the equations of motion are unknown. The research group used geometry to develop artificial intelligence that can find the hidden equation of motion in the supplied observational data (regardless of its format) and model it accordingly. In the future, it may be possible to discover the hidden physical laws behind phenomena that that had previously been considered to be incompatible with Newton’s Laws, such as ecosystem changes. This will enable us to carry out investigations and simulations related to these phenomena using the laws of physics, which could reveal previously unknown properties.Ordinarily, predictions of physical phenomena are carried out via simulations using supercomputers. These simulations use mathematical models based on the laws of physics, however if the model is not highly reliable then the results will also lack reliability. Therefore, it is essential to develop a method of producing highly reliable models from the observational data of phenomena. Furthermore, in recent years the range of physics applications has expanded beyond our predications, and it has been demonstrated that it is possible to apply Newton’s Laws to other aspects, such as part of a model to show ecosystem changes. However, a concrete equation of motion has not yet been revealed for many cases.
    Research Methodology
    This research study developed a method of discovering novel equations of motion in observational data for phenomena that Newton’s Laws can be applied to. Previously, research has been conducted into discovering equations of motion from data, however the prior method required the data to be in the appropriate format to fit its assumed special form of the equation of motion. However, there are many cases in reality where it is not clear what data format is best to use, therefore it is difficult to apply realistic data.
    In response to this, the researchers considered that the appropriate transformation of observational data is akin to coordinate transformation in geometry, thus resolving the issue by applying the geometric idea of coordinate transformation invariance found in physics. For this, it is necessary to illuminate the unknown geometric properties behind phenomena. The research team subsequently succeeded in developing AI that can find these geometric properties in data. If equations of motion can be extracted from data, then it will be possible to use these equations to create models and simulations that are faithful to physical laws.
    Further Developments
    Physics simulations are carried out in a wide range of fields, including weather forecasting, drug discovery, building analyses, and car design, but they usually require extensive calculations. However, if AI can learn from the data of specific phenomena and construct small-scale models using the proposed method, then this will simplify and speed up calculations that are faithful to the laws of physics. This will contribute towards the development of the aforementioned fields.
    Furthermore, we can apply this method to aspects that at first glance appear to be unrelated to physics. If equations of motion can be extracted in such cases, this will make it possible to do physics knowledge-based investigations and simulations even for phenomena that has been considered impossible to explain using physics. For example, it may be possible to find a hidden equation of motion in animal population data that shows the change in the number of individuals. This could be used to investigate ecosystem sustainability by applying the appropriate physical laws (eg. the law of conservation of energy, etc.).
    Story Source:
    Materials provided by Kobe University. Note: Content may be edited for style and length. More

  • in

    Key step toward personalized medicine: Modeling biological systems

    A new study by the Oregon State University College of Engineering shows that machine learning techniques can offer powerful new tools for advancing personalized medicine, care that optimizes outcomes for individual patients based on unique aspects of their biology and disease features.
    The research with machine learning, a branch of artificial intelligence in which computer systems use algorithms and statistical models to look for trends in data, tackles long-unsolvable problems in biological systems at the cellular level, said Oregon State’s Brian D. Wood, who conducted the study with then OSU Ph.D. student Ehsan Taghizadeh and Helen M. Byrne of the University of Oxford.
    “Those systems tend to have high complexity — first because of the vast number of individual cells and second, because of the highly nonlinear way in which cells can behave,” said Wood, a professor of environmental engineering. “Nonlinear systems present a challenge for upscaling methods, which is the primary means by which researchers can accurately model biological systems at the larger scales that are often the most relevant.”
    A linear system in science or mathematics means any change to the system’s input results in a proportional change to the output; a linear equation, for example, might describe a slope that gains 2 feet vertically for every foot of horizontal distance.
    Nonlinear systems don’t work that way, and many of the world’s systems, including biological ones, are nonlinear.
    The new research, funded in part by the U.S. Department of Energy and published in the Journal of Computational Physics, is one of the first examples of using machine learning to address issues with modeling nonlinear systems and understanding complex processes that might occur in human tissues, Wood said. More

  • in

    Community of ethical hackers needed to prevent AI's looming 'crisis of trust'

    The Artificial Intelligence industry should create a global community of hackers and “threat modellers” dedicated to stress-testing the harm potential of new AI products in order to earn the trust of governments and the public before it’s too late.
    This is one of the recommendations made by an international team of risk and machine-learning experts, led by researchers at the University of Cambridge’s Centre for the Study of Existential Risk (CSER), who have authored a new “call to action” published today in the journal Science.
    They say that companies building intelligent technologies should harness techniques such as “red team” hacking, audit trails and “bias bounties” — paying out rewards for revealing ethical flaws — to prove their integrity before releasing AI for use on the wider public.
    Otherwise, the industry faces a “crisis of trust” in the systems that increasingly underpin our society, as public concern continues to mount over everything from driverless cars and autonomous drones to secret social media algorithms that spread misinformation and provoke political turmoil.
    The novelty and “black box” nature of AI systems, and ferocious competition in the race to the marketplace, has hindered development and adoption of auditing or third party analysis, according to lead author Dr Shahar Avin of CSER.
    The experts argue that incentives to increase trustworthiness should not be limited to regulation, but must also come from within an industry yet to fully comprehend that public trust is vital for its own future — and trust is fraying. More