More stories

  • in

    A new approach to artificial intelligence that builds in uncertainty

    They call it artificial intelligence — not because the intelligence is somehow fake. It’s real intelligence, but it’s still made by humans. That means AI — a power tool that can add speed, efficiency, insight and accuracy to a researcher’s work — has many limitations.
    It’s only as good as the methods and data it has been given. On its own, it doesn’t know if information is missing, how much weight to give differing kinds of information or whether the data it draws on is incorrect or corrupted. It can’t deal precisely with uncertainty or random events — unless it learns how. Relying exclusively on data, as machine-learning models usually do, it does not leverage the knowledge experts have accumulated over years and physical models underpinning physical and chemical phenomena. It has been hard to teach the computer to organize and integrate information from widely different sources.
    Now researchers at the University of Delaware and the University of Massachusetts-Amherst have published details of a new approach to artificial intelligence that builds uncertainty, error, physical laws, expert knowledge and missing data into its calculations and leads ultimately to much more trustworthy models. The new method provides guarantees typically lacking from AI models, showing how valuable — or not — the model can be for achieving the desired result.
    Joshua Lansford, a doctoral student in UD’s Department of Chemical and Biomolecular Engineering, and Prof. Dion Vlachos, director of UD’s Catalysis Center for Energy Innovation, are co-authors on the paper published Oct. 14 in the journal Science Advances. Also contributing were Jinchao Feng and Markos Katsoulakis of the Department of Mathematics and Statistics at the University of Massachusetts-Amherst.
    The new mathematical framework could produce greater efficiency, precision and innovation for computer models used in many fields of research. Such models provide powerful ways to analyze data, study materials and complex interactions and tweak variables in virtual ways instead of in the lab.
    “Traditionally in physical modelings, we build a model first using only our physical intuition and expert knowledge about the system,” Lansford said. “Then after that, we measure uncertainty in predictions due to error in underlying variables, often relying on brute-force methods, where we sample, then run the model and see what happens.”
    Effective, accurate models save time and resources and point researchers to more efficient methods, new materials, greater precision and innovative approaches they might not otherwise consider.

    advertisement

    The paper describes how the new mathematical framework works in a chemical reaction known as the oxygen reduction reaction, but it is applicable to many kinds of modeling, Lansford said.
    “The chemistries and materials we need to make things faster or even make them possible — like fuel cells — are highly complex,” he said. “We need precision…. And if you want to make a more active catalyst, you need to have bounds on your prediction error. By intelligently deciding where to put your efforts, you can tighten the area to explore.
    “Uncertainty is accounted for in the design of our model,” Lansford said. “Now it is no longer a deterministic model. It is a probabilistic one.”
    With these new mathematical developments in place, the model itself identifies what data are needed to reduce model error, he said. Then a higher level of theory can be used to produce more accurate data or more data can be generated, leading to even smaller error boundaries on the predictions and shrinking the area to explore.
    “Those calculations are time-consuming to generate, so we’re often dealing with small datasets — 10-15 data points. That’s where the need comes in to apportion error.”
    That’s still not a money-back guarantee that using a specific substance or approach will deliver precisely the product desired. But it is much closer to a guarantee than you could get before.

    advertisement

    This new method of model design could greatly enhance work in renewable energy, battery technology, climate change mitigation, drug discovery, astronomy, economics, physics, chemistry and biology, to name just a few examples.
    Artificial intelligence doesn’t mean human expertise is no longer needed. Quite the opposite.
    The expert knowledge that emerges from the laboratory and the rigors of scientific inquiry is essential, foundational material for any computational model. More

  • in

    An ultrasonic projector for medicine

    A chip-based technology that generates sound profiles with high resolution and intensity could create new options for ultrasound therapy, which would become more effective and easier. A team of researchers led by Peer Fischer from the Max Planck Institute for Intelligent Systems and the University of Stuttgart has developed a projector that flexibly modulates three-dimensional ultrasound fields with comparatively little technical effort. Dynamic sound pressure profiles can thus be generated with higher resolution and sound pressure than the current technology allows. It should soon be easier to tailor ultrasound profiles to individual patients. New medical applications for ultrasound may even emerge.
    Ultrasound is widely used as a diagnostic tool in both medicine and materials science. It can also be used therapeutically. In the US, for example, tumours of the uterus and prostate are treated with high-power ultrasound. The ultrasound destroys the cancer cells by specific heating of the diseased tissue. Researchers worldwide are using ultrasound to combat tumours and other pathological changes in the brain. “In order to avoid damaging healthy tissue, the sound pressure profile must be precisely shaped,” explains Peer Fischer, Research Group Leader at the Max Planck Institute for Intelligent Systems and professor at the University of Stuttgart. Tailoring an intensive ultrasound field to diseased tissue is somewhat more difficult in the brain. This is because the skullcap distorts the sound wave. The Spatial Ultrasound Modulator (SUM) developed by researchers in Fischer’s group should help to remedy this situation and make ultrasound treatment more effective and easier in other cases. It allows the three-dimensional shape of even very intense ultrasound waves to be varied with high resolution — and with less technical effort than is currently required to modulate ultrasound profiles.
    High intensity sound pressure profiles with 10,000 pixels
    Conventional methods vary sound fields with several individual sound sources, the waves of which can be superimposed and shifted against each other. However, because the individual sound sources cannot be miniaturized at will, the resolution of these sound pressure profiles is limited to 1000 pixels. The sound transmitters are then so small that the sound pressure is sufficient for diagnostic but not therapeutic purposes. With the new technology, the researchers first generate an ultrasonic wave and then modulate its sound pressure profile independently, essentially killing two birds with one stone. “In this way, we can use much more powerful ultrasonic transducers,” explains postdoctoral fellow Kai Melde, who is part of the team that developed the SUM. “Thanks to a chip with 10,000 pixels that modulates the ultrasonic wave, we can generate a much finer-resolved profile.”
    “In order to modulate the sound pressure profile, we take advantage of the different acoustic properties of water and air,” says Zhichao Ma, a post-doctoral fellow in Fischer’s group, who was instrumental in developing the new SUM technology: “While an ultrasonic wave passes through a liquid unhindered, it is completely reflected by air bubbles.” The research team from Stuttgart thus constructed a chip the size of a thumbnail on which they can produce hydrogen bubbles by electrolysis (i.e. the splitting of water into oxygen and hydrogen with electricity) on 10,000 electrodes in a thin water film. The electrodes each have an edge length of less than a tenth of a millimetre and can be controlled individually.
    A picture show with ultrasound
    If you send an ultrasonic wave through the chip with a transducer (a kind of miniature loudspeaker), it passes through the chip unhindered. But as soon as the sound wave hits the water with the hydrogen bubbles, it continues to travel only through the liquid. Like a mask, this creates a sound pressure profile with cut-outs at the points where the air bubbles are located. To form a different sound profile, the researchers first wipe the hydrogen bubbles away from the chip and then generate gas bubbles in a new pattern.
    The researchers demonstrated how precisely and variably the new projector for ultrasound works by writing the alphabet in a kind of picture show of sound pressure profiles. To make the letters visible, they caught micro-particles in the various sound pressure profiles. Depending on the sound pattern, the particles arranged themselves into the individual letters.
    Organoid models for drug testing
    For similar images, the scientists collaborating with Peer Fischer, Kai Melde, and Zhichao Ma previously arranged micro-particles with sound pressure profiles, which they modelled using a slightly different technique. They used special plastic stencils to deform the pressure profile of an ultrasonic wave like a hologram and arrange small particles — as well as biological cells in a liquid — into a desired pattern. However, the plastic holograms only provided still images. For each new pattern, they had to make a different plastic template. Using the ultrasound projector, the Stuttgart team is able to generate a new sound profile in about 10 seconds. “With other chips, we could significantly increase the frame rate,” says Kai Melde, who led the hologram development team.
    The technique could be used not only for diagnostic and therapeutic purposes but also in biomedical laboratories. For example, to arrange cells into organoid models. “Such organoids enable useful tests of active pharmaceutical ingredients and could therefore at least partially replace animal experiments,” says Fischer. More

  • in

    Detecting early-stage failure in electric power conversion devices

    Power electronics regulate and modify electric power. They are in computers, power steering systems, solar cells, and many other technologies. Researchers are seeking to enhance power electronics by using silicon carbide semiconductors. However, wear-out failures such as cracks remain problematic. To help researchers improve future device designs, early damage detection in power electronics before complete failure is required.
    In a study recently published in IEEE Transactions on Power Electronics, researchers from Osaka University monitored in real time the propagation of cracks in a silicon carbide Schottsky diode during power cycling tests. The researchers used an analysis technique, known as acoustic emission, which has not been previously reported for this purpose.
    During the power cycling test, the researchers mimicked repeatedly turning the device on and off, to monitor the resulting damage to the diode over time. Increasing acoustic emission corresponds to progressive damage to aluminum ribbons affixed to the silicon carbide Schottsky diode. The researchers correlated the monitored acoustic emission signals to specific stages of device damage that eventually led to failure.
    “A transducer converts acoustic emission signals during power cycling tests to an electrical output that can be measured,” explains lead author ChanYang Choe. “We observed burst-type waveforms, which are consistent with fatigue cracking in the device.”
    The traditional method of checking whether a power device is damaged is to monitor anomalous increases in the forward voltage during power cycling tests. Using the traditional method, the researchers found that there was an abrupt increase in the forward voltage, but only when the device was near complete failure. In contrast, acoustic emission counts were much more sensitive. Instead of an all-or-none response, there were clear trends in the acoustic emission counts during power cycling tests.
    “Unlike forward voltage plots, acoustic emission plots indicate all three stages of crack development,” says senior author Chuantong Chen. “We detected crack initiation, crack propagation, and device failure, and confirmed our interpretations by microscopic imaging.”
    To date, there has been no sensitive early-warning method for detecting fatigue cracks that lead to complete failure in silicon carbide Schottsky diodes. Acoustic emission monitoring, as reported here, is such a method. In the future, this development will help researchers determine why silicon carbide devices fail, and improve future designs in common and advanced technologies.

    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    Physicist joins international effort to unveil the behavior of 'strange metals'

    The Landau’s theory of Fermi liquid (FL) (Note 1), established in the first half of the 20th century, is the foundation of the scientific and industrial usage of the metallic materials in our society. It is also the basis of our current understanding of metals. However, in the second half of the 20th century, more and more metallic materials were discovered which behave very differently. The non-Fermi liquid (NFL) behaviour of these “strange metals” remains a puzzle to physicists, and there is no established theory to explain them.
    Recently, a joint research team comprising members including Dr Zi Yang MENG, Associate Professor of Department of Physics at the University of Hong Kong (HKU), Dr Avraham KLEIN and Professor Andrey CHUBUKOV from the University of Minnesota, Dr Kai SUN, Associate Professor from the University of Michigan, and Dr Xiao Yan XU from the University of California at San Diego, has solved the puzzle of the NFL behaviour in interacting electrons systems, and provided a protocol for the establishment of new paradigms in quantum metals, through quantum many-body computation and analytical calculations. The findings have recently been published in Npj Quantum Materials. The work was supported by the Research Grants Council of HKSAR, and the Ministry of Science and Technology of China.
    Breaking discoveries of mysterious NFL behaviour
    The Landau’s theory of Fermi liquid (FL) successfully explained many features of simple metals like Copper, Silver, Gold and Iron, such as when temperature changes, their resistivity, heat capacity and other properties follow simple function form with respect to temperature T (for example, resistivity follows ρ~T2 and heat capacity follows C~T, independent of material details). The success of the Fermi liquid theory lies in the central assumption that the electrons, the droplets in the Fermi liquid are not interacting with each other, but behave identically in the material.
    However, many metallic materials which were discovered after FL was established, do NOT behave as FL. For example, in the so-called high-temperature superconductor compounds — copper oxides and iron pnictides — their resistivities are linear in temperature ρ~T before the system becomes superconducting (resistivity is then zero), and such systems are in general dubbed Non-Fermi-Liquids (NFL). Different from the simple FL, the electrons of NFL, the droplets, strongly interact with each other.
    NFLs have potential application in solving the energy crisis
    The physicists still do not have much clue about NFL, which makes it very difficult to make concrete predictions. Still, these systems are essential for the continued prosperity of human society, as NFLs hold the key in making use of high-temperature superconducting material that will solve the energy crisis. Currently, the so-called high-temperature superconducting materials still only work at temperature scale of-100 Celsius — they are called high-temperature in comparison with the FL superconductors, which work at temperature scale of -200 Celsius — so it is still hard to put high-temperature superconductors into daily usage at room temperature, but only then can we enjoy the nice properties of such material that the electronic power will not be loss in heat due to resistivity. Only when we understand how the NFL in high-temperature superconductor works at -100 Celsius, can we then design the ultimate material to work at room temperature. Therefore, the complete understanding of NFL is of vital importance.

    advertisement

    Physicists from analytical background have been trying to understand NFL for about half a century. The problem of analytical calculation is that, due to the quantum many-body nature of the NFL, the convergence and accuracy of many theoretical predictions cannot be controlled or guaranteed; one would need unbiased quantum computation to verify these prepositions.
    Key revelation to the puzzle is computation
    At the numerical front, there have been many previous attempts, but the problem is that the results obtained are always different from the analytical prediction. For example, the most important quantity of the NFL, the self-energy Σ , which describes the level of the electron interactions in the material, is expected to have a power-law frequency dependence such as Σ~ω2/3. However, the computed self-energy doesn’t follow such as power-law, it shows a slow diverging behaviour, that is the self-energy computed doesn’t go to zero as frequency is reduced, but instead gets larger and large. Such difference makes the situation even more perplexing.
    After a very inspirational discussion between Dr Meng, Professor Chubukov, and Dr Klein, they realized that the setting of the numerical simulation is actually different from that of the analytical calculation. Such subtlety comes from the fact that the model simulations are performed on the finite system at finite temperature, that is T≠0, whereas the analytical expectations are strictly at the zero temperature T=0. In other words, the numerical data actually contain both the zero temperature NFL contribution and the contribution from the fluctuations at finite temperature. To be able to reveal the NFL behaviour from the lattice model simulation such as the setting, one would need to deduce the finite temperature contribution.
    This turns out to be the key revelation to the puzzle of NFL. Dr Klein, Dr Sun and Prof Chubukov derived the analytical form of the finite temperature contribution (with the input from the lattice model designed by Dr Meng and Dr Xu) for Dr Meng and Dr Xu to employ and deduce from the numerical data, the results are shown as the black dashed line and the data round it. To everyone’s surprise and ecstasy, the results after the deduction perfectly exhibit the expected NFL behaviour, from finite temperature all the way to zero temperature, the power-law is revealed. It is the first time that such clear NFL behaviour has been obtained from unbiased numerical simulation.
    Bring a better future to the society
    Dr Meng said it is expected that this work will inspire many follow-up theoretical and experimental researches, and in fact, promising results for further identification of NFL behaviour in another model system have been obtained by the further investigations, he said: “This research work reveals the puzzle of Non-Fermi-liquid for several decades and paves the avenue for the establishment of new paradigm of quantum metals beyond those more than half-a-century ago. Eventually, we will be able to understand the NFL materials such as high-temperature superconductors as we understand simple metals such as Cooper and Sliver now, and such new understanding will solve the energy crisis and bring better industrial and personal applications to the society.” More

  • in

    Quantum engines with entanglement as fuel?

    In order to make a car run, a car’s engine burns gasoline and converts the energy from the heat of the combusting gasoline into mechanical work. In the process, however, energy is wasted; a typical car only converts around 25 percent of the energy in gasoline into useful energy to make it run.
    Engines that run with 100 percent efficiency are still more science fiction than science fact, but new research from the University of Rochester may bring scientists one step closer to demonstrating an ideal transfer of energy within a system.
    Andrew Jordan, a professor of physics at Rochester, was recently awarded a three-year, $1 million grant from the Templeton Foundation to research quantum measurement engines — engines that use the principles of quantum mechanics to run with 100 percent efficiency. The research, to be carried out with co-principal investigators in France and at Washington University St. Louis, could answer important questions about the laws of thermodynamics in quantum systems and contribute to technologies such as more efficient engines and quantum computers.
    “The grant deals with several Big Questions about our natural world,” Jordan says.
    PHYSICS AT A SMALL LEVEL
    The researchers have previously described the concept of quantum measurement engines, but the theory has never been demonstrated experimentally.

    advertisement

    In the microscopic quantum world, particles exhibit unique properties that do not align with the classical laws of physics as we know them. Jordan and his colleagues will use superconducting circuits to design experiments that can be carried out within a realistic quantum system. Through these experiments, the researchers will study how the laws of energy, work, power, efficiency, heat, and entropy function at the quantum level. These concepts are currently poorly understood in quantum mechanics.
    MICROSCOPIC POWER TASKS
    Quantum measurement engines may work in microscopic environments for very small power tasks such as moving around an atom or charging a miniaturized circuit. In these capacities, they may be important components for quantum computers.
    This type of engine couldn’t currently be used to power a car, however; the power in a quantum measurement engine is measured in the unit picowatts, with one picowatt equal to one million millionths of a watt. For comparison, a single lightbulb has about 60 watts of power.
    “The power scales involved — numbers like picowatts — indicate the large gap between our human interests and these tiny engines,” Jordan says.

    advertisement

    One way to make quantum measurement engines for human-scale activities may be “through massive parallelization,” Jordan says. “Each device only outputs a tiny amount of energy, but by making billions of them working together, you could make a macroscopic engine from the ground up.”
    A NEW TYPE OF FUEL
    Jordan and his team will also investigate another major area of research: how it might be possible to extract work from a system using entanglement as a fuel. In entanglement — one of the basic of concepts of quantum physics — the properties of one particle are interlinked with properties of another, even when the particles are separated by a large distance. Using entanglement as a fuel has the possibly revolutionary feature of creating a non-local engine; half of an engine could be in New York, while the other half could be in California. The energy would not be held by either half of the system, yet the two parts could still share energy to fuel both halves proficiently.
    “We will show that the engine can, in principle, be perfectly efficient,” Jordan says. “That is, there would be an ideal transfer of energy from the measurement apparatus to the quantum system.”
    The foundation award reflects the significance of quantum technology as a national and international priority, and Rochester’s key role in the enterprise. The project itself builds on Rochester’s robust history of research in optics and physics and current efforts to better unravel the mysteries of quantum mechanics.
    “The University of Rochester has an existing strength in quantum physics, and indeed was the birthplace of the field of quantum optics,” Jordan says. “We have a good collection of quality researchers in place, a historical legacy of quantum physics, and ongoing University support of quantum physics.”

    Story Source:
    Materials provided by University of Rochester. Original written by Lindsey Valich. Note: Content may be edited for style and length. More

  • in

    Molecular design strategy reveals near infrared-absorbing hydrocarbon

    Nagoya University researchers have synthesized a unique molecule with a surprising property: it can absorb near infrared light. The molecule is made only of hydrogen and carbon atoms and offers insights for making organic conductors and batteries. The details were published in the journal Nature Communications.
    Organic chemist Hiroshi Shinokubo and physical organic chemist Norihito Fukui of Nagoya University work on designing new, interesting molecules using organic, or carbon-containing, compounds. In the lab, they synthesized an aromatic hydrocarbon called methoxy-substituted as-indacenoterrylene. This molecule has a unique structure, as its methoxy groups are located internally rather than at its periphery.
    “Initially, we wanted to see if this hydrocarbon demonstrated novel phenomena due to its unique structure,” says Fukui.
    But during their investigations, the researchers discovered they could convert it into a new bowl-shaped hydrocarbon called as-indacenoterrylene.
    “We were surprised to find that this new molecule exhibits near infrared absorption up to 1300 nanometers,” Shinokubo explains.
    What’s unique about as-indacenoterrylene is not that it absorbs near infrared light. Other hydrocarbons can do this as well. as-indacenoterrylene is interesting because it does this despite being made of only 34 carbon and 14 hydrogen atoms, without containing other kinds of stabilizing atoms at its periphery.
    When the scientists conducted electrochemical measurements, theoretical calculations, and other tests, they found that as-indacenoterrylene was intriguingly stable and also had a remarkably narrow gap between its highest occupied molecular orbital (HOMO) and its lowest unoccupied molecular orbital (LUMO). This means that the molecule has two electronically different subunits, one that donates and another that withdraws electrons. The narrow HOMO-LUMO gap makes it easier for electrons to become excited within the molecule.
    “The study offers an effective guideline for the design of hydrocarbons with a narrow HOMO-LUMO gap, which is to fabricate molecules with coexisting electron-donating and electron-withdrawing subunits,” says Fukui. “These molecules will be useful for the development of next-generation solid-state materials, such as organic conductors and organic batteries.”
    The team next plans to synthesize other near infrared-absorbing aromatic hydrocarbons based on the design concepts garnered in this current study.

    Story Source:
    Materials provided by Nagoya University. Note: Content may be edited for style and length. More

  • in

    Internet connectivity is oxygen for research and development work

    Fast and reliable internet access is fundamental for research and development activity around the world. Seamless connectivity is a privilege we often take for granted. But in developing nations, technological limitations can become stumbling blocks to efficient communication and cause significant disadvantages.
    Pete Goldsmith, director of the Soybean Innovation Lab at University of Illinois, works closely with partner organizations in several African countries. He noticed that his African colleagues were often dealing with technological problems that made communication very challenging. For example, sometimes they had to rely on their cell phones because their institution’s internet access was unreliable.
    Goldsmith teamed up with two IT experts at U of I, former Chief Information Officer Paul Hixson and Director of Research IT and Innovation Tracy Smith, to investigate technological challenges facing institutions in developing countries.
    “Connectivity is the oxygen organizations run on,” Hixson says. “It’s such a basic requirement that it’s often not even recognized as an issue. But lack of connectivity severely hinders an organization’s ability to perform simple functions, conduct research, and compete for grants.”
    Goldsmith, Hixson, and Smith conducted an in-depth case study of information communication technology (ICT) infrastructure at the Savannah Agricultural Research Institute (SARI), a leading research station in Ghana and a close collaborator of SIL.
    The case study included focus groups, interviews, and a technological analysis of SARI’s equipment and connectivity. Based on this study, the research team developed the ICT Health Checkup, an assessment procedure for IT administrators to methodically assess the current state of their system, identify gaps affecting performance, and document steps for remediation.

    advertisement

    The ICT Health Checkup tool systematically evaluates four key elements of ICT infrastructure. The first step focuses on connectivity and bandwidth, identifying the required bandwidth to accommodate the institution’s needs and whether the institution has an uninterrupted fiber-based connection to the global internet. The second step analyzes core physical infrastructure, including dependable electricity, local network design, and both wired and wireless connectivity capabilities.
    The third step looks at available intranet service offerings for researchers such as local storage, data backup procedures, access control, security procedures, email service, and cloud access. Finally, the fourth step deals with the human resources and technical support requirements for planning and managing the institution’s IT infrastructure.
    “With this tool, institutions can go through a checklist, and at each point there is a ‘stoplight’. If it’s red, you know there is something that needs to be fixed, because there are conditions that will act as a block and you can’t go on until they are fixed — until there’s a green light. So turning things from red to green at each step is crucial; methodically going through each step at a time and making sure it’s fixed before moving on to the next one,” Hixson explains.
    The researchers compare the ICT Health Checkup to a medical health exam; it measures the current conditions and can be used as a benchmarking tool to measure improvements.
    Goldsmith says the tool can be used to empower organizations so they can be self-sufficient. “With proper connectivity you can manage and store research data, compete for grants, and manage awards,” he notes. “It’s the foundation that allows institutions to participate fully in a global context.”
    The research team is currently expanding the study, collecting data from nine institutions and five networking organizations operating in three countries, in order to create a more robust picture of internet connectivity challenges and potential solutions across Africa.
    They are also collaborating with the National Research and Education Networks (NRENs) in each of the sub-Saharan African countries that SIL operates in. These African NRENs are comparable to Internet2, which has been an instrumental partner in the expansion and adoption of advanced computing technologies at U of I and is one of the leading NRENs in the U.S., serving the country’s research and higher-education communities.
    “With the ICT health checkup, our partner African NRENs now have an actual assessment tool they can use with their member institutions. It’s becoming a continent-wide approach as they are starting to adopt this new instrument created at the U of I to be their benchmark and measurement tool,” Goldsmith says.
    “The U of I is ideally positioned to provide this knowledge, because of the university’s continued leadership in the computational and network administration space,” he adds. “Now we are extending that to have real impact overseas.” More

  • in

    A new ultrafast control scheme of ferromagnet for energy-efficient data storage

    The digital data generated around the world every year is now counted in zettabytes, or trillions of billions of bytes — equivalent to delivering data for hundreds of millions of books every second. The amount of data generated continues to grow. If existing technologies remained constant, all the current global electricity consumption would be devoted to data storage by 2040.
    Researchers at the Université de Lorraine in France and Tohoku University reported on an innovative technology that leads to a drastic reduction in energy for data storage.
    The established technology utilizes an ultrafast laser pulse whose duration is as short as 30 femto seconds — equal to 0.0000000000000003 seconds. The laser pulse is applied to a heterostructure consisting of ferrimagnetic GdFeCo, nonmagnetic Cu and ferromagnetic Co/Pt layers.
    “Previous research, conducted by a subset of the current research group, observed magnetic switching of the ferromagnetic layer after the ferrimagnetic layer had been switched.” This time, the researchers uncovered the mechanism accounting for this peculiar phenomena and found that a flow of electron spin, referred to as a spin current, accompanying the switching of ferrimagnetic GeFeCo plays a crucial role in inducing the switching of ferromagnetic Co/Pt.
    Based on this insight, they demonstrated a much faster and less energy consuming switching of the ferromagnet. This was driven by a single laser pulse without a switching of the ferrimagnetic layer. “This is very good news for future data-storage applications as this technology can provide an efficient scheme to write digital information to a magnetic medium, which is currently based on a magnetic-field-induced switching,” says Shunsuke Fukami, co-author of the study.

    Story Source:
    Materials provided by Tohoku University. Note: Content may be edited for style and length. More