More stories

  • in

    Material found in house paint may spur technology revolution

    The development of a new method to make non-volatile computer memory may have unlocked a problem that has been holding back machine learning and has the potential to revolutionize technologies like voice recognition, image processing and autonomous driving.
    A team from Sandia National Laboratories, working with collaborators from the University of Michigan, published a paper in the peer-reviewed journal Advanced Materials that details a new method that will imbue computer chips that power machine-learning applications with more processing power by using a common material found in house paint in an analog memory device that enables highly energy-efficient machine inference operations.
    “Titanium oxide is one of the most commonly made materials. Every paint you buy has titanium oxide in it. It’s cheap and nontoxic,” explains Sandia materials scientist Alec Talin. “It’s an oxide, there’s already oxygen there. But if you take a few out, you create what are called oxygen vacancies. It turns out that when you create oxygen vacancies, you make this material electrically conductive.”
    Those oxygen vacancies can now store electrical data, giving almost any device more computing power. Talin and his team create the oxygen vacancies by heating a computer chip with a titanium oxide coating above 302 degrees Fahrenheit (150 degree Celsius), separate some of the oxygen molecules from the material using electrochemistry and create vacancies.
    “When it cools off, it stores any information you program it with,” Talin said.
    Energy efficiency a boost to machine learning
    Right now, computers generally work by storing data in one place and processing that data in another place. That means computers have to constantly transfer data from one place to the next, wasting energy and computing power.

    advertisement

    The paper’s lead author, Yiyang Li, is a former Truman Fellow at Sandia and now an assistant professor of materials science at the University of Michigan. He explained how their process has the potential to completely change how computers work.
    “What we’ve done is make the processing and the storage at the same place,” Li said. “What’s new is that we’ve been able to do it in a predictable and repeatable manner.”
    Both he and Talin see the use of oxygen vacancies as a way to help machine learning overcome a big obstacle holding it back right now — power consumption.
    “If we are trying to do machine learning, that takes a lot of energy because you are moving it back and forth and one of the barriers to realizing machine learning is power consumption,” Li said. “If you have autonomous vehicles, making decisions about driving consumes a large amount of energy to process all the inputs. If we can create an alternative material for computer chips, they will be able to process information more efficiently, saving energy and processing a lot more data.”
    Research has everyday impact
    Talin sees the potential in the performance of everyday devices.
    “Think about your cell phone,” he said. “If you want to give it a voice command, you need to be connected to a network that transfers the command to a central hub of computers that listen to your voice and then send a signal back telling your phone what to do. Through this process, voice recognition and other functions happen right in your phone.”
    Talin said the team is working on refining several processes and testing the method on a larger scale. The project is funded through Sandia’s Laboratory Directed Research and Development program.

    Story Source:
    Materials provided by DOE/Sandia National Laboratories. Note: Content may be edited for style and length. More

  • in

    With deep learning algorithms, standard CT technology produces spectral images

    Bioimaging technologies are the eyes that allow doctors to see inside the body in order to diagnose, treat, and monitor disease. Ge Wang, an endowed professor of biomedical engineering at Rensselaer Polytechnic Institute, has received significant recognition for devoting his research to coupling those imaging technologies with artificial intelligence in order to improve physicians’ “vision.”
    In research published today in Patterns, a team of engineers led by Wang demonstrated how a deep learning algorithm can be applied to a conventional computerized tomography (CT) scan in order to produce images that would typically require a higher level of imaging technology known as dual-energy CT.
    Wenxiang Cong, a research scientist at Rensselaer, is first author on this paper. Wang and Cong were also joined by coauthors from Shanghai First-Imaging Tech, and researchers from GE Research.
    “We hope that this technique will help extract more information from a regular single-spectrum X-ray CT scan, make it more quantitative, and improve diagnosis,” said Wang, who is also the director of the Biomedical Imaging Center within the Center for Biotechnology and Interdisciplinary Studies (CBIS) at Rensselaer.
    Conventional CT scans produce images that show the shape of tissues within the body, but they don’t give doctors sufficient information about the composition of those tissues. Even with iodine and other contrast agents, which are used to help doctors differentiate between soft tissue and vasculature, it’s hard to distinguish between subtle structures.
    A higher-level technology called dual-energy CT gathers two datasets in order to produce images that reveal both tissue shape and information about tissue composition. However, this imaging approach often requires a higher dose of radiation and is more expensive due to needed additional hardware.
    “With traditional CT, you take a grayscale image, but with dual-energy CT you take an image with two colors,” Wang said. “With deep learning, we try to use the standard machine to do the job of dual-energy CT imaging.”
    In this research, Wang and his team demonstrated how their neural network was able to produce those more complex images using single-spectrum CT data. The researchers used images produced by dual-energy CT to train their model and found that it was able to produce high-quality approximations with a relative error of less than 2%.
    “Professor Wang and his team’s expertise in bioimaging is giving physicians and surgeons ‘new eyes’ in diagnosing and treating disease,” said Deepak Vashishth, director of CBIS. “This research effort is a prime example of the partnership needed to personalize and solve persistent human health challenges.”

    Story Source:
    Materials provided by Rensselaer Polytechnic Institute. Original written by Torie Wells. Note: Content may be edited for style and length. More

  • in

    A new approach to artificial intelligence that builds in uncertainty

    They call it artificial intelligence — not because the intelligence is somehow fake. It’s real intelligence, but it’s still made by humans. That means AI — a power tool that can add speed, efficiency, insight and accuracy to a researcher’s work — has many limitations.
    It’s only as good as the methods and data it has been given. On its own, it doesn’t know if information is missing, how much weight to give differing kinds of information or whether the data it draws on is incorrect or corrupted. It can’t deal precisely with uncertainty or random events — unless it learns how. Relying exclusively on data, as machine-learning models usually do, it does not leverage the knowledge experts have accumulated over years and physical models underpinning physical and chemical phenomena. It has been hard to teach the computer to organize and integrate information from widely different sources.
    Now researchers at the University of Delaware and the University of Massachusetts-Amherst have published details of a new approach to artificial intelligence that builds uncertainty, error, physical laws, expert knowledge and missing data into its calculations and leads ultimately to much more trustworthy models. The new method provides guarantees typically lacking from AI models, showing how valuable — or not — the model can be for achieving the desired result.
    Joshua Lansford, a doctoral student in UD’s Department of Chemical and Biomolecular Engineering, and Prof. Dion Vlachos, director of UD’s Catalysis Center for Energy Innovation, are co-authors on the paper published Oct. 14 in the journal Science Advances. Also contributing were Jinchao Feng and Markos Katsoulakis of the Department of Mathematics and Statistics at the University of Massachusetts-Amherst.
    The new mathematical framework could produce greater efficiency, precision and innovation for computer models used in many fields of research. Such models provide powerful ways to analyze data, study materials and complex interactions and tweak variables in virtual ways instead of in the lab.
    “Traditionally in physical modelings, we build a model first using only our physical intuition and expert knowledge about the system,” Lansford said. “Then after that, we measure uncertainty in predictions due to error in underlying variables, often relying on brute-force methods, where we sample, then run the model and see what happens.”
    Effective, accurate models save time and resources and point researchers to more efficient methods, new materials, greater precision and innovative approaches they might not otherwise consider.

    advertisement

    The paper describes how the new mathematical framework works in a chemical reaction known as the oxygen reduction reaction, but it is applicable to many kinds of modeling, Lansford said.
    “The chemistries and materials we need to make things faster or even make them possible — like fuel cells — are highly complex,” he said. “We need precision…. And if you want to make a more active catalyst, you need to have bounds on your prediction error. By intelligently deciding where to put your efforts, you can tighten the area to explore.
    “Uncertainty is accounted for in the design of our model,” Lansford said. “Now it is no longer a deterministic model. It is a probabilistic one.”
    With these new mathematical developments in place, the model itself identifies what data are needed to reduce model error, he said. Then a higher level of theory can be used to produce more accurate data or more data can be generated, leading to even smaller error boundaries on the predictions and shrinking the area to explore.
    “Those calculations are time-consuming to generate, so we’re often dealing with small datasets — 10-15 data points. That’s where the need comes in to apportion error.”
    That’s still not a money-back guarantee that using a specific substance or approach will deliver precisely the product desired. But it is much closer to a guarantee than you could get before.

    advertisement

    This new method of model design could greatly enhance work in renewable energy, battery technology, climate change mitigation, drug discovery, astronomy, economics, physics, chemistry and biology, to name just a few examples.
    Artificial intelligence doesn’t mean human expertise is no longer needed. Quite the opposite.
    The expert knowledge that emerges from the laboratory and the rigors of scientific inquiry is essential, foundational material for any computational model. More

  • in

    An ultrasonic projector for medicine

    A chip-based technology that generates sound profiles with high resolution and intensity could create new options for ultrasound therapy, which would become more effective and easier. A team of researchers led by Peer Fischer from the Max Planck Institute for Intelligent Systems and the University of Stuttgart has developed a projector that flexibly modulates three-dimensional ultrasound fields with comparatively little technical effort. Dynamic sound pressure profiles can thus be generated with higher resolution and sound pressure than the current technology allows. It should soon be easier to tailor ultrasound profiles to individual patients. New medical applications for ultrasound may even emerge.
    Ultrasound is widely used as a diagnostic tool in both medicine and materials science. It can also be used therapeutically. In the US, for example, tumours of the uterus and prostate are treated with high-power ultrasound. The ultrasound destroys the cancer cells by specific heating of the diseased tissue. Researchers worldwide are using ultrasound to combat tumours and other pathological changes in the brain. “In order to avoid damaging healthy tissue, the sound pressure profile must be precisely shaped,” explains Peer Fischer, Research Group Leader at the Max Planck Institute for Intelligent Systems and professor at the University of Stuttgart. Tailoring an intensive ultrasound field to diseased tissue is somewhat more difficult in the brain. This is because the skullcap distorts the sound wave. The Spatial Ultrasound Modulator (SUM) developed by researchers in Fischer’s group should help to remedy this situation and make ultrasound treatment more effective and easier in other cases. It allows the three-dimensional shape of even very intense ultrasound waves to be varied with high resolution — and with less technical effort than is currently required to modulate ultrasound profiles.
    High intensity sound pressure profiles with 10,000 pixels
    Conventional methods vary sound fields with several individual sound sources, the waves of which can be superimposed and shifted against each other. However, because the individual sound sources cannot be miniaturized at will, the resolution of these sound pressure profiles is limited to 1000 pixels. The sound transmitters are then so small that the sound pressure is sufficient for diagnostic but not therapeutic purposes. With the new technology, the researchers first generate an ultrasonic wave and then modulate its sound pressure profile independently, essentially killing two birds with one stone. “In this way, we can use much more powerful ultrasonic transducers,” explains postdoctoral fellow Kai Melde, who is part of the team that developed the SUM. “Thanks to a chip with 10,000 pixels that modulates the ultrasonic wave, we can generate a much finer-resolved profile.”
    “In order to modulate the sound pressure profile, we take advantage of the different acoustic properties of water and air,” says Zhichao Ma, a post-doctoral fellow in Fischer’s group, who was instrumental in developing the new SUM technology: “While an ultrasonic wave passes through a liquid unhindered, it is completely reflected by air bubbles.” The research team from Stuttgart thus constructed a chip the size of a thumbnail on which they can produce hydrogen bubbles by electrolysis (i.e. the splitting of water into oxygen and hydrogen with electricity) on 10,000 electrodes in a thin water film. The electrodes each have an edge length of less than a tenth of a millimetre and can be controlled individually.
    A picture show with ultrasound
    If you send an ultrasonic wave through the chip with a transducer (a kind of miniature loudspeaker), it passes through the chip unhindered. But as soon as the sound wave hits the water with the hydrogen bubbles, it continues to travel only through the liquid. Like a mask, this creates a sound pressure profile with cut-outs at the points where the air bubbles are located. To form a different sound profile, the researchers first wipe the hydrogen bubbles away from the chip and then generate gas bubbles in a new pattern.
    The researchers demonstrated how precisely and variably the new projector for ultrasound works by writing the alphabet in a kind of picture show of sound pressure profiles. To make the letters visible, they caught micro-particles in the various sound pressure profiles. Depending on the sound pattern, the particles arranged themselves into the individual letters.
    Organoid models for drug testing
    For similar images, the scientists collaborating with Peer Fischer, Kai Melde, and Zhichao Ma previously arranged micro-particles with sound pressure profiles, which they modelled using a slightly different technique. They used special plastic stencils to deform the pressure profile of an ultrasonic wave like a hologram and arrange small particles — as well as biological cells in a liquid — into a desired pattern. However, the plastic holograms only provided still images. For each new pattern, they had to make a different plastic template. Using the ultrasound projector, the Stuttgart team is able to generate a new sound profile in about 10 seconds. “With other chips, we could significantly increase the frame rate,” says Kai Melde, who led the hologram development team.
    The technique could be used not only for diagnostic and therapeutic purposes but also in biomedical laboratories. For example, to arrange cells into organoid models. “Such organoids enable useful tests of active pharmaceutical ingredients and could therefore at least partially replace animal experiments,” says Fischer. More

  • in

    Detecting early-stage failure in electric power conversion devices

    Power electronics regulate and modify electric power. They are in computers, power steering systems, solar cells, and many other technologies. Researchers are seeking to enhance power electronics by using silicon carbide semiconductors. However, wear-out failures such as cracks remain problematic. To help researchers improve future device designs, early damage detection in power electronics before complete failure is required.
    In a study recently published in IEEE Transactions on Power Electronics, researchers from Osaka University monitored in real time the propagation of cracks in a silicon carbide Schottsky diode during power cycling tests. The researchers used an analysis technique, known as acoustic emission, which has not been previously reported for this purpose.
    During the power cycling test, the researchers mimicked repeatedly turning the device on and off, to monitor the resulting damage to the diode over time. Increasing acoustic emission corresponds to progressive damage to aluminum ribbons affixed to the silicon carbide Schottsky diode. The researchers correlated the monitored acoustic emission signals to specific stages of device damage that eventually led to failure.
    “A transducer converts acoustic emission signals during power cycling tests to an electrical output that can be measured,” explains lead author ChanYang Choe. “We observed burst-type waveforms, which are consistent with fatigue cracking in the device.”
    The traditional method of checking whether a power device is damaged is to monitor anomalous increases in the forward voltage during power cycling tests. Using the traditional method, the researchers found that there was an abrupt increase in the forward voltage, but only when the device was near complete failure. In contrast, acoustic emission counts were much more sensitive. Instead of an all-or-none response, there were clear trends in the acoustic emission counts during power cycling tests.
    “Unlike forward voltage plots, acoustic emission plots indicate all three stages of crack development,” says senior author Chuantong Chen. “We detected crack initiation, crack propagation, and device failure, and confirmed our interpretations by microscopic imaging.”
    To date, there has been no sensitive early-warning method for detecting fatigue cracks that lead to complete failure in silicon carbide Schottsky diodes. Acoustic emission monitoring, as reported here, is such a method. In the future, this development will help researchers determine why silicon carbide devices fail, and improve future designs in common and advanced technologies.

    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    Physicist joins international effort to unveil the behavior of 'strange metals'

    The Landau’s theory of Fermi liquid (FL) (Note 1), established in the first half of the 20th century, is the foundation of the scientific and industrial usage of the metallic materials in our society. It is also the basis of our current understanding of metals. However, in the second half of the 20th century, more and more metallic materials were discovered which behave very differently. The non-Fermi liquid (NFL) behaviour of these “strange metals” remains a puzzle to physicists, and there is no established theory to explain them.
    Recently, a joint research team comprising members including Dr Zi Yang MENG, Associate Professor of Department of Physics at the University of Hong Kong (HKU), Dr Avraham KLEIN and Professor Andrey CHUBUKOV from the University of Minnesota, Dr Kai SUN, Associate Professor from the University of Michigan, and Dr Xiao Yan XU from the University of California at San Diego, has solved the puzzle of the NFL behaviour in interacting electrons systems, and provided a protocol for the establishment of new paradigms in quantum metals, through quantum many-body computation and analytical calculations. The findings have recently been published in Npj Quantum Materials. The work was supported by the Research Grants Council of HKSAR, and the Ministry of Science and Technology of China.
    Breaking discoveries of mysterious NFL behaviour
    The Landau’s theory of Fermi liquid (FL) successfully explained many features of simple metals like Copper, Silver, Gold and Iron, such as when temperature changes, their resistivity, heat capacity and other properties follow simple function form with respect to temperature T (for example, resistivity follows ρ~T2 and heat capacity follows C~T, independent of material details). The success of the Fermi liquid theory lies in the central assumption that the electrons, the droplets in the Fermi liquid are not interacting with each other, but behave identically in the material.
    However, many metallic materials which were discovered after FL was established, do NOT behave as FL. For example, in the so-called high-temperature superconductor compounds — copper oxides and iron pnictides — their resistivities are linear in temperature ρ~T before the system becomes superconducting (resistivity is then zero), and such systems are in general dubbed Non-Fermi-Liquids (NFL). Different from the simple FL, the electrons of NFL, the droplets, strongly interact with each other.
    NFLs have potential application in solving the energy crisis
    The physicists still do not have much clue about NFL, which makes it very difficult to make concrete predictions. Still, these systems are essential for the continued prosperity of human society, as NFLs hold the key in making use of high-temperature superconducting material that will solve the energy crisis. Currently, the so-called high-temperature superconducting materials still only work at temperature scale of-100 Celsius — they are called high-temperature in comparison with the FL superconductors, which work at temperature scale of -200 Celsius — so it is still hard to put high-temperature superconductors into daily usage at room temperature, but only then can we enjoy the nice properties of such material that the electronic power will not be loss in heat due to resistivity. Only when we understand how the NFL in high-temperature superconductor works at -100 Celsius, can we then design the ultimate material to work at room temperature. Therefore, the complete understanding of NFL is of vital importance.

    advertisement

    Physicists from analytical background have been trying to understand NFL for about half a century. The problem of analytical calculation is that, due to the quantum many-body nature of the NFL, the convergence and accuracy of many theoretical predictions cannot be controlled or guaranteed; one would need unbiased quantum computation to verify these prepositions.
    Key revelation to the puzzle is computation
    At the numerical front, there have been many previous attempts, but the problem is that the results obtained are always different from the analytical prediction. For example, the most important quantity of the NFL, the self-energy Σ , which describes the level of the electron interactions in the material, is expected to have a power-law frequency dependence such as Σ~ω2/3. However, the computed self-energy doesn’t follow such as power-law, it shows a slow diverging behaviour, that is the self-energy computed doesn’t go to zero as frequency is reduced, but instead gets larger and large. Such difference makes the situation even more perplexing.
    After a very inspirational discussion between Dr Meng, Professor Chubukov, and Dr Klein, they realized that the setting of the numerical simulation is actually different from that of the analytical calculation. Such subtlety comes from the fact that the model simulations are performed on the finite system at finite temperature, that is T≠0, whereas the analytical expectations are strictly at the zero temperature T=0. In other words, the numerical data actually contain both the zero temperature NFL contribution and the contribution from the fluctuations at finite temperature. To be able to reveal the NFL behaviour from the lattice model simulation such as the setting, one would need to deduce the finite temperature contribution.
    This turns out to be the key revelation to the puzzle of NFL. Dr Klein, Dr Sun and Prof Chubukov derived the analytical form of the finite temperature contribution (with the input from the lattice model designed by Dr Meng and Dr Xu) for Dr Meng and Dr Xu to employ and deduce from the numerical data, the results are shown as the black dashed line and the data round it. To everyone’s surprise and ecstasy, the results after the deduction perfectly exhibit the expected NFL behaviour, from finite temperature all the way to zero temperature, the power-law is revealed. It is the first time that such clear NFL behaviour has been obtained from unbiased numerical simulation.
    Bring a better future to the society
    Dr Meng said it is expected that this work will inspire many follow-up theoretical and experimental researches, and in fact, promising results for further identification of NFL behaviour in another model system have been obtained by the further investigations, he said: “This research work reveals the puzzle of Non-Fermi-liquid for several decades and paves the avenue for the establishment of new paradigm of quantum metals beyond those more than half-a-century ago. Eventually, we will be able to understand the NFL materials such as high-temperature superconductors as we understand simple metals such as Cooper and Sliver now, and such new understanding will solve the energy crisis and bring better industrial and personal applications to the society.” More

  • in

    Quantum engines with entanglement as fuel?

    In order to make a car run, a car’s engine burns gasoline and converts the energy from the heat of the combusting gasoline into mechanical work. In the process, however, energy is wasted; a typical car only converts around 25 percent of the energy in gasoline into useful energy to make it run.
    Engines that run with 100 percent efficiency are still more science fiction than science fact, but new research from the University of Rochester may bring scientists one step closer to demonstrating an ideal transfer of energy within a system.
    Andrew Jordan, a professor of physics at Rochester, was recently awarded a three-year, $1 million grant from the Templeton Foundation to research quantum measurement engines — engines that use the principles of quantum mechanics to run with 100 percent efficiency. The research, to be carried out with co-principal investigators in France and at Washington University St. Louis, could answer important questions about the laws of thermodynamics in quantum systems and contribute to technologies such as more efficient engines and quantum computers.
    “The grant deals with several Big Questions about our natural world,” Jordan says.
    PHYSICS AT A SMALL LEVEL
    The researchers have previously described the concept of quantum measurement engines, but the theory has never been demonstrated experimentally.

    advertisement

    In the microscopic quantum world, particles exhibit unique properties that do not align with the classical laws of physics as we know them. Jordan and his colleagues will use superconducting circuits to design experiments that can be carried out within a realistic quantum system. Through these experiments, the researchers will study how the laws of energy, work, power, efficiency, heat, and entropy function at the quantum level. These concepts are currently poorly understood in quantum mechanics.
    MICROSCOPIC POWER TASKS
    Quantum measurement engines may work in microscopic environments for very small power tasks such as moving around an atom or charging a miniaturized circuit. In these capacities, they may be important components for quantum computers.
    This type of engine couldn’t currently be used to power a car, however; the power in a quantum measurement engine is measured in the unit picowatts, with one picowatt equal to one million millionths of a watt. For comparison, a single lightbulb has about 60 watts of power.
    “The power scales involved — numbers like picowatts — indicate the large gap between our human interests and these tiny engines,” Jordan says.

    advertisement

    One way to make quantum measurement engines for human-scale activities may be “through massive parallelization,” Jordan says. “Each device only outputs a tiny amount of energy, but by making billions of them working together, you could make a macroscopic engine from the ground up.”
    A NEW TYPE OF FUEL
    Jordan and his team will also investigate another major area of research: how it might be possible to extract work from a system using entanglement as a fuel. In entanglement — one of the basic of concepts of quantum physics — the properties of one particle are interlinked with properties of another, even when the particles are separated by a large distance. Using entanglement as a fuel has the possibly revolutionary feature of creating a non-local engine; half of an engine could be in New York, while the other half could be in California. The energy would not be held by either half of the system, yet the two parts could still share energy to fuel both halves proficiently.
    “We will show that the engine can, in principle, be perfectly efficient,” Jordan says. “That is, there would be an ideal transfer of energy from the measurement apparatus to the quantum system.”
    The foundation award reflects the significance of quantum technology as a national and international priority, and Rochester’s key role in the enterprise. The project itself builds on Rochester’s robust history of research in optics and physics and current efforts to better unravel the mysteries of quantum mechanics.
    “The University of Rochester has an existing strength in quantum physics, and indeed was the birthplace of the field of quantum optics,” Jordan says. “We have a good collection of quality researchers in place, a historical legacy of quantum physics, and ongoing University support of quantum physics.”

    Story Source:
    Materials provided by University of Rochester. Original written by Lindsey Valich. Note: Content may be edited for style and length. More

  • in

    Molecular design strategy reveals near infrared-absorbing hydrocarbon

    Nagoya University researchers have synthesized a unique molecule with a surprising property: it can absorb near infrared light. The molecule is made only of hydrogen and carbon atoms and offers insights for making organic conductors and batteries. The details were published in the journal Nature Communications.
    Organic chemist Hiroshi Shinokubo and physical organic chemist Norihito Fukui of Nagoya University work on designing new, interesting molecules using organic, or carbon-containing, compounds. In the lab, they synthesized an aromatic hydrocarbon called methoxy-substituted as-indacenoterrylene. This molecule has a unique structure, as its methoxy groups are located internally rather than at its periphery.
    “Initially, we wanted to see if this hydrocarbon demonstrated novel phenomena due to its unique structure,” says Fukui.
    But during their investigations, the researchers discovered they could convert it into a new bowl-shaped hydrocarbon called as-indacenoterrylene.
    “We were surprised to find that this new molecule exhibits near infrared absorption up to 1300 nanometers,” Shinokubo explains.
    What’s unique about as-indacenoterrylene is not that it absorbs near infrared light. Other hydrocarbons can do this as well. as-indacenoterrylene is interesting because it does this despite being made of only 34 carbon and 14 hydrogen atoms, without containing other kinds of stabilizing atoms at its periphery.
    When the scientists conducted electrochemical measurements, theoretical calculations, and other tests, they found that as-indacenoterrylene was intriguingly stable and also had a remarkably narrow gap between its highest occupied molecular orbital (HOMO) and its lowest unoccupied molecular orbital (LUMO). This means that the molecule has two electronically different subunits, one that donates and another that withdraws electrons. The narrow HOMO-LUMO gap makes it easier for electrons to become excited within the molecule.
    “The study offers an effective guideline for the design of hydrocarbons with a narrow HOMO-LUMO gap, which is to fabricate molecules with coexisting electron-donating and electron-withdrawing subunits,” says Fukui. “These molecules will be useful for the development of next-generation solid-state materials, such as organic conductors and organic batteries.”
    The team next plans to synthesize other near infrared-absorbing aromatic hydrocarbons based on the design concepts garnered in this current study.

    Story Source:
    Materials provided by Nagoya University. Note: Content may be edited for style and length. More