More stories

  • in

    Tapping hidden visual information: An all-in-one detector for thousands of colors

    Spectrometers are widely used throughout industry and research to detect and analyse light. Spectrometers measure the spectrum of light — its strength at different wavelengths, like the colours in a rainbow — and are an essential tool for identifying and analysing specimens and materials. Integrated on-chip spectrometers would be of great benefit to a variety of technologies, including quality inspection platforms, security sensors, biomedical analysers, healthcare systems, environmental monitoring tools, and space telescopes.
    An international research team led by researchers at Aalto University has developed high-sensitivity spectrometers with high wavelength accuracy, high spectral resolution, and broad operation bandwidth, using only a single microchip-sized detector. The research behind this new ultra-miniaturised spectrometer was published today in the journal Science.
    ‘Our single-detector spectrometer is an all-in-one device. We designed this optoelectronic-lab-on-a-chip with artificial intelligence replacing conventional hardware, such as optical and mechanical components. Therefore, our computational spectrometer does not require separate bulky components or array designs to disperse and filter light. It can achieve a high resolution comparable to benchtop systems but in a much smaller package,’ says Postdoctoral Researcher Hoon Hahn Yoon.
    ‘With our spectrometer, we can measure light intensity at each wavelength beyond the visible spectrum using a device at our fingertips. The device is entirely electrically controllable, so it has enormous potential for scalability and integration. Integrating it directly into portable devices such as smartphones and drones could advance our daily lives. Imagine that the next generation of our smartphone cameras could be fitted with hyperspectral cameras that outperform colour cameras,’ he adds.
    Shrinking computational spectrometers is essential for their use in chips and implantable applications. Professor Zhipei Sun, the head of the research team, says, ‘Conventional spectrometers are bulky because they need optical and mechanical components, so their on-chip applications are limited. There is an emerging demand in this field to improve the performance and usability of spectrometers. From this point of view, miniaturised spectrometers are very important to offer high performance and new functions in all fields of science and industry.’
    Professor Pertti Hakonen adds that ‘Finland and Aalto have invested in photonics research in recent years. For example, there has been great support from the Academy of Finland’s Centre of Excellence on quantum technology, Flagship on Photonics Research and Innovation, InstituteQ, and the Otanano Infrastructure. Our new spectrometer is a clear demonstration of the success of these collaborative efforts. I believe that with further improvements in resolution and efficiency, these spectrometers could provide new tools for quantum information processing.’
    In addition to Postdoctoral Researcher Hoon Hahn Yoon and Professors Zhipei Sun and Pertti Hakonen, the key Aalto members linked to the work included Postdoctoral Researchers Henry A. Fernandez and Faisal Ahmed, Doctoral Researchers Fedor Nigmatulin, Xiaoqi Cui, Md Gius Uddin, and Professor Harri Lipsanen. Professor Ethan D. Minot, from Oregon State University, joined this work as a visiting scholar at Aalto University for one year. The international research team led by Aalto university also included Professors Weiwei Cai (Shanghai Jiao Tong University), Zongyin Yang (Zhejiang University), Hanxiao Cui (Sichuan University), Kwanpyo Kim (Yonsei University), and Tawfique Hasan (University of Cambridge).
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More

  • in

    New computing architecture: Deep learning with light

    Ask a smart home device for the weather forecast, and it takes several seconds for the device to respond. One reason this latency occurs is because connected devices don’t have enough memory or power to store and run the enormous machine-learning models needed for the device to understand what a user is asking of it. The model is stored in a data center that may be hundreds of miles away, where the answer is computed and sent to the device.
    MIT researchers have created a new method for computing directly on these devices, which drastically reduces this latency. Their technique shifts the memory-intensive steps of running a machine-learning model to a central server where components of the model are encoded onto light waves.
    The waves are transmitted to a connected device using fiber optics, which enables tons of data to be sent lightning-fast through a network. The receiver then employs a simple optical device that rapidly performs computations using the parts of a model carried by those light waves.
    This technique leads to more than a hundredfold improvement in energy efficiency when compared to other methods. It could also improve security, since a user’s data do not need to be transferred to a central location for computation.
    This method could enable a self-driving car to make decisions in real-time while using just a tiny percentage of the energy currently required by power-hungry computers. It could also allow a user to have a latency-free conversation with their smart home device, be used for live video processing over cellular networks, or even enable high-speed image classification on a spacecraft millions of miles from Earth.
    “Every time you want to run a neural network, you have to run the program, and how fast you can run the program depends on how fast you can pipe the program in from memory. Our pipe is massive — it corresponds to sending a full feature-length movie over the internet every millisecond or so. That is how fast data comes into our system. And it can compute as fast as that,” says senior author Dirk Englund, an associate professor in the Department of Electrical Engineering and Computer Science (EECS) and member of the MIT Research Laboratory of Electronics. More

  • in

    A drop in the sea of electrons

    Recent Australian-led research has provided a world’s first measurement of interactions between Fermi polarons in an atomically-thin 2D semiconductor, using ultrafast spectroscopy capable of probing complex quantum materials.
    Researchers at Swinburne University of Technology found the signatures of interactions between exciton-polarons in experiments on the 2D semiconductor monolayer tungsten-disulfide.
    FLEET collaborators at Monash University and RMIT developed a theoretical model to explain the experimental signals. They found that repulsive interactions at long-range are mediated by a phase-space filling effect, while attractive interactions at short range led to the formation of a cooperatively bound exciton-exciton-electron state.
    The Material
    Tungsten-disulfide (WS2) comes from the family of semiconducting transition metal dichalcogenides (TMDCs). When the bulk material is exfoliated down to a single atomic monolayer (less than 1 nanometre thick), the physics of these 2D materials becomes really interesting, and controllable.
    Much of the intriguing physics is described by the creation and interactions of quasiparticles*. Excitons are one such quasiparticle, and they dominate the optical response of monolayer WS2. Excitons are formed when electrons from the valence band are excited into the conduction band. The vacancy left behind (a hole) can then bind to the excited electron through Coulomb forces, forming the exciton. More

  • in

    How can flying insects and drones tell up from down?

    While drones typically use accelerometers to estimate the direction of gravity, the way flying insects achieve this has been shrouded in mystery until now, as they have no specific sense of acceleration. In this study, a European team of scientists1 led by the Delft University of Technology in the Netherlands and involving a CNRS researcher has shown that drones can assess gravity using visual motion detection and motion modelling together.
    To develop this new principle, scientists have investigated optical flow, that is, how an individual perceives movement relative to their environment. It is the visual movement that sweeps across our retina when we move. For example, when we are on a train, trees next to the tracks pass by faster than distant mountains. The optical flow alone is not enough for an insect to be able to know the direction of gravity.
    However, the research team discovered that it was possible for them to find this direction by combining this optical flow with a modelling of their movement, i.e. a prediction of how they will move. The conclusions of the article show that with this principle it was possible to find the direction of gravity in almost all situations, except in a few rare and specific cases such as when the subject was completely immobile.
    During such perfect stationary flights, the impossibility of finding the direction of gravity will destabilize the drone for a moment and therefore put it in motion. This means the drone will regain the direction of gravity at the next instant. So these movements generate slight oscillations, reminiscent of insect flight.
    Using this new principle in robotics could meet a major challenge that nature has also faced: How to obtain a fully autonomous system while limiting payload. Future drone prototypes would be lightened by not needing accelerometers, which is very promising for the smallest models of the size of an insect.
    Though this theory may explain how flying insects determine gravity, we still need confirmation that they actually use this mechanism. Specific new biological experiments are needed to prove the existence of these neural processes that are difficult to observe in flight. This publication shows how the synergy between robotics and biology can lead to technological advances and new biological research avenues.
    Notes
    1 This research results from a European collaboration between two laboratories: the Micro Air Vehicle Laboratory at the The Faculty of Aerospace Engineering at the Delft University of Technology in the Netherlands and the Institut des Sciences du Mouvement (CNRS/Aix Marseille Université) in France.
    Story Source:
    Materials provided by CNRS. Note: Content may be edited for style and length. More

  • in

    Looking to move to a galaxy far, far away? Innovative system evaluates habitability of distant planets

    The climate crisis presents a huge challenge to all people on Earth. It has led many scientists to look for exoplanets, planets outside our solar system that humans could potentially settle.
    The James Webb Space Telescope was developed as part of this search to provide detailed observational data about Earth-like exoplanets in the coming years. A new project, led by Dr. Assaf Hochman at the Fredy & Nadine Herrmann Institute of Earth Sciences at the Hebrew University of Jerusalem (HU), in collaboration with Dr. Paolo De Luca at the Barcelona Supercomputing Center and Dr. Thaddeus D. Komacek at the University of Maryland, has successfully developed a framework to study the atmospheres of distant planets and locate those planets fit for human habitation, without having to visit them physically. Their joint research study was published in the Astrophysical Journal.
    Classifying climate conditions and measuring climate sensitivity are central elements when assessing the viability of exoplanets as potential candidates for human habitation. In the current study, the research team examined TRAPPIST-1e, a planet located some 40 light years from the Earth and scheduled to be documented by the James Webb Space Telescope in the coming year. The researchers looked at the sensitivity of the planet’s climate to increases in greenhouse gases and compared it with conditions on Earth. Using a computerized simulation of the climate on TRAPPIST-1e, they could assess the impact of changes in greenhouse gas concentration.
    The study focused on the effect of an increase in carbon dioxide on extreme weather conditions, and on the rate of changes in weather on the planet. “These two variables are crucial for the existence of life on other planets, and they are now being studied in depth for the first time in history,” explained Hochman.
    According to the research team, studying the climate variability of earth-like exoplanets provides a better understanding of the climate changes we are currently experiencing on Earth. Additionally, this kind of research offers a new understanding of how planet Earth’s atmosphere might change in the future.
    Hochman and his research partners found that planet TRAPPIST-1e has a significantly more sensitive atmosphere than planet Earth. They estimate that an increase in greenhouse gases there could lead to more extreme climate changes than we would experience here on Earth because one side of TRAPPIST-1e constantly faces its own sun, in the same way, that our moon always has one side facing the Earth.
    As Hochman concluded, “the research framework we developed, along with observational data from the Webb Space Telescope, will enable scientists to efficiently assess the atmospheres of many other planets without having to send a space crew to visit them physically. This will help us make informed decisions in the future about which planets are good candidates for human settlement and perhaps even to find life on those planets.”
    Story Source:
    Materials provided by The Hebrew University of Jerusalem. Note: Content may be edited for style and length. More

  • in

    High entropy alloys: Structural disorder and magnetic properties

    High entropy alloys or HEAs consist of five or more different metallic elements and are an extremely interesting class of materials with a great diversity of potential applications. Since their macroscopic properties are strongly dependent on interatomic interactions, it is utterly interesting to probe the local structure and structural disorder around each individual element by element-specific techniques. Now, a team has examined a so called Cantor alloy — a model system to study the high-entropy effects on the local and macroscopic scales.
    A toolbox at BESSY II
    To investigate the local environment of individual components, the team used multi-edge X-ray absorption spectroscopy (EXAFS) at BESSY II and then the reverse Monte Carlo method to analyse the collected data. The magnetic properties of each element of the alloy were additionally probed using X-ray magnetic circular dichroism (XMCD) technique. By conventional magnetometry, the scientists proved the presence of magnetic phase transitions and found some signatures of a complex magnetic ordering with a coexistence of different magnetic phases.
    Common trends in bulk and nanofilm samples
    The results from the examined nanocrystalline film made of this alloy demonstrate some common trends as compared to a bulk sample, e.g., the largest lattice relaxations of Chromium and still intriguing magnetic behaviour of Manganese, which are consistent with the macroscopic magnetic behaviour of the film.
    “High-entropy alloys are an extremely diverse and exciting class of materials,” says Dr. Alevtina Smekhova, physicist at HZB and first author of the paper. “By probing the behaviour of individual components at the atomic scale, we would gain valuable clues for the further development of new complex systems with the desired multifunctionality,” she says. More

  • in

    Advances in water-splitting catalysts

    Creating a hydrogen economy is no small task, but Rice University engineers have discovered a method that could make oxygen evolution catalysis in acids, one of the most challenging topics in water electrolysis for producing clean hydrogen fuels, more economical and practical.
    The lab of chemical and biomolecular engineer Haotian Wang at Rice’s George R. Brown School of Engineering has replaced rare and expensive iridium with ruthenium, a far more abundant precious metal, as the positive-electrode catalyst in a reactor that splits water into hydrogen and oxygen.
    The lab’s successful addition of nickel to ruthenium dioxide (RuO2) resulted in a robust anode catalyst that produced hydrogen from water electrolysis for thousands of hours under ambient conditions.
    “There’s huge industry interest in clean hydrogen,” Wang said. “It’s an important energy carrier and also important for chemical fabrication, but its current production contributes a significant portion of carbon emissions in the chemical manufacturing sector globally. We want to produce it in a more sustainable way, and water-splitting using clean electricity is widely recognized as the most promising option.”
    Iridium costs roughly eight times more than ruthenium, he said, and it could account for 20% to 40% of the expense in commercial device manufacturing, especially in future large-scale deployments.
    The process developed by Wang, Rice postdoctoral associate Zhen-Yu Wu and graduate student Feng-Yang Chen, and colleagues at the University of Pittsburgh and the University of Virginia is detailed in Nature Materials.
    Water splitting involves the oxygen and hydrogen evolution reactions by which polarized catalysts rearrange water molecules to release oxygen and hydrogen. “Hydrogen is produced by the cathode, which is a negative electrode,” Wu said. “At the same time, it has to balance the charge by oxidizing water to generate oxygen on the anode side.” More

  • in

    Smartphone data can help create global vegetation maps

    Missing knowledge in the global distribution of plant traits could be filled with data from species identification apps. Researchers from Leipzig University, the German Centre for Integrative Biodiversity Research (iDiv) and other institutions were able to demonstrate this based on data from the popular iNaturalist app. Supplemented with data on plant traits, iNaturalist input results in considerably more precise maps than previous approaches based on extrapolation from limited databases. Among other things, the new maps provide an improved basis for understanding plant-environment interactions and for Earth system modelling. The study has been published in the journal Nature Ecology and Evolution.
    Nature and climate are mutually dependent. Plant growth is absolutely dependent on climate, but this is, in turn, strongly influenced by plants, such as in a forest, which evaporates a lot of water. In order to be able to make accurate predictions about how the living world may develop, extensive knowledge of the characteristics of the vegetation at the different locations is necessary, for example, leaf surface size, tissue properties and plant height. However, such data usually have to be recorded manually by professional scientists in a painstaking, time-consuming process. Consequently, the available worldwide plant trait data are very sparse and cover only certain regions.
    The TRY database, managed by iDiv and the Max Planck Institute for Biogeochemistry in Jena, currently provides such data on plant traits for almost 280,000 plant species. This makes it one of the most comprehensive databases for plant characteristics mapping in the world. Up to now, global maps of plant traits have been created using extrapolations (estimation beyond the original observation range) from this geographically limited database. However, the resulting maps are not particularly reliable.
    In order to fill large data gaps, the Leipzig researchers have now taken a different approach. Instead of extrapolating existing trait data geographically from the TRY database, they have linked it to the vast dataset from the citizen science project iNaturalist.
    With iNaturalist, users of the associated smartphone app share their observations of nature, providing species names, photos and geolocation. In this way, more than 19 million data points have been recorded, worldwide, for terrestrial plants alone. The data also feeds the world’s largest biodiversity database, the Global Biodiversity Information Facility (GBIF). This is accessible to the public and also serves as an important database for biodiversity research.
    In order to test the accuracy of the maps based on the combination of iNaturalist observations and TRY plant traits, they were compared to the plant trait evaluations based on sPlotOpen; the iDiv sPlot platform is the world’s largest archive of plant community data. It contains nearly two million datasets with complete lists of plant species which occur in the locations (plots) studied by professional researchers. The database is also enhanced with plant trait data from the TRY database.
    The conclusion: The new iNaturalist-based map corresponded to the sPlot data map significantly more closely than previous map products based on extrapolation. “That the new maps, based on the citizen science data, seem to be even more precise than the extrapolations was both surprising and impressive,” says first author Sophie Wolf, a doctoral researcher at Leipzig University. “Particularly because iNaturalist and our reference sPlotOpen are very different in structure.”
    “Our study convincingly demonstrates the potential for research into voluntary data,” says last author, Dr Teja Kattenborn from Leipzig University and iDiv. “It is encouraging to make increasing use of the synergies between the combined data from thousands of citizens and professional scientists.”
    “This work is the result of an initiative of the National Research Data Infrastructure for Biodiversity Research (NFDI4Biodiversity), with which we are pushing for a change in culture towards the open provision of data,” says co-author Prof Miguel Mahecha, head of the working group Modelling Approaches in Remote Sensing at Leipzig University and iDiv. “The free availability of data is an absolute prerequisite for a better understanding of our planet.” More