More stories

  • in

    Cyborg cells could be tools for health and environment

    Biomedical engineers at the University of California, Davis, have created semi-living “cyborg cells.” Retaining the capabilities of living cells, but unable to replicate, the cyborg cells could have a wide range of applications, from producing therapeutic drugs to cleaning up pollution. The work was published Jan. 11 in Advanced Science.
    Synthetic biology aims to engineer cells that can carry out novel functions. There are essentially two approaches in use, said Cheemeng Tan, associate professor of biomedical engineering at UC Davis and senior author on the paper. One is to take a living bacterial cell and remodel its DNA with new genes that give it new functions. The other is to create an artificial cell from scratch, with a synthetic membrane and biomolecules.
    The first approach, an engineered living cell, has great flexibility but is also able to reproduce itself, which may not be desirable. A completely artificial cell cannot reproduce but is less complex and only capable of a limited range of tasks.
    Infused with artificial polymer
    Tan and the UC Davis team came up with a third approach. They infused living bacterial cells with the basic units of an artificial polymer. Once inside the cell, polymer was cross-linked into a hydrogel matrix by exposure to ultraviolet light. The cells could maintain their biological activity but could not reproduce.
    “The cyborg cells are programmable, do not divide, preserve essential cellular activities, and gain nonnative abilities,” Tan said.
    The cyborg cells were more resistant to stressors that would kill normal cells, such as exposure to hydrogen peroxide, antibiotics or high pH, the researchers found.
    Finally, they were able to engineer the cells so that they could invade cancer cells grown in the lab.
    The group is carrying out further research on how to create and control cyborg cells and on the effects of different matrix materials. They also hope to explore their use in a wide range of applications, from meeting environmental challenges to diagnosing and treating diseases.
    “Finally, we are interested in the bioethics of applying cyborg cells as they are cell-derived biomaterials that are neither cells nor materials,” Tan said.
    Additional co-authors on the paper are: Luis Contreras-Llano, Conary Meyer, Ofelya Baghdasaryan, Shahid Khan, and Aijun Wang, UC Davis Department of Biomedical Engineering; Tanner Henson, UC Davis Department of Surgery; Yu-Han Liu, Chi-Long Lin, Che-Ming J. Hu, Academia Sinica, Taiwan.
    An application has been submitted for a provisional patent on the process. The work was supported in part by grants from the National Institutes of Health. More

  • in

    Satellites can be used to detect waste sites on Earth

    A new computational system uses satellite data to identify sites on land where people dispose of waste, providing a new tool to monitor waste and revealing sites that may leak plastic into waterways. Caleb Kruse of Earthrise Media in Berkeley, California, Dr. Fabien Laurier from the Minderoo Foundation in Washington DC, and colleagues present this method in the open-access journal PLOS ONE on January 18, 2022.
    Every year, millions of metric tons of plastic waste end up in oceans, harming hundreds of species and their ecosystems. Most of this waste comes from land-based sources that leak into watersheds. Efforts to address this issue require better understanding of where people dispose of waste on land, but resources to detect and monitor such sites — both official sites and informal or illegal ones — are lacking.
    In recent years, the use of computational tools known as neural networks to analyze satellite data has shown great value in the field of remote sensing. Building on that work, Kruse and colleagues developed a new system of neural networks to analyze data from the European Space Agency’s Sentinel-2 satellites and demonstrated its potential for use in monitoring waste sites on land.
    To evaluate the performance of the new system, the researchers first applied it to Indonesia, where it detected 374 waste sites — more than twice the number of sites reported in public records. Broadening to all countries across Southeast Asia, the system identified a total of 966 waste sites — nearly three times the number of publicly recorded sites — that were subsequently confirmed to exist via other methods.
    The researchers demonstrated that their new system can be used to monitor waste sites over time. In addition, they showed that nearly 20 percent of the waste sites they detected are found within 200 meters of a waterway, with some visibly spilling into rivers that eventually reach the ocean.
    These findings, as well as future findings using this system, could help inform waste-management policies and decision-making. The data are publicly available, so stakeholders can use it to advocate for action within their communities. Looking ahead, the researchers plan to refine and expand their new waste site-monitoring system globally.
    The authors add: “For the first time, Global Plastic Watch arms governments and researchers around the world with data that can guide better waste management interventions, ensuring land-based waste doesn’t end up in our oceans.” More

  • in

    Microelectronics give researchers a remote control for biological robots

    First, they walked. Then, they saw the light. Now, miniature biological robots have gained a new trick: remote control.
    The hybrid “eBiobots” are the first to combine soft materials, living muscle and microelectronics, said researchers at the University of Illinois Urbana-Champaign, Northwestern University and collaborating institutions. They described their centimeter-scale biological machines in the journal Science Robotics.
    “Integrating microelectronics allows the merger of the biological world and the electronics world, both with many advantages of their own, to now produce these electronic biobots and machines that could be useful for many medical, sensing and environmental applications in the future,” said study co-leader Rashid Bashir, an Illinois professor of bioengineering and dean of the Grainger College of Engineering.
    Bashir’s group has pioneered the development of biobots, small biological robots powered by mouse muscle tissue grown on a soft 3D-printed polymer skeleton. They demonstrated walking biobots in 2012 and light-activated biobots in 2016. The light activation gave the researchers some control, but practical applications were limited by the question of how to deliver the light pulses to the biobots outside of a lab setting.
    The answer to that question came from Northwestern University professor John A. Rogers, a pioneer in flexible bioelectronics, whose team helped integrate tiny wireless microelectronics and battery-free micro-LEDs. This allowed the researchers to remotely control the eBiobots.
    “This unusual combination of technology and biology opens up vast opportunities in creating self-healing, learning, evolving, communicating and self-organizing engineered systems. We feel that it’s a very fertile ground for future research with specific potential applications in biomedicine and environmental monitoring,” said Rogers, a professor of materials science and engineering, biomedical engineering and neurological surgery at Northwestern University and director of the Querrey Simpson Institute for Bioelectronics.

    To give the biobots the freedom of movement required for practical applications, the researchers set out to eliminate bulky batteries and tethering wires. The eBiobots use a receiver coil to harvest power and provide a regulated output voltage to power the micro-LEDs, said co-first author Zhengwei Li, an assistant professor of biomedical engineering at the University of Houston.
    The researchers can send a wireless signal to the eBiobots that prompts the LEDs to pulse. The LEDs stimulate the light-sensitive engineered muscle to contract, moving the polymer legs so that the machines “walk.” The micro-LEDs are so targeted that they can activate specific portions of muscle, making the eBiobot turn in a desired direction.
    The researchers used computational modeling to optimize the eBiobot design and component integration for robustness, speed and maneuverability. Illinois professor of mechanical sciences and engineering Mattia Gazzola led the simulation and design of the eBiobots. The iterative design and additive 3D printing of the scaffolds allowed for rapid cycles of experiments and performance improvement, said Gazzola and co-first author Xiaotian Zhang, a postdoctoral researcher in Gazzola’s lab.
    The design allows for possible future integration of additional microelectronics, such as chemical and biological sensors, or 3D-printed scaffold parts for functions like pushing or transporting things that the biobots encounter, said co-first author Youngdeok Kim, who completed the work as a graduate student at Illinois.
    The integration of electronic sensors or biological neurons would allow the eBiobots to sense and respond to toxins in the environment, biomarkers for disease and more possibilities, the researchers said.
    “In developing a first-ever hybrid bioelectronic robot, we are opening the door for a new paradigm of applications for health care innovation, such as in-situ biopsies and analysis, minimum invasive surgery or even cancer detection within the human body,” Li said.
    The National Science Foundation and the National Institutes of Health supported this work.
    Video: https://youtu.be/MI__Nm6EzvA More

  • in

    Two technical breakthroughs make high-quality 2D materials possible

    Researchers have been looking to replace silicon in electronics with materials that provide a higher performance and lower power consumption while also having scalability. An international team is addressing that need by developing a promising process to develop high-quality 2D materials that could power next-generation electronics.
    Sang-Hoon Bae, an assistant professor of mechanical engineering and materials science at the McKelvey School of Engineering at Washington University in St. Louis, is one of three researchers leading the multi-institutional work published Jan. 18 in Nature, together with his doctoral student Justin S. Kim and postdoctoral research associate Yuan Meng.
    The work, which includes two technical breakthroughs, is the first to report that their method to grow semiconductor materials, known as transition metal dichalcogenides (TMD), would make devices faster and use less power.
    The team, co-led by Jeehwan Kim, an associate professor of mechanical engineering and of materials science and engineering at the Massachusetts Institute of Technology, and Jin-Hong Park, a professor of information and communication engineering and of electronic and electrical engineering at Sungkyunkwan University, had to overcome three extremely difficult challenges to create the new materials: securing single crystallinity at wafer-scale; preventing irregular thickness during growth at wafer-scale; and vertical heterostructures at wafer-scale.
    Bae said 3D materials go through a process of roughening and smoothing to become an even-surfaced material. However, 2D materials don’t allow this process, resulting in an uneven surface that makes it difficult to have a large-scale, high-quality, uniform 2D material.
    “We designed a geometric-confined structure that facilitates kinetic control of 2D materials so that all grand challenges in high-quality 2D material growth are resolved,” Bae said. “Thanks to the facilitated kinetic control, we only needed to grow self-defined seeding for a shorter growing time.”
    The team made another technical breakthrough by demonstrating single-domain heterojunction TMDs at the wafer scale, or a large scale, by layer-by-layer growth. To confine the growth of the nuclei, they used various substrates made from chemical compounds. These substrates formed a physical barrier that prevented lateral-epitaxy formation and forced vertical growth.
    “We believe that our confined growth technique can bring all the great findings in physics of 2D materials to the level of commercialization by allowing the construction of single domain layer-by-layer heterojunctions at the wafer-scale,” Bae said.
    Bae said other researchers are studying this material at very small sizes of tens to hundreds of micrometers.
    “We scaled up because we can solve the issue by producing the high-quality material at large scale,” Bae said. “Our achievement will lay a strong foundation for 2D materials to fit into industrial settings.”
    This research was supported by funding from Intel; DARPA (029584-00001 and 2018-JU-2776); and Institute for Basic Science (IBS-R034-D1). More

  • in

    A window into the nanoworld: Scientists develop new technique to image fluctuations in materials

    A team of scientists, led by researchers from the Max Born Institute in Berlin and Helmholtz-Zentrum Berlin in Germany and from Brookhaven National Laboratory and the Massachusetts Institute of Technology in the United States has developed a revolutionary new method for capturing high-resolution images of fluctuations in materials at the nanoscale using powerful X-ray sources. The technique, which they call Coherent Correlation Imaging (CCI), allows for the creation of sharp, detailed movies without damaging the sample by excessive radiation. By using an algorithm to detect patterns in underexposed images, CCI opens paths to previously inaccessible information. The team demonstrated CCI on samples made of thin magnetic layers, and their results have been published in Nature.
    The microscopic realm of the world is constantly in motion and marked by unceasing alteration. Even in seemingly unchanging solid materials, these fluctuations can give rise to unusual properties; one example being the lossless transmission of electrical current in high-temperature superconductors. Fluctuations are particularly pronounced during phase transitions, where a material changes its state, such as from solid to liquid during melting. Scientists also investigate very different phase transitions, such as from non-conductive to conductive, non-magnetic to magnetic, and changes in crystal structure. Many of these processes are utilized in technology, and also play a crucial role in the functioning of living organisms.
    The problem: Too much illumination might damage the sample
    Studying these processes in detail, however, is a difficult task, and capturing a movie of these fluctuation patterns is even more challenging. This is because the fluctuations happen quickly and take place at the nanometer scale — a millionth of a millimeter. Even the most advanced high-resolution X-ray and electron microscopes are unable to capture this rapid, random motion. The problem is fundamentally rooted, as exemplified by this principle of photography: in order to capture a clear image of an object, a certain level of illumination is required. To magnify the object, that is to “zoom in,” more illumination is needed. Even more light is necessary when attempting to capture a fast motion with a short exposure time. Ultimately, increasing the resolution and decreasing the exposure time leads to a point where the object would be damaged or even destroyed by the illumination required. This is exactly the point science has reached in recent years: snapshots taken with free-electron lasers, the most intense X-ray sources available today, inevitably led to the destruction of the sample under study. As a result, capturing a movie of these random processes consisting of multiple images has been deemed impossible.
    New approach: using an algorithm to detect patterns in dimly lit pictures
    An international team of scientists has now found a solution to this problem. The key to their solution was the realization that the fluctuation patterns in materials are often not entirely random. By focusing on a small portion of the sample, the researchers observed that certain spatial patterns repeatedly emerged, but the exact timing and frequency of these patterns were unpredictable.
    The scientists have developed a novel non-destructive imaging method called Coherent Correlation Imaging (CCI). To create a movie, they take multiple snapshots of the sample in quick succession while reducing the illumination enough to keep the sample intact. However, this results in individual images where the fluctuation pattern in the sample becomes indistinct. Nevertheless, the images still contain sufficient information to separate them into groups. To accomplish this, the team first had to create a new algorithm that analyzes the correlations between the images, hence the method’s name. The snapshots within each group are very similar and thus likely to originate from the same specific fluctuation pattern. It is only when all shots in a group are viewed together that a clear image of the sample emerges. The scientists are now able to rewind the film and associate each snapshot with a clear image of the sample’s state at that moment in time.
    An example: Filming the “dance of domains” in magnetic layers
    The scientists created this new method to tackle a specific problem in the field of magnetism: microscopic patterns that occur in thin ferromagnetic layers. These layers are divided into regions known as domains, in which the magnetization points either upward or downward. Similar magnetic films are used in modern hard drives where the two different types of domains encode bits with “0” or “1.” Until now, it was believed that these patterns were extremely stable. But is this really true?
    To answer this question, the team investigated a sample consisting of such a magnetic layer at the National Synchrotron Light Source II on Long Island near New York City, using the newly developed CCI method. Indeed, the patterns remained unchanged at room temperature. But at a slightly elevated temperature of 37°C (98°F), the domains began to move back and forth erratically, displacing each other. The scientists observed this “dance of the domains” for several hours. Subsequently, they created a map showing the preferred location of the boundaries between the domains. This map and the movie of the movements led to a better understanding of the magnetic interactions in the materials, promoting future applications in advanced computer architectures.
    New opportunities for materials research at X-ray sources
    The scientists’ next objective is to employ the novel imaging method on free-electron lasers, such as the European XFEL in Hamburg, to gain deeper insights into even faster processes at the smallest length scales. They are confident that this method will improve our understanding of the role of fluctuations and stochastic processes in the properties of modern materials, and as a result, discover new methods of utilizing them in a more directed manner. More

  • in

    Can you trust your quantum simulator?

    At the scale of individual atoms, physics gets weird. Researchers are working to reveal, harness, and control these strange quantum effects using quantum analog simulators — laboratory experiments that involve super-cooling tens to hundreds of atoms and probing them with finely tuned lasers and magnets.
    Scientists hope that any new understanding gained from quantum simulators will provide blueprints for designing new exotic materials, smarter and more efficient electronics, and practical quantum computers. But in order to reap the insights from quantum simulators, scientists first have to trust them.
    That is, they have to be sure that their quantum device has “high fidelity” and accurately reflects quantum behavior. For instance, if a system of atoms is easily influenced by external noise, researchers could assume a quantum effect where there is none. But there has been no reliable way to characterize the fidelity of quantum analog simulators, until now.
    In a study appearing in Nature, physicists from MIT and Caltech report a new quantum phenomenon: They found that there is a certain randomness in the quantum fluctuations of atoms and that this random behavior exhibits a universal, predictable pattern. Behavior that is both random and predictable may sound like a contradiction. But the team confirmed that certain random fluctuations can indeed follow a predictable, statistical pattern.
    What’s more, the researchers have used this quantum randomness as a tool to characterize the fidelity of a quantum analog simulator. They showed through theory and experiments that they could determine the accuracy of a quantum simulator by analyzing its random fluctuations.
    The team developed a new benchmarking protocol that can be applied to existing quantum analog simulators to gauge their fidelity based on their pattern of quantum fluctuations. The protocol could help to speed the development of new exotic materials and quantum computing systems.

    “This work would allow characterizing many existing quantum devices with very high precision,” says study co-author Soonwon Choi, assistant professor of physics at MIT. “It also suggests there are deeper theoretical structures behind the randomness in chaotic quantum systems than we have previously thought about.”
    The study’s authors include MIT graduate student Daniel Mark and collaborators at Caltech, the University of Illinois at Urbana-Champaign, Harvard University, and the University of California at Berkeley.
    Random evolution
    The new study was motivated by an advance in 2019 by Google, where researchers had built a digital quantum computer, dubbed “Sycamore,” that could carry out a specific computation more quickly than a classical computer.
    Whereas the computing units in a classical computer are “bits” that exist as either a 0 or a 1, the units in a quantum computer, known as “qubits,” can exist in a superposition of multiple states. When multiple qubits interact, they can in theory run special algorithms that solve difficult problems in far shorter time than any classical computers.

    The Google researchers engineered a system of superconducting loops to behave as 53 qubits, and showed that the “computer” could carry out a specific calculation that would normally be too thorny for even the fastest supercomputer in the world to solve.
    Google also happened to show that it could quantify the system’s fidelity. By randomly changing the state of individual qubits and comparing the resulting states of all 53 qubits with what the principles of quantum mechanics predict, they were able to measure the system’s accuracy.
    Choi and his colleagues wondered whether they could use a similar, randomized approach to gauge the fidelity of quantum analog simulators. But there was one hurdle they would have to clear: Unlike Google’s digital quantum system, individual atoms and other qubits in analog simulators are incredibly difficult to manipulate and therefore randomly control.
    But through some theoretical modeling, Choi realized that the collective effect of individually manipulating qubits in Google’s system could be reproduced in an analog quantum simulator by simply letting the qubits naturally evolve.
    “We figured out that we don’t have to engineer this random behavior,” Choi says. “With no fine-tuning, we can just let the natural dynamics of quantum simulators evolve, and the outcome would lead to a similar pattern of randomness due to chaos.”
    Building trust
    As an extremely simplified example, imagine a system of five qubits. Each qubit can exist simultaneously as a 0 or a 1, until a measurement is made, whereupon the qubits settle into one or the other state. With any one measurement, the qubits can take on one of 32 different combinations: 0-0-0-0-0, 0-0-0-0-1, and so on.
    “These 32 configurations will occur with a certain probability distribution, which people believe should be similar to predictions of statistical physics,” Choi explains. “We show they agree on average, but there are deviations and fluctuations that exhibit a universal randomness that we did not know. And that randomness looks the same as if you ran those random operations that Google did.”
    The researchers hypothesized that if they could develop a numerical simulation that precisely represents the dynamics and universal random fluctuations of a quantum simulator, they could compare the predicted outcomes with the simulator’s actual outcomes. The closer the two are, the more accurate the quantum simulator must be.
    To test this idea, Choi teamed up with experimentalists at Caltech, who engineered a quantum analog simulator comprising 25 atoms. The physicists shone a laser on the experiment to collectively excite the atoms, then let the qubits naturally interact and evolve over time. They measured the state of each qubit over multiple runs, gathering 10,000 measurements in all.
    Choi and colleagues also developed a numerical model to represent the experiment’s quantum dynamics, and incorporated an equation that they derived to predict the universal, random fluctuations that should arise. The researchers then compared their experimental measurements with the model’s predicted outcomes and observed a very close match — strong evidence that this particular simulator can be trusted as reflecting pure, quantum mechanical behavior.
    More broadly, the results demonstrate a new way to characterize almost any existing quantum analog simulator.
    “The ability to characterize quantum devices forms a very basic technical tool to build increasingly larger, more precise and complex quantum systems,” Choi says. “With our tool, people can know whether they are working with a trustable system.”
    This research was funded, in part, by the U.S. National Science Foundation, the Defense Advanced Research Projects Agency, the Army Research Office, and the Department of Energy. More

  • in

    Engineers grow 'perfect' atom-thin materials on industrial silicon wafers

    True to Moore’s Law, the number of transistors on a microchip has doubled every year since the 1960s. But this trajectory is predicted to soon plateau because silicon — the backbone of modern transistors — loses its electrical properties once devices made from this material dip below a certain size.
    Enter 2D materials — delicate, two-dimensional sheets of perfect crystals that are as thin as a single atom. At the scale of nanometers, 2D materials can conduct electrons far more efficiently than silicon. The search for next-generation transistor materials therefore has focused on 2D materials as potential successors to silicon.
    But before the electronics industry can transition to 2D materials, scientists have to first find a way to engineer the materials on industry-standard silicon wafers while preserving their perfect crystalline form. And MIT engineers may now have a solution.
    The team has developed a method that could enable chip manufacturers to fabricate ever-smaller transistors from 2D materials by growing them on existing wafers of silicon and other materials. The new method is a form of “nonepitaxial, single-crystalline growth,” which the team used for the first time to grow pure, defect-free 2D materials onto industrial silicon wafers.
    With their method, the team fabricated a simple functional transistor from a type of 2D materials called transition-metal dichalcogenides, or TMDs, which are known to conduct electricity better than silicon at nanometer scales.
    “We expect our technology could enable the development of 2D semiconductor-based, high-performance, next-generation electronic devices,” says Jeehwan Kim, associate professor of mechanical engineering at MIT. “We’ve unlocked a way to catch up to Moore’s Law using 2D materials.”
    Kim and his colleagues detail their method in a paper appearing in Nature. The study’s MIT co-authors include Ki Seok Kim, Doyoon Lee, Celesta Chang, Seunghwan Seo, Hyunseok Kim, Jiho Shin, Sangho Lee, Jun Min Suh, and Bo-In Park, along with collaborators at the University of Texas at Dallas, the University of California at Riverside, Washington University in Saint Louis, and institutions across South Korea.

    A crystal patchwork
    To produce a 2D material, researchers have typically employed a manual process by which an atom-thin flake is carefully exfoliated from a bulk material, like peeling away the layers of an onion.
    But most bulk materials are polycrystalline, containing multiple crystals that grow in random orientations. Where one crystal meets another, the “grain boundary” acts as an electric barrier. Any electrons flowing through one crystal suddenly stop when met with a crystal of a different orientation, damping a material’s conductivity. Even after exfoliating a 2D flake, researchers must then search the flake for “single-crystalline” regions — a tedious and time-intensive process that is difficult to apply at industrial scales.
    Recently, researchers have found other ways to fabricate 2D materials, by growing them on wafers of sapphire — a material with a hexagonal pattern of atoms which encourages 2D materials to assemble in the same, single-crystalline orientation.
    “But nobody uses sapphire in the memory or logic industry,” Kim says. “All the infrastructure is based on silicon. For semiconductor processing, you need to use silicon wafers.”
    However, wafers of silicon lack sapphire’s hexagonal supporting scaffold. When researchers attempt to grow 2D materials on silicon, the result is a random patchwork of crystals that merge haphazardly, forming numerous grain boundaries that stymie conductivity.

    “It’s considered almost impossible to grow single-crystalline 2D materials on silicon,” Kim says. “Now we show you can. And our trick is to prevent the formation of grain boundaries.”
    Seed pockets
    The team’s new “nonepitaxial, single-crystalline growth” does not require peeling and searching flakes of 2D material. Instead, the researchers use conventional vapor deposition methods to pump atoms across a silicon wafer. The atoms eventually settle on the wafer and nucleate, growing into two-dimensional crystal orientations. If left alone, each “nucleus,” or seed of a crystal, would grow in random orientations across the silicon wafer. But Kim and his colleagues found a way to align each growing crystal to create single-crystalline regions across the entire wafer.
    To do so, they first covered a silicon wafer in a “mask” — a coating of silicon dioxide that they patterned into tiny pockets, each designed to trap a crystal seed. Across the masked wafer, they then flowed a gas of atoms that settled into each pocket to form a 2D material — in this case, a TMD. The mask’s pockets corralled the atoms and encouraged them to assemble on the silicon wafer in the same, single-crystalline orientation.
    “That is a very shocking result,” Kim says “You have single-crystalline growth everywhere, even if there is no epitaxial relation between the 2D material and silicon wafer.”
    With their masking method, the team fabricated a simple TMD transistor and showed that its electrical performance was just as good as a pure flake of the same material.
    They also applied the method to engineer a multilayered device. After covering a silicon wafer with a patterned mask, they grew one type of 2D material to fill half of each square, then grew a second type of 2D material over the first layer to fill the rest of the squares. The result was an ultrathin, single-crystalline bilayer structure within each square. Kim says that going forward, multiple 2D materials could be grown and stacked together in this way to make ultrathin, flexible, and multifunctional films.
    “Until now, there has been no way of making 2D materials in single-crystalline form on silicon wafers, thus the whole community has almost given up on pursuing 2D materials for next-generation processors,” Kim says. “Now we have completely solved this problem, with a way to make devices smaller than a few nanometers. This will change the paradigm of Moore’s Law.”
    This research was supported in part by DARPA, Intel, the IARPA MicroE4AI program, MicroLink Devices, Inc., ROHM Co., and Samsung. More

  • in

    Light-based tech could inspire Moon navigation and next-gen farming

    Super-thin chips made from lithium niobate are set to overtake silicon chips in light-based technologies, according to world-leading scientists in the field, with potential applications ranging from remote ripening-fruit detection on Earth to navigation on the Moon.
    They say the artificial crystal offers the platform of choice for these technologies due to its superior performance and recent advances in manufacturing capabilities.
    RMIT University’s Distinguished Professor Arnan Mitchell and University of Adelaide’s Dr Andy Boes led this team of global experts to review lithium niobate’s capabilities and potential applications in the journal Science.
    The international team, including scientists from Peking University in China and Harvard University in the United States, is working with industry to make navigation systems that are planned to help rovers drive on the Moon later this decade.
    As it is impossible to use global positioning system (GPS) technology on the Moon, navigation systems in lunar rovers will need to use an alternative system, which is where the team’s innovation comes in.
    By detecting tiny changes in laser light, the lithium-niobate chip can be used to measure movement without needing external signals, according to Mitchell.

    “This is not science fiction — this artificial crystal is being used to develop a range of exciting applications. And competition to harness the potential of this versatile technology is heating up,” said Mitchell, Director of the Integrated Photonics and Applications Centre.
    He said while the lunar navigation device was in the early stages of development, the lithium niobate chip technology was “mature enough to be used in space applications.”
    “Our lithium niobate chip technology is also flexible enough to be rapidly adapted to almost any application that uses light,” Mitchell said.
    “We are focused on navigation now, but the same technology could also be used for linking internet on the Moon to the internet on Earth.”
    What is lithium niobate and how can it be used?
    Lithium niobate is an artificial crystal that was first discovered in 1949 but is “back in vogue,” according to Boes.

    “Lithium niobate has new uses in the field of photonics — the science and technology of light — because unlike other materials it can generate and manipulate electro-magnetic waves across the full spectrum of light, from microwave to UV frequencies,” he said.
    “Silicon was the material of choice for electronic circuits, but its limitations have become increasingly apparent in photonics.
    “Lithium niobate has come back into vogue because of its superior capabilities, and advances in manufacturing mean that it is now readily available as thin films on semiconductor wafers.”
    A layer of lithium niobate about 1,000 times thinner than a human hair is placed on a semiconductor wafer, Boes said.
    “Photonic circuits are printed into the lithium niobate layer, which are tailored according to the chip’s intended use. A fingernail-sized chip may contain hundreds of different circuits,” he said.
    How does the lunar navigation tech work?
    The team is working with the Australian company Advanced Navigation to create optical gyroscopes, where laser light is launched in both clockwise and anticlockwise directions in a coil of fibre, Mitchell said.
    “As the coil is moved the fibre is slightly shorter in one direction than the other, according to Albert Einstein’s theory of relativity,” he said.
    “Our photonic chips are sensitive enough to measure this tiny difference and use it to determine how the coil is moving. If you can keep track of your movements, then you know where you are relative to where you started. This is called inertial navigation.”
    Potential applications closer to home
    This technology can also be used to remotely detect the ripeness of fruit.
    “Gas emitted by ripe fruit is absorbed by light in the mid-infrared part of the spectrum,” Mitchell said.
    “A drone hovering in an orchard would transmit light to another which would sense the degree to which the light is absorbed and when fruit is ready for harvesting.
    “Our microchip technology is much smaller, cheaper and more accurate than current technology and can be used with very small drones that won’t damage fruit trees.”
    Next steps
    Australia could become a global hub for manufacturing integrated photonic chips from lithium niobate that would have a major impact on applications in technology that use every part of the spectrum of light, Mitchell said.
    “We have the technology to manufacture these chips in Australia and we have the industries that will use them,” he said.
    “Photonic chips can now transform industries well beyond optical fibre communications.” More