More stories

  • in

    A ferroelectric transistor that stores and computes at scale

    The Big Data revolution has strained the capabilities of state-of-the-art electronic hardware, challenging engineers to rethink almost every aspect of the microchip. With ever more enormous data sets to store, search and analyze at increasing levels of complexity, these devices must become smaller, faster and more energy efficient to keep up with the pace of data innovation.
    Ferroelectric field effect transistors (FE-FETs) are among the most intriguing answers to this challenge. Like traditional silicon-based transistors, FE-FETs are switches, turning on and off at incredible speed to communicate the 1s and 0s computers use to perform their operations.
    But FE-FETs have an additional function that conventional transistors do not: their ferroelectric properties allow them to hold on to electrical charge.
    This property allows them to serve as non-volatile memory devices as well as computing devices. Able to both store and process data, FE-FETs are the subject of a wide range of research and development projects. A successful FE-FET design would dramatically undercut the size and energy usage thresholds of traditional devices, as well as increase speed.
    Researchers at the University of Pennsylvania School of Engineering and Applied Science have introduced a new FE-FET design that demonstrates record-breaking performances in both computing and memory.
    A recent study published in Nature Nanotechnology led by Deep Jariwala, Associate Professor in the Department of Electrical and Systems Engineering (ESE), and Kwan-Ho Kim, a Ph.D. candidate in his lab, debuted the design. They collaborated with fellow Penn Engineering faculty members Troy Olsson, also Associate Professor in ESE, and Eric Stach, Robert D. Bent Professor of Engineering in the Department of Materials Science and Engineering (MSE) and Director of the Laboratory for Research on the Structure of Matter (LRSM).

    The transistor layers a two-dimensional semiconductor called molybdenum disulfide (MoS2) on top of a ferroelectric material called aluminum scandium nitride (AlScN), demonstrating for the first time that these two materials can be effectively combined to create transistors at scales attractive to industrial manufacturing.
    “Because we have made these devices combining a ferroelectric insulator material with a 2D semiconductor, both are very energy efficient,” says Jariwala. “You can use them for computing as well as memory — interchangeably and with high efficiency.”
    The Penn Engineering team’s device is notable for its unprecedented thinness, allowing for each individual device to operate with a minimum amount of surface area. In addition, the tiny devices can be manufactured in large arrays scalable to industrial platforms.
    “With our semiconductor, MoS2, at a mere 0.7 nanometers, we weren’t sure it could survive the amount of charge that our ferroelectric material, AlScN, would inject into it,” says Kim. “To our surprise, not only did both of them survive, but the amount of current this enables the semiconductor to carry was also record-breaking.”
    The more current a device can carry, the faster it can operate for computing applications. The lower the resistance, the faster the access speed for memory.

    This MoS2 and AlScN combination is a true breakthrough in transistor technology. Other research teams’ FE-FETs have been consistently stymied by a loss of ferroelectric properties as devices miniaturize to approach industry-appropriate scales.
    Until this study, miniaturizing FE-FETs has resulted in severe shrinking of the “memory window.” This means that as engineers reduce the size of the transistor design, the device develops an unreliable memory, mistaking 1s for 0s and vice versa, compromising its overall performance.
    The Jariwala lab and collaborators achieved a design that keeps the memory window large with impressively small device dimensions. With AlScN at 20 nanometers, and MoS2 at 0.7 nanometers, the FE-FET dependably stores data for quick access.
    “The key,” says Olsson, “is our ferroelectric material, AlScN. Unlike many ferroelectric materials, it maintains its unique properties even when very thin. In a recent paper from my group, we showed that it can we can retain its unique ferroelectric properties at even smaller thicknesses: 5 nanometers.”
    The Penn Engineering team’s next steps are focused on this further miniaturization to produce devices that operate with voltages low enough to be compatible with leading-edge consumer device manufacturing.
    “Our FE-FETs are incredibly promising,” says Jariwala. “With further development, these versatile devices could have a place in almost any technology you can think of, especially those that are AI-enabled and consume, generate or process vast amounts of data — from sensing to communications and more.” More

  • in

    Participating in genetic studies is in your genes

    Why do some people take part in genetic studies while others do not? The answer may lie within our genetic makeup. According to a groundbreaking study by Oxford’s Leverhulme Centre for Demographic Science and Big Data Institute, people who participate in genetic studies are genetically more likely to do so, leaving detectable ‘footprints’ in genetics data. This breakthrough equips researchers with the ability to identify and address participation bias, a significant challenge in genetic research.
    Stefania Benonisdottir, lead author of the study and a Doctoral candidate from Oxford’s Big Data Institute, explains, ‘Currently, most genetic studies are based on genetic databases which contain large numbers of participants and a wealth of information. However, some people are more likely to be included in these databases than others, which can create a problem called ascertainment bias, where the genetic data collected is not representative of the intended study population.’
    To study this link between genetics data and participation bias, the researchers turned to one of the largest biomedical databases in the world, the UK Biobank which contains information from half a million participants.
    Using UK Biobank data, it was found there is a genetic component to people’s probability to participate — that is correlated but distinct from other human traits. Published today in Nature Genetics, the study highlights that participation could be an important human trait that has been previously underappreciated and introduces a statistical framework that could lead to more accurate analyses of genetic data.
    Professor Augustine Kong, senior author from the Leverhulme Centre for Demographic Science and the Big Data Institute, notes, ‘Ascertainment bias poses a statistical challenge in genetics research, particularly in the era of big data. Adjustments for this bias often rely on known differences between participants and non-participants, introducing imperfections when answering questions involving variables only observed for participants, such as genotypes. Our study identifies detectable footprints of participation bias in the genetic data of participants, which can be exploited statistically to enhance research accuracy for both participants and non-participants alike.’
    Genome-wide association studies offer important insights into the role of genetics in human health and diseases. However, such studies can be affected by biases, which arise when genetic databases are not representative of the intended study population. Now, the identified genetic inclination to participate can help scientists assess the representativeness of their study sample.
    By analysing the genetic data of over 30,000 related participants with white British descent from the UK Biobank, the researchers found that the genetic component underlying participation in the study is correlated with, but distinct from, the genetic components of traits such as educational attainment and body mass index.
    For example, the estimated correlation between the genetic components underlying participation in the UK Biobank and educational attainment is estimated to be 36.6%. This result is consistent with some of the previously reported differences between the participants and the non-participants, but it also shows that the participation bias is not fully captured by these previously known differences. In other words, participation is not simply a consequence of these other traits and characteristics.
    The study also found the genetic component of participation can be passed down through families and may affect people’s participation in many different studies over their lifetimes. This highlights the potential for bias in genetic research and underscores the importance of accounting for such biases in study design and analysis.
    Professor Melinda Mills, Director of the Leverhulme Centre for Demographic Science concludes, ‘As our GWAS Diversity Monitor shows, the road to improve diversity in genome-wide association studies is long. However, this statistical framework is a huge step in the right direction to mitigate the risk of incomplete or inaccurate data analysis and ensure that genetic research truly benefits everyone.’ More

  • in

    Controlling signal routing in quantum information processing

    Routing signals and isolating them against noise and back-reflections are essential in many practical situations in classical communication as well as in quantum processing. In a theory-experimental collaboration, a team led by Andreas Nunnenkamp from the University of Vienna and Ewold Verhagen based at the research institute AMOLF in Amsterdam has achieved unidirectional transport of signals in pairs of “one-way streets.” This research published in Nature Physics opens up new possibilities for more flexible signaling devices.
    Devices that allow to route signals, for example carried by light or sound waves, are essential in many practical situations. This is, for instance, the case in quantum information processing, where the states of the quantum computer have to be amplified to read them out — without noise from the amplification process corrupting them. That is why devices that allow signals to travel in a one-way channel e.g. isolators or circulators are much sought-after. However, at present such devices are lossy, bulky, and require large magnetic fields that break time-reversal symmetry to achieve unidirectional behaviour. These limitations have prompted strong efforts to find alternatives that take less space and that do not rely on magnetic fields.
    The new study published in Nature Physics introduces a new class of systems characterized by a phenomenon the authors call “quadrature nonreciprocity.” Quadrature nonreciprocity exploits interference between two distinct physical processes. Each of the processes produces a wave that contributes to the transmitted signal. Like water waves produced by two thrown pebbles, the two waves can either cancel or amplify each other, in a phenomenon known as interference.
    This allows for unidirectional transmission of signals without time-reversal breaking and leads to a distinctive dependence on the phase, i.e., the quadrature, of the signal. “In these devices, transmission depends not only on the direction of the signal, but also on the signal quadrature” says Clara Wanjura, the theoretical lead author of the study. “This realizes a ‘dual carriageway’ for signals: one quadrature is transmitted in one direction and the other quadrature in the opposite direction. Time-reversal symmetry then enforces that the quadratures always travel pairwise along opposite directions in two separate lanes.”
    The experimental team at AMOLF has demonstrated this phenomenon experimentally in a nanomechanical system where interactions among mechanical vibrations of small silicon strings are orchestrated by laser light. Laser light exerts forces on the strings, thereby mediating interactions between their different vibration ‘tones’. Jesse Slim, the experimental lead author of the study says: “We have developed a versatile experimental toolbox that allowed us to control the two different types of interactions that are needed to implement quadrature nonreciprocity. This way we could reveal the resulting unidirectional transport of the signals experimentally.”
    The work opens up new possibilities for signal routing and quantum-limited amplification, with potential applications in quantum information processing and sensing. More

  • in

    The economic life of cells

    A team from the University of Tokyo has combined economic theory with biology to understand how natural systems respond to change. The researchers noticed a similarity between consumers’ shopping behavior and the behavior of metabolic systems, which convert food into energy in our bodies. The team focused on predicting how different metabolic systems might respond to environmental change by using an economic tool called the Slutsky equation. Their calculations indicated that very different metabolic systems actually share previously unknown universal properties, and can be understood using tools from other academic fields. Metabolic processes are used in drug development, bioengineering, food production and other industries, so being able to predict how such systems will respond to change can offer many benefits.
    Where do you get your energy from? Perhaps a long night’s sleep, or a good breakfast and some exercise? These activities can all help as they support a healthy metabolism, the chemical processes by which our bodies convert food and drink into energy. Understanding how individual metabolic reactions behave and predicting how they may change under different circumstances is a big challenge. There are thousands of different reactions which enable us to move, think, grow — in short, to live. In recent years, it has become possible to predict some reactions through numerical simulations, but this requires large amounts of data. However, researchers at the University of Tokyo have derived previously unknown universal properties of metabolic systems by applying microeconomic theory to their data.
    “Until this research, we thought that metabolic systems varied so much among species and cell types that there were no common properties among them,” explained Assistant Professor Tetsuhiro Hatakeyama from the Graduate School of Arts and Sciences. “However, we were very excited to demonstrate that all metabolic systems have universal properties, and that these properties can be expressed by very simple laws.”According to the researchers, this theory does not require as much detailed background data to be collected as other methods. It can also be effectively applied whether you are trying to understand the behavior of all metabolic processes in a cell or focusing on just one part — say, for example, how much oxygen it is using.
    Hatakeyama, a biophysicist, was looking at some metabolic system diagrams when he noticed a striking similarity to diagrams used in economics. This realization inspired him to try an interdisciplinary approach and apply economic theory, which he had briefly studied, to his biology research. Along with co-author Jumpei Yamagishi, a graduate student in the same lab, he decided to explore how both consumers and cells optimize their “spending” to maximize gain: Whereas we as consumers spend money, cells “spend” nutrients. They reasoned if there were similarities in this way, then perhaps the same theories that are used to identify patterns in consumer behavior under changing financial situations could also identify patterns in cellular metabolic behavior under changing environments.
    More specifically, the researchers focused on the Slutsky equation, which is used to understand changes in consumer demand. In particular, it is used to understand so-called Giffen goods, which counterintuitively go up in demand when the price increases and go down in demand when the price decreases. According to Hatakeyama, this is similar to cellular metabolic behavior in response to a disturbance. For example, respiration demand (the Giffen goods in this case) in cancer cells goes up, counterintuitively, with increased drug dosage (the “price”), even though this is not beneficial to the growth rate of the cancer. The outcome was that the team uncovered a universal law for how metabolic systems respond to change.
    One of the key benefits of this law is that it can be used to understand metabolic systems about which few details are known. “Disturbances in metabolic systems lead to a variety of diseases, and our research could be used to propose new treatment strategies for diseases for which treatments are not fully understood,” said Hatakeyama. “In addition, many foods and medicines are made using the metabolic systems of organisms. By applying the simple equation found in this study, we can know how to increase the output of products made with these systems.” Hatakeyama hopes that through further interdisciplinary research, more universal laws might be discovered that will lead to a variety of useful applications. More

  • in

    Targeted prevention helps stop homelessness before it starts

    Homelessness has become an increasingly worrisome crisis in our nation over the past several years, but a new study from the University of Notre Dame shows that efforts to prevent homelessness work.
    The issue has reached such proportions in California, for example, that mayors of several major cities have declared a state of emergency on homelessness. In response, leaders in California have invested billions in homelessness programs, including some that target prevention.
    Prevention efforts, however, have led to questions — even from organizations committed to addressing homelessness — as to whether such programs are effective, due to the difficulty of targeting assistance to those with the greatest risk of becoming homeless. To test the impact of providing financial assistance to those susceptible to losing their housing, researchers at Notre Dame conducted a randomized controlled trial to evaluate the effect of emergency financial assistance (EFA) on families receiving support through the Santa Clara County Homelessness Prevention System, which is co-led by Destination: Home, a nonprofit organization dedicated to ending homelessness in Silicon Valley.
    David Phillips, a research professor in the Wilson Sheehan Lab for Economic Opportunities (LEO) within Notre Dame’s economics department, and James Sullivan, a professor of economics and co-founder of LEO, found that people offered EFA were 81 percent less likely to become homeless within six months of enrollment and 73 percent less likely within 12 months, as reported in their study recently published by The Review of Economics and Statistics.
    The study evaluated individuals and families at imminent risk of being evicted or becoming homeless who were allocated EFA between July 2019 and December 2020, with the average household receiving nearly $2,000. Recipients were chosen from among a larger group of people eligible for the program based on their vulnerability to homelessness and on a randomized system set up by LEO and Destination: Home. This temporary financial assistance helped pay rent, utilities or other housing-related expenses on their behalf.
    A common approach to fighting homelessness is to provide shelter to those who are already homeless, but the researchers argued that once a family or individual becomes homeless, they face even more difficulties — such as finding permanent housing, basic necessities and health care. They are also more likely to become involved in the criminal justice system and experience frequent hospital visits. LEO’s study found that a preventive approach focusing directly on helping those who are on the brink of homelessness can also be effective.
    “Our estimates suggest that the benefits to homelessness prevention exceed the costs,” the researchers said. They estimated that communities get $2.47 back in benefits per net dollar spent on emergency financial assistance.
    “Policymakers at all levels are struggling to make really hard decisions about how to allocate scarce resources to address this pervasive problem,” Sullivan said. “But this study shows that you can actually target the intervention to those at risk, which moves the needle on homelessness enough to justify making the investment.”
    Phillips added that while homelessness prevention programs are not a panacea to other problems often associated with the most visible forms of homelessness — such as health and substance abuse issues — it is still an effective way to help people.
    “Every person who ends up homeless is a little different from the next, and the reasons they’re there are different, but it’s the kind of help they need at the moment they need it, before everything falls apart,” Phillips said.
    One of LEO’s main tenets is to take a rigorous approach to fighting poverty by helping service providers apply scientific evaluation methods to better understand and share effective poverty interventions. Said Sullivan, “A big part of LEO’s mission is to create evidence that helps improve the lives of those most vulnerable. Because we have far greater needs than we have resources to address them, we have a real incentive to allocate those resources to the programs that are most effective. This evidence helps shape the decisions of those on the front lines fighting homelessness and poverty.”
    Jennifer Loving, chief executive officer of Destination: Home, said the LEO study has implications both locally and nationally. “This could inspire other jurisdictions to stand up their own homelessness prevention systems, using this research as a model or starting point for how to do that on their own — as well as justification to policymakers for funding,” Loving said. More

  • in

    Surgical and engineering innovations enable unprecedented control over every finger of a bionic hand

    Prosthetic limbs are the most common solution to replace a lost extremity. However, they are hard to control and often unreliable with only a couple of movements available. Remnant muscles in the residual limb are the preferred source of control for bionic hands. This is because patients can contract muscles at will, and the electrical activity generated by the contractions can be used to tell the prosthetic hand what to do, for instance, open or close. A major problem at higher amputation levels, such as above the elbow, is that not many muscles remain to command the many robotic joints needed to truly restore the function of an arm and hand.
    A multidisciplinary team of surgeons and engineers has circumvented this problem by reconfiguring the residual limb and integrating sensors and a skeletal implant to connect with a prosthesis electrically and mechanically. By dissecting the peripheral nerves and redistributing them to new muscle targets used as biological amplifiers, the bionic prosthesis can now access much more information so the user can command many robotic joints at will (video: https://youtu.be/h1N-vKku0hg).
    The research was led by Professor Max Ortiz Catalan, Founding Director of the Center for Bionics and Pain Research (CBPR) in Sweden, Head of Neural Prosthetics Research at the Bionics Institute in Australia, and Professor of Bionics at Chalmers University of Technology in Sweden.
    “In this article, we show that rewiring nerves to different muscle targets in a distributed and concurrent manner is not only possible but also conducive to improved prosthetic control. A key feature of our work is that we have the possibility to clinically implement more refine surgical procedures and embed sensors in the neuromuscular constructs at the time of the surgery, which we then connect to the electronic system of the prosthesis via an osseointegrated interface. A.I. algorithms take care of the rest.”
    Prosthetic limbs are commonly attached to the body by a socket that compresses the residual limb causing discomfort and is mechanically unstable. An alternative to socket attachment is to use a titanium implant placed within the residual bone which becomes strongly anchored — this is known as osseointegration. Such skeletal attachment allows for comfortable and more efficient mechanical connection of the prosthesis to the body.
    “It is rewarding to see that our cutting-edge surgical and engineering innovation can provide such a high level of functionality for an individual with an arm amputation. This achievement is based on over 30 years of gradual development of the concept, in which I am proud to have contributed” comments Dr. Rickard Brånemark, research affiliate at MIT, associate professor at Gothenburg University, CEO of Integrum, a leading expert on osseointegration for limb prostheses, who conducted the implantation of the interface.
    The surgery took place at the Sahlgrenska University Hospital, Sweden, where CBPR is located. The neuromuscular reconstruction procedure was conducted by Dr. Paolo Sassu, who also led the first hand transplantation performed in Scandinavia.
    “The incredible journey we have undertaken together with the bionic engineers at CBPR has allowed us to combine new microsurgical techniques with sophisticated implanted electrodes that provide single-finger control of a prosthetic arm as well as sensory feedback. Patients who have suffered from an arm amputation might now see a brighter future,” says Dr. Sassu, who is presently working at the Istituto Ortopedico Rizzoli in Italy.
    The Science Translational Medicine article illustrates how the transferred nerves progressively connected to their new hosting muscles. Once the innervation process had advanced enough, the researchers connected them to the prosthesis so the patient could control every finger of a prosthetic hand as if it would be his own (video: https://youtu.be/FdDdZQg58kc). The researchers also demonstrated how the system respond in activities of the daily life (video: https://youtu.be/yC24WRoGIe8) and are currently in the process of further improving the controllability of the bionic hand. More

  • in

    Robot team on lunar exploration tour

    On the Moon, there are raw materials that humanity could one day mine and use. Various space agencies, such as the European Space Agency (ESA), are already planning missions to better explore Earth’s satellite and find minerals. This calls for appropriate exploration vehicles. Swiss researchers led by ETH Zurich are now pursuing the idea of sending not just one solitary rover on an exploration tour, but rather an entire team of vehicles and flying devices that complement each other.
    The researchers equipped three ANYmal — a type of legged robot developed at ETH — with a range of measuring and analysis instruments that would potentially make them suitable exploration devices in the future. They tested these robots on various terrains in Switzerland and at the European Space Resources Innovation Centre (ESRIC) in Luxembourg, where, a few months ago, the Swiss team won a European competition for lunar exploration robots together with colleagues from Germany. The competition involved finding and identifying minerals on a test site modelled after the surface of the Moon. In the latest issue of the journal Science Robotics, the scientists describe how they go about exploring an unknown terrain using a team of robots.
    Insurance against failure
    “Using multiple robots has two advantages,” explains Philip Arm, a doctoral student in the group led by ETH Professor Marco Hutter. “The individual robots can take on specialised tasks and perform them simultaneously. Moreover, thanks to its redundancy, a robot team is able to compensate for a teammate’s failure.” Redundancy in this case means that important measuring equipment is installed on several robots. In other words, redundancy and specialisation are opposing goals. “Getting the benefits of both is a matter of finding the right balance,” Arm says.
    The researchers at ETH Zurich and the Universities of Basel, Bern and Zurich solved this problem by equipping two of the legged robots as specialists. One robot was programmed to be particularly good at mapping the terrain and classifying the geology. It used a laser scanner and several cameras — some of them capable of spectral analysis — to gather initial clues about the mineral composition of the rock. The other specialist robot was taught to precisely identify rocks using a Raman spectrometer and a microscopy camera.
    The third robot was a generalist: it was able to both map the terrain and identify rocks, which meant that it had a broader range of tasks than the specialists. However, its equipment meant that it could perform these tasks with less precision. “This makes it possible to complete the mission should any one of the robots malfunction,” Arm says.
    Combination is key
    At the ESRIC and ESA Space Resources Challenge, the jury was particularly impressed that the researchers had built redundancy into their exploration system to make it resilient to potential failures. As a prize, the Swiss scientists and their colleagues from the FZI Research Center for Information Technology in Karlsruhe, were awarded a one-year research contract to further develop this technology. In addition to legged robots, this work will also involve robots with wheels, building on the FZI researchers’ experience with such robots.
    “Legged robots like our ANYmal cope well in rocky and steep terrain, for example when it comes to climbing down into a crater,” explains Hendrik Kolvenbach, a senior scientist in Professor Hutter’s group. Robots with wheels are at a disadvantage in these kinds of conditions, but they can move faster on less challenging terrain. For a future mission, it would therefore make sense to combine robots that differ in terms of their mode of locomotion. Flying robots could also be added to the team.
    The researchers also plan to make the robots more autonomous. Presently, all data from the robots flows into a control centre, where an operator assigns tasks to the individual robots. In the future, semi-autonomous robots could directly assign certain tasks to each other, with control and intervention options for the operator.
    Video: https://youtu.be/bqwbQzVrzkQ More

  • in

    Generative AI ‘fools’ scientists with artificial data, bringing automated data analysis closer

    The same AI technology used to mimic human art can now synthesize artificial scientific data, advancing efforts toward fully automated data analysis.
    Researchers at the University of Illinois Urbana-Champaign have developed an AI that generates artificial data from microscopy experiments commonly used to characterize atomic-level material structures. Drawing from the technology underlying art generators, the AI allows the researchers to incorporate background noise and experimental imperfections into the generated data, allowing material features to be detected much faster and more efficiently than before.
    “Generative AIs take information and generate new things that haven’t existed before in the world, and now we’ve leveraged that for the goal of automated data analysis,” said Pinshane Huang, a U. of I. professor of materials science and engineering and a project co-lead. “What is used to make paintings of llamas in the style of Monet on the internet can now make scientific data so good it fools me and my colleagues.”
    Other forms of AI and machine learning are routinely used in materials science to assist with data analysis, but they require frequent, time-consuming human intervention. Making these analysis routines more efficient requires a large set of labeled data to show the program what to look for. Moreover, the data set needs to account for a wide range of background noise and experimental imperfections to be effective, effects that are difficult to model.
    Since collecting and labeling such a vast data set using a real microscope is infeasible, Huang worked with U. of I. physics professor Bryan Clark to develop a generative AI that could create a large set of artificial training data from a comparatively small set of real, labeled data. To achieve this, the researchers used a cycle generative adversarial network, or CycleGAN.
    “You can think of a CycleGAN as a competition between two entities,” Clark said. “There’s a ‘generator’ whose job is to imitate a provided data set, and there’s a ‘discriminator’ whose job is to spot the differences between the generator and the real data. They take turns trying to foil each other, improving themselves based on what the other was able to do. Ultimately, the generator can produce artificial data that is virtually indistinguishable from the real data.”
    By providing the CycleGAN with a small sample of real microscopy images, the AI learned to generate images that were used to train the analysis routine. It is now capable of recognizing a wide range of structural features despite the background noise and systematic imperfections.
    “The remarkable part of this is that we never had to tell the AI what things like background noise and imperfections like aberration in the microscope are,” Clark said. “That means even if there’s something that we hadn’t thought about, the CycleGAN can learn it and run with it.”
    Huang’s research group has incorporated the CycleGAN into their experiments to detect defects in two-dimensional semiconductors, a class of materials that is promising for applications in electronics and optics but is difficult to characterize without the aid of AI. However, she observed that the method has a much broader reach.
    “The dream is to one day have a ‘self-driving’ microscope, and the biggest barrier was understanding how to process the data,” she said. “Our work fills in this gap. We show how you can teach a microscope how to find interesting things without having to know what you’re looking for.” More