More stories

  • in

    Autonomous excavator constructs a 6-meter-high dry-stone wall

    Until today, dry stone wall construction has involved vast amounts of manual labour. A multidisciplinary team of ETH Zurich researchers developed a method of using an autonomous excavator to construct a dry-​stone wall that is six metres high and sixty-​five metres long. Dry stone walls are resource efficient as they use locally sourced materials, such as concrete slabs that are low in embodied energy.
    ETH Zurich researchers deployed an autonomous excavator, called HEAP, to build a six metre-high and sixty-five-metre-long dry-stone wall. The wall is embedded in a digitally planned and autonomously excavated landscape and park.
    The team of researchers included: Gramazio Kohler Research, the Robotics Systems Lab, Vision for Robotics Lab, and the Chair of Landscape Architecture. They developed this innovative design application as part of the National Centre of Competence in Research for Digital Fabrication (NCCR dfab).
    Using sensors, the excavator can autonomously draw a 3D map of the construction site and localise existing building blocks and stones for the wall’s construction. Specifically designed tools and machine vision approaches enable the excavator to scan and grab large stones in its immediate environment. It can also register their approximate weight as well as their centre of gravity.
    An algorithm determines the best position for each stone, and the excavator then conducts the task itself by placing the stones in the desired location. The autonomous machine can place 20 to 30 stones in a single consignment — about as many as one delivery could supply. More

  • in

    Hybrid transistors set stage for integration of biology and microelectronics

    Your phone may have more than 15 billion tiny transistors packed into its microprocessor chips. The transistors are made of silicon, metals like gold and copper, and insulators that together take an electric current and convert it to 1s and 0s to communicate information and store it. The transistor materials are inorganic, basically derived from rock and metal.
    But what if you could make these fundamental electronic components part biological, able to respond directly to the environment and change like living tissue?
    This is what a team at Tufts University Silklab did when they created transistors replacing the insulating material with biological silk. They reported their findings in Advanced Materials.
    Silk fibroin — the structural protein of silk fibers — can be precisely deposited onto surfaces and easily modified with other chemical and biological molecules to change its properties. Silk functionalized in this manner can pick up and detect a wide range of components from the body or environment.
    The team’s first demonstration of a prototype device used the hybrid transistors to make a highly sensitive and ultrafast breath sensor, detecting changes in humidity. Further modifications of the silk layer could enable devices to detect some cardiovascular and pulmonary diseases, as well as sleep apnea, or pick up carbon dioxide levels and other gases and molecules in the breath that might provide diagnostic information. Used with blood plasma, they could potentially provide information on levels of oxygenation and glucose, circulating antibodies, and more.
    Prior to the development of the hybrid transistors, the Silklab, led by Fiorenzo Omenetto, the Frank C. Doble Professor of engineering, had already used fibroin to make bioactive inks for fabrics that can detect changes in the environment or on the body, sensing tattoos that can be placed under the skin or on the teeth to monitor health and diet, and sensors that can be printed on any surface to detect pathogens like the virus responsible for COVID19.
    How It Works
    A transistor is simply an electrical switch, with a metal electrical lead coming in and another going out. In between the leads is the semiconductor material, so-called because it’s not able to conduct electricity unless coaxed.

    Another source of electrical input called a gate is separated from everything else by an insulator. The gate acts as the “key” to turn the transistor on and off. It triggers the on-state when a threshold voltage- which we will call “1” — creates an electric field across the insulator, priming electron movement in the semiconductor and starting the flow of current through the leads.
    In a biological hybrid transistor, a silk layer is used as the insulator, and when it absorbs moisture, it acts like a gel carrying whatever ions (electrically charged molecules) are contained within. The gate triggers the on-state by rearranging ions in the silk gel. By changing the ionic composition in the silk, the transistor operation changes, allowing it to be triggered by any gate value between zero and one.
    “You could imagine creating circuits that make use of information that is not represented by the discrete binary levels used in digital computing, but can process variable information as in analog computing, with the variation caused by changing what’s inside the silk insulator” said Omenetto. “This opens up the possibility of introducing biology into computing within modern microprocessors,” said Omenetto. Of course, the most powerful known biological computer is the brain, which processes information with variable levels of chemical and electrical signals.
    The technical challenge in creating hybrid biological transistors was to achieve silk processing at the nanoscale, down to 10nm or less than 1/10000th the diameter of a human hair. “Having achieved that, we can now make hybrid transistors with the same fabrication processes that are used for commercial chip manufacturing,” said Beom Joon Kim, postdoctoral researcher at the School of Engineering. “This means you can make a billion of these with capabilities available today.”
    Having billions of transistor nodes with connections reconfigured by biological processes in the silk could lead to microprocessors which could act like the neural networks used in AI. “Looking ahead, one could imagine have integrated circuits that train themselves, respond to environmental signals, and record memory directly in the transistors rather than sending it to separate storage,” said Omenetto.
    Devices detecting and responding to more complex biological states, as well as large-scale analog and neuromorphic computing are yet to be created. Omenetto is optimistic for future opportunities. “This opens up a new way of thinking about the interface between electronics and biology, with many important fundamental discoveries and applications ahead.” More

  • in

    AI for perovskite solar cells: Key to better manufacturing

    Tandem solar cells based on perovskite semiconductors convert sunlight to electricity more efficiently than conventional silicon solar cells. In order to make this technology ready for the market, further improvements with regard to stability and manufacturing processes are required. Researchers of Karlsruhe Institute of Technology (KIT) and of two Helmholtz platforms — Helmholtz Imaging at the German Cancer Research Center (DKFZ) and Helmholtz AI — have succeeded in finding a way to predict the quality of the perovskite layers and consequently that of the resulting solar cells: Assisted by Machine Learning and new methods in Artificial Intelligence (AI), it is possible assess their quality from variations in light emission already in the manufacturing process.
    Perovskite tandem solar cells combine a perovskite solar cell with a conventional solar cell, for example based on silicon. These cells are considered a next-generation technology: They boast an efficiency of currently more than 33 percent, which is much higher than that of conventional silicon solar cells. Moreover, they use inexpensive raw materials and are easily manufactured. To achieve this level of efficiency, an extremely thin high-grade perovskite layer, whose thickness is only a fraction of that of human hair, has to be produced. “Manufacturing these high-grade, multi-crystalline thin layers without any deficiencies or holes using low-cost and scalable methods is one of the biggest challenges,” says tenure-track professor Ulrich W. Paetzold who conducts research at the Institute of Microstructure Technology and the Light Technology Institute of KIT. Even under apparently perfect lab conditions, there may be unknown factors that cause variations in semiconductor layer quality: “This drawback eventually prevents a quick start of industrial-scale production of these highly efficient solar cells, which are needed so badly for the energy turnaround,” explains Paetzold.
    AI Finds Hidden Signs of Effective Coating
    To find the factors that influence coating, an interdisciplinary team consisting of the perovskite solar cell experts of KIT has joined forces with specialists for Machine Learning and Explainable Artificial Intelligence (XAI) of Helmholtz Imaging and Helmholtz AI at the DKFZ in Heidelberg. The researchers developed AI methods that train and analyze neural networks using a huge dataset. This dataset includes video recordings that show the photoluminescence of the thin perovskite layers during the manufacturing process. Photoluminescence refers to the radiant emission of the semiconductor layers that have been excited by an external light source. “Since even experts could not see anything particular on the thin layers, the idea was born to train an AI system for Machine Learning (Deep Learning) to detect hidden signs of good or poor coating from the millions of data items on the videos,” Lukas Klein and Sebastian Ziegler from Helmholtz Imaging at the DKFZ explain.
    To filter and analyze the widely scattered indications output by the Deep Learning AI system, the researchers subsequently relied on methods of Explainable Artificial Intelligence.
    “A Blueprint for Follow-Up Research”
    The researchers found out experimentally that the photoluminescence varies during production and that this phenomenon has an influence on the coating quality. “Key to our work was the targeted use of XAI methods to see which factors have to be changed to obtain a high-grade solar cell,” Klein and Ziegler say. This is not the usual approach. In most cases, XAI is only used as a kind of guardrail to avoid mistakes when building AI models. “This is a change of paradigm: Gaining highly relevant insights in materials science in such a systematic way is a totally new experience.” It was indeed the conclusion drawn from the photoluminescence variation that enabled the researchers to take the next step. After the neural networks had been trained accordingly, the AI was able to predict whether each solar cell would achieve a low or a high level of efficiency based on which variation of light emission occurred at what point in the manufacturing process. “These are extremely exciting results,” emphasizes Ulrich W. Paetzold. “Thanks to the combined use of AI, we have a solid clue and know which parameters need to be changed in the first place to improve production. Now we are able to conduct our experiments in a more targeted way and are no longer forced to look blindfolded for the needle in a haystack. This is a blueprint for follow-up research that also applies to many other aspects of energy research and materials science.” More

  • in

    First experimental evidence of hopfions in crystals opens up new dimension for future technology

    Hopfions, magnetic spin structures predicted decades ago, have become a hot and challenging research topic in recent years. In a study published in Nature today, the first experimental evidence is presented by a Swedish-German-Chinese research collaboration.
    “Our results are important from both a fundamental and applied point of view, as a new bridge has emerged between experimental physics and abstract mathematical theory, potentially leading to hopfions finding an application in spintronics,” says Philipp Rybakov, researcher at the Department of Physics and Astronomy at Uppsala University, Sweden.
    A deeper understanding of how different components of materials function is important for the development of innovative materials and future technology. The research field of spintronics, for example, which studies the spin of electrons, has opened up promising possibilities to combine the electrons electricity and magnetism for applications such as new electronics, etc.
    Magnetic skyrmions and hopfions are topological structures — well-localized field configurations that have been a hot research topic over the past decade owing to their unique particle-like properties, which make them promising objects for spintronic applications. Skyrmions are two-dimensional, resembling vortex-like strings, while hopfions are three-dimensional structures within a magnetic sample volume resembling closed, twisted skyrmion strings in the shape of a donut-shaped ring in the simplest case.
    Despite extensive research in recent years, direct observation of magnetic hopfions has only been reported in synthetic material. This current work is the first experimental evidence of such states stabilised in a crystal of B20-type FeGe plates using transmission electron microscopy and holography. The results are highly reproducible and in full agreement with micromagnetic simulations. The researchers provide a unified skyrmion-hopfion homotopy classification and offer an insight into the diversity of topological solitons in three-dimensional chiral magnets.
    The findings open up new fields in experimental physics: identifying other crystals in which hopfions are stable, studying how hopfions interact with electric and spin currents, hopfion dynamics, and more.
    “Since the object is new and many of its interesting properties remain to be discovered, it is difficult to make predictions about specific spintronic applications. However, we can speculate that hopfions may be of greatest interest when upgrading to the third dimension of almost any technology being developed with magnetic skyrmions: racetrack memory, neuromorphic computing, and qubits (basic unit of quantum information). Compared to skyrmions, hopfions have an additional degree of freedom due to three-dimensionality and thus can move in three rather than two dimensions,” explains Rybakov. More

  • in

    Medical AI tool gets human thumbs-up

    A new artificial intelligence computer program created by researchers at the University of Florida and NVIDIA can generate doctors’ notes so well that two physicians couldn’t tell the difference, according to an early study from both groups.
    In this proof-of-concept study, physicians reviewed patient notes — some written by actual medical doctors while others were created by the new AI program — and the physicians identified the correct author only 49% of the time.
    A team of 19 researchers from NVIDIA and the University of Florida said their findings, published Nov. 16 in the Nature journal npj Digital Medicine, open the door for AI to support health care workers with groundbreaking efficiencies.
    The researchers trained supercomputers to generate medical records based on a new model, GatorTronGPT, that functions similarly to ChatGPT. The free versions of GatorTron™ models have more than 430,000 downloads from Hugging Face, an open-source AI website. GatorTron™ models are the site’s only models available for clinical research, according to the article’s lead author Yonghui Wu, Ph.D., from the UF College of Medicine’s department of health outcomes and biomedical informatics.
    “In health care, everyone is talking about these models. GatorTron™ and GatorTronGPT are unique AI models that can power many aspects of medical research and health care. Yet, they require massive data and extensive computing power to build. We are grateful to have this supercomputer, HiPerGator, from NVIDIA to explore the potential of AI in health care,” Wu said.
    UF alumnus and NVIDIA co-founder Chris Malachowsky is the namesake of UF’s new Malachowsky Hall for Data Science & Information Technology. A public-private partnership between UF and NVIDIA helped to fund this $150 million structure. In 2021, UF upgraded its HiPerGator supercomputer to elite status with a multimillion-dollar infrastructure package from NVIDIA, the first at a university.
    For this research, Wu and his colleagues developed a large language model that allows computers to mimic natural human language. These models work well with standard writing or conversations, but medical records bring additional hurdles, such as needing to protect patients’ privacy and being highly technical. Digital medical records cannot be Googled or shared on Wikipedia.

    To overcome these obstacles, the researchers stripped UF Health medical records of identifying information from 2 million patients while keeping 82 billion useful medical words. Combining this set with another dataset of 195 billion words, they trained the GatorTronGPT model to analyze the medical data with GPT-3 architecture, or Generative Pre-trained Transformer, a form of neural network architecture. That allowed GatorTronGPT to write clinical text similar to medical doctors’ notes.
    “This GatorTronGPT model is one of the first major products from UF’s initiative to incorporate AI across the university. We are so pleased with how the partnership with NVIDIA is already bearing fruit and setting the stage for the future of medicine,” said Elizabeth Shenkman, Ph.D., a co-author and chair of UF’s department of health outcomes and biomedical informatics.
    Of the many possible uses for a medical GPT, one idea involves replacing the tedium of documentation with notes recorded and transcribed by AI. Wu says that UF has an innovation center that is pursuing a commercial version of the software.
    For an AI tool to reach such parity with human writing, programmers spend weeks programming supercomputers with clinical vocabulary and language usage based on billions upon billions of words. One resource providing the necessary clinical data is the OneFlorida+ Clinical Research Network, coordinated at UF and representing many health care systems.
    “It’s critical to have such massive amounts of UF Health clinical data not only available but ready for AI. Only a supercomputer could handle such a big dataset of 277 billion words. We are excited to implement GatorTron™ and GatorTronGPT models to real-world health care at UF Health,” said Jiang Bian, Ph.D., a co-author and UF Health’s chief data scientist and chief research information officer.
    A cross-section of 14 UF and UF Health faculty contributed to this study, including researchers from Research Computing, Integrated Data Repository Research Services within the Clinical and Translational Science Institute, and from departments and divisions within the College of Medicine, including neurosurgery, endocrinology, diabetes and metabolism, cardiovascular medicine, and health outcomes and biomedical informatics.
    The study was partially funded by grants from the Patient-Centered Outcomes Research Institute, the National Cancer Institute and the National Institute on Aging.
    Here are two paragraphs that reference two patient cases one written by a human and one created by GatorTronGPT — can you tell whether the author was machine or human? More

  • in

    Computer simulation suggests mutant strains of COVID-19 emerged in response to human behavior

    Using artificial intelligence technology and mathematical modeling, a research group led by Nagoya University has revealed that human behavior, such as lockdowns and isolation measures, affect the evolution of new strains of COVID-19. SARS-CoV-2, the virus that causes COVID-19, developed to become more transmissible earlier in its lifecycle. The researcher’s findings, published in Nature Communications, provide new insights into the relationship between how people behave and disease-causing agents.
    As with any other living organism, viruses evolve over time. Those with survival advantages become dominant in the gene pool. Many environmental factors influence this evolution, including human behavior. By isolating sick people and using lockdowns to control outbreaks, humans may alter virus evolution in complicated ways. Predicting how these changes occur is vital to develop adaptive treatments and interventions.
    An important concept in this interaction is viral load, which refers to the amount or concentration of a virus present per ml of a bodily fluid. In SARS-CoV-2, a higher viral load in respiratory secretions increases the risk of transmission through droplets. Viral load relates to the potential to transmit a virus to others. For example, a virus like Ebola has an exceptionally high viral load, whereas the common cold has a low one. However, viruses must perform a careful balancing act, as increasing the maximum viral load can be advantageous, but an excessive viral load may cause individuals to become too sick to transmit the virus to others.
    The research group led by Professor Shingo Iwami at the Nagoya University Graduate School of Science identified trends using mathematical modeling with an artificial intelligence component to investigate previously published clinical data. They found that the SARS-CoV-2 variants that were most successful at spreading had an earlier and higher peak in viral load. However, as the virus evolved from the pre-Alpha to the Delta variants, it had a shorter duration of infection. The researchers also found that the decreased incubation period and the increased proportion of asymptomatic infections recorded as the virus mutated also affected virus evolution.
    The results showed a clear difference. As the virus evolved from the Wuhan strain to the Delta strain, they found a 5-fold increase in the maximum viral load and a 1.5-fold increase in the number of days before the viral load peaked.
    Iwami and his colleagues suggest that human behavioral changes in response to the virus, designed to limit transmission, were increasing the selection pressure on the virus. This caused SARS-CoV-2 to be transmitted mainly during the asymptomatic and presymptomatic periods, which occur earlier in its infectious cycle. As a result, the viral load peak advanced to this period to spread more effectively in the earlier pre-symptomatic stages.
    When evaluating public health strategies in response to COVID-19 and any future potentially pandemic-causing pathogens, it is necessary to consider the impact of changes in human behavior on virus evolution patterns. “We expect that immune pressure from vaccinations and/or previous infections drives the evolution of SARS-CoV-2,” Iwami said. “However, our study found that human behavior can also contribute to the virus’s evolution in a more complicated manner, suggesting the need to reevaluate virus evolution.”
    Their study suggests the possibility that new strains of coronavirus evolved because of a complex interaction between clinical symptoms and human behavior. The group hopes that their research will speed up the establishment of testing regimes for adaptive treatment, effective screening, and isolation strategies. More

  • in

    How we play together

    Intense focus pervades the EEG laboratory at the University of Konstanz on this day of experimentation. In separate labs, two participants, connected by screens, engage in the computer game Pacman. The burning question: Can strangers, unable to communicate directly, synchronize their efforts to conquer the digital realm together?
    Doctoral candidate Karl-Philipp Flösch is leading today’s experiment. He states: “Our research revolves around cooperative behaviour and the adoption of social roles.” However, understanding brain processes underlying cooperative behaviour is still in its infancy, presenting a central challenge for cognitive neuroscience. How can cooperative behaviour be brought into a highly structured EEG laboratory environment without making it feel artificial or boring for study participants?
    Pacman as a scientific “playground”
    The research team, led by Harald Schupp, Professor of Biological Psychology at the University of Konstanz, envisioned using the well-known computer game Pacman as a natural medium to study cooperative behaviour in the EEG laboratory. Conducting the study as part of the Cluster of Excellence Centre for the Advanced Study of Collective Behaviour, they recently published their findings in Psychophysiology.
    “Pacman is a cultural icon. Many have navigated the voracious Pacman through mazes in their youth, aiming to devour fruits and outsmart hostile ghosts,” reminisces Karl-Philipp Flösch. Collaborating with colleagues, co-author Tobias Flaisch adapted the game. In the EEG version, two players instead of one must collaboratively guide Pacman to the goal. Flaisch explains: “Success hinges on cooperative behaviour, as players must seamlessly work together.”
    However, the researchers have built in a special hurdle: the labyrinth’s path is concealed. Only one of the two players can see where Pacman is going next. Flösch elaborates: “The active player can communicate the direction to the partner, but only indirectly using pre-agreed symbols, communicated solely through the computer screen.” If you do not remember quickly enough that a crescent moon on the screen means that Pacman should move right, and that only the banana on the keyboard can make Pacman move to the right, you’re making a mistake. “From the perspective of classical psychological research, the game combines various skills inherent in natural social situations,” notes Harald Schupp.
    EEG measures event-related potentials
    During each game, the players’ brain reactions were measured using EEG. Calculating event-related potentials provides a detailed view of the effects elicited by different game roles with millisecond-level temporal precision. The team hypothesized that the game role significantly influences brain reactions. Therefore, they examined the P3 component, a well-studied brain reaction exhibiting a stronger deflection in the presence of significant and task-relevant stimuli. The results confirmed their assumption: “The P3 was increased not only when the symbol indicated the next move’s direction but also when observing whether the game partner selected the correct symbol,” says Flösch. The team concludes that the role we take on during cooperation determines the informational value of environmental stimuli situationally. EEG measurements allow the brain processes involved to be dynamically mapped.
    “Cooperative role adoption structures our entire society,” summarizes Schupp, providing context for the study. “An individual achieves little alone, but collectively, humanity even reaches the moon. Our technological society hinges on cooperative behavior,” says Flösch, adding that children early on take individual roles, thereby learning the art of complex cooperation. Consequently, this role adoption occurs nearly effortlessly and automatically for us every day. “Our brains are practically ‘built’ for it, as evidenced by the results of our study.” More

  • in

    Long in the Bluetooth: Scientists develop a more efficient way to transmit data between our devices

    University of Sussex researchers have developed a more energy-efficient alternative to transmit data that could potentially replace Bluetooth in mobile phones and other tech devices. With more and more of us owning smart phones and wearable tech, researchers at the University of Sussex have found a more efficient way of connecting our devices and improving battery life. Applied to wearable devices, it could even see us unlocking doors by touch or exchanging phone numbers by shaking hands.
    Professor Robert Prance and Professor Daniel Roggen, of the University of Sussex, have developed the use of electric waves, rather than electromagnetic waves, for a low-power way to transmit data at close range, while maintaining the high throughput needed for multimedia applications.
    Bluetooth, Wifi, and 5G currently rely on electromagnetic modulation, a form of wireless technology which was developed over 125 years ago. In the late 19th Century, the focus was to transmit data over long distances using electromagnetic waves. By contrast, electric field modulation uses short-range electric waves, which consumes much less power than Bluetooth.
    As we tend to be in close proximity to our devices, electric field modulation offers a proven, more efficient method of connecting our devices, enabling longer lasting battery life when streaming music to headphones, taking calls, using fitness trackers, or interacting with smart home tech.
    The development could advance how we use tech in our day to day lives and evolve a wide range of futuristic applications too. For example, a bracelet using this technology could enable phone numbers to be exchanged simply by shaking hands or a door could be unlocked just by touching the handle.
    Daniel Roggen, Professor of Engineering and Design at the University of Sussex, explains:
    “We no longer need to rely on electromagnetic modulation, which is inherently battery hungry. We can improve the battery life of wearable technology and home assistants, for example, by using electric field modulation instead of Bluetooth. This solution will not only make our lives much more efficient, but it also opens novel opportunities to interact with devices in smart homes.
    “The technology is also low cost, meaning it could be rolled out to society quickly and easily. If this were mass produced, the solution can be miniaturised to a single chip and cost just a few pence per device, meaning that it could be used in all devices in the not-too-distant future.”
    The University of Sussex researchers are now seeking industrial partnerships to help further miniaturize the technology for personal devices. More