More stories

  • in

    A chip with natural blood vessels

    How can we investigate the effects of a new drug? How can we better understand the interaction between different organs to grasp the systemic response? In biomedical research, so-called organs-on-a-chip, also referred to as microphysiological systems, are becoming increasingly important: by cultivating tissue structures in precisely controlled microfluidic chips, it is possible to conduct research much more accurately than in experiments involving living humans or animals.
    However, there has been a major obstacle: such mini-organs are incomplete without blood vessels. To facilitate systematic studies and ensure meaningful comparisons with living organisms, a network of perfusable blood vessels and capillaries must be created — in a way that is precisely controllable and reproducible. This is exactly what has now been achieved at TU Wien: the team established a method using ultrashort laser pulses to create tiny blood vessels in a rapid and reproducible manner. Experiments show that these vessels behave just like those in living tissue. Liver lobules have been created on a chip with great success.
    Real Cells in Artificial Microchannels
    “If you want to study how certain drugs are transported, metabolized and absorbed in different human tissues, you need the finest vascular networks,” says Alice Salvadori, a member of the Research Group 3D Printing and Biofabrication established by Prof. Aleksandr Ovsianikov at TU Wien.
    Ideally such blood vessels have to be created directly within special materials called hydrogels. Hydrogels provide structural support for living cells, while being permeable similarly to natural tissues. By creating tiny channels within these hydrogels, it becomes possible to guide the formation of blood vessel-like structures: endothelial cells — the cells that line the inside of real blood vessels in the human body — can settle inside these channel networks. This creates a model that closely mimics the structure and function of natural blood vessels.
    The major challenge so far has been geometry: the shape and size of these microvascular networks have been difficult to control. In self-organization based approaches, vessel geometry varies significantly from one sample to another. This makes it impossible to run reproducible, precisely controlled experiments — yet that is exactly what is needed for reliable biomedical research.
    Improved Hydrogel and Laser Precision
    The team at TU Wien therefore relied on advanced laser technology: with the help of ultrashort laser pulses in the femtosecond range, highly precise3D structures can be written directly into the hydrogel — quickly and efficiently.

    “We can create channels spaced only a hundred micrometers apart. That’s essential when you would like to replicate the natural density of blood vessels in specific organs,” says Aleksandr Ovsianikov.
    But it’s not just about precision: the artificial blood vessels have to be formed quickly and also remain structurally stable once they are populated with living cells. “We know that cells actively remodel their environment. That can lead to deformations or even to the collapse of vessels,” explains Alice Salvadori. “That’s why we also improved the material preparation process.”
    Instead of using the standard single-step gelation method, the team used a two-step thermal curing process: the hydrogel is warmed in two phases, using different temperature, rather than just one. This alters its network structure, producing a more stable material. The vessels formed within such material remain open and maintain their shape over time.
    “We have not only shown that we can produce artificial blood vessels that can actually be perfused. The even more important thing is: We have developed a scalable technology that can be used on an industrial scale,” says Aleksanr Ovsianikov. “It takes only 10 minutes to pattern 30 channels, which is at least 60 times faster than other techniques.”
    Simulating Inflammation: Natural Reactions on a Chip
    If biological processes are to be realistically modeled on a chip, the artificial tissues must behave like their natural counterparts. And this, too, has now been demonstrated:
    “We showed that these artificial blood vessels are colonized by endothelial cells that respond just like real ones in the body,” says Alice Salvadori. “For example, they react to inflammation in the same way — becoming more permeable, just like real blood vessels.”

    This marks an important step toward establishing lab-on-a-chip technology as an industrial standard in many fields of medical research.
    Big Success with Liver Tissue
    “Using this approach, we were able to vascularize a liver model. In collaboration with Keio University (Japan), we developed a liver lobule-on-chip that incorporates a controlled 3D vascular network, closely mimicking the in vivo arrangement of the central vein and sinusoids,” says Aleksandr Ovsianikov.
    “Replicating the liver’s dense and intricate microvasculature has long been a challenge in organ-on-chip research. By building multiple layers of microvessels spanning the entire tissue volume, we were able to ensure adequate nutrient and oxygen supply — which, in turn, led to improved metabolic activity in the liver model. We believe that these advancements bring us a step closer to integrating Organ-on-a-chip technology into preclinical drug discovery,” says Masafumi Watanabe (Keio University).
    “OoC technology and advanced laser technology work well together to create more reliable models of blood vessels and liver tissues. One important breakthrough is the ability to build tiny tissues on a chip that allow liquid to flow through them, similar to how blood flows in the body. This helps researchers better understand how blood flow affects cells. OoC technology also makes it possible to closely observe how cells react under a microscope. These models will help scientists study how the body works and may lead to better treatments and healthcare in the future,” says Prof. Ryo Sudo at Keio University. More

  • in

    Emotional responses crucial to attitudes about self-driving cars

    When it comes to public attitudes toward using self-driving cars, understanding how the vehicles work is important — but so are less obvious characteristics like feelings of excitement or pleasure and a belief in technology’s social benefits.
    Those are key insights of a new study from researchers at Washington State University, who are examining attitudes toward self-driving cars as the technology creeps toward the commercial market — and as questions persist about whether people will readily adopt them.
    The study, published in the journal Transportation Research, surveyed 323 people on their perceptions of autonomous vehicles. Researchers found that considerations such as how much people understand and trust the cars are important in determining whether they would eventually choose to use them.
    “But in addition, we found that some of the non-functional aspects of autonomous vehicles are also very important,” said Wei Peng, an assistant professor in the Edward R. Murrow College of Communication at WSU.
    These included the emotional value associated with using the cars, such as feelings of excitement, enjoyment or novelty; beliefs about the broader impact on society; and curiosity about learning how the technology works and its potential role in the future, Peng said.
    In addition, they found that respondents would want to give the technology a test drive before adopting it.
    “This is not something where you watch the news and say, ‘I want to buy it or I want to use it,'” Peng said. “People want to try it first.”
    The new paper is the latest research on the subject from Peng and doctoral student Kathryn Robinson-Tay. In a paper published in 2023, they examined whether people believed the vehicles were safe, finding that simply knowing more about how the cars work did not improve perceptions about risk — people needed to have more trust in them, too.

    The new study examined the next step in the decision-making chain: What would motivate people to actually use an autonomous vehicle?
    Answering that question is important as the technology moves toward becoming a reality on the roads. Already, carmakers are adding autonomous features to models, and self-driving taxis have begun operating in a handful of U.S, cities, such as Phoenix, San Francisco and Los Angeles. Fully self-driving vehicles could become available by 2035.
    It is estimated they could prevent 90% of accidents while improving mobility for people with limited access to transportation. However, achieving those benefits would require widespread, rapid adoption — a big hurdle given that public attitudes toward the cars have been persistently negative and the rollout of “robotaxies” have been bumpy, with some high-profile accidents and recalls. In a national survey by AAA released in February, 60 percent of respondents said they were afraid to use the cars.
    Widespread adoption would be crucial because roadways shared by self-driving and human-driven cars may not bring about safety improvements, in part because self-drivers may not be able to predict and respond to unpredictable human drivers.
    One surprise in the study is that respondents did not trust vehicles more when they discovered they were easy to use — which opens a new question for future research: “What is it about thinking the car is easy to use that makes people trust it less?” Robinson-Tay asked.
    Attitudes about self-driving cars depend heavily on individual circumstances, and can be nuanced in surprising ways. For example, those with a strong “car-authority identity” — a personal investment in driving and displaying knowledge about automobiles — and more knowledge about self-driving cars were more likely to believe the cars would be easy to use.

    But respondents with more knowledge were less likely to view the cars as useful — a separate variable from ease of use.
    Other considerations also play a role. Those who can’t drive due to disability or other reasons may have a stronger motivation to use them, as might drivers with significant concerns about heavy traffic or driving in inclement weather.
    “If I really worry about snowy weather, like we experience in Pullman in winter, is it going to help?” Peng said. “If I really worry about weather, I might get a car like that if it would help me steer clear of dangerous weather conditions.” More

  • in

    Controlling quantum motion and hyper-entanglement

    Manuel Endres, professor of physics at Caltech, specializes in finely controlling single atoms using devices known as optical tweezers. He and his colleagues use the tweezers, made of laser light, to manipulate individual atoms within an array of atoms to study fundamental properties of quantum systems. Their experiments have led to, among other advances, new techniques for erasing errors in simple quantum machines; a new device that could lead to the world’s most precise clocks; and a record-breaking quantum system controlling more than 6,000 individual atoms.
    One nagging factor in this line of work has been the normal jiggling motion of atoms, which make the systems harder to control. Now, reporting in the journal Science, the team has flipped the problem on its head and used this atomic motion to encode quantum information, a process underlying quantum technologies.
    “We show that atomic motion, which is typically treated as a source of unwanted noise in quantum systems, can be turned into a strength,” says Adam Shaw (PhD ’24), a co-lead author on the study along with Pascal Scholl and Ran Finkelstein. Shaw was formerly a graduate student at Caltech during these experiments and is now a postdoctoral scholar at Stanford University. Scholl served as a postdoc at Caltech and is now working at the quantum computing company Pasqal. Finkelstein held the Troesh Postdoctoral Prize Fellowship at Caltech and is now a professor at Tel Aviv University.
    Ultimately, the experiment not only encoded quantum information in the motion of the atoms but also led to a state known as hyper-entanglement. In basic entanglement, two particles remain connected even when separated by vast distances. When researchers measure the particles’ states, they observe this correlation: For example, if one particle is in a state known as spin up (in which the orientation of the angular momentum is pointing up), the other will always be spin down.
    In hyper-entanglement, two characteristics of a particle pair are correlated. As a simple analogy, this would be like a set of twins separated at birth having both the same names and same types of cars: The two traits are correlated between the twins. In the new study, Endres and his team were able to hyper-entangle pairs of atoms such that their individual states of motion and their individual electronic states — their internal energy levels — were correlated among the atoms. What is more, this experimental demonstration implies that even more traits could be entangled at the same time.
    “This allows us to encode more quantum information per atom,” Endres explains. “You get more entanglement with fewer resources.”
    The experiment is the first demonstration of hyper-entanglement in massive particles, such as neutral atoms or ions (earlier demonstrations used photons).

    For these experiments, the team cooled down an array of individual alkaline-earth neutral atoms confined inside optical tweezers. They demonstrated a novel form of cooling via “detection and subsequent active correction of thermal motional excitations,” says Endres, which he compares to James Clerk Maxwell’s famous 1867 thought experiment invoking a demon that measures and sorts particles in a chamber. “We essentially measure the motion of each atom and apply an operation depending on the outcome, atom-by-atom, similar to Maxwell’s demon.”
    The method, which outperformed the best-known laser cooling techniques, caused the atoms to come to nearly a complete standstill.
    From there, the researchers induced the atoms to oscillate like a swinging pendulum, but with an amplitude of approximately100 nanometers, which is much smaller than the width of a human hair. They were able to excite the atoms into two distinct oscillations simultaneously, causing the motion to be in a state of superposition. Superposition is a quantum state in which a particle exhibits opposite traits simultaneously, like a particle’s spin being both up and down at the same time.
    “You can think of an atom moving in this superposition state like a kid on a swing who starts getting pushed by two parents on opposite sides, but simultaneously,” Endres says. “In our everyday world, this would certainly lead to a parental conflict; in the quantum world, we can remarkably make use of this!”
    They then entangled the individual, swinging atoms to partner atoms, creating a correlated state of motion over several micrometers of distance. After the atoms were entangled, the team then hyper-entangled them in such a way that both the motion and the electronic states of the atoms were correlated.
    “Basically, the goal here was to push the boundaries on how much we could control these atoms,” Endres says. “We are essentially building a toolbox: We knew how to control the electrons within an atom, and we now learned how to control the external motion of the atom as a whole. It’s like an atom toy that you have fully mastered.”
    The findings could lead to new ways to perform quantum computing as well as quantum simulations designed to probe fundamental questions in physics. “Motional states could become a powerful resource for quantum technology, from computing to simulation to precision measurements,” Endres says. More

  • in

    New biosensor solves old quantum riddle

    Putting hypersensitive quantum sensors in a living cell is a promising path for tracking cell growth and diagnosing diseases — even cancers — in their early stages.
    Many of the best, most powerful quantum sensors can be created in small bits of diamond, but that leads to a separate issue: It’s hard to stick a diamond in a cell and get it to work.
    “All kinds of those processes that you really need to probe on a molecular level, you cannot use something very big. You have to go inside the cell. For that, we need nanoparticles,” said University of Chicago Pritzker School of Molecular Engineering PhD candidate Uri Zvi. “People have used diamond nanocrystals as biosensors before, but they discovered that they perform worse than what we would expect. Significantly worse.”
    Zvi is the first author of a paper published in Proceedings of the National Academy of Sciences that tackles this issue. Together with researchers from UChicago PME and the University of Iowa, Zvi united insights from cellular biology, quantum computing, old-fashioned semiconductors and high-definition TVs to both create a revolutionary new quantum biosensor. In doing so, they shed light on a longstanding mystery in quantum materials.
    By encasing a diamond nanoparticle with a specially engineered shell — a technique inspired by QLED televisions — the team created not only a quantum biosensor ideal for a living cell, but also uncovered new insights into how a material’s surface can be modified to enhance its quantum properties.
    “It’s already one of the most sensitive things on earth, and now they’ve figured out a way to enhance that further in a number of different environments,” said Zvi’s principal investigator, UChicago PME Prof. Aaron Esser-Kahn, a co-author of the paper.
    A cell full of diamonds
    Qubits hosted in diamond nanocrystals maintain quantum coherence even when the particles are small enough to be “taken up” by a living cell — a good metaphor is the cell swallowing and chewing on them without spitting them out. But the smaller the diamond particles, the weaker the quantum signal.

    “It excited people for a while that these quantum sensors can be brought into living cells and, in principle, be useful as a sensor,” said UChicago PME Asst. Prof. Peter Maurer, a co-author of the paper. “However, while these kind of quantum sensors inside of a big piece of bulk diamond have really good quantum properties, when they are in nano diamonds, the coherent properties, the quantum properties, are actually significantly reduced.”
    Here, Zvi turned to an unlikely source for inspiration — quantum dot LED televisions. QLED TVs use vibrant fluorescent quantum dots to broadcast in rich, full colors. In the early days, the colors were bright but unstable, prone to suddenly blinking off.
    “Researchers found that surrounding the quantum dots with carefully designed shells suppresses detrimental surface effects and increase their emission,” Zvi said. “And today you can use a previously unstable quantum dot as part of your TV.”
    Working with UChicago PME and Chemistry Department quantum dot expert Prof. Dmitri Talapin, a co-author of the paper, Zvi reasoned that since both sets of issues — the quantum dots’ fluorescence and the nanodiamond weakened signal — originated with the surface state, a similar approach might work.
    But since the sensor is meant to go within a living body, not every shell would work. An immunoengineering expert, Esser-Kahn helped develop a silicon-oxygen (siloxane) shell that would both enhance the quantum properties and not tip off the immune system that something is awry.
    “The surface properties of most of these materials are sticky and disordered in a way that the immune cells can tell it’s not supposed to be there. They look like a foreign object to an immune cell,” Esser-Kahn said. “Siloxane-coated things look like a big, smooth blob of water. And so the body is much more happy to engulf and then chew on a particle like that.”
    Previous efforts to improve the quantum properties of diamond nanocrystals through surface engineering had shown limited success. As a result, the team expected only modest gains. Instead, they saw up to fourfold improvements in spin coherence.

    That increase — as well as a 1.8-fold increase in fluorescence and separate significant increases to charge stability — was a riddle both baffling and enthralling.
    Better and better
    “I would try to go to bed at night but stay up thinking ‘What’s happening there? The spin coherence is getting better — but why?” said University of Iowa Asst. Prof. Denis Candido, second author of the new paper. “I’d think ‘What if we do this experiment? What if we do this calculation?’ It was very, very exciting, and in the end, we found the underlying reason for the improvement of the coherence.”
    The interdisciplinary team — bioengineer-turned-quantum-scientist Zvi, immunoengineer Esser-Kahn and quantum engineers Maurer and Talapin — brought Candido and University of Iowa Physics and Astronomy Prof. Michael Flatté in to provide some of the theoretical framework for the research.
    “What I found really exciting about this is that some old ideas that were critical for semiconductor electronic technology turned out to be really important for these new quantum systems,” Flatté said.
    They found that adding the silica shell didn’t just protect the diamond surface. It fundamentally altered the quantum behavior inside. The material interface was driving electron transfer from the diamond into the shell. Depleting electrons from the atoms and molecules that normally reduce the quantum coherence made a more sensitive and stable way to read signals from living cells.
    This enabled the team to identify the specific surface sites that degrade coherence and make quantum devices less effective — solving a long-standing mystery in the quantum sensing field and opening new doors for both engineering innovation and fundamental research.
    “The end impact is not just a better sensor, but a new, quantitative framework for engineering coherence and charge stability in quantum nanomaterials,” Zvi said. More

  • in

    Quantum eyes on energy loss: Diamond quantum imaging for next-gen power electronics

    Diamond quantum sensors can be used to analyze the magnetization response of soft magnetic materials used in power electronics; report scientists based on collaborative research. Using a novel imaging technique, they developed quantum protocols to simultaneously image both the amplitude and phase of AC stray fields over a wide frequency range up to 2.3 MHz. Their results demonstrate that quantum sensing is a powerful tool for developing advanced magnetic materials across diverse applications.
    Improving energy conversion efficiency in power electronics is vital for a sustainable society, with wide-bandgap semiconductors like GaN and SiC power devices offering advantages due to their high-frequency capabilities. However, energy losses in passive components at high frequencies hinder efficiency and miniaturization. This underscores the need for advanced soft magnetic materials with lower energy losses.
    In a recent study published in Communications Materials, a research team led by Professor Mutsuko Hatano from the School of Engineering, Institute of Science Tokyo, Japan, developed a novel method for analyzing such losses by simultaneously imaging the amplitude and phase of alternating current (AC) stray fields, which are key to understanding hysteresis losses. Using a diamond quantum sensor with nitrogen-vacancy (NV) centers and developing two protocols — Qubit Frequency Tracking (Qurack) for kHz and quantum heterodyne (Qdyne) imaging for MHz frequencies — they realized wide-range AC magnetic field imaging. This study was carried out in collaboration with Harvard University and Hitachi, Ltd.
    The researchers conducted a proof-of-principle wide-frequency-range magnetic field imaging experiment by applying an AC current to a 50-turn coil and sweeping the frequency from 100 Hz to 200 kHz for Qurack and 237 kHz to 2.34 MHz for Qdyne. As expected, the uniform AC Ampere magnetic field’s amplitude and phase were imaged using NV centers with high spatial resolution (2-5 µm), validating both measurement protocols.
    Using this innovative imaging system, the team could simultaneously map the amplitude and phase of stray magnetic fields from the CoFeB-SiO2 thin films, which have been developed for high-frequency inductors. Their findings revealed that these films exhibit near-zero phase delay up to 2.3 MHz, indicating negligible energy losses along the hard axis. Moreover, they observed that energy loss depends on the material’s magnetic anisotropy — when magnetization is driven along the easy axis, phase delay increases with frequency, signifying higher energy dissipation.
    Overall, the results showcase how quantum sensing can be used to analyze soft magnetic materials operating at higher frequencies, which is considered to be a major challenge in developing highly efficient electronic systems. Notably, the capacity to resolve domain wall motion, one of the magnetization mechanisms strongly related to energy losses, is a pivotal step, leading to important practical advances and optimizations in electronics.
    Looking forward, the researchers hope to further improve the proposed techniques in various ways. “The Qurack and Qdyne techniques used in this study can be enhanced by engineering improvements,” says Hatano. “Qurack’s performance can be enhanced by adopting high-performance signal generators to extend its amplitude range, whereas optimizing spin coherence time and microwave control speed would broaden Qdyne’s frequency detection range.”
    “Simultaneous imaging of the amplitude and phase of AC magnetic fields across a broad frequency range offers numerous potential applications in power electronics, electromagnets, non-volatile memory, and spintronics technologies,” remarks Hatano. “This success contributes to the acceleration of quantum technologies, particularly in sectors related to sustainable development goals and well-being.” More

  • in

    Efficiency upgrade for OLED screens: A route to blue PHOLED longevity

    Blue phosphorescent OLEDs can now last as long as the green phosphorescent OLEDs already in devices, University of Michigan researchers have demonstrated, paving the way for further improving the energy efficiency of OLED screens.
    “This moves the blues into the domain of green lifetimes,” said Stephen Forrest, the Peter A. Franken Distinguished University Professor of Electrical Engineering and corresponding author of the study in Nature Photonics.
    “I can’t say the problem is completely solved — of course it’s not solved until it enters your display — but I think we’ve shown the path to a real solution that has been evading the community for two decades.”
    OLED screens are standard in flagship smartphones and high-end televisions, providing high contrast and energy efficiency as variations in brightness are achieved by the light emitters rather than a liquid crystal layer over the top. However, not all OLEDs are equally energy efficient.
    In current displays, red and green OLEDs produce light through the highly efficient phosphorescent route, whereas blue OLEDs still use fluorescence. This means while red and green OLEDs have a theoretical maximum of one photon for every electron running through the device, blue OLEDs cap out at a far lower efficiency.
    The trouble is that blue light is the highest energy that an RGB device must produce: The molecules in blue phosphorescent OLEDs (PHOLEDs) need to handle higher energies than their red and green counterparts. Most of the energy leaves in the form of blue light, but when it is trapped, it can instead break down the color-producing molecules.
    Previously, Forrest’s team discovered that there was a way to get that trapped energy out faster by including a coating on the negative electrode that helps the energy convert into blue light. Haonan Zhao, a recent Ph.D. graduate in physics, said it was like creating a fast lane.

    “On a road that doesn’t have enough lanes, impatient drivers can crash into one another, cutting off all traffic — just like two excitons bumping into one another create a lot of hot energy that destroys the molecule,” said Zhao, first author of that study as well as the new one. “The plasmon exciton polariton is our optical design for an exciton fast lane.”
    The details are based in quantum mechanics. When an electron comes in through the negative electrode, it creates what’s called an excited state in one of the molecules that produces blue light. That state is a negatively charged electron that jumps into a higher energy level and a positively charged “hole” that the electron leaves behind — together, they make an exciton.
    Ideally, the electron would quickly jump back to its original state and fire off a blue photon, but excitons that use the phosphorescent route tend to hang around. Simply relaxing into their original state would violate a law of quantum mechanics. However, excitons very near the electrode produce photons faster because the shiny surface supports another quantum quasiparticle — surface plasmons. These are like ripples in the pond of electrons on the surface of the metal.
    If the exciton in the light-emitting material is close enough to the electrode, it gets a little help with the conversion to blue light because it can dump its energy into a surface plasmon — a phenomenon known as the Purcell effect. It does this because the exciton oscillates a little like a broadcast antenna, which creates waves in the electrons in the electrode. This isn’t automatically helpful, though, as not all surface plasmons produce photons. To get the photon, the exciton must attach itself to the surface plasmon, producing a plasmon exciton polariton.
    Forrest’s team encouraged this route by adding a thin layer of a carbon-based semiconductor onto the shiny electrode that encourages the exciton to transfer its energy and resonate in the right way. It also extends the effect deeper into the light-emitting material, so excitons further from the electrode can benefit.
    The team reported on this last year, and they have since been putting this effect together with other approaches to finally produce a blue PHOLED that can last as long and burn as bright as a green one. These are the highlights of the design: Two light-emitting layers (a tandem OLED): This cuts the light-emitting burden of each layer in half, reducing the odds that two excitons merge.

    Adding a layer that helps the excitons resonate with surface plasmons near both electrodes, so that both emitting layers have access to the fast lane The whole structure is an optical cavity, in which blue light resonates between the two mirror-like electrodes. This pushes the color of the photons deeper into the blue range. This study was supported in part by the Department of Energy and Universal Display Corporation.
    Claire Arneson, a Ph.D. student in physics at U-M, also contributed to this study.
    The device was built in the Lurie Nanofabrication Facility and studied at the Michigan Center for Materials Characterization.
    The team has patented the technology with the assistance of U-M Innovation Partnerships and has licensed it to Universal Display Corp. Forrest and the University of Michigan have a financial interest in Universal Display Corp.
    Forrest is also the Paul G. Goebel Professor of Engineering and a professor of electrical computer engineering, materials science and engineering, physics and applied physics. More

  • in

    Charge radius of Helium-3 measured with unprecedented precision

    A research team led by Professor Randolf Pohl from the Institute of Physics at Johannes Gutenberg University Mainz (JGU) has achieved a significant breakthrough in determining fundamental properties of atomic nuclei. For the first time, the team conducted laser spectroscopy experiments on muonic helium-3 at the Paul Scherrer Institute in Switzerland. Muonic helium-3 is a special form of helium in which the atom’s two electrons are replaced by a single, much heavier muon. Yesterday, the results have been published in the journal Science.
    “Our experiments with muonic helium-3 provide the most accurate value to date for the charge radius of this nucleus,” says Randolf Pohl, who is also a member of the PRISMA+ Cluster of Excellence at JGU. “This is primarily because the heavy muon orbits much closer to the atomic nucleus than electrons in regular atoms, making it far more sensitive to the nucleus’s size and structure.” Helium nuclei always consist of two protons — this is what defines them as helium. Different isotopes are distinguished by the number of neutrons in the nucleus: helium-3 contains one neutron alongside the two protons, while the heavier helium-4 contains two neutrons. Professor Pohl’s team had already successfully measured helium-4 using laser spectroscopy and muons several years ago.
    Measurement Confirms Theoretical Models
    Traditionally, nuclear radii are determined using particle accelerators, such as MAMI at JGU or, in the future, MESA. However, the new value obtained from muonic helium measurements is fifteen times more precise, coming in at 1.97007 ± 0.00097 femtometers. Laser spectroscopy with electrons has previously been successfully applied to the lightest atomic nuclei, such as hydrogen and deuterium. For helium, highly precise measurements also exist, but the presence of two electrons in the helium atom makes theoretical calculations more complex, preventing accurate determination of the nuclear radius from such measurements alone. Nevertheless, it has already been possible to determine the difference in charge radii between various helium isotopes (nuclei with the same number of protons but different numbers of neutrons). The new results from muonic helium measurements align well with recent experiments on regular helium conducted by a research team in Amsterdam, also published in Science, yesterday.
    “In combination with our earlier results on muonic helium-4, which were published in Nature in 2021, we have now been able to precisely determine the difference in charge radii between helium-3 and helium-4 — an important advancement,” says Pohl.
    Close Interaction Between Theory and Experiment in the PRISMA+ Cluster of Excellence
    The strong agreement between the measurements conducted by the Mainz and Amsterdam teams confirms existing knowledge about the nuclear physics of the lightest atomic nuclei. This knowledge is based in part on key theoretical calculations of nuclear structure, also carried out within the PRISMA+ Cluster of Excellence.

    Professor Sonia Bacca’s team calculated the influence of the muon on the structure of the helium nucleus, while Professor Marc Vanderhaeghen and Dr. Franziska Hagelstein explored the roles of protons and neutrons. These theoretical foundations enabled the extraction of reliable information about the atomic nucleus from the precise experimental data.
    “Accurate knowledge of nuclear charge radii is essential for determining fundamental physical constants such as the Rydberg constant,” Pohl explains. “It is also crucial for the search for new physics — particles and forces not yet included in the Standard Model. The previous lack of precise data in this area introduced significant uncertainties.”
    Precise Measurement of Additional Atomic Nuclei Planned
    Looking ahead, the team of experimental and theoretical physicists at Mainz plans to apply their methods to other atomic nuclei — from lithium to neon — with ten times the accuracy compared to particle accelerator-based methods. Instead of lasers, they will use innovative X-ray detectors. This work, like the previous experiments led by Pohl’s group, is supported by the German Research Foundation (DFG) as part of the Collaborative Research Center 1660 at JGU. More

  • in

    Researchers make breakthrough in semiconductor technology set to supercharge 6G delivery

    Self-driving cars which eliminate traffic jams, getting a healthcare diagnosis instantly without leaving your home, or feeling the touch of loved ones based across the continent may sound like the stuff of science fiction.
    But new research, led by the University of Bristol and published today in the journal Nature Electronics, could make all this and more a step closer to reality thanks to a radical breakthrough in semiconductor technology.
    The futuristic concepts rely on the ability to communicate and transfer vast volumes of data much faster than existing networks. So physicists have developed an innovative way to accelerate this process between scores of users, potentially across the globe.
    Co-lead author Martin Kuball, Professor of Physics at the University of Bristol, said: “Within the next decade, previously almost unimaginable technologies to transform a wide range of human experiences could be widely available. The possible benefits are also far-reaching, including advances in healthcare with remote diagnostics and surgery, virtual classrooms and even virtual holiday tourism.
    “In addition, there is considerable potential for advanced driver assistance systems to improve road safety and industrial automation for greater efficiency. The list of possible 6G applications is endless, with the limit just being human imagination. So our innovative semiconductor discoveries are hugely exciting and will help drive forward these developments at speed and scale.”
    It is widely acknowledged that the shift from 5G to 6G will demand a radical upgrade of semiconductor technology, circuits, systems, and associated algorithms. For instance, the main semiconductor components involved, in other words the radio frequency amplifiers made from a wonder conductor called Gallium Nitride (GaN), need to be much quicker, emit greater power, and be more reliable.
    The team of international scientists and engineers has tested a new architecture, catapulting these special GaN amplifiers to unprecedented heights. This was achieved by discovering a latch-effect in GaN, which unlocked a much greater radio frequency device performance. These next generation devices use parallel channels which then require the use of sub-100nm side fins — a type of transistor which controls the flow of current passing through the devices.

    Co-lead author Dr Akhil Shaji, Honorary Research Associate at the University of Bristol, explained: “We have piloted a device technology, working with collaborators, called superlattice castellated field effect transistors (SLCFETs), in which more than 1000 fins with sub-100 nm width help drive the current. Although SLCFETs have demonstrated the highest performance in the W-band frequency range, equating to 75 gigahertz -110 GHz, the physics behind it was unknown.
    “We recognised it was a latch-effect in GaN, which enables the high radio frequency performance.”
    The researchers then needed to pinpoint exactly where this effect occurred, by simultaneously using ultra precision electrical measurements and optical microscopy, so it could be further studied and understood. After analysing more than 1,000 fins findings located this effect to the widest fin.
    Prof Kuball, who is also Royal Academy of Engineering Chair in Emerging Technologies, added: “We also developed a 3D model using a simulator to further verify our observations. The next challenge was to study the reliability aspects of latch effect for practical applications. The rigorous testing of the device over a long duration of time showed it has no detrimental effect on device reliability or performance.
    “We found a key aspect driving this reliability was a thin layer of dielectric coating around each of the fins. But the main takeaway was clear — the latch effect can be exploited for countless practical applications, which could help transform people’s lives in many different ways in years to come.”
    Next steps for the work include further increasing the power density the devices can deliver, so they can offer even higher performance and serve wider audiences. Industry partners will also be bringing such next generation devices to a commercial market.
    Researchers at the University of Bristol are at the forefront of improving electrical performance and efficiency in a wide range of different applications and settings.
    Professor Kuball leads the Centre for Device Thermography and Reliability (CDTR), which is developing next generation semiconductor electronic devices for net zero, and for communications and radar technology. It also works on improving device thermal management, electrical performance and reliability, using wide and ultra-wide bandgap semiconductors. More