More stories

  • in

    New biosensor solves old quantum riddle

    Putting hypersensitive quantum sensors in a living cell is a promising path for tracking cell growth and diagnosing diseases — even cancers — in their early stages.
    Many of the best, most powerful quantum sensors can be created in small bits of diamond, but that leads to a separate issue: It’s hard to stick a diamond in a cell and get it to work.
    “All kinds of those processes that you really need to probe on a molecular level, you cannot use something very big. You have to go inside the cell. For that, we need nanoparticles,” said University of Chicago Pritzker School of Molecular Engineering PhD candidate Uri Zvi. “People have used diamond nanocrystals as biosensors before, but they discovered that they perform worse than what we would expect. Significantly worse.”
    Zvi is the first author of a paper published in Proceedings of the National Academy of Sciences that tackles this issue. Together with researchers from UChicago PME and the University of Iowa, Zvi united insights from cellular biology, quantum computing, old-fashioned semiconductors and high-definition TVs to both create a revolutionary new quantum biosensor. In doing so, they shed light on a longstanding mystery in quantum materials.
    By encasing a diamond nanoparticle with a specially engineered shell — a technique inspired by QLED televisions — the team created not only a quantum biosensor ideal for a living cell, but also uncovered new insights into how a material’s surface can be modified to enhance its quantum properties.
    “It’s already one of the most sensitive things on earth, and now they’ve figured out a way to enhance that further in a number of different environments,” said Zvi’s principal investigator, UChicago PME Prof. Aaron Esser-Kahn, a co-author of the paper.
    A cell full of diamonds
    Qubits hosted in diamond nanocrystals maintain quantum coherence even when the particles are small enough to be “taken up” by a living cell — a good metaphor is the cell swallowing and chewing on them without spitting them out. But the smaller the diamond particles, the weaker the quantum signal.

    “It excited people for a while that these quantum sensors can be brought into living cells and, in principle, be useful as a sensor,” said UChicago PME Asst. Prof. Peter Maurer, a co-author of the paper. “However, while these kind of quantum sensors inside of a big piece of bulk diamond have really good quantum properties, when they are in nano diamonds, the coherent properties, the quantum properties, are actually significantly reduced.”
    Here, Zvi turned to an unlikely source for inspiration — quantum dot LED televisions. QLED TVs use vibrant fluorescent quantum dots to broadcast in rich, full colors. In the early days, the colors were bright but unstable, prone to suddenly blinking off.
    “Researchers found that surrounding the quantum dots with carefully designed shells suppresses detrimental surface effects and increase their emission,” Zvi said. “And today you can use a previously unstable quantum dot as part of your TV.”
    Working with UChicago PME and Chemistry Department quantum dot expert Prof. Dmitri Talapin, a co-author of the paper, Zvi reasoned that since both sets of issues — the quantum dots’ fluorescence and the nanodiamond weakened signal — originated with the surface state, a similar approach might work.
    But since the sensor is meant to go within a living body, not every shell would work. An immunoengineering expert, Esser-Kahn helped develop a silicon-oxygen (siloxane) shell that would both enhance the quantum properties and not tip off the immune system that something is awry.
    “The surface properties of most of these materials are sticky and disordered in a way that the immune cells can tell it’s not supposed to be there. They look like a foreign object to an immune cell,” Esser-Kahn said. “Siloxane-coated things look like a big, smooth blob of water. And so the body is much more happy to engulf and then chew on a particle like that.”
    Previous efforts to improve the quantum properties of diamond nanocrystals through surface engineering had shown limited success. As a result, the team expected only modest gains. Instead, they saw up to fourfold improvements in spin coherence.

    That increase — as well as a 1.8-fold increase in fluorescence and separate significant increases to charge stability — was a riddle both baffling and enthralling.
    Better and better
    “I would try to go to bed at night but stay up thinking ‘What’s happening there? The spin coherence is getting better — but why?” said University of Iowa Asst. Prof. Denis Candido, second author of the new paper. “I’d think ‘What if we do this experiment? What if we do this calculation?’ It was very, very exciting, and in the end, we found the underlying reason for the improvement of the coherence.”
    The interdisciplinary team — bioengineer-turned-quantum-scientist Zvi, immunoengineer Esser-Kahn and quantum engineers Maurer and Talapin — brought Candido and University of Iowa Physics and Astronomy Prof. Michael Flatté in to provide some of the theoretical framework for the research.
    “What I found really exciting about this is that some old ideas that were critical for semiconductor electronic technology turned out to be really important for these new quantum systems,” Flatté said.
    They found that adding the silica shell didn’t just protect the diamond surface. It fundamentally altered the quantum behavior inside. The material interface was driving electron transfer from the diamond into the shell. Depleting electrons from the atoms and molecules that normally reduce the quantum coherence made a more sensitive and stable way to read signals from living cells.
    This enabled the team to identify the specific surface sites that degrade coherence and make quantum devices less effective — solving a long-standing mystery in the quantum sensing field and opening new doors for both engineering innovation and fundamental research.
    “The end impact is not just a better sensor, but a new, quantitative framework for engineering coherence and charge stability in quantum nanomaterials,” Zvi said. More

  • in

    Quantum eyes on energy loss: Diamond quantum imaging for next-gen power electronics

    Diamond quantum sensors can be used to analyze the magnetization response of soft magnetic materials used in power electronics; report scientists based on collaborative research. Using a novel imaging technique, they developed quantum protocols to simultaneously image both the amplitude and phase of AC stray fields over a wide frequency range up to 2.3 MHz. Their results demonstrate that quantum sensing is a powerful tool for developing advanced magnetic materials across diverse applications.
    Improving energy conversion efficiency in power electronics is vital for a sustainable society, with wide-bandgap semiconductors like GaN and SiC power devices offering advantages due to their high-frequency capabilities. However, energy losses in passive components at high frequencies hinder efficiency and miniaturization. This underscores the need for advanced soft magnetic materials with lower energy losses.
    In a recent study published in Communications Materials, a research team led by Professor Mutsuko Hatano from the School of Engineering, Institute of Science Tokyo, Japan, developed a novel method for analyzing such losses by simultaneously imaging the amplitude and phase of alternating current (AC) stray fields, which are key to understanding hysteresis losses. Using a diamond quantum sensor with nitrogen-vacancy (NV) centers and developing two protocols — Qubit Frequency Tracking (Qurack) for kHz and quantum heterodyne (Qdyne) imaging for MHz frequencies — they realized wide-range AC magnetic field imaging. This study was carried out in collaboration with Harvard University and Hitachi, Ltd.
    The researchers conducted a proof-of-principle wide-frequency-range magnetic field imaging experiment by applying an AC current to a 50-turn coil and sweeping the frequency from 100 Hz to 200 kHz for Qurack and 237 kHz to 2.34 MHz for Qdyne. As expected, the uniform AC Ampere magnetic field’s amplitude and phase were imaged using NV centers with high spatial resolution (2-5 µm), validating both measurement protocols.
    Using this innovative imaging system, the team could simultaneously map the amplitude and phase of stray magnetic fields from the CoFeB-SiO2 thin films, which have been developed for high-frequency inductors. Their findings revealed that these films exhibit near-zero phase delay up to 2.3 MHz, indicating negligible energy losses along the hard axis. Moreover, they observed that energy loss depends on the material’s magnetic anisotropy — when magnetization is driven along the easy axis, phase delay increases with frequency, signifying higher energy dissipation.
    Overall, the results showcase how quantum sensing can be used to analyze soft magnetic materials operating at higher frequencies, which is considered to be a major challenge in developing highly efficient electronic systems. Notably, the capacity to resolve domain wall motion, one of the magnetization mechanisms strongly related to energy losses, is a pivotal step, leading to important practical advances and optimizations in electronics.
    Looking forward, the researchers hope to further improve the proposed techniques in various ways. “The Qurack and Qdyne techniques used in this study can be enhanced by engineering improvements,” says Hatano. “Qurack’s performance can be enhanced by adopting high-performance signal generators to extend its amplitude range, whereas optimizing spin coherence time and microwave control speed would broaden Qdyne’s frequency detection range.”
    “Simultaneous imaging of the amplitude and phase of AC magnetic fields across a broad frequency range offers numerous potential applications in power electronics, electromagnets, non-volatile memory, and spintronics technologies,” remarks Hatano. “This success contributes to the acceleration of quantum technologies, particularly in sectors related to sustainable development goals and well-being.” More

  • in

    Efficiency upgrade for OLED screens: A route to blue PHOLED longevity

    Blue phosphorescent OLEDs can now last as long as the green phosphorescent OLEDs already in devices, University of Michigan researchers have demonstrated, paving the way for further improving the energy efficiency of OLED screens.
    “This moves the blues into the domain of green lifetimes,” said Stephen Forrest, the Peter A. Franken Distinguished University Professor of Electrical Engineering and corresponding author of the study in Nature Photonics.
    “I can’t say the problem is completely solved — of course it’s not solved until it enters your display — but I think we’ve shown the path to a real solution that has been evading the community for two decades.”
    OLED screens are standard in flagship smartphones and high-end televisions, providing high contrast and energy efficiency as variations in brightness are achieved by the light emitters rather than a liquid crystal layer over the top. However, not all OLEDs are equally energy efficient.
    In current displays, red and green OLEDs produce light through the highly efficient phosphorescent route, whereas blue OLEDs still use fluorescence. This means while red and green OLEDs have a theoretical maximum of one photon for every electron running through the device, blue OLEDs cap out at a far lower efficiency.
    The trouble is that blue light is the highest energy that an RGB device must produce: The molecules in blue phosphorescent OLEDs (PHOLEDs) need to handle higher energies than their red and green counterparts. Most of the energy leaves in the form of blue light, but when it is trapped, it can instead break down the color-producing molecules.
    Previously, Forrest’s team discovered that there was a way to get that trapped energy out faster by including a coating on the negative electrode that helps the energy convert into blue light. Haonan Zhao, a recent Ph.D. graduate in physics, said it was like creating a fast lane.

    “On a road that doesn’t have enough lanes, impatient drivers can crash into one another, cutting off all traffic — just like two excitons bumping into one another create a lot of hot energy that destroys the molecule,” said Zhao, first author of that study as well as the new one. “The plasmon exciton polariton is our optical design for an exciton fast lane.”
    The details are based in quantum mechanics. When an electron comes in through the negative electrode, it creates what’s called an excited state in one of the molecules that produces blue light. That state is a negatively charged electron that jumps into a higher energy level and a positively charged “hole” that the electron leaves behind — together, they make an exciton.
    Ideally, the electron would quickly jump back to its original state and fire off a blue photon, but excitons that use the phosphorescent route tend to hang around. Simply relaxing into their original state would violate a law of quantum mechanics. However, excitons very near the electrode produce photons faster because the shiny surface supports another quantum quasiparticle — surface plasmons. These are like ripples in the pond of electrons on the surface of the metal.
    If the exciton in the light-emitting material is close enough to the electrode, it gets a little help with the conversion to blue light because it can dump its energy into a surface plasmon — a phenomenon known as the Purcell effect. It does this because the exciton oscillates a little like a broadcast antenna, which creates waves in the electrons in the electrode. This isn’t automatically helpful, though, as not all surface plasmons produce photons. To get the photon, the exciton must attach itself to the surface plasmon, producing a plasmon exciton polariton.
    Forrest’s team encouraged this route by adding a thin layer of a carbon-based semiconductor onto the shiny electrode that encourages the exciton to transfer its energy and resonate in the right way. It also extends the effect deeper into the light-emitting material, so excitons further from the electrode can benefit.
    The team reported on this last year, and they have since been putting this effect together with other approaches to finally produce a blue PHOLED that can last as long and burn as bright as a green one. These are the highlights of the design: Two light-emitting layers (a tandem OLED): This cuts the light-emitting burden of each layer in half, reducing the odds that two excitons merge.

    Adding a layer that helps the excitons resonate with surface plasmons near both electrodes, so that both emitting layers have access to the fast lane The whole structure is an optical cavity, in which blue light resonates between the two mirror-like electrodes. This pushes the color of the photons deeper into the blue range. This study was supported in part by the Department of Energy and Universal Display Corporation.
    Claire Arneson, a Ph.D. student in physics at U-M, also contributed to this study.
    The device was built in the Lurie Nanofabrication Facility and studied at the Michigan Center for Materials Characterization.
    The team has patented the technology with the assistance of U-M Innovation Partnerships and has licensed it to Universal Display Corp. Forrest and the University of Michigan have a financial interest in Universal Display Corp.
    Forrest is also the Paul G. Goebel Professor of Engineering and a professor of electrical computer engineering, materials science and engineering, physics and applied physics. More

  • in

    Charge radius of Helium-3 measured with unprecedented precision

    A research team led by Professor Randolf Pohl from the Institute of Physics at Johannes Gutenberg University Mainz (JGU) has achieved a significant breakthrough in determining fundamental properties of atomic nuclei. For the first time, the team conducted laser spectroscopy experiments on muonic helium-3 at the Paul Scherrer Institute in Switzerland. Muonic helium-3 is a special form of helium in which the atom’s two electrons are replaced by a single, much heavier muon. Yesterday, the results have been published in the journal Science.
    “Our experiments with muonic helium-3 provide the most accurate value to date for the charge radius of this nucleus,” says Randolf Pohl, who is also a member of the PRISMA+ Cluster of Excellence at JGU. “This is primarily because the heavy muon orbits much closer to the atomic nucleus than electrons in regular atoms, making it far more sensitive to the nucleus’s size and structure.” Helium nuclei always consist of two protons — this is what defines them as helium. Different isotopes are distinguished by the number of neutrons in the nucleus: helium-3 contains one neutron alongside the two protons, while the heavier helium-4 contains two neutrons. Professor Pohl’s team had already successfully measured helium-4 using laser spectroscopy and muons several years ago.
    Measurement Confirms Theoretical Models
    Traditionally, nuclear radii are determined using particle accelerators, such as MAMI at JGU or, in the future, MESA. However, the new value obtained from muonic helium measurements is fifteen times more precise, coming in at 1.97007 ± 0.00097 femtometers. Laser spectroscopy with electrons has previously been successfully applied to the lightest atomic nuclei, such as hydrogen and deuterium. For helium, highly precise measurements also exist, but the presence of two electrons in the helium atom makes theoretical calculations more complex, preventing accurate determination of the nuclear radius from such measurements alone. Nevertheless, it has already been possible to determine the difference in charge radii between various helium isotopes (nuclei with the same number of protons but different numbers of neutrons). The new results from muonic helium measurements align well with recent experiments on regular helium conducted by a research team in Amsterdam, also published in Science, yesterday.
    “In combination with our earlier results on muonic helium-4, which were published in Nature in 2021, we have now been able to precisely determine the difference in charge radii between helium-3 and helium-4 — an important advancement,” says Pohl.
    Close Interaction Between Theory and Experiment in the PRISMA+ Cluster of Excellence
    The strong agreement between the measurements conducted by the Mainz and Amsterdam teams confirms existing knowledge about the nuclear physics of the lightest atomic nuclei. This knowledge is based in part on key theoretical calculations of nuclear structure, also carried out within the PRISMA+ Cluster of Excellence.

    Professor Sonia Bacca’s team calculated the influence of the muon on the structure of the helium nucleus, while Professor Marc Vanderhaeghen and Dr. Franziska Hagelstein explored the roles of protons and neutrons. These theoretical foundations enabled the extraction of reliable information about the atomic nucleus from the precise experimental data.
    “Accurate knowledge of nuclear charge radii is essential for determining fundamental physical constants such as the Rydberg constant,” Pohl explains. “It is also crucial for the search for new physics — particles and forces not yet included in the Standard Model. The previous lack of precise data in this area introduced significant uncertainties.”
    Precise Measurement of Additional Atomic Nuclei Planned
    Looking ahead, the team of experimental and theoretical physicists at Mainz plans to apply their methods to other atomic nuclei — from lithium to neon — with ten times the accuracy compared to particle accelerator-based methods. Instead of lasers, they will use innovative X-ray detectors. This work, like the previous experiments led by Pohl’s group, is supported by the German Research Foundation (DFG) as part of the Collaborative Research Center 1660 at JGU. More

  • in

    Researchers make breakthrough in semiconductor technology set to supercharge 6G delivery

    Self-driving cars which eliminate traffic jams, getting a healthcare diagnosis instantly without leaving your home, or feeling the touch of loved ones based across the continent may sound like the stuff of science fiction.
    But new research, led by the University of Bristol and published today in the journal Nature Electronics, could make all this and more a step closer to reality thanks to a radical breakthrough in semiconductor technology.
    The futuristic concepts rely on the ability to communicate and transfer vast volumes of data much faster than existing networks. So physicists have developed an innovative way to accelerate this process between scores of users, potentially across the globe.
    Co-lead author Martin Kuball, Professor of Physics at the University of Bristol, said: “Within the next decade, previously almost unimaginable technologies to transform a wide range of human experiences could be widely available. The possible benefits are also far-reaching, including advances in healthcare with remote diagnostics and surgery, virtual classrooms and even virtual holiday tourism.
    “In addition, there is considerable potential for advanced driver assistance systems to improve road safety and industrial automation for greater efficiency. The list of possible 6G applications is endless, with the limit just being human imagination. So our innovative semiconductor discoveries are hugely exciting and will help drive forward these developments at speed and scale.”
    It is widely acknowledged that the shift from 5G to 6G will demand a radical upgrade of semiconductor technology, circuits, systems, and associated algorithms. For instance, the main semiconductor components involved, in other words the radio frequency amplifiers made from a wonder conductor called Gallium Nitride (GaN), need to be much quicker, emit greater power, and be more reliable.
    The team of international scientists and engineers has tested a new architecture, catapulting these special GaN amplifiers to unprecedented heights. This was achieved by discovering a latch-effect in GaN, which unlocked a much greater radio frequency device performance. These next generation devices use parallel channels which then require the use of sub-100nm side fins — a type of transistor which controls the flow of current passing through the devices.

    Co-lead author Dr Akhil Shaji, Honorary Research Associate at the University of Bristol, explained: “We have piloted a device technology, working with collaborators, called superlattice castellated field effect transistors (SLCFETs), in which more than 1000 fins with sub-100 nm width help drive the current. Although SLCFETs have demonstrated the highest performance in the W-band frequency range, equating to 75 gigahertz -110 GHz, the physics behind it was unknown.
    “We recognised it was a latch-effect in GaN, which enables the high radio frequency performance.”
    The researchers then needed to pinpoint exactly where this effect occurred, by simultaneously using ultra precision electrical measurements and optical microscopy, so it could be further studied and understood. After analysing more than 1,000 fins findings located this effect to the widest fin.
    Prof Kuball, who is also Royal Academy of Engineering Chair in Emerging Technologies, added: “We also developed a 3D model using a simulator to further verify our observations. The next challenge was to study the reliability aspects of latch effect for practical applications. The rigorous testing of the device over a long duration of time showed it has no detrimental effect on device reliability or performance.
    “We found a key aspect driving this reliability was a thin layer of dielectric coating around each of the fins. But the main takeaway was clear — the latch effect can be exploited for countless practical applications, which could help transform people’s lives in many different ways in years to come.”
    Next steps for the work include further increasing the power density the devices can deliver, so they can offer even higher performance and serve wider audiences. Industry partners will also be bringing such next generation devices to a commercial market.
    Researchers at the University of Bristol are at the forefront of improving electrical performance and efficiency in a wide range of different applications and settings.
    Professor Kuball leads the Centre for Device Thermography and Reliability (CDTR), which is developing next generation semiconductor electronic devices for net zero, and for communications and radar technology. It also works on improving device thermal management, electrical performance and reliability, using wide and ultra-wide bandgap semiconductors. More

  • in

    A faster, more reliable method for simulating the plasmas used to make computer chips

    Plasma — the electrically charged fourth state of matter — is at the heart of many important industrial processes, including those used to make computer chips and coat materials. Simulating those plasmas can be challenging, however, because millions of math operations must be performed for thousands of points in the simulation, many times per second. Even with the world’s fastest supercomputers, scientists have struggled to create a kinetic simulation — which considers individual particles — that is detailed and fast enough to help them improve those manufacturing processes.
    Now, a new method offers improved stability and efficiency for kinetic simulations of what’s known as inductively coupled plasmas. The method was implemented in a code developed as part of a private-public partnership between the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) and chip equipment maker Applied Materials Inc., which is already using the tool. Researchers from the University of Alberta, PPPL and Los Alamos National Laboratory contributed to the project.
    Detailed simulations of these plasmas are important to gain a better understanding of how plasma forms and evolves for various manufacturing processes. The more realistic the simulation, the more accurate the distribution functions it provides. These measures show, for example, the probability that a particle is at a particular location moving at a particular speed. Ultimately, understanding these details could lead to realizations about how to use the plasma in a more refined way to etch patterns onto silicon for even faster chips or memory with greater storage, for example.
    “This is a big step forward in our capabilities,” said Igor Kaganovich, a principal research physicist at PPPL and co-author of a journal article published in Physics of Plasmas that details the simulation findings.
    Making the code reliable
    The initial version of the code was developed using an old method that proved unreliable. Dmytro Sydorenko, a research associate at the University of Alberta and first author of the paper, said that significant modifications of the method were made to make the code much more stable. “We changed the equations, so the simulation immediately became very reliable and there were no crashes anymore,” he said. “So now we have a usable tool for the simulation of inductively coupled plasmas into two spatial dimensions.”
    The code was improved, in part, by changing the way one of the electric fields was calculated. An electric field is like an invisible force field that surrounds electric charges and currents. It exerts forces on particles. In an inductively coupled plasma, a wire coil carrying an electric current generates a changing magnetic field, which, in turn, generates an electric field that heats the plasma. It is this field, known as the solenoidal electric field, that the team focused its efforts on.

    The code calculates electromagnetic fields based on procedures developed by Salomon Janhunen from Los Alamos National Laboratory. These procedures were optimized by PPPL’s Jin Chen, who acted as a bridge between physics, mathematics and computer science aspects of the challenge. “For a complicated problem, the improvement is significant,” Chen said.
    The simulation is known as a particle-in-cell code because it tracks individual particles (or small groups of particles clumped together as so-called macroparticles) while they move in space from one grid cell to another. This approach works particularly well for the plasmas used in industrial devices where the gas pressure is low. A fluid approach doesn’t work for such plasmas because it uses average values instead of tracking individual particles.
    Obeying the law of conservation of energy
    “This new simulation allows us to model larger plasmas quickly while accurately conserving energy, helping to ensure the results reflect real physical processes rather than numerical artifacts,” said Kaganovich.
    In the real world, energy doesn’t randomly appear or disappear. It follows the law of conservation of energy. But a small mistake in a computer simulation can accumulate with each step. Because each simulation might involve thousands or even millions of steps, a small error throws off the results significantly. Making sure energy is conserved helps keep the simulation faithful to a real plasma.
    PPPL’s Stéphane Ethier also worked on the new simulation code. The work was supported by a Cooperative Research and Development Agreement between Applied Materials Inc. and PPPL, under contract number DE-AC02-09CH11466. More

  • in

    AI is here to stay, let students embrace the technology, experts urge

    A new study from UBC Okanagan says students appear to be using generative artificial intelligence (GenAI) responsibly, and as a way to speed up tasks, not just boost their grades.
    Dr. Meaghan MacNutt, who teaches professional ethics in the UBCO School of Health and Exercise Sciences (HES), recently published a study in Advances in Physiology Education. Published this month, the paper — titled Reflective writing assignments in the era of GenAI: student behaviour and attitudes suggest utility, not futility — contradicts common concerns about student use of AI.
    Students in three different courses, almost 400 participants, anonymously completed a survey about their use of AI on at least five reflective writing assignments. All three courses used an identical AI policy and students had the option to use the tool for their writing.
    “GenAI tools like ChatGPT allow users to interface with large language models. They offer incredible promise to enhance student learning, however, they are also susceptible to misuse in completion of writing assignments,” says Dr. MacNutt. “This potential has raised concerns about GenAI as a serious threat to academic integrity and to the learning that occurs when students draft and revise their own written work.”
    While UBC offers guidance to students and faculty about the risks and benefits of using GenAI, policies regarding its use in courses are at the discretion of individual instructors.
    Dr. MacNutt, who completed the study with doctoral student and HES lecturer Tori Stranges, notes that discipline-specific factors contribute to the perception that many courses in HES are particularly challenging and many students strive for excellence, often at the expense of their mental wellbeing.
    So, how often were the students using AI and what was motivating their use?

    While only about one-third of the students used AI, the majority of users, 81 per cent, reported their GenAI use was inspired by at least one of the following factors: speed and ease in completing the assignment, a desire for high grades and a desire to learn. About 15 per cent of the students said they were motivated by all three factors, with more than 50 per cent using it to save time on the assignment.
    Dr. MacNutt notes that most students used AI to initiate the paper or revise sections. Only 0.3 per cent of assignments were mostly written by GenAI.
    “There is a lot of speculation when it comes to student use of AI,” she says. “However, students in our study reported that GenAI use was motivated more by learning than by grades, and they are using GenAI tools selectively and in ways they believe are ethical and supportive of their learning. This was somewhat unexpected due to the common perception that undergraduate students have become increasingly focused on grades at the expense of learning.”
    The study does raise some cautions, she warns. GenAI can be a useful tool for students learning English or people with reading and writing disabilities. But there is also the potential that if paid versions are better, students who can afford to use a more effective platform might have an advantage over others — creating further classroom inequities.
    MacNutt says continued research in this area can only provide a better understanding of student behaviour and attitudes as GenAI technologies continue to advance. She also suggests, while AI continues to be used more frequently, that institutions and educators adopt an approach that embodies “collaboration with” rather than “surveillance of” students.
    “Our findings contradict common concerns about widespread student misuse and overuse of GenAI at the expense of academic integrity and learning,” says Dr. MacNutt. “But as we move forward with our policies, or how we’re teaching students how to use it, we have to keep in mind that students are coming from really different places. And they have different ways of benefiting or being harmed by these technologies.” More

  • in

    Breakthrough AI model could transform how we prepare for natural disasters

    As climate-related disasters grow more intense and frequent, an international team of researchers has introduced Aurora — a groundbreaking AI model designed to deliver faster, more accurate, and more affordable forecasts for air quality, ocean waves, and extreme weather events. This model, called Aurora, has been trained on over a million hours of data. According to the researchers, it could revolutionize the way we prepare for natural disasters and respond to climate change.
    From deadly floods in Europe to intensifying tropical cyclones around the world, the climate crisis has made timely and precise forecasting more essential than ever. Yet traditional forecasting methods rely on highly complex numerical models developed over decades, requiring powerful supercomputers and large teams of experts. According to its developers, Aurora offers a powerful and efficient alternative using artificial intelligence.
    Machine learning at the core
    ‘Aurora uses state-of-the-art machine learning techniques to deliver superior forecasts for key environmental systems — air quality, weather, ocean waves, and tropical cyclones,’ explains Max Welling, machine learning expert at the University of Amsterdam and one of the researchers behind the model. Unlike conventional methods, Aurora requires far less computational power, making high-quality forecasting more accessible and scalable — especially in regions that lack expensive infrastructure.
    Trained on a million hours of earth data
    Aurora is built on a 1.3 billion parameter foundation model, trained on more than one million hours of Earth system data. It has been fine-tuned to excel in a range of forecasting tasks: Air quality: Outperforms traditional models in 74% of cases Ocean waves: Exceeds numerical simulations on 86% of targets Tropical cyclones: Beats seven operational forecasting centres in 100% of tests High-resolution weather: Surpasses leading models in 92% of scenarios, especially during extreme eventsForecasting that’s fast, accurate, and inclusive
    As climate volatility increases, rapid and reliable forecasts are crucial for disaster preparedness, emergency response, and climate adaptation. The researchers believe Aurora can help by making advanced forecasting more accessible.

    ‘Development cycles that once took years can now be completed in just weeks by small engineering teams,’ notes AI researcher Ana Lucic, also of the University of Amsterdam. ‘This could be especially valuable for countries in the Global South, smaller weather services, and research groups focused on localised climate risks.’ ‘Importantly, this acceleration builds on decades of foundational research and the vast datasets made available through traditional forecasting methods,’ Welling adds.
    Aurora is available freely online for anyone to use. If someone wants to fine-tune it for a specific task, they will need to provide data for that task. ‘But the “initial” training is done, we don’t need these vast datasets anymore, all the information from them is baked into Aurora already’, Lucic explains.
    A future-proof forecasting tool
    Although current research focuses on the four applications mentioned above, the researchers say Aurora is flexible and can be used for a wide range of future scenarios. These could include forecasting flood risks, wildfire spread, seasonal weather trends, agricultural yields, and renewable energy output. ‘Its ability to process diverse data types makes it a powerful and future-ready tool’, states Welling.
    As the world faces more extreme weather — from heatwaves to hurricanes — innovative models like Aurora could shift the global approach from reactive crisis response to proactive climate resilience concludes the study. More