More stories

  • in

    Using AI to detect cancer from patient data securely

    A new way of using artificial intelligence to predict cancer from patient data without putting personal information at risk has been developed by a team including University of Leeds medical scientists.
    Artificial intelligence (AI) can analyse large amounts of data, such as images or trial results, and can identify patterns often undetectable by humans, making it highly valuable in speeding up disease detection, diagnosis and treatment.
    However, using the technology in medical settings is controversial because of the risk of accidental data release and many systems are owned and controlled by private companies, giving them access to confidential patient data — and the responsibility for protecting it.
    The researchers set out to discover whether a form of AI, called swarm learning, could be used to help computers predict cancer in medical images of patient tissue samples, without releasing the data from hospitals.
    Swarm learning trains AI algorithms to detect patterns in data in a local hospital or university, such as genetic changes within images of human tissue. The swarm learning system then sends this newly trained algorithm — but importantly no local data or patient information — to a central computer. There, it is combined with algorithms generated by other hospitals in an identical way to create an optimised algorithm. This is then sent back to the local hospital, where it is reapplied to the original data, improving detection of genetic changes thanks to its more sensitive detection capabilities.
    By undertaking this several times, the algorithm can be improved and one created that works on all the data sets. This means that the technique can be applied without the need for any data to be released to third party companies or to be sent between hospitals or across international borders.
    The team trained AI algorithms on study data from three groups of patients from Northern Ireland, Germany and the USA. The algorithms were tested on two large sets of data images generated at Leeds, and were found to have successfully learned how to predict the presence of different sub types of cancer in the images.
    The research was led by Jakob Nikolas Kather, Visiting Associate Professor at the University of Leeds’ School of Medicine and Researcher at the University Hospital RWTH Aachen. The team included Professors Heike Grabsch and Phil Quirke, and Dr Nick West from the University of Leeds’ School of Medicine.
    Dr Kather said: “Based on data from over 5,000 patients, we were able to show that AI models trained with swarm learning can predict clinically relevant genetic changes directly from images of tissue from colon tumors.”
    Phil Quirke, Professor of Pathology in the University of Leeds’s School of Medicine, said: “We have shown that swarm learning can be used in medicine to train independent AI algorithms for any image analysis task. This means it is possible to overcome the need for data transfer without institutions having to relinquish secure control of their data.
    “Creating an AI system which can perform this task improves our ability to apply AI in the future.”
    Story Source:
    Materials provided by University of Leeds. Note: Content may be edited for style and length. More

  • in

    Exotic magnetic structures created with laser light

    Research at Lund University in Sweden has found a new way to create nano-sized magnetic particles using ultrafast laser light pulses. The discovery could pave the way for new and more energy-efficient technical components and become useful in the quantum computers of the future.
    Magnetic skyrmions are sometimes described as magnetic vortices. Unlike ferromagnetic states — which occur in conventional magnets such as compasses and refrigerator magnets — the skyrmion state is quite peculiar: the orientation of the magnetization does not point in the same direction everywhere in the material, but is instead best described as a kind of swirling magnetism.
    Skyrmions are of great interest to both basic researchand industry, as they can be used to manufacture more compact computer memories. However, that is easier said than done. Using skyrmions for technical purposes requires efficient ways of writing, erasing and manipulating the particles on short time scales, and with high spatial precision.
    In a new study, researchersClaudio Verdozzi from Lund University and Emil Viñas Boström and Angel Rubio from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg have found a new approach.
    “In our study, we have theoretically shown how it is possible to meet one of these requirements, that is, how to create magnetic skyrmions in ultra-short time scales using pulses of laser light,” says Claudio Verdozzi, physics researcher at Lund University.
    The research team has identified a microscopic mechanism that explains an experimental protocol that has been shown to be useful in creating the strange skyrmions. Using femtosecond laser pulses — light pulses that last one millionth of a billionth of a second — the researchers showed that it is possible to create skyrmions ultra-fast.
    “Our results are of great relevance to creating more energy-efficient technical components. Our study shows that light can be used to manipulate localized magnetic excitations in very short time scales,” says Claudio Verdozzi.
    There are a number of applications that the new discovery can lead to, including quantum technology — an area where quantum mechanical properties are used to solve extremely advanced calculations that traditional computers cannot handle. Magnetic excitations such as skyrmions and so-called spin waves are also believed to be able to help reduce energy consumption in technological components, and can thus help to achieve future climate goals.
    “Skyrmions are in focus for both theoretical and experimental research thanks to their technological potential. Also, their exotic magnetic patterns possess a conceptually and mathematically beautiful appeal that makes them very interesting,” concludes Claudio Verdozzi.
    Story Source:
    Materials provided by Lund University. Note: Content may be edited for style and length. More

  • in

    Computing: Resilient system using only non-volatile memory

    A KAIST research team has developed hardware and software technology that ensures both data and execution persistence. The Lightweight Persistence Centric System (LightPC) makes the systems resilient against power failures by utilizing only non-volatile memory as the main memory.
    “We mounted non-volatile memory on a system board prototype and created an operating system to verify the effectiveness of LightPC,” said Professor Myoungsoo Jung. The team confirmed that LightPC validated its execution while powering up and down in the middle of execution, showing up to eight times more memory, 4.3 times faster application execution, and 73% lower power consumption compared to traditional systems.
    Professor Jung said that LightPC can be utilized in a variety of fields such as data centers and high-performance computing to provide large-capacity memory, high performance, low power consumption, and service reliability.
    In general, power failures on legacy systems can lead to the loss of data stored in the DRAM-based main memory. Unlike volatile memory such as DRAM, non-volatile memory can retain its data without power. Although non-volatile memory has the characteristics of lower power consumption and larger capacity than DRAM, non-volatile memory is typically used for the task of secondary storage due to its lower write performance. For this reason, nonvolatile memory is often used with DRAM. However, modern systems employing non-volatile memory-based main memory experience unexpected performance degradation due to the complicated memory microarchitecture.
    To enable both data and execution persistent in legacy systems, it is necessary to transfer the data from the volatile memory to the non-volatile memory. Checkpointing is one possible solution. It periodically transfers the data in preparation for a sudden power failure. While this technology is essential for ensuring high mobility and reliability for users, checkpointing also has fatal drawbacks. It takes additional time and power to move data and requires a data recovery process as well as restarting the system.
    In order to address these issues, the research team developed a processor and memory controller to raise the performance of non-volatile memory-only memory. LightPC matches the performance of DRAM by minimizing the internal volatile memory components from non-volatile memory, exposing the non-volatile memory (PRAM) media to the host, and increasing parallelism to service on-the-fly requests as soon as possible.
    The team also presented operating system technology that quickly makes execution states of running processes persistent without the need for a checkpointing process. The operating system prevents all modifications to execution states and data by keeping all program executions idle before transferring data in order to support consistency within a period much shorter than the standard power hold-up time of about 16 minutes. For consistency, when the power is recovered, the computer almost immediately revives itself and re-executes all the offline processes immediately without the need for a boot process.
    The researchers will present their work (LightPC: Hardware and Software Co-Design for Energy-Efficient Full System Persistence) at the International Symposium on Computer Architecture (ISCA) 2022 in New York in June.
    Story Source:
    Materials provided by The Korea Advanced Institute of Science and Technology (KAIST). Note: Content may be edited for style and length. More

  • in

    Breakthrough for efficient and high-speed spintronic devices

    Sharing real-time information requires complex networks of systems. A promising approach for speeding up data storage devices consists of switching the magnetization, or the electrons’ spin, of magnetic materials with ultra-short femtosecond laser pulses. But, how the spin evolves in the nanoworld on extremely short time scales, in one millionth of one billionth of a second, has remained largely mysterious. The team of Professor François Légaré at the Institut national de la recherche scientifique (INRS) has made a major breakthrough in this field, in collaboration with TU Wien, Austria, the French national synchrotron facility (SOLEIL) and other international partners. Their work was published in the journal Optica.
    So far, studies on the subject strongly rely on limited access large X-ray facilities such as free-electron lasers and synchrotrons. The team demonstrates, for the first time, a tabletop ultrafast soft X-ray microscope to spatio-temporally resolve the spin dynamics inside rare earth materials, which are promising for spintronic devices.
    This new soft X-ray source based on a high-energy Ytterbium laser represents a critical advance for studying future energy-efficient and high-speed spintronic devices and could be used for many applications in physics, chemistry, and biology.
    “Our approach provides a robust, cost-efficient and energy-scalable elegant solution for many laboratories. It allows the study of ultrafast dynamics in nanoscale and mesoscale structures with both nanometre spatial and femtosecond temporal resolutions, as well as with the element specificity,” says Professor Andrius Baltuska, at TU Wien.
    Bright X-ray pulses to watch the spin
    With this bright source of X-ray photons, a series of snapshot images of the nanoscale rare earth magnetic structures have been recorded. They clearly expose the fast demagnetization process, and the results provide rich information on the magnetic properties that are as accurate as those obtained using large-scale X-ray facilities.
    “Development of ultrafast tabletop X-ray sources is exciting for cutting-edge technological applications and modern fields of science. We are excited about our results, that could be helpful for future research for spintronics, as well as other potential fields,” says INRS postdoctoral researcher, Dr. Guangyu Fan.
    “Rare earth systems are trending in the community because of their nanometer size, faster speed, and topologically protected stability. The X-ray source is very attractive for many studies on future spintronic devices composed of rare earth.” says Nicolas Jaouen, senior scientist at the French national synchrotron facility.
    Professor Légaré emphasizes the collaborative work between experts in the development of state-of-the-art light sources and ultrafast dynamics in magnetic materials at the nanoscale. “Considering the quick emergence of high-power Ytterbium laser technology, this work represents huge potential for high-performance soft X-ray sources. This new generation of lasers, which will be available soon at the Advanced Laser Light Source (ALLS), will have many future applications for the fields of physics, chemistry, and even biology,” he says.
    Story Source:
    Materials provided by Institut national de la recherche scientifique – INRS. Original written by Audrey-Maude Vézina. Note: Content may be edited for style and length. More

  • in

    Scientists turn a hydrogen molecule into a quantum sensor

    Physicists at the University of California, Irvine have demonstrated the use of a hydrogen molecule as a quantum sensor in a terahertz laser-equipped scanning tunneling microscope, a technique that can measure the chemical properties of materials at unprecedented time and spatial resolutions.
    This new technique can also be applied to analysis of two-dimensional materials which have the potential to play a role in advanced energy systems, electronics and quantum computers.
    Today in Science, the researchers in UCI’s Department of Physics & Astronomy and Department of Chemistry describe how they positioned two bound atoms of hydrogen in between the silver tip of the STM and a sample composed of a flat copper surface arrayed with small islands of copper nitride. With pulses of the laser lasting trillionths of a second, the scientists were able to excite the hydrogen molecule and detect changes in its quantum states at cryogenic temperatures and in the ultrahigh vacuum environment of the instrument, rendering atomic-scale, time-lapsed images of the sample.
    “This project represents an advance in both the measurement technique and the scientific question the approach allowed us to explore,” said co-author Wilson Ho, Bren Professor of physics & astronomy and chemistry. “A quantum microscope that relies on probing the coherent superposition of states in a two-level system is much more sensitive than existing instruments that are not based on this quantum physics principle.”
    Ho said the hydrogen molecule is an example of a two-level system because its orientation shifts between two positions, up and down and slightly horizontally tilted. Through a laser pulse, the scientists can coax the system to go from a ground state to an excited state in a cyclical fashion resulting in a superposition of the two states. The duration of the cyclic oscillations is vanishingly brief — lasting mere tens of picoseconds — but by measuring this “decoherence time” and the cyclic periods the scientists were able to see how the hydrogen molecule was interacting with its environment.
    “The hydrogen molecule became part of the quantum microscope in the sense that wherever the microscope scanned, the hydrogen was there in between the tip and the sample,” said Ho. “It makes for an extremely sensitive probe, allowing us to see variations down to 0.1 angstrom. At this resolution, we could see how the charge distributions change on the sample.”
    The space between the STM tip and the sample is almost unimaginably small, about six angstroms or 0.6 nanometers. The STM that Ho and his team assembled is equipped to detect minute electrical current flowing in this space and produce spectroscopic readings proving the presence of the hydrogen molecule and sample elements. Ho said this experiment represents the first demonstration of a chemically sensitive spectroscopy based on terahertz-induced rectification current through a single molecule.
    The ability to characterize materials at this level of detail based on hydrogen’s quantum coherence can be of great use in the science and engineering of catalysts, since their functioning often depends on surface imperfections at the scale of single atoms, according to Ho.
    “As long as hydrogen can be adsorbed onto a material, in principle, you can use hydrogen as a sensor to characterize the material itself through observations of their electrostatic field distribution,” said study lead author Likun Wang, UCI graduate student in physics & astronomy.
    Joining Ho and Wang on this project, which was supported by the U.S. Department of Energy Office of Basic Energy Sciences, was Yunpeng Xia, UCI graduate student in physics & astronomy.
    Story Source:
    Materials provided by University of California – Irvine. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence helps physicians better assess the effectiveness of bladder cancer treatment

    In a small but multi-institutional study, an artificial intelligence-based system improved providers’ assessments of whether patients with bladder cancer had complete response to chemotherapy before a radical cystectomy (bladder removal surgery).
    Yet the researchers caution that AI isn’t a replacement for human expertise and that their tool shouldn’t be used as such.
    “If you use the tool smartly, it can help you,” said Lubomir Hadjiyski, Ph.D., a professor of radiology at the University of Michigan Medical School and the senior author of the study.
    When patients develop bladder cancer, surgeons often remove the entire bladder in an effort to keep the cancer from returning or spreading to other organs or areas. More evidence is building, though, that surgery may not be necessary if a patient has zero evidence of disease after chemotherapy.
    However, it’s difficult to determine whether the lesion left after treatment is simply tissue that’s become necrotic or scarred as a result of treatment or whether cancer remains. The researchers wondered if AI could help.
    “The big question was when you have such an artificial device next to you, how is it going to affect the physician?” Hadjiyski said. “Is it going to help? Is it going to confuse them? Is it going to raise their performance or will they simply ignore it?”
    Fourteen physicians from different specialties — including radiology, urology and oncology — as well as two fellows and a medical student looked at pre- and post-treatment scans of 157 bladder tumors. The providers gave ratings for three measures that assessed the level of response to chemotherapy as well as a recommendation for the next treatment to be done for each patient (radiation or surgery). More

  • in

    This algorithm has opinions about your face

    When two people meet, they instantly size each other up, making snap judgments about everything from the other person’s age to their intelligence or trustworthiness based solely on the way they look. Those first impressions, though often inaccurate, can be extremely powerful, shaping our relationships and impacting everything from hiring decisions to criminal sentencing.
    Researchers at Stevens Institute of Technology, in collaboration with Princeton University and University of Chicago, have now taught an AI algorithm to model these first impressions and accurately predict how people will be perceived based on a photograph of their face. The work appears today, in the April 21 issue of the Proceedings of the National Academy of Sciences.
    “There’s a wide body of research that focuses on modeling the physical appearance of people’s faces,” said Jordan W. Suchow, a cognitive scientist and AI expert at the School of Business at Stevens. “We’re bringing that together with human judgments and using machine learning to study people’s biased first impressions of one another.”
    Suchow and team, including Joshua Peterson and Thomas Griffiths at Princeton, and Stefan Uddenberg and Alex Todorov at Chicago Booth, asked thousands of people to give their first impressions of over 1,000 computer-generated photos of faces, ranked using criteria such as how intelligent, electable, religious, trustworthy, or outgoing a photograph’s subject appeared to be. The responses were then used to train a neural network to make similar snap judgments about people based solely on photographs of their faces.
    “Given a photo of your face, we can use this algorithm to predict what people’s first impressions of you would be, and which stereotypes they would project onto you when they see your face,” Suchow explained.
    Many of the algorithm’s findings align with common intuitions or cultural assumptions: people who smile tend to be seen as more trustworthy, for instance, while people with glasses tend to be seen as more intelligent. In other cases, it’s a little harder to understand exactly why the algorithm attributes a particular trait to a person. More

  • in

    New algorithm could simplify decisions for ship channel dredging

    A new decision-support tool could become a game changer in the dredging of ship channels. Millions of dollars are at stake every time a major ship channel is cleaned up. Delays in dredging can cost even more by triggering increased risks, repeated maintenance and lost revenue. In either case, the task cannot be put off indefinitely.
    All ship channels must be regularly cleared of sand, debris settled on the bottom (called shoal) and miscellaneous trash. That means ship channel management regularly faces the mighty task of dredging. How do they make the wisest decisions with the best timing?
    “The quandary involves weighing factors for the optimal decision of channel dredging and disposal activities,” said Zheyong Bian, assistant professor of construction management at the University of Houston College of Technology and the lead author of a study published in the journal Transportation Research Part E: Logistics and Transportation Review. “Some factors are static, like geographical features of navigation channels and confined disposal facilities. Others vary substantially, such as navigability condition deterioration (shoaling), traffic, economic values, annual budget and more.”
    Bian, then a doctoral student at Rutgers University, The State University of New Jersey, collaborated with project leader Yun Bai from the Center for Advanced Infrastructure and Transportation (CAIT) at Rutgers, to develop a dredging planning optimization model (DPOM) and a dynamic prioritization planning (DPP) algorithm that factor in known variables, such as the volume of debris expected and availability of nearby confined disposal facilities. It also considers grouping phases of the project and how costs can be affected by interest rates and inflation.
    But — this is a key advantage, Bian stressed — the algorithm also holds the flexibility that weighs input from local professionals. In other words, it values boots-on-the-ground opinions and learns from experience.
    Once all factors are included, the algorithm suggests timing, prioritization and the grouping of projects. It also projects costs, with interest and inflation included, as well as monies likely to be recouped (through repurposing of sand, for example).
    In these days of tightened budgets, public funds are deployed with ever more care. The new DPOM model and DPP algorithm could strengthen cost efficiencies at ship channels around the world. The New Jersey Department of Transportation Office of Maritime Resources provided funding and data to support this study.
    Story Source:
    Materials provided by University of Houston. Original written by Sally Strong. Note: Content may be edited for style and length. More