More stories

  • in

    An easier way to teach robots new skills

    With e-commerce orders pouring in, a warehouse robot picks mugs off a shelf and places them into boxes for shipping. Everything is humming along, until the warehouse processes a change and the robot must now grasp taller, narrower mugs that are stored upside down.
    Reprogramming that robot involves hand-labeling thousands of images that show it how to grasp these new mugs, then training the system all over again.
    But a new technique developed by MIT researchers would require only a handful of human demonstrations to reprogram the robot. This machine-learning method enables a robot to pick up and place never-before-seen objects that are in random poses it has never encountered. Within 10 to 15 minutes, the robot would be ready to perform a new pick-and-place task.
    The technique uses a neural network specifically designed to reconstruct the shapes of 3D objects. With just a few demonstrations, the system uses what the neural network has learned about 3D geometry to grasp new objects that are similar to those in the demos.
    In simulations and using a real robotic arm, the researchers show that their system can effectively manipulate never-before-seen mugs, bowls, and bottles, arranged in random poses, using only 10 demonstrations to teach the robot.
    “Our major contribution is the general ability to much more efficiently provide new skills to robots that need to operate in more unstructured environments where there could be a lot of variability. The concept of generalization by construction is a fascinating capability because this problem is typically so much harder,” says Anthony Simeonov, a graduate student in electrical engineering and computer science (EECS) and co-lead author of the paper. More

  • in

    Automated nutrition app can help people follow healthier diet

    People could benefit from fully automated personal nutritional advice, as a new research paper shows that an app improved healthy diet in clinical trials.
    A paper published in the Journal of Medical Internet Research today (Mon 25 April 22) shows how the eNutri app developed by researchers in human nutrition and biomedical engineering at the University of Reading helped people to eat more healthily. Participants who were given automated personalised nutrition advice improved their healthy diet score by 6% compared to a control group who were given general healthy eating guidance.
    Dr Roz Fallaize, Dietitian and Research Fellow at the University of Reading’s Department of Food and Nutritional Science said:
    “The research demonstrates that the eNutri technology is effective in helping users to improve their healthy food intake, with a significant improvement in diet quality for the group who had access to automated, personalised nutrition advice.”
    “While having a registered nutritionist or dietitian giving personalised dietary advice is ideal, this is often only available to those with health concerns or with the financial resource to pay. There is also growing interest in nutrition apps and web services, but many commercial apps tend to focus on weight loss or calorie counting rather than healthy eating.”
    “Nutritional advice should always be focused on healthy, balanced diets and positive relationships with food, and I’m pleased that our study helped people eat better. One exciting aspect of the eNutri system is the potential to offer it to lots of people at low-cost”
    Dr Rodrigo Zenun Franco, a PhD graduate from the University of Reading and lead author of the paper said:
    “The eNutri app prioritises healthy eating based on evidence and uniquely uses a diet scoring system to provide food-based advice that is tailored to the individual user.”
    “We are continuing to develop eNutri to suit specific population groups including those with heart conditions and hope to make this available to the public in the future”
    People were either assigned to receive personalised nutrition advice or given general healthy eating advice. Those in the personalised group then had their diets scored according to 11 criteria based on UK dietary guidance. The eNutri app gave an automated assessment of diet quality giving the user a ‘healthy diet score’ out of 100.
    The ‘healthy diet score’ includes assessments of intake of fruit, vegetables, wholegrains, red and processed meat, with higher points awarded when users have the recommended intake of these. The personalised advice is then targeted to areas of their diet which they would benefit most from changing.
    Story Source:
    Materials provided by University of Reading. Note: Content may be edited for style and length. More

  • in

    Using AI to detect cancer from patient data securely

    A new way of using artificial intelligence to predict cancer from patient data without putting personal information at risk has been developed by a team including University of Leeds medical scientists.
    Artificial intelligence (AI) can analyse large amounts of data, such as images or trial results, and can identify patterns often undetectable by humans, making it highly valuable in speeding up disease detection, diagnosis and treatment.
    However, using the technology in medical settings is controversial because of the risk of accidental data release and many systems are owned and controlled by private companies, giving them access to confidential patient data — and the responsibility for protecting it.
    The researchers set out to discover whether a form of AI, called swarm learning, could be used to help computers predict cancer in medical images of patient tissue samples, without releasing the data from hospitals.
    Swarm learning trains AI algorithms to detect patterns in data in a local hospital or university, such as genetic changes within images of human tissue. The swarm learning system then sends this newly trained algorithm — but importantly no local data or patient information — to a central computer. There, it is combined with algorithms generated by other hospitals in an identical way to create an optimised algorithm. This is then sent back to the local hospital, where it is reapplied to the original data, improving detection of genetic changes thanks to its more sensitive detection capabilities.
    By undertaking this several times, the algorithm can be improved and one created that works on all the data sets. This means that the technique can be applied without the need for any data to be released to third party companies or to be sent between hospitals or across international borders.
    The team trained AI algorithms on study data from three groups of patients from Northern Ireland, Germany and the USA. The algorithms were tested on two large sets of data images generated at Leeds, and were found to have successfully learned how to predict the presence of different sub types of cancer in the images.
    The research was led by Jakob Nikolas Kather, Visiting Associate Professor at the University of Leeds’ School of Medicine and Researcher at the University Hospital RWTH Aachen. The team included Professors Heike Grabsch and Phil Quirke, and Dr Nick West from the University of Leeds’ School of Medicine.
    Dr Kather said: “Based on data from over 5,000 patients, we were able to show that AI models trained with swarm learning can predict clinically relevant genetic changes directly from images of tissue from colon tumors.”
    Phil Quirke, Professor of Pathology in the University of Leeds’s School of Medicine, said: “We have shown that swarm learning can be used in medicine to train independent AI algorithms for any image analysis task. This means it is possible to overcome the need for data transfer without institutions having to relinquish secure control of their data.
    “Creating an AI system which can perform this task improves our ability to apply AI in the future.”
    Story Source:
    Materials provided by University of Leeds. Note: Content may be edited for style and length. More

  • in

    Exotic magnetic structures created with laser light

    Research at Lund University in Sweden has found a new way to create nano-sized magnetic particles using ultrafast laser light pulses. The discovery could pave the way for new and more energy-efficient technical components and become useful in the quantum computers of the future.
    Magnetic skyrmions are sometimes described as magnetic vortices. Unlike ferromagnetic states — which occur in conventional magnets such as compasses and refrigerator magnets — the skyrmion state is quite peculiar: the orientation of the magnetization does not point in the same direction everywhere in the material, but is instead best described as a kind of swirling magnetism.
    Skyrmions are of great interest to both basic researchand industry, as they can be used to manufacture more compact computer memories. However, that is easier said than done. Using skyrmions for technical purposes requires efficient ways of writing, erasing and manipulating the particles on short time scales, and with high spatial precision.
    In a new study, researchersClaudio Verdozzi from Lund University and Emil Viñas Boström and Angel Rubio from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg have found a new approach.
    “In our study, we have theoretically shown how it is possible to meet one of these requirements, that is, how to create magnetic skyrmions in ultra-short time scales using pulses of laser light,” says Claudio Verdozzi, physics researcher at Lund University.
    The research team has identified a microscopic mechanism that explains an experimental protocol that has been shown to be useful in creating the strange skyrmions. Using femtosecond laser pulses — light pulses that last one millionth of a billionth of a second — the researchers showed that it is possible to create skyrmions ultra-fast.
    “Our results are of great relevance to creating more energy-efficient technical components. Our study shows that light can be used to manipulate localized magnetic excitations in very short time scales,” says Claudio Verdozzi.
    There are a number of applications that the new discovery can lead to, including quantum technology — an area where quantum mechanical properties are used to solve extremely advanced calculations that traditional computers cannot handle. Magnetic excitations such as skyrmions and so-called spin waves are also believed to be able to help reduce energy consumption in technological components, and can thus help to achieve future climate goals.
    “Skyrmions are in focus for both theoretical and experimental research thanks to their technological potential. Also, their exotic magnetic patterns possess a conceptually and mathematically beautiful appeal that makes them very interesting,” concludes Claudio Verdozzi.
    Story Source:
    Materials provided by Lund University. Note: Content may be edited for style and length. More

  • in

    Computing: Resilient system using only non-volatile memory

    A KAIST research team has developed hardware and software technology that ensures both data and execution persistence. The Lightweight Persistence Centric System (LightPC) makes the systems resilient against power failures by utilizing only non-volatile memory as the main memory.
    “We mounted non-volatile memory on a system board prototype and created an operating system to verify the effectiveness of LightPC,” said Professor Myoungsoo Jung. The team confirmed that LightPC validated its execution while powering up and down in the middle of execution, showing up to eight times more memory, 4.3 times faster application execution, and 73% lower power consumption compared to traditional systems.
    Professor Jung said that LightPC can be utilized in a variety of fields such as data centers and high-performance computing to provide large-capacity memory, high performance, low power consumption, and service reliability.
    In general, power failures on legacy systems can lead to the loss of data stored in the DRAM-based main memory. Unlike volatile memory such as DRAM, non-volatile memory can retain its data without power. Although non-volatile memory has the characteristics of lower power consumption and larger capacity than DRAM, non-volatile memory is typically used for the task of secondary storage due to its lower write performance. For this reason, nonvolatile memory is often used with DRAM. However, modern systems employing non-volatile memory-based main memory experience unexpected performance degradation due to the complicated memory microarchitecture.
    To enable both data and execution persistent in legacy systems, it is necessary to transfer the data from the volatile memory to the non-volatile memory. Checkpointing is one possible solution. It periodically transfers the data in preparation for a sudden power failure. While this technology is essential for ensuring high mobility and reliability for users, checkpointing also has fatal drawbacks. It takes additional time and power to move data and requires a data recovery process as well as restarting the system.
    In order to address these issues, the research team developed a processor and memory controller to raise the performance of non-volatile memory-only memory. LightPC matches the performance of DRAM by minimizing the internal volatile memory components from non-volatile memory, exposing the non-volatile memory (PRAM) media to the host, and increasing parallelism to service on-the-fly requests as soon as possible.
    The team also presented operating system technology that quickly makes execution states of running processes persistent without the need for a checkpointing process. The operating system prevents all modifications to execution states and data by keeping all program executions idle before transferring data in order to support consistency within a period much shorter than the standard power hold-up time of about 16 minutes. For consistency, when the power is recovered, the computer almost immediately revives itself and re-executes all the offline processes immediately without the need for a boot process.
    The researchers will present their work (LightPC: Hardware and Software Co-Design for Energy-Efficient Full System Persistence) at the International Symposium on Computer Architecture (ISCA) 2022 in New York in June.
    Story Source:
    Materials provided by The Korea Advanced Institute of Science and Technology (KAIST). Note: Content may be edited for style and length. More

  • in

    Breakthrough for efficient and high-speed spintronic devices

    Sharing real-time information requires complex networks of systems. A promising approach for speeding up data storage devices consists of switching the magnetization, or the electrons’ spin, of magnetic materials with ultra-short femtosecond laser pulses. But, how the spin evolves in the nanoworld on extremely short time scales, in one millionth of one billionth of a second, has remained largely mysterious. The team of Professor François Légaré at the Institut national de la recherche scientifique (INRS) has made a major breakthrough in this field, in collaboration with TU Wien, Austria, the French national synchrotron facility (SOLEIL) and other international partners. Their work was published in the journal Optica.
    So far, studies on the subject strongly rely on limited access large X-ray facilities such as free-electron lasers and synchrotrons. The team demonstrates, for the first time, a tabletop ultrafast soft X-ray microscope to spatio-temporally resolve the spin dynamics inside rare earth materials, which are promising for spintronic devices.
    This new soft X-ray source based on a high-energy Ytterbium laser represents a critical advance for studying future energy-efficient and high-speed spintronic devices and could be used for many applications in physics, chemistry, and biology.
    “Our approach provides a robust, cost-efficient and energy-scalable elegant solution for many laboratories. It allows the study of ultrafast dynamics in nanoscale and mesoscale structures with both nanometre spatial and femtosecond temporal resolutions, as well as with the element specificity,” says Professor Andrius Baltuska, at TU Wien.
    Bright X-ray pulses to watch the spin
    With this bright source of X-ray photons, a series of snapshot images of the nanoscale rare earth magnetic structures have been recorded. They clearly expose the fast demagnetization process, and the results provide rich information on the magnetic properties that are as accurate as those obtained using large-scale X-ray facilities.
    “Development of ultrafast tabletop X-ray sources is exciting for cutting-edge technological applications and modern fields of science. We are excited about our results, that could be helpful for future research for spintronics, as well as other potential fields,” says INRS postdoctoral researcher, Dr. Guangyu Fan.
    “Rare earth systems are trending in the community because of their nanometer size, faster speed, and topologically protected stability. The X-ray source is very attractive for many studies on future spintronic devices composed of rare earth.” says Nicolas Jaouen, senior scientist at the French national synchrotron facility.
    Professor Légaré emphasizes the collaborative work between experts in the development of state-of-the-art light sources and ultrafast dynamics in magnetic materials at the nanoscale. “Considering the quick emergence of high-power Ytterbium laser technology, this work represents huge potential for high-performance soft X-ray sources. This new generation of lasers, which will be available soon at the Advanced Laser Light Source (ALLS), will have many future applications for the fields of physics, chemistry, and even biology,” he says.
    Story Source:
    Materials provided by Institut national de la recherche scientifique – INRS. Original written by Audrey-Maude Vézina. Note: Content may be edited for style and length. More

  • in

    Scientists turn a hydrogen molecule into a quantum sensor

    Physicists at the University of California, Irvine have demonstrated the use of a hydrogen molecule as a quantum sensor in a terahertz laser-equipped scanning tunneling microscope, a technique that can measure the chemical properties of materials at unprecedented time and spatial resolutions.
    This new technique can also be applied to analysis of two-dimensional materials which have the potential to play a role in advanced energy systems, electronics and quantum computers.
    Today in Science, the researchers in UCI’s Department of Physics & Astronomy and Department of Chemistry describe how they positioned two bound atoms of hydrogen in between the silver tip of the STM and a sample composed of a flat copper surface arrayed with small islands of copper nitride. With pulses of the laser lasting trillionths of a second, the scientists were able to excite the hydrogen molecule and detect changes in its quantum states at cryogenic temperatures and in the ultrahigh vacuum environment of the instrument, rendering atomic-scale, time-lapsed images of the sample.
    “This project represents an advance in both the measurement technique and the scientific question the approach allowed us to explore,” said co-author Wilson Ho, Bren Professor of physics & astronomy and chemistry. “A quantum microscope that relies on probing the coherent superposition of states in a two-level system is much more sensitive than existing instruments that are not based on this quantum physics principle.”
    Ho said the hydrogen molecule is an example of a two-level system because its orientation shifts between two positions, up and down and slightly horizontally tilted. Through a laser pulse, the scientists can coax the system to go from a ground state to an excited state in a cyclical fashion resulting in a superposition of the two states. The duration of the cyclic oscillations is vanishingly brief — lasting mere tens of picoseconds — but by measuring this “decoherence time” and the cyclic periods the scientists were able to see how the hydrogen molecule was interacting with its environment.
    “The hydrogen molecule became part of the quantum microscope in the sense that wherever the microscope scanned, the hydrogen was there in between the tip and the sample,” said Ho. “It makes for an extremely sensitive probe, allowing us to see variations down to 0.1 angstrom. At this resolution, we could see how the charge distributions change on the sample.”
    The space between the STM tip and the sample is almost unimaginably small, about six angstroms or 0.6 nanometers. The STM that Ho and his team assembled is equipped to detect minute electrical current flowing in this space and produce spectroscopic readings proving the presence of the hydrogen molecule and sample elements. Ho said this experiment represents the first demonstration of a chemically sensitive spectroscopy based on terahertz-induced rectification current through a single molecule.
    The ability to characterize materials at this level of detail based on hydrogen’s quantum coherence can be of great use in the science and engineering of catalysts, since their functioning often depends on surface imperfections at the scale of single atoms, according to Ho.
    “As long as hydrogen can be adsorbed onto a material, in principle, you can use hydrogen as a sensor to characterize the material itself through observations of their electrostatic field distribution,” said study lead author Likun Wang, UCI graduate student in physics & astronomy.
    Joining Ho and Wang on this project, which was supported by the U.S. Department of Energy Office of Basic Energy Sciences, was Yunpeng Xia, UCI graduate student in physics & astronomy.
    Story Source:
    Materials provided by University of California – Irvine. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence helps physicians better assess the effectiveness of bladder cancer treatment

    In a small but multi-institutional study, an artificial intelligence-based system improved providers’ assessments of whether patients with bladder cancer had complete response to chemotherapy before a radical cystectomy (bladder removal surgery).
    Yet the researchers caution that AI isn’t a replacement for human expertise and that their tool shouldn’t be used as such.
    “If you use the tool smartly, it can help you,” said Lubomir Hadjiyski, Ph.D., a professor of radiology at the University of Michigan Medical School and the senior author of the study.
    When patients develop bladder cancer, surgeons often remove the entire bladder in an effort to keep the cancer from returning or spreading to other organs or areas. More evidence is building, though, that surgery may not be necessary if a patient has zero evidence of disease after chemotherapy.
    However, it’s difficult to determine whether the lesion left after treatment is simply tissue that’s become necrotic or scarred as a result of treatment or whether cancer remains. The researchers wondered if AI could help.
    “The big question was when you have such an artificial device next to you, how is it going to affect the physician?” Hadjiyski said. “Is it going to help? Is it going to confuse them? Is it going to raise their performance or will they simply ignore it?”
    Fourteen physicians from different specialties — including radiology, urology and oncology — as well as two fellows and a medical student looked at pre- and post-treatment scans of 157 bladder tumors. The providers gave ratings for three measures that assessed the level of response to chemotherapy as well as a recommendation for the next treatment to be done for each patient (radiation or surgery). More