More stories

  • in

    Multi-state data storage leaving binary behind

    Electronic data is being produced at a breath-taking rate.
    The total amount of data stored in data centres around the globe is of the order of ten zettabytes (a zettabyte is a trillion gigabytes), and we estimate that amount doubles every couple of years.
    With 8% of global electricity already being consumed in information and communication technology (ICT), low-energy data-storage is a key priority.
    To date there is no clear winner in the race for next-generation memory that is non-volatile, has great endurance, highly energy efficient, low cost, high density, and allows fast access operation.
    The joint international team comprehensively reviews ‘multi-state memory’ data storage, which steps ‘beyond binary’ to store more data than just 0s and 1s.
    MULTI-STATE MEMORY: MORE THAN JUST ZEROES AND ONES
    Multi-state memory is an extremely promising technology for future data storage, with the ability to store data in more than a single bit (ie, 0 or 1) allowing much higher storage density (amount of data stored per unit area.

    advertisement

    This circumvents the plateauing of benefits historically offered by ‘Moore’s Law’, where component size halved abut every two years. In recent years, the long-predicted plateauing of Moore’s Law has been observed, with charge leakage and spiralling research and fabrication costs putting the nail in the Moore’s Law coffin.
    Non-volatile, multi-state memory (NMSM) offers energy efficiency, high, nonvolatility, fast access, and low cost.
    Storage density is dramatically enhanced without scaling down the dimensions of the memory cell, making memory devices more efficient and less expensive.
    NEUROMORPHIC COMPUTER MIMICKING THE HUMAN BRAIN
    Multi-state memory also enables the proposed future technology neuromorphic computing, which would mirror the structure of the human brain. This radically-different, brain-inspired computing regime could potentially provide the economic impetus for adoption of a novel technology such as NMSM.
    NMSMs allow analog calculation, which could be vital to intelligent, neuromorphic networks, as well as potentially helping us finally unravel the working mechanism of the human brain itself.
    THE STUDY
    The paper reviews device architectures, working mechanisms, material innovation, challenges, and recent progress for leading NMSM candidates, including:
    Flash memory
    magnetic random-access memory (MRAM)
    resistive random-access memory (RRAM)
    ferroelectric random-access memory (FeRAM)
    phase-change memory (PCM) More

  • in

    New project to build nano-thermometers could revolutionize temperature imaging

    Cheaper refrigerators? Stronger hip implants? A better understanding of human disease? All of these could be possible and more, someday, thanks to an ambitious new project underway at the National Institute of Standards and Technology (NIST).
    NIST researchers are in the early stages of a massive undertaking to design and build a fleet of tiny ultra-sensitive thermometers. If they succeed, their system will be the first to make real-time measurements of temperature on the microscopic scale in an opaque 3D volume — which could include medical implants, refrigerators, and even the human body.
    The project is called Thermal Magnetic Imaging and Control (Thermal MagIC), and the researchers say it could revolutionize temperature measurements in many fields: biology, medicine, chemical synthesis, refrigeration, the automotive industry, plastic production — “pretty much anywhere temperature plays a critical role,” said NIST physicist Cindi Dennis. “And that’s everywhere.”
    The NIST team has now finished building its customized laboratory spaces for this unique project and has begun the first major phase of the experiment.
    Thermal MagIC will work by using nanometer-sized objects whose magnetic signals change with temperature. The objects would be incorporated into the liquids or solids being studied — the melted plastic that might be used as part of an artificial joint replacement, or the liquid coolant being recirculated through a refrigerator. A remote sensing system would then pick up these magnetic signals, meaning the system being studied would be free from wires or other bulky external objects.
    The final product could make temperature measurements that are 10 times more precise than state-of-the-art techniques, acquired in one-tenth the time in a volume 10,000 times smaller. This equates to measurements accurate to within 25 millikelvin (thousandths of a kelvin) in as little as a tenth of a second, in a volume just a hundred micrometers (millionths of a meter) on a side. The measurements would be “traceable” to the International System of Units (SI); in other words, its readings could be accurately related to the fundamental definition of the kelvin, the world’s basic unit of temperature.

    advertisement

    The system aims to measure temperatures over the range from 200 to 400 kelvin (K), which is about -99 to 260 degrees Fahrenheit (F). This would cover most potential applications — at least the ones the Thermal MagIC team envisions will be possible within the next 5 years. Dennis and her colleagues see potential for a much larger temperature range, stretching from 4 K-600 K, which would encompass everything from supercooled superconductors to molten lead. But that is not a part of current development plans.
    “This is a big enough sea change that we expect that if we can develop it — and we have confidence that we can — other people will take it and really run with it and do things that we currently can’t imagine,” Dennis said.
    Potential applications are mostly in research and development, but Dennis said the increase in knowledge would likely trickle down to a variety of products, possibly including 3D printers, refrigerators, and medicines.
    What Is It Good For?
    Whether it’s the thermostat in your living room or a high-precision standard instrument that scientists use for laboratory measurements, most thermometers used today can only measure relatively big areas — on a macroscopic as opposed to microscopic level. These conventional thermometers are also intrusive, requiring sensors to penetrate the system being measured and to connect to a readout system by bulky wires.

    advertisement

    Infrared thermometers, such as the forehead instruments used at many doctors’ offices, are less intrusive. But they still only make macroscopic measurements and cannot see beneath surfaces.
    Thermal MagIC should let scientists get around both these limitations, Dennis said.
    Engineers could use Thermal MagIC to study, for the first time, how heat transfer occurs within different coolants on the microscale, which could aid their quest to find cheaper, less energy-intensive refrigeration systems.
    Doctors could use Thermal MagIC to study diseases, many of which are associated with temperature increases — a hallmark of inflammation — in specific parts of the body.
    And manufacturers could use the system to better control 3D printing machines that melt plastic to build custom objects such as medical implants and prostheses. Without the ability to measure temperature on the microscale, 3D printing developers are missing crucial information about what’s going on inside the plastic as it solidifies into an object. More knowledge could improve the strength and quality of 3D-printed materials someday, by giving engineers more control over the 3D printing process.
    Giving It OOMMF
    The first step in making this new thermometry system is creating nano-sized magnets that will give off strong magnetic signals in response to temperature changes. To keep particle concentrations as low as possible, the magnets will need to be 10 times more sensitive to temperature changes than any objects that currently exist.
    To get that kind of signal, Dennis said, researchers will likely need to use multiple magnetic materials in each nano-object. A core of one substance will be surrounded by other materials like the layers of an onion.
    The trouble is that there are practically endless combinations of properties that can be tweaked, including the materials’ composition, size, shape, the number and thickness of the layers, or even the number of materials. Going through all of these potential combinations and testing each one for its effect on the object’s temperature sensitivity could take multiple lifetimes to accomplish.
    To help them get there in months instead of decades, the team is turning to sophisticated software: the Object Oriented MicroMagnetic Framework (OOMMF), a widely used modeling program developed by NIST researchers Mike Donahue and Don Porter.
    The Thermal MagIC team will use this program to create a feedback loop. NIST chemists Thomas Moffat, Angela Hight Walker and Adam Biacchi will synthesize new nano-objects. Then Dennis and her team will characterize the objects’ properties. And finally, Donahue will help them feed that information into OOMMF, which will make predictions about what combinations of materials they should try next.
    “We have some very promising results from the magnetic nano-objects side of things, but we’re not quite there yet,” Dennis said.
    Each Dog Is a Voxel
    So how do they measure the signals given out by tiny concentrations of nano-thermometers inside a 3D object in response to temperature changes? They do it with a machine called a magnetic particle imager (MPI), which surrounds the sample and measures a magnetic signal coming off the nanoparticles.
    Effectively, they measure changes to the magnetic signal coming off one small volume of the sample, called a “voxel” — basically a 3D pixel — and then scan through the entire sample one voxel at a time.
    But it’s hard to focus a magnetic field, said NIST physicist Solomon Woods. So they achieve their goal in reverse.
    Consider a metaphor. Say you have a dog kennel, and you want to measure how loud each individual dog is barking. But you only have one microphone. If multiple dogs are barking at once, your mic will pick up all of that sound, but with only one mic you won’t be able to distinguish one dog’s bark from another’s.
    However, if you could quiet each dog somehow — perhaps by occupying its mouth with a bone — except for a single cocker spaniel in the corner, then your mic would still be picking up all the sounds in the room, but the only sound would be from the cocker spaniel.
    In theory, you could do this with each dog in sequence — first the cocker spaniel, then the mastiff next to it, then the labradoodle next in line — each time leaving just one dog bone-free.
    In this metaphor, each dog is a voxel.
    Basically, the researchers max out the ability of all but one small volume of their sample to respond to a magnetic field. (This is the equivalent of stuffing each dog’s mouth with a delicious bone.) Then, measuring the change in magnetic signal from the entire sample effectively lets you measure just that one little section.
    MPI systems similar to this exist but are not sensitive enough to measure the kind of tiny magnetic signal that would come from a small change in temperature. The challenge for the NIST team is to boost the signal significantly.
    “Our instrumentation is very similar to MPI, but since we have to measure temperature, not just measure the presence of a nano-object, we essentially need to boost our signal-to-noise ratio over MPI by a thousand or 10,000 times,” Woods said.
    They plan to boost the signal using state-of-the-art technologies. For example, Woods may use superconducting quantum interference devices (SQUIDs), cryogenic sensors that measure extremely subtle changes in magnetic fields, or atomic magnetometers, which detect how energy levels of atoms are changed by an external magnetic field. Woods is working on which are best to use and how to integrate them into the detection system.
    The final part of the project is making sure the measurements are traceable to the SI, a project led by NIST physicist Wes Tew. That will involve measuring the nano-thermometers’ magnetic signals at different temperatures that are simultaneously being measured by standard instruments.
    Other key NIST team members include Thinh Bui, Eric Rus, Brianna Bosch Correa, Mark Henn, Eduardo Correa and Klaus Quelhas.
    Before finishing their new laboratory space, the researchers were able to complete some important work. In a paper published last month in the International Journal on Magnetic Particle Imaging, the group reported that they had found and tested a “promising” nanoparticle material made of iron and cobalt, with temperature sensitivities that varied in a controllable way depending on how the team prepared the material. Adding an appropriate shell material to encase this nanoparticle “core” would bring the team closer to creating a working temperature-sensitive nanoparticle for Thermal MagIC.
    In the past few weeks, the researchers have made further progress testing combinations of materials for the nanoparticles.
    “Despite the challenge of working during the pandemic, we have had some successes in our new labs,” Woods said. “These achievements include our first syntheses of multi-layer nanomagnetic systems for thermometry, and ultra-stable magnetic temperature measurements using techniques borrowed from atomic clock research.” More

  • in

    'Universal law of touch' will enable new advances in virtual reality

    Seismic waves, commonly associated with earthquakes, have been used by scientists to develop a universal scaling law for the sense of touch. A team, led by researchers at the University of Birmingham, used Rayleigh waves to create the first scaling law for touch sensitivity. The results are published in Science Advances.
    The researchers are part of a European consortium (H-Reality) that are already using the theory to develop new Virtual Reality technologies that incorporate the sense of touch.
    Rayleigh waves are created by impact between objects and are commonly thought to travel only along surfaces. The team discovered that, when it comes to touch, the waves also travel through layers of skin and bone and are picked up by the body’s touch receptor cells.
    Using mathematical modelling of these touch receptors the researchers showed how the receptors were located at depths that allowed them to respond to Rayleigh waves. The interaction of these receptors with the Rayleigh waves will vary across species, but the ratio of receptor depth vs wavelength remains the same, enabling the universal law to be defined.
    The mathematics used by the researchers to develop the law is based on approaches first developed over a hundred years ago to model earthquakes. The law supports predictions made by the Nobel-Prize-winning physicist Georg von Békésy who first suggested the mathematics of earthquakes could be used to explore connections between Rayleigh waves and touch.
    The team also found that the interaction of the waves and receptors remained even when the stiffness of the outermost layer of skin changed. The ability of the receptors to respond to Rayleigh waves remained unchanged despite the many variations in this outer layer caused by, age, gender, profession, or even hydration.
    Dr Tom Montenegro-Johnson, of the University of Birmingham’s School of Mathematics, led the research. He explains: “Touch is a primordial sense, as important to our ancient ancestors as it is to modern day mammals, but it’s also one of the most complex and therefore least understood. While we have universal laws to explain sight and hearing, for example, this is the first time that we’ve been able to explain touch in this way.”
    James Andrews, co-author of the study at the University of Birmingham, adds: “The principles we’ve defined enable us to better understand the different experiences of touch among a wide range of species. For example, if you indent the skin of a rhinoceros by 5mm, they would have the same sensation as a human with a similar indentation — it’s just that the forces required to produce the indentation would be different. This makes a lot of sense in evolutionary terms, since it’s connected to relative danger and potential damage.”
    The work was funded by the European Union’s Horizon 2020 research and innovation programme, under collaborative project “H-Reality.” The other institutions involved in the project are Ultraleap Ltd. (UK), Actronika (France), TU Delft (The Netherlands), and CNRS (France).

    Story Source:
    Materials provided by University of Birmingham. Note: Content may be edited for style and length. More

  • in

    Researchers use artificial intelligence language tools to decode molecular movements

    By applying natural language processing tools to the movements of protein molecules, University of Maryland scientists created an abstract language that describes the multiple shapes a protein molecule can take and how and when it transitions from one shape to another.
    A protein molecule’s function is often determined by its shape and structure, so understanding the dynamics that control shape and structure can open a door to understanding everything from how a protein works to the causes of disease and the best way to design targeted drug therapies. This is the first time a machine learning algorithm has been applied to biomolecular dynamics in this way, and the method’s success provides insights that can also help advance artificial intelligence (AI). A research paper on this work was published on October 9, 2020, in the journal Nature Communications.
    “Here we show the same AI architectures used to complete sentences when writing emails can be used to uncover a language spoken by the molecules of life,” said the paper’s senior author, Pratyush Tiwary, an assistant professor in UMD’s Department of Chemistry and Biochemistry and Institute for Physical Science and Technology. “We show that the movement of these molecules can be mapped into an abstract language, and that AI techniques can be used to generate biologically truthful stories out of the resulting abstract words.”
    Biological molecules are constantly in motion, jiggling around in their environment. Their shape is determined by how they are folded and twisted. They may remain in a given shape for seconds or days before suddenly springing open and refolding into a different shape or structure. The transition from one shape to another occurs much like the stretching of a tangled coil that opens in stages. As different parts of the coil release and unfold, the molecule assumes different intermediary conformations.
    But the transition from one form to another occurs in picoseconds (trillionths of a second) or faster, which makes it difficult for experimental methods such as high-powered microscopes and spectroscopy to capture exactly how the unfolding happens, what parameters affect the unfolding and what different shapes are possible. The answers to those questions form the biological story that Tiwary’s new method can reveal.
    Tiwary and his team applied Newton’s laws of motion — which can predict the movement of atoms within a molecule — with powerful supercomputers, including UMD’s Deepthought2, to develop statistical physics models that simulate the shape, movement and trajectory of individual molecules.
    Then they fed those models into a machine learning algorithm, like the one Gmail uses to automatically complete sentences as you type. The algorithm approached the simulations as a language in which each molecular movement forms a letter that can be strung together with other movements to make words and sentences. By learning the rules of syntax and grammar that determine which shapes and movements follow one another and which don’t, the algorithm predicts how the protein untangles as it changes shape and the variety of forms it takes along the way.
    To demonstrate that their method works, the team applied it to a small biomolecule called riboswitch, which had been previously analyzed using spectroscopy. The results, which revealed the various forms the riboswitch could take as it was stretched, matched the results of the spectroscopy studies.
    “One of the most important uses of this, I hope, is to develop drugs that are very targeted,” Tiwary said. “You want to have potent drugs that bind very strongly, but only to the thing that you want them to bind to. We can achieve that if we can understand the different forms that a given biomolecule of interest can take, because we can make drugs that bind only to one of those specific forms at the appropriate time and only for as long as we want.”
    An equally important part of this research is the knowledge gained about the language processing system Tiwary and his team used, which is generally called a recurrent neural network, and in this specific instance a long short-term memory network. The researchers analyzed the mathematics underpinning the network as it learned the language of molecular motion. They found that the network used a kind of logic that was similar to an important concept from statistical physics called path entropy. Understanding this opens opportunities for improving recurrent neural networks in the future.
    “It is natural to ask if there are governing physical principles making AI tools successful,” Tiwary said. “Here we discover that, indeed, it is because the AI is learning path entropy. Now that we know this, it opens up more knobs and gears we can tune to do better AI for biology and perhaps, ambitiously, even improve AI itself. Anytime you understand a complex system such as AI, it becomes less of a black-box and gives you new tools for using it more effectively and reliably.” More

  • in

    New model may explain rarity of certain malaria-blocking mutations

    A new computational model suggests that certain mutations that block infection by the most dangerous species of malaria have not become widespread in people because of the parasite’s effects on the immune system. Bridget Penman of the University of Warwick, U.K., and Sylvain Gandon of the CNRS and Montpellier University, France, present these findings in the open-access journal PLOS Computational Biology.
    Malaria is a potentially lethal, mosquito-borne disease caused by parasites of the Plasmodium genus. Several protective adaptations to malaria have spread widely among humans, such as the sickle-cell mutation. Laboratory experiments suggest that certain other mutations could be highly protective against the most dangerous human-infecting malaria species, Plasmodium falciparum. However, despite being otherwise benign, these mutations have not become widespread.
    To help clarify why some protective mutations may remain rare, Penman and colleagues developed a computational model that simulates the epidemiology of malaria infection, as well the evolution of protective mutations. Importantly, the model also incorporates mechanisms of adaptive immunity, in which the immune system “learns” to recognize and attack specific pathogens, such as P. falciparum.
    Analysis of the model’s predictions suggests that if people rapidly gain adaptive immunity to the severe effects of P. falciparum malaria, mutations capable of blocking P. falciparum infection are unlikely to spread among the population. The fewer the number of infections it takes for people to become immune to the severe effects of malaria, the less likely it is that malaria infection-blocking mutations will arise.
    “Understanding why a potential human malaria adaptation has not succeeded could be just as important as understanding those which have succeeded,” Penman says. “Our results highlight the need for further detailed genetic studies of populations living in regions impacted by malaria in order to better understand malaria-human interactions.”
    Ultimately, understanding how humans have adapted to malaria could help open up new avenues for treatment.

    Story Source:
    Materials provided by PLOS. Note: Content may be edited for style and length. More

  • in

    Engineering team develops novel miniaturized organic semiconductor

    Field Effect Transistors (FET) are the core building blocks of modern electronics such as integrated circuits, computer CPUs and display backplanes. Organic Field Effect Transistors (OFETs), which use organic semiconductor as a channel for current flows, have the advantage of being flexible when compared with their inorganic counterparts like silicon.
    OFETs, given their high sensitivity, mechanical flexibility, biocompatibility, property tunability and low-cost fabrication, are considered to have great potential in new applications in wearable electronics, conformal health monitoring sensors, and bendable displays etc. Imagine TV screens that can be rolled up; or smart wearable electronic devices and clothing worn close to the body to collect vital body signals for instant biofeedback; or mini-robots made of harmless organic materials working inside the body for diseases diagnosis, target drug transportations, mini-surgeries and other medications and treatments.
    Until now, the main limitation on enhanced performance and mass production of OFETs lies in the difficulty in miniaturising them. Products currently using OFETs in the market are still in their primitive forms, in terms of product flexibility and durability.
    An engineering team led by Dr Paddy Chan Kwok Leung at the Department of Mechanical Engineering of the University of Hong Kong (HKU) has made an important breakthrough in developing the staggered structure monolayer Organic Field Effect Transistors, which sets a major cornerstone to reduce the size of OFETs. The result has been published in the academic journal Advanced Materials. A US patent has been filed for the innovation.
    The major problem now confronting scientists in reducing the size of OFETs is that the performance of the transistor will drop significantly with a reduction in size, partly due to the problem of contact resistance, i.e. resistance at interfaces which resists current flows. When the device gets smaller, its contact resistance will become a dominating factor in significantly downgrading the device’s performance.
    The staggered structure monolayer OFETs created by Dr Chan’s team demonstrate a record low normalized contact resistance of 40 ? -cm. Compared with conventional devices with a contact resistance of 1000 ? -cm, the new device can save 96% of power dissipation at contact when running the device at the same current level. More importantly, apart from energy saving, the excessive heat generated in the system, a common problem which causes semiconductors to fail, can be greatly reduced.
    “On the basis of our achievement, we can further reduce the dimensions of OFETs and push them to a sub-micrometer scale, a level compatible with their inorganic counterparts, while can still function effectively to exhibit their unique organic properties. This is critical for meeting the requirement for commercialisation of related research.” Dr Chan said.
    “If flexible OFET works, many traditional rigid based electronics such as display panels, computers and cell phones would transform to become flexible and foldable. These future devices would be much lighter in weight, and with low production cost.”
    “Moreover, given their organic nature, they are more likely to be biocompatible for advanced medical applications such as sensors in tracking brain activities or neural spike sensing, and in precision diagnosis of brain related illness such as epilepsy.” Dr Chan added.
    Dr Chan’s team is currently working with researchers at the HKU Faculty of Medicine and biomedical engineering experts at CityU to integrate the miniaturised OFETs into a flexible circuit onto a polymer microprobe for neural spike detections in-vivo on a mouse brain under different external stimulations. They also plan to integrate the OFETs onto surgical tools such as catheter tube, and then put it inside animals’ brains for direct brain activities sensing to locate abnormal activation in brain.
    “Our OFETs provide a much better signal to noise ratio. Therefore, we expect we can pick up some weak signals which cannot be detected before using the conventional bare electrode for sensing.”
    “It has been our goal to connect applied research with fundamental science. Our research achievement would hopefully open a blue ocean for OFETs research and applications. We believe that the setting and achievement on OFETs are now ready for applications in large area display backplane and surgical tools.” Dr Chan concluded. More

  • in

    Study uses mathematical modeling to identify an optimal school return approach

    In a recent study, NYU Abu Dhabi Professor of Practice in Mathematics Alberto Gandolfi has developed a mathematical model to identify the number of days students could attend school to allow them a better learning experience while mitigating infections of COVID-19.
    Published in Physicsa D journal, the study shows that blended models, with almost periodic alternations of in-class and remote teaching days or weeks, would be ideal. In a prototypical example, the optimal strategy results in the school opening 90 days out of 200, with the number of COVID-19 cases among the individuals related to the school increasing by about 66 percent, instead of the almost 250 percent increase, which is predicted should schools fully reopen.
    The study features five different groups; these include students susceptible to infection, students exposed to infection, students displaying symptoms, asymptomatic students, and recovered students. In addition, Gandolfi’s study models other factors, including a seven hour school day as the window for transmission, and the risk of students getting infected outside of school.
    Speaking on the development of this model, Gandolfi commented: “The research comes as over one billion students around the world are using remote learning models in the face of the global pandemic, and educators are in need of plans for the upcoming 2020 — 2021 academic year. Given that children come in very close contact within the classrooms, and that the incubation period lasts several days, the study shows that full re-opening of the classrooms is not a viable possibility in most areas. On the other hand, with the development of a vaccine still in its formative stages, studies have placed the potential impact of COVID-19 on children as losing 30 percent of usual progress in reading and 50 percent or more in math.”
    He added: “The approach aims to provide a viable solution for schools that are planning activities ahead of the 2020 — 2021 academic year. Each school, or group thereof, can adapt the study to its current situation in terms of local COVID-19 diffusion and relative importance assigned to COVID-19 containment versus in-class teaching; it can then compute an optimal opening strategy. As these are mixed solutions in most cases, other aspects of socio-economic life in the area could then be built around the schools’ calendar. This way, children can benefit as much as possible from a direct, in class experience, while ensuring that the spread of infection is kept under control.”
    Using the prevalence of active COVID-19 cases in a region as a proxy for the chance of getting infected, the study gives a first indication, for each country, of the possibilities for school reopening: schools can fully reopen in a few countries, while in most others blended solutions can be attempted, with strict physical distancing, and frequent, generalized, even if not necessarily extremely reliable, testing.

    Story Source:
    Materials provided by New York University. Note: Content may be edited for style and length. More

  • in

    Biochip innovation combines AI and nanoparticle printing for cancer cell analysis

    Electrical engineers, computer scientists and biomedical engineers at the University of California, Irvine have created a new lab-on-a-chip that can help study tumor heterogeneity to reduce resistance to cancer therapies.
    In a paper published today in Advanced Biosystems, the researchers describe how they combined artificial intelligence, microfluidics and nanoparticle inkjet printing in a device that enables the examination and differentiation of cancers and healthy tissues at the single-cell level.
    “Cancer cell and tumor heterogeneity can lead to increased therapeutic resistance and inconsistent outcomes for different patients,” said lead author Kushal Joshi, a former UCI graduate student in biomedical engineering. The team’s novel biochip addresses this problem by allowing precise characterization of a variety of cancer cells from a sample.
    “Single-cell analysis is essential to identify and classify cancer types and study cellular heterogeneity. It’s necessary to understand tumor initiation, progression and metastasis in order to design better cancer treatment drugs,” said co-author Rahim Esfandyarpour, UCI assistant professor of electrical engineering & computer science as well as biomedical engineering. “Most of the techniques and technologies traditionally used to study cancer are sophisticated, bulky, expensive, and require highly trained operators and long preparation times.”
    He said his group overcame these challenges by combining machine learning techniques with accessible inkjet printing and microfluidics technology to develop low-cost, miniaturized biochips that are simple to prototype and capable of classifying various cell types.
    In the apparatus, samples travel through microfluidic channels with carefully placed electrodes that monitor differences in the electrical properties of diseased versus healthy cells in a single pass. The UCI researchers’ innovation was to devise a way to prototype key parts of the biochip in about 20 minutes with an inkjet printer, allowing for easy manufacturing in diverse settings. Most of the materials involved are reusable or, if disposable, inexpensive.
    Another aspect of the invention is the incorporation of machine learning to manage the large amount of data the tiny system produces. This branch of AI accelerates the processing and analysis of large datasets, finding patterns and associations, predicting precise outcomes, and aiding in rapid and efficient decision-making.
    By including machine learning in the biochip’s workflow, the team has improved the accuracy of analysis and reduced the dependency on skilled analysts, which can also make the technology appealing to medical professionals in the developing world, Esfandyarpour said.
    “The World Health Organization says that nearly 60 percent of deaths from breast cancer happen because of a lack of early detection programs in countries with meager resources,” he said. “Our work has potential applications in single-cell studies, in tumor heterogeneity studies and, perhaps, in point-of-care cancer diagnostics — especially in developing nations where cost, constrained infrastructure and limited access to medical technologies are of the utmost importance.”

    Story Source:
    Materials provided by University of California – Irvine. Note: Content may be edited for style and length. More