More stories

  • in

    A pulse of innovation: AI at the service of heart research

    Understanding heart function and disease, as well as testing new drugs for heart conditions, has long been a complex and time-consuming task. A promising way to study disease and test new drugs is to use cellular and engineered tissue models in a dish, but existing methods to study heart cell contraction and calcium handling require a good deal of manual work, are prone to errors, and need expensive specialized equipment. There clearly is a critical medical need for a more efficient, accurate, and accessible way to study heart function, using a methodology based on artificial intelligence (AI) and machine learning.
    BeatProfiler, new tool to rapidly analyze heart cell function
    Researchers at Columbia Engineering unveiled a groundbreaking new tool today that addresses these challenges head-on. BeatProfiler is a comprehensive software that automates the analysis of heart cell function from video data and is the first system to integrate the analysis of different heart function indicators, such as contractility, calcium handling, and force output into one tool, speeding up the process significantly and reducing the chance for errors. BeatProfiler enabled the researchers to not only distinguish between different diseases and levels of their severity but also to rapidly and objectively test drugs that affect heart function. The study was published on April 8 in IEEE Open Journal of Engineering in Medicine and Biology.
    “This is truly a transformative tool,” said project leader Gordana Vunjak-Novakovic, University Professor and the Mikati Foundation Professor of Biomedical Engineering, Medical Sciences, and Dental Medicine at Columbia. “It’s fast, comprehensive, automated, and compatible with a broad range of computer platforms so it is easily accessible to investigators and clinicians.”
    Software is open-source
    The team, which included Barry Fine, assistant professor of medicine (in Cardiology) at Columbia University Irving Medical Center, elected not to file a patent application, and instead are offering the AI software as open source, so it can be directly used — for free — by any lab. They believe that this is important for disseminating the results of their research, as well as for getting feedback from users in academic, clinical, and commercial labs that can help the team to further refine the software.
    The need to diagnose heart disease quickly and accurately
    This project was driven, like much of Vunjak-Novakovic’s research, by a clinical need to diagnose heart diseases more quickly and accurately. This was a project that was several years in the making in which the team added different features piece by piece. While the overarching need was to develop a tool that could better capture the function of the cardiac models that the team was building to study cardiac diseases and assess the efficacy of potential therapeutics, the researchers had an urgent need to quickly and accurately assess the function of their cardiac models in real-time.

    As the lab was making more and more cardiac tissues through innovations such as milliPillar and multiorgan tissue models, the increased capabilities of the tissues required the researchers to develop a method to more rapidly quantify the function of cardiomyocytes (heart muscle cells) and tissues to enable studies exploring genetic cardiomyopathies, cosmic radiation, immune-mediated inflammation, and drug discovery.
    Collaborators in software development, machine learning, and more
    In the last year and a half, lead author Youngbin Kim and his coauthors developed a graphical user interface (GUI) on top of the code so that biomedical researchers with no coding expertise could easily analyze the data with just a few clicks. This brought together experts in software development (for the GUI development), machine learning (for developing computer vision technology and disease/drug classifiers), signal processing (for processing contractile and calcium signals), engineering (translating pillar deflection on the cardiac platform to mechanical force), and user experience by lab members (to give feedback for improvements in the interface).
    The results
    The study showed that BeatProfiler could accurately analyze cardiomyocyte function, outperforming existing tools by being faster — up to 50 times in some cases — and more reliable. It detected subtle changes in engineered heat tissue force response that other tools might miss.
    “This level of analysis speed and versatility is unprecedented in cardiac research,” said Kim, a PhD candidate in Vunjak-Novakovic’s lab at Columbia Engineering. “Using machine learning, the functional measurements analyzed by BeatProfiler helped us to distinguish between diseased and healthy heart cells with high accuracy and even to classify different cardiac drugs based on how they affect the heart.”
    What’s next
    The team is working to expand BeatProfiler’s capabilities for new applications in heart research, including a full spectrum of diseases that affect the pumping of the heart, and drug development. To ensure that BeatProfiler can be applied to a wide variety of research questions, they are testing and validating its performance across additional in vitro cardiac models, including different engineered heart tissue models. They are also refining their machine-learning algorithm to extend and generalize its use to a variety of heart diseases and drug effect classification. The long-term goal is to adapt BeatProfiler to pharmaceutical settings to speed up the testing of hundreds of thousands of candidate drugs at once. More

  • in

    Engineers design soft and flexible ‘skeletons’ for muscle-powered robots

    Our muscles are nature’s perfect actuators — devices that turn energy into motion. For their size, muscle fibers are more powerful and precise than most synthetic actuators. They can even heal from damage and grow stronger with exercise.
    For these reasons, engineers are exploring ways to power robots with natural muscles. They’ve demonstrated a handful of “biohybrid” robots that use muscle-based actuators to power artificial skeletons that walk, swim, pump, and grip. But for every bot, there’s a very different build, and no general blueprint for how to get the most out of muscles for any given robot design.
    Now, MIT engineers have developed a spring-like device that could be used as a basic skeleton-like module for almost any muscle-bound bot. The new spring, or “flexure,” is designed to get the most work out of any attached muscle tissues. Like a leg press that’s fit with just the right amount of weight, the device maximizes the amount of movement that a muscle can naturally produce.
    The researchers found that when they fit a ring of muscle tissue onto the device, much like a rubber band stretched around two posts, the muscle pulled on the spring, reliably and repeatedly, and stretched it five times more, compared with other previous device designs.
    The team sees the flexure design as a new building block that can be combined with other flexures to build any configuration of artificial skeletons. Engineers can then fit the skeletons with muscle tissues to power their movements.
    “These flexures are like a skeleton that people can now use to turn muscle actuation into multiple degrees of freedom of motion in a very predictable way,” says Ritu Raman, the Brit and Alex d’Arbeloff Career Development Professor in Engineering Design at MIT. “We are giving roboticists a new set of rules to make powerful and precise muscle-powered robots that do interesting things.”
    Raman and her colleagues report the details of the new flexure design in a paper appearing in the journal Advanced Intelligent Systems. The study’s MIT co-authors include Naomi Lynch ’12, SM ’23; undergraduate Tara Sheehan; graduate students Nicolas Castro, Laura Rosado, and Brandon Rios; and professor of mechanical engineering Martin Culpepper.

    Muscle pull
    When left alone in a petri dish in favorable conditions, muscle tissue will contract on its own but in directions that are not entirely predictable or of much use.
    “If muscle is not attached to anything, it will move a lot, but with huge variability, where it’s just flailing around in liquid,” Raman says.
    To get a muscle to work like a mechanical actuator, engineers typically attach a band of muscle tissue between two small, flexible posts. As the muscle band naturally contracts, it can bend the posts and pull them together, producing some movement that would ideally power part of a robotic skeleton. But in these designs, muscles have produced limited movement, mainly because the tissues are so variable in how they contact the posts. Depending on where the muscles are placed on the posts, and how much of the muscle surface is touching the post, the muscles may succeed in pulling the posts together but at other times may wobble around in uncontrollable ways.
    Raman’s group looked to design a skeleton that focuses and maximizes a muscle’s contractions regardless of exactly where and how it is placed on a skeleton, to generate the most movement in a predictable, reliable way.
    “The question is: How do we design a skeleton that most efficiently uses the force the muscle is generating?” Raman says.

    The researchers first considered the multiple directions that a muscle can naturally move. They reasoned that if a muscle is to pull two posts together along a specific direction, the posts should be connected to a spring that only allows them to move in that direction when pulled.
    “We need a device that is very soft and flexible in one direction, and very stiff in all other directions, so that when a muscle contracts, all that force gets efficiently converted into motion in one direction,” Raman says.
    Soft flex
    As it turns out, Raman found many such devices in Professor Martin Culpepper’s lab. Culpepper’s group at MIT specializes in the design and fabrication of machine elements such as miniature actuators, bearings, and other mechanisms, that can be built into machines and systems to enable ultraprecise movement, measurement, and control, for a wide variety of applications. Among the group’s precision machined elements are flexures — spring-like devices, often made from parallel beams, that can flex and stretch with nanometer precision.
    “Depending on how thin and far apart the beams are, you can change how stiff the spring appears to be,” Raman says.
    She and Culpepper teamed up to design a flexure specifically tailored with a configuration and stiffness to enable muscle tissue to naturally contract and maximally stretch the spring. The team designed the device’s configuration and dimensions based on numerous calculations they carried out to relate a muscle’s natural forces with a flexure’s stiffness and degree of movement.
    The flexure they ultimately designed is 1/100 the stiffness of muscle tissue itself. The device resembles a miniature, accordion-like structure, the corners of which are pinned to an underlying base by a small post, which sits near a neighboring post that is fit directly onto the base. Raman then wrapped a band of muscle around the two corner posts (the team molded the bands from live muscle fibers that they grew from mouse cells), and measured how close the posts were pulled together as the muscle band contracted.
    The team found that the flexure’s configuration enabled the muscle band to contract mostly along the direction between the two posts. This focused contraction allowed the muscle to pull the posts much closer together — five times closer — compared with previous muscle actuator designs.
    “The flexure is a skeleton that we designed to be very soft and flexible in one direction, and very stiff in all other directions,” Raman says. “When the muscle contracts, all the force is converted into movement in that direction. It’s a huge magnification.”
    The team found they could use the device to precisely measure muscle performance and endurance. When they varied the frequency of muscle contractions (for instance, stimulating the bands to contract once versus four times per second), they observed that the muscles “grew tired” at higher frequencies, and didn’t generate as much pull.
    “Looking at how quickly our muscles get tired, and how we can exercise them to have high-endurance responses — this is what we can uncover with this platform,” Raman says.
    The researchers are now adapting and combining flexures to build precise, articulated, and reliable robots, powered by natural muscles.
    “An example of a robot we are trying to build in the future is a surgical robot that can perform minimally invasive procedures inside the body,” Raman says. “Technically, muscles can power robots of any size, but we are particularly excited in making small robots, as this is where biological actuators excel in terms of strength, efficiency, and adaptability.” More

  • in

    Researchers developed new method for detecting heart failure with a smartphone

    The new technology, which was created at the University of Turku and developed by the company CardioSignal, uses a smartphone to analyse heart movement and detect heart failure. The study involved five organisations from Finland and the United States.
    Heart failure is a condition affecting tens of millions of people worldwide, in which the heart is unable to perform its normal function of pumping blood to the body. It is a serious condition that develops as a result of a number of cardiovascular diseases and its symptoms may require repeated hospitalisation.
    Heart failure is challenging to diagnose because its symptoms, such as shortness of breath, abnormal fatigue on exertion, and swelling, can be caused by a number of conditions. There is no simple test available to detect it and diagnostics relies on an examination by a doctor, blood tests, and sophisticated imaging, such as an ultrasound scan of the heart.
    Gyrocardiography is a non-invasive technique for measuring cardiac vibrations on the chest. The smartphone’s built-in motion sensors can detect and record these vibrations, including those that doctors cannot hear with a stethoscope. The method has been developed over the last 10 years by researchers at the University of Turku and CardioSignal.
    The researchers’ latest study on using smartphone motion sensors to detect heart failure was carried out at the Turku and Helsinki University Hospitals in Finland and Stanford University Hospital in the US. Approximately 1,000 people took part in the study, of whom around 200 were patients suffering from heart failure. The study compared the data provided by the motion sensors in the heart failure patients and patients without heart disease.
    “The results we obtained with this new method are promising and may in the future make it easier to detect heart failure,” says Cardiologist Antti Saraste, one of the two main authors of the research article and the Professor of Cardiovascular Medicine at the University of Turku, Finland.
    Precise detection uncovers heart failure
    The researchers found that heart failure is associated with typical changes in the motion sensor data collected by a smartphone. On the basis of this data, the researchers were able to identify the majority of patients with heart failure.

    The analysis of the movements detected by the gyroscope and accelerometer is so accurate that in the future it could provide healthcare professionals with a quick and easy way to detect heart failure.
    “Primary healthcare has very limited tools for detecting heart failure. We can create completely new treatment options for remote monitoring of at-risk groups and for monitoring already diagnosed patients after hospitalisation,” says CardioSignal’s founding member and CEO, Cardiologist Juuso Blomster.
    Consistent with several European countries, heart failure affects around 1-2% of the population in Finland, but it is much more common in older adults, affecting around one in ten people aged 70. Detecting heart failure is important as effective treatment can help to alleviate its symptoms. Accurate diagnosis and timely access to treatment can also reduce healthcare costs, which are driven up by emergency room visits and hospital stays, especially during exacerbations.
    The joint research projects between CardioSignal and the University of Turku aim to promote people’s health and reduce healthcare costs through innovation, improved disease diagnostics, and prevention of serious complications. More

  • in

    Novel robust-optimal controllers based on fuzzy descriptor system

    Nonlinear systems have applications in many diverse fields from robotics to economics. Unlike linear systems, the output is not proportional to the input is such systems. A classic example is the motion of a pendulum. Due to the inherent nature of nonlinear systems, their mathematical modelling and, consequently, control is difficult. In this context, the Takagi-Sugeno (T-S) fuzzy system emerges as a highly effective tool. This system leverages fuzzy logic to map input and output values to approximate a nonlinear system as multiple linear systems which are easier to model. Fuzzy logic is a form of mathematical logic in which, instead of requiring all statements to be true (1) or false (0), the truth values can be any value between 0 and 1. T-S fuzzy system has thus served as the foundation for several nonlinear control methods, with the Parallel Distributed Compensator (PDC) method being the most prominent.
    Furthermore, scientists have developed an enhanced version of this system, known as the fuzzy descriptor system (FDS). It combines the T-S fuzzy system with the powerful space-state representation, which describes a physical system in terms of state variables, input variables, and output variables. Despite extensive research, optimal control strategies in the context of T-S FDSs are still largely unexplored. Additionally, while robust control methods, which protect against disturbances, have been explored for T-S FDS using methods like Linear Matrix Inequalities (LMI), these methods introduce additional complexity and optimization challenges.
    To overcome these limitations, a group of researchers, led by Associate Professor Ngoc-Tam Bui from the Innovative Global Program of the College of Engineering at Shibaura Institute of Technology in Japan and including Thi-Van-Anh Nguyen, Quy-Thinh Dao, and Duc-Binh Pham, all from Hanoi University of Science and Technology, developed novel optimal and robust-optimal controllers based on the T-S fuzzy descriptor model. Their study was published in the journal Scientific Reports on March 07, 2024.
    To develop the controllers, the team first utilized the powerful Lyapunov stability theory to establish the stability conditions for the mathematical model of the FDS. However, these stability conditions cannot be directly used. As Dr. Bui explains, “The stability conditions for the FDS model make it difficult to solve using established mathematical tools. To make them more amenable, we systematically transformed them into LMI.” These modified conditions formed the basis for developing three controllers: the stability controller which uses PDC to manage deviations, the optimal controller which minimizes a cost function to obtain optimal control, and the robust-optimal controller which combines the benefits of both of them.
    The researchers demonstrated the effectiveness of these controllers in controlling a rotary inverted pendulum, a challenging system comprising an inverted pendulum sitting on a rotating base. The problem is to keep the pendulum upright by controlling the rotation of the base. The researchers tested the performance of the controllers using distinct simulation scenarios. Simulations revealed that the stability controller effectively stabilized the system when the initial displacement angle was small, while with larger initial angles, there were more oscillations, and the settling time was higher. The high settling time was effectively addressed by the optimal controller, reducing it from 13 to 2 seconds, representing a six-fold reduction. Moreover, it also reduced the maximum amplitudes during oscillations.
    The robust-optimal controller was tested using two different scenarios. In the first case, the mass of the pendulum bar was changed, while in the second, white noise was introduced into the control input. Compared to the optimal controller, it performed the same in the first scenario. However, the controller was considerably better in the second scenario, showing no oscillations while the optimal controller showed clear oscillations. Notably, the robust-optimal controller showed the lowest error values.
    These results highlight the adaptability and potential of these controllers in practical scenarios. “The research findings hold promising implications for various real-life applications where stable control in dynamic and uncertain environments is paramount. Specifically, autonomous vehicles and industrial robots can achieve enhanced performance and adaptability using the proposed controllers,” remarks Dr. Bui. “Overall, our research opens avenues for advancing control strategies in various domains, ultimately contributing to more capable autonomous systems, making transportationsafer, healthcare more effective, and manufacturing more efficient.” More

  • in

    Protecting art and passwords with biochemistry

    Security experts fear Q-​Day, the day when quantum computers become so powerful that they can crack today’s passwords. Some experts estimate that this day will come within the next ten years. Password checks are based on cryptographic one-​way functions, which calculate an output value from an input value. This makes it possible to check the validity of a password without transmitting the password itself: the one-​way function converts the password into an output value that can then be used to check its validity in, say, online banking. What makes one-​way functions special is that it’s impossible to use their output value to deduce the input value — in other words, the password. At least not with today’s resources. However, future quantum computers could make this kind of inverse calculation easier.
    Researchers at ETH Zurich have now presented a cryptographic one-​way function that works differently from today’s and will also be secure in the future. Rather than processing the data using arithmetic operations, it is stored as a sequence of nucleotides — the chemical building blocks of DNA.
    Based on true randomness
    “Our system is based on true randomness. The input and output values are physically linked, and it’s only possible to get from the input value to the output value, not the other way round,” explains Robert Grass, a professor in the Department of Chemistry and Applied Biosciences. “Since it’s a physical system and not a digital one, it can’t be decoded by an algorithm, not even by one that runs on a quantum computer,” adds Anne Lüscher, a doctoral student in Grass’s group. She is the lead author of the paper, which was published in the journal Nature Communications.
    The researchers’ new system can serve as a counterfeit-​proof way of certifying the authenticity of valuable objects such as works of art. The technology could also be used to trace raw materials and industrial products.
    How it works
    The new biochemical one-​way function is based on a pool of one hundred million different DNA molecules. Each of the molecules contains two segments featuring a random sequence of nucleotides: one segment for the input value and one for the output value. There are several hundred identical copies of each of these DNA molecules in the pool, and the pool can also be divided into several pools; these are identical because they contain the same random DNA molecules. The pools can be located in different places, or they can be built into objects.

    Anyone in possession of this DNA pool holds the security system’s lock. The polymerase chain reaction (PCR) can be used to test a key, or input value, which takes the form of a short sequence of nucleotides. During the PCR, this key searches the pool of hundreds of millions of DNA molecules for the molecule with the matching input value, and the PCR then amplifies the output value located on the same molecule. DNA sequencing is used to make the output value readable.
    At first glance, the principle seems complicated. “However, producing DNA molecules with built-​in randomness is cheap and easy,” Grass says. The production costs for a DNA pool that can be divided up in this way are less than 1 Swiss franc. Using DNA sequencing to read out the output value is more time-​consuming and expensive, but many biology laboratories already possess the necessary equipment.
    Securing valuable goods and supply chains
    ETH Zurich has applied for a patent on this new technology. The researchers now want to optimise and refine it to bring it to market. Because using the method calls for specialised laboratory infrastructure, the scientists think the most likely application for this form of password verification is currently for highly sensitive goods or for access to buildings with restricted access. This technology won’t be an option for the broader public to check passwords until DNA sequencing in particular becomes easier.
    A little more thought has already gone into the idea of using the technology for the forgery-​proof certification of works of art. For instance, if there are ten copies of a picture, the artist can mark them all with the DNA pool — perhaps by mixing the DNA into the paint, spraying it onto the picture or applying it to a specific spot.
    If several owners later wish to have the authenticity of these artworks confirmed, they can get together, agree on a key (i.e. an input value) and carry out the DNA test. All the copies for which the test produces the same output value will have been proven genuine. The new technology could also be used to link crypto-​assets such as NFTs, which exist only in the digital world, to an object and thus to the physical world.
    Furthermore, it would support counterfeit-​proof tracking along supply chains of industrial goods or raw materials. “The aviation industry, for example, has to be able to provide complete proof that it uses only original components. Our technology can guarantee traceability,” Grass says. In addition, the method could be used to label the authenticity of original medicines or cosmetics. More

  • in

    How scientists are accelerating chemistry discoveries with automation

    A new automated workflow developed by scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) has the potential to allow researchers to analyze the products of their reaction experiments in real time, a key capability needed for future automated chemical processes.
    The developed workflow — which applies statistical analysis to process data from nuclear magnetic resonance (NMR) spectroscopy — could help speed the discovery of new pharmaceutical drugs, and accelerate the development of new chemical reactions.
    The Berkeley Lab scientists who developed the groundbreaking technique say that the workflow can quickly identify the molecular structure of products formed by chemical reactions that have never been studied before. They recently reported their findings in the Journal of Chemical Information and Modeling.
    In addition to drug discovery and chemical reaction development, the workflow could also help researchers who are developing new catalysts. Catalysts are substances that facilitate a chemical reaction in the production of useful new products like renewable fuels or biodegradable plastics.
    “What excites people the most about this technique is its potential for real-time reaction analysis, which is an integral part of automated chemistry,” said first author Maxwell C. Venetos, a former researcher in Berkeley Lab’s Materials Sciences Division and former graduate student researcher in materials sciences at UC Berkeley. He completed his doctoral studies last year. “Our workflow really allows you to start pursuing the unknown. You are no longer constrained by things that you already know the answer to.”
    The new workflow can also identify isomers, which are molecules with the same chemical formula but different atomic arrangements. This could greatly accelerate synthetic chemistry processes in pharmaceutical research, for example. “This workflow is the first of its kind where users can generate their own library, and tune it to the quality of that library, without relying on an external database,” Venetos said.
    Advancing new applications
    In the pharmaceutical industry, drug developers currently use machine-learning algorithms to virtually screen hundreds of chemical compounds to identify potential new drug candidates that are more likely to be effective against specific cancers and other diseases. These screening methods comb through online libraries or databases of known compounds (or reaction products) and match them with likely drug “targets” in cell walls.

    But if a drug researcher is experimenting with molecules so new that their chemical structures don’t yet exist in a database, they must typically spend days in the lab to sort out the mixture’s molecular makeup: First, by running the reaction products through a purification machine, and then using one of the most useful characterization tools in a synthetic chemist’s arsenal, an NMR spectrometer, to identify and measure the molecules in the mixture one at a time.
    “But with our new workflow, you could feasibly do all of that work within a couple of hours,” Venetos said. The time-savings come from the workflow’s ability to rapidly and accurately analyze the NMR spectra of unpurified reaction mixtures that contain multiple compounds, a task that is impossible through conventional NMR spectral analysis methods.
    “I’m very excited about this work as it applies novel data-driven methods to the age-old problem of accelerating synthesis and characterization,” said senior author Kristin Persson, a faculty senior scientist in Berkeley Lab’s Materials Sciences Division and UC Berkeley professor of materials science and engineering who also leads the Materials Project.
    Experimental results
    In addition to being much faster than benchtop purification methods, the new workflow has the potential to be just as accurate. NMR simulation experiments performed using the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab with support from the Materials Project showed that the new workflow can correctly identify compound molecules in reaction mixtures that produce isomers, and also predict the relative concentrations of those compounds.
    To ensure high statistical accuracy, the research team used a sophisticated algorithm known as Hamiltonian Monte Carlo Markov Chain (HMCMC) to analyze the NMR spectra. They also performed advanced theoretical calculations based on a method called density-functional theory.

    Venetos designed the automated workflow as open source so that users can run it on an ordinary desktop computer. That convenience will come in handy for anyone from industry or academia.
    The technique sprouted from conversations between the Persson group and experimental collaborators Masha Elkin and Connor Delaney, former postdoctoral researchers in the John Hartwig group at UC Berkeley. Elkin is now a professor of chemistry at the Massachusetts Institute of Technology, and Delaney a professor of chemistry at the University of Texas at Dallas.
    “In chemistry reaction development, we are constantly spending time to figure out what a reaction made and in what ratio,” said John Hartwig, a senior faculty scientist in Berkeley Lab’s Chemical Sciences Division and UC Berkeley professor of chemistry. “Certain NMR spectrometry methods are precise, but if one is deciphering the contents of a crude reaction mixture containing a bunch of unknown potential products, those methods are far too slow to have as part of a high-throughput experimental or automated workflow. And that’s where this new capability to predict the NMR spectrum could help,” he said.
    Now that they’ve demonstrated the automated workflow’s potential, Persson and team hope to incorporate it into an automated laboratory that analyzes the NMR data of thousands or even millions of new chemical reactions at a time.
    Other authors on the paper include Masha Elkin, Connor Delaney, and John Hartwig at UC Berkeley.
    NERSC is a DOE Office of Science user facility at Berkeley Lab.
    The work was supported by the U.S. Department of Energy’s Office of Science, the U.S. National Science Foundation, and the National Institutes of Health. More

  • in

    Scientists release state-of-the-art spike-sorting software Kilosort4

    How do researchers make sense of the mountains of data collected from recording the simultaneous activity of hundreds of neurons? Neuroscientists all over the world rely on Kilosort, software that enables them to tease apart spikes from individual neurons to understand how the brain’s cells and circuits work together to process information.
    Now, researchers at HHMI’s Janelia Research Campus, led by Group Leader Marius Pachitariu, have released Kilosort4, an updated version of the popular spike-sorting software that has improved processing, requires less manual work, and is more accurate and easier to use than previous versions.
    “Over the past eight years, I’ve been refining the algorithm to make it more and more human-independent so people can use it out of the box,” Pachitariu says.
    Kilosort has become indispensable for many neuroscientists, but it may never have been developed if Pachitariu hadn’t decided he wanted to try something new.
    Pachitariu’s PhD work was in computational neuroscience and machine learning, but he yearned to work on more real-world applications, and he almost left academia for industry after he completed his PhD. Instead, Pachitariu opted for a postdoc in the joint lab of Kenneth Harris and Matteo Carandini at University College London where he could do more experimental neuroscience.
    The lab was then part of a consortium testing a probe called Neuropixels, developed at HHMI’s Janelia Research Campus. Pachitariu had no idea how to use the probes, which record activity from hundreds of neurons simultaneously, but he knew how to develop algorithms to keep up with the enormous amount of data his labmates were generating.
    In the first year of his postdoc, Pachitariu developed the initial version of Kilosort. The software, which was 50 times faster than previous approaches, allowed researchers to process the millions of data points generated by the Neuropixels probes. Eight years later, the probes and the software are staples in neuroscience labs worldwide, allowing researchers to identify and classify the spikes of individual neurons.

    In 2017, Pachitariu became a group leader at Janelia, where he and his team seek to understand how thousands of neurons work together to enable animals to think, decide, and act. These days, Pachitariu spends most of his time doing experiments and analyzing data, but he still finds time to work on improving Kilosort. The newly released Kilosort4 is the best in its class, outperforming other algorithms and correctly identifying even hard-to-detect neurons, according to the researchers.
    Pachitariu says it is much easier to squeeze in work on projects like Kilosort at Janelia than at other institutions where he would have to spend time writing grants and teaching.
    “Every now and then, I can put a few months into spearheading a new version and writing new code,” he says.
    Pachitariu says he also enjoys refining Kilosort, which allows him to use the core set of skills he developed during his PhD work. More

  • in

    Proof-of-principle demonstration of 3-D magnetic recording

    Research groups from NIMS, Seagate Technology, and Tohoku University have made a breakthrough in the field of hard disk drives (HDD) by demonstrating the feasibility of multi-level recording using a three-dimensional magnetic recording medium to store digital information. The research groups have shown that this technology can be used to increase the storage capacity of HDDs, which could lead to more efficient and cost-effective data storage solutions in the future.
    Data centers are increasingly storing vast amounts of data on hard disk drives (HDDs) that use perpendicular magnetic recording (PMR) to store information at areal densities of around 1.5 Tbit/in². However, to transition to higher areal densities, a high anisotropy magnetic recording medium consisting of FePt grains combined with heat-assisted laser writing is required. This method, known as heat-assisted magnetic recording (HAMR), is capable of sustaining areal recording densities of up to 10 Tbit/in². Furthermore, densities of larger than 10 Tbit/in² are possible based on a new principle demonstrated by storing multiple recording levels of 3 or 4 compared with the binary level used in HDD technology.
    In this study, we succeeded in arranging the FePt recording layers three dimensionally, by fabricating lattice-matched, FePt/Ru/FePt multilayer films, with Ru as a spacer layer. Measurements of the magnetization show the two FePt layers have different Curie temperatures. This means that three-dimensional recording becomes possible by adjusting the laser power when writing. In addition, we have demonstrated the principle of 3D recording through recording simulations, using a media model that mimics the microstructure and magnetic properties of the fabricated media.
    The three-dimensional magnetic recording method can increase recording capacity by stacking recording layers in three dimensions. This means that more digital information can be stored with fewer HDDs, leading to energy savings for data centers. In the future, we plan to develop processes to reduce the size of FePt grains, to improve the orientation and magnetic anisotropy, and to stack more FePt layers to realize a media structure suitable for practical use as a high-density HDD. More