More stories

  • in

    Heart attack on a chip

    Researchers at the University of Southern California Alfred E. Mann Department of Biomedical Engineering have developed a “heart attack on a chip,” a device that could one day serve as a testbed to develop new heart drugs and even personalized medicines.
    “Our device replicates some key features of a heart attack in a relatively simple and easy to use system,” said Megan McCain, an associate professor of biomedical engineering and stem cell biology and regenerative medicine, who developed the device with postdoctoral researcher Megan Rexius-Hall.
    “This enables us to more clearly understand how the heart is changing after a heart attack. From there, we and others can develop and test drugs that will be most effective for limiting the further degradation of heart tissue that can occur after a heart attack,” added McCain.
    McCain, a “cardiac tissue engineer,” whose work previously included co-developing a heart on a chip, and Rexius-Hall detail their findings in a recently released article in the journal Science Advances titled “A Myocardial Infarct Border-Zone-On-A-Chip Demonstrates Distinct Regulation of Cardiac Tissue Function by an Oxygen Gradient.”
    America’s No. 1 killer
    Coronary heart disease is America’s No. 1 killer. In 2018, 360,900 Americans succumbed to it, making heart disease responsible for 12.6% of all deaths in the United States, according to the AHA. Severe coronary heart disease can cause a heart attack, which accounts for much of that pain and suffering. Heart attacks occur when fat, cholesterol and other substances in the coronary arteries severely reduce the flow of oxygen-rich blood to part of the heart. Between 2005 and 2014, an average of 805,000 Americans per year had heart attacks. More

  • in

    A novel, space-time coding antenna promotes 6G and secure wireless communications

    A research team co-led by a scientist at City University of Hong Kong (CityU) has developed a novel antenna that allows manipulation of the direction, frequency and amplitude of the radiated beam, and is expected to play an important role in the integration of sensing and communications (ISAC) for 6th-generation (6G) wireless communications.
    The structure and characteristics of traditional antennas cannot be changed once fabricated. However, the direction, frequency, and amplitude of the electromagnetic waves from this new-generation antenna, which is called a “sideband-free space-time-coding (STC) metasurface antenna,” can be changed through space-time coding (i.e. software control), enabling great user flexibility.
    The key to this innovative feature is that the response of the metasurface (artificial, thin-sheet material with sub-wavelength thickness and made of several sub-wavelength meta-atoms) can be changed by switching the meta-atoms on its surface between radiating and non-radiating states, like turning on and off switches, by controlling the electric current. This allows the STC metasurface antenna to realize complicated wave manipulation in the space and frequency domains through software control, and to create a desired radiation pattern and a highly directed beam.
    Professor Chan Chi-hou, Acting Provost and Chair Professor of Electronic Engineering in the Department of Electrical Engineering at CityU, who led the research, highlighted that the antenna relies on the successful combination of two research advances, namely amplitude-modulated (AM) leaky-wave antennas and space-time coding techniques.
    Dr Wu Gengbo, postdoctoral fellow in the State Key Laboratory of Terahertz and Millimeter Waves (SKLTMW) at CityU, first proposed the new concept of AM leaky-wave antennas in 2020 in his PhD studies at CityU. “The concept provides an analytical approach to synthesize antennas with the desired radiation patterns for different specific uses by simply changing the antennas’ shape and structure,” explained Dr Wu.
    But as with other antennas, once the AM leaky-wave antenna is fabricated, its radiation characteristics are fixed. At about that time, Dr Dai Junyan, from a research group led by Academician Cui Tiejun and Professor Cheng Qiang, from Southeast University at Nanjing, China, who pioneered STC technologies, joined Professor Chan’s group at CityU. “Dr Dai’s expertise in space-time coding and digital metasurfaces to dynamically reconfigure antenna performance added a new, important dimension to the antenna research at the SKLTMW,” said Professor Chan, who is also Director of the SKLTMW at CityU.
    Moreover, the time modulation of electromagnetic waves on metasurfaces usually generates unwanted harmonic frequencies, called sidebands. These sidebands carry part of the radiated electromagnetic wave energy and interfere with the useful communication channels of the antenna, leading to “spectrum pollution.” But Professor Chan and his team proposed a novel design, which makes use of a waveguide (a line for transmitting electromagnetic waves by successive reflection from the inner wall) and successfully suppressed the undesired harmonics, achieving a high-directivity beam and enabling secure communication.
    “With the AM leaky-wave antenna and space-time coding technologies, we achieve the designated radiation characteristics by controlling the on-off sequences and duration of the ‘switches’ on the antenna through software,” said Professor Chan.
    “A high-directivity beam can be generated with the new antenna, allowing a wide range of radiation performance without having to redesign the antenna, except for using different STC inputs,” added Dr Wu.
    The energy from the radiated beam of the STC metasurface antenna can be focused to a focal point with fixed or varying focal lengths, which can be used for real-time imaging and treated as a type of radar to scan the environment and feedback data. “The invention plays an important role in the ISAC for 6G wireless communications,” Professor Chan explained. “For example, the radiated beam can scan a person and create an image of the person, allowing mobile phone users to talk to each other with 3D hologram imaging. It also performs better against eavesdropping than the conventional transmitter architecture.” More

  • in

    Energy-efficient computing with tiny magnetic vortices

    A large percentage of energy used today is consumed in the form of electrical power for processing and storing data and for running the relevant terminal equipment and devices. According to predictions, the level of energy used for these purposes will increase even further in the future. Innovative concepts, such as neuromorphic computing, employ energy-saving approaches to solve this problem. In a joint project undertaken by experimental and theoretical physicists at Johannes Gutenberg University Mainz (JGU) with the funding of an ERC Synergy Grant such an approach, known as Brownian reservoir computing, has now been realized. The results were also recently featured as an Editors’ Highlight in the Devices section of the scientific journal Nature Communications.
    Brownian computing uses ambient thermal energy
    Brownian reservoir computing is a combination of two unconventional computing methods. Brownian computing exploits the fact that computer processes typically run at room temperature so that there is the option of using the surrounding thermal energy and thus cutting down on electricity consumption. The thermal energy used in the computing system is basically the random movement of particles, known as Brownian motion; which explains the name of this computing method.
    Reservoir computing is ideal for exceptionally efficient data processing
    Reservoir computing utilizes the complex response of a physical system to external stimuli, resulting in an extremely resource-efficient way of processing data. Most of the computation is performed by the system itself, which does not require additional energy. Furthermore, this type of reservoir computer can easily be customized to perform various tasks as there is no need to adjust the solid-state system to suit specific requirements.
    A team headed by Professor Mathias Kläui of the Institute of Physics at Mainz University, supported by Professor Johan Mentink of Radboud University Nijmegen in the Netherlands, has now succeeded in developing a prototype that combines these two computing methods. This prototype is able to perform Boolean logic operations, which can be used as standard tests for the validation of reservoir computing.
    The solid-state system selected in this instance consists of metallic thin films exhibiting magnetic skyrmions. These magnetic vortices behave like particles and can be driven by electrical currents. The behavior of skyrmions is influenced not only by the applied current but also by their own Brownian motion. This Brownian motion of skyrmions can result in significantly increased energy savings as the system is automatically reset after each operation and prepared for the next computation.
    First prototype developed in Mainz
    Although there have been many theoretical concepts for skyrmion-based reservoir computing in recent years, the researchers in Mainz succeeded in developing the first functional prototype only when combining these concepts with the principle of Brownian computing. “The prototype is easy to produce from a lithographic point of view and can theoretically be reduced to a size of just nanometers,” said experimental physicist Klaus Raab. “We owe our success to the excellent collaboration between the experimental and theoretical physicists here at Mainz University,” emphasized theoretical physicist Maarten Brems. Project coordinator Professor Mathias Kläui added: “I’m delighted that the funding provided through a Synergy Grant from the European Research Council enabled us to collaborate with outstanding colleagues in the Department of Theoretical Physics in Nijmegen, and it was this collaboration that resulted in our achievement. I see great potential in unconventional computing, a field which also receives extensive support here at Mainz through funding from the Carl Zeiss Foundation for the Emergent Algorithmic Intelligence Center.”
    Story Source:
    Materials provided by Johannes Gutenberg Universitaet Mainz. Note: Content may be edited for style and length. More

  • in

    New quantum dots study uncovers implications for biological imaging

    A new study involving researchers at the University of Illinois Chicago achieved a milestone in the synthesis of multifunctional photonic nanomaterials.
    In a paper published in the American Chemical Society’s journal Nano Letters, they report the synthesis of semiconductor “giant” core-shell quantum dots with record-breaking emissive lifetimes. In addition, the lifetimes can be tuned by making a simple alteration to the material’s internal structure.
    The group, which included collaborators from Princeton University and Pennsylvania State University, demonstrated a new structure-property concept that imparts the ability to spatially localize electrons or holes within a core/shell heterostructure by tuning the charge carrier’s kinetic energy on a parabolic potential energy surface.
    According to UIC chemist Preston Snee, this charge carrier separation results in extended radiative lifetimes and in continuous emission at the single-nanoparticle level.
    “These properties enable new applications for optics, facilitate novel approaches such as time-gated single-particle imaging and create inroads for the development of other new advanced materials,” said Snee, UIC associate professor of chemistry and the study’s senior co-author.
    Snee and the study’s first author, Marcell Pálmai, UIC postdoctoral research associate in chemistry, teamed with Haw Yang of Princeton and others to excite the quantum dots particle with light to put it in the “exciton” state. The exciton is an electron/hole charge pair, and in the new materials, the electron becomes displaced from the center to the shell, where it becomes trapped for upwards of 500 nanoseconds, which represents the record for such nanomaterials.
    “As emissive materials, quantum dots hold the promise of creating more energy-efficient displays and can be used as fluorescent probes for biomedical research due to their highly robust optical properties. They are 10 times to 100 times more absorptive than organic dyes and are nearly impervious to photobleaching, which is why they are used in the new Samsung QLED-TV,” they write.
    These new particles have great efficacy for fundamental biological discovery, according to the researchers.
    The quantum dots presented in their paper emit at red wavelengths, which minimizes scattering, while the long lifetimes allow for biological imaging to be performed with less background noise. At the single particle level, the new quantum dots emit continuously, so a research scientist can tag proteins relevant to cancer and follow the biological dynamics without losing track of the signal which is currently a common problem with such studies.
    In future research, the group plans to demonstrate that the materials make good components for optical devices such as micron-sized lasers.
    Additional co-authors of the paper are Marcell Pálmai, Eun Byoel Kim, Prakash Parajuli, Kyle Tomczak, Kai Wang, Bibash Sapkota, Nan Jiang and Robert F. Klie of UIC; Joseph S. Beckwith, Nyssa T. Emerson, Shuhui Yin and Tian Zhao of Princeton; and Ming Tien of Pennsylvania State University.
    Funding from the University of Illinois Chicago primarily supported this work. The research was also supported by funding from the American Chemical Society Petroleum Research Fund and grants from the U.S. Department of Energy (DE-SC0019364), the Fonds National Suisse de la Recherche Scientifique (P2GEP2_191208) and the National Science Foundation (CHE-1944796).
    Story Source:
    Materials provided by University of Illinois Chicago. Note: Content may be edited for style and length. More

  • in

    New instrument measures supercurrent flow, data has applications in quantum computing

    Jigang Wang offered a quick walk-around of a new sort of microscope that can help researchers understand, and ultimately develop, the inner workings of quantum computing.
    Wang, an Iowa State University professor of physics and astronomy who’s also affiliated with the U.S. Department of Energy’s Ames National Laboratory, described how the instrument works in extreme scales of space, time and energy — billionths of a meter, quadrillionths of a second and trillions of electromagnetic waves per second.
    Wang pointed out and explained the control systems, the laser source, the maze of mirrors that make an optical path for light pulsing at trillions of cycles per second, the superconducting magnet that surrounds the sample space, the custom-made atomic force microscope, the bright yellow cryostat that lowers sample temperatures down to the temperature of liquid helium, about -450 degrees Fahrenheit.
    Wang calls the instrument a Cryogenic Magneto-Terahertz Scanning Near-field Optical Microscope. (That’s cm-SNOM for short.) It’s based at the Ames National Laboratory’s Sensitive Instrument Facility just northwest of Iowa State’s campus.
    It took five years and $2 million — $1.3 million from the W.M. Keck Foundation of Los Angeles and $700,000 from Iowa State and Ames National Laboratory — to build the instrument. It has been gathering data and contributing to experiments for less than a year.
    “No one has it,” Wang said of the extreme-scale nanoscope. “It’s the first in the world.”
    It can focus down to about 20 nanometers, or 20 billionths of a meter, while operating below liquid-helium temperatures and in strong, Tesla magnetic fields. That’s small enough to get a read on the superconducting properties of materials in these extreme environments. More

  • in

    Finding the right AI for you

    The human genome is three billion letters of code, and each person has millions of variations. While no human can realistically sift through all that code, computers can. Artificial intelligence (AI) programs can find patterns in the genome related to disease much faster than humans can. They also spot things that humans miss. Someday, AI-powered genome readers may even be able to predict the incidence of diseases from cancer to the common cold. Unfortunately, AI’s recent popularity surge has led to a bottleneck in innovation.
    “It’s like the Wild West right now. Everyone’s just doing whatever the hell they want,” says Cold Spring Harbor Laboratory (CSHL) Assistant Professor Peter Koo. Just like Frankenstein’s monster was a mix of different parts, AI researchers are constantly building new algorithms from various sources. And it’s difficult to judge whether their creations will be good or bad. After all, how can scientists judge “good” and “bad” when dealing with computations that are beyond human capabilities?
    That’s where GOPHER, the Koo lab’s newest invention, comes in. GOPHER (short for GenOmic Profile-model compreHensive EvaluatoR) is a new method that helps researchers identify the most efficient AI programs to analyze the genome. “We created a framework where you can compare the algorithms more systematically,” explains Ziqi Tang, a graduate student in Koo’s laboratory.
    GOPHER judges AI programs on several criteria: how well they learn the biology of our genome, how accurately they predict important patterns and features, their ability to handle background noise, and how interpretable their decisions are. “AI are these powerful algorithms that are solving questions for us,” says Tang. But, she notes: “One of the major issues with them is that we don’t know how they came up with these answers.”
    GOPHER helped Koo and his team dig up the parts of AI algorithms that drive reliability, performance, and accuracy. The findings help define the key building blocks for constructing the most efficient AI algorithms going forward. “We hope this will help people in the future who are new to the field,” says Shushan Toneyan, another graduate student at the Koo lab.
    Imagine feeling unwell and being able to determine exactly what’s wrong at the push of a button. AI could someday turn this science-fiction trope into a feature of every doctor’s office. Similar to video-streaming algorithms that learn users’ preferences based on their viewing history, AI programs may identify unique features of our genome that lead to individualized medicine and treatments. The Koo team hopes GOPHER will help optimize such AI algorithms so that we can trust they’re learning the right things for the right reasons. Toneyan says: “If the algorithm is making predictions for the wrong reasons, they’re not going to be helpful.”
    Story Source:
    Materials provided by Cold Spring Harbor Laboratory. Original written by Luis Sandoval. Note: Content may be edited for style and length. More

  • in

    Purchasing loot boxes in video games associated with problem gambling risk, says study

    Gamers who buy ‘loot boxes’ are up to two times more likely to gamble, shows new research published today in the peer-reviewed journal Addiction Research & Theory.
    They are also more likely to have a gambling problem compared with the gamers who don’t purchase these ‘virtual’ treasure chests, according to the findings based on more than 1,600 adults in Canada.
    The authors say the results cast doubt on the theory that psychological factors create the link between gambling and loot boxes — banned by some countries including Belgium and discussed for legislation in many others worldwide.
    Their study demonstrates that the association between these video game features and gambling exists even when childhood neglect, depression and other known risk factors for gambling are taken into account.
    The authors say their findings have potential implications for policymakers and for healthcare. They are calling for more research into the benefit of harm minimization features, with some online platforms having already implemented these — such as telling players the odds of winning when they buy a loot box.
    “Findings indicate that loot box purchasing represents an important marker of risk for gambling and problem gambling among people who play video games,” says Sophie Coelho, a PhD student at York University, Toronto. More

  • in

    Changing the color of quantum light on an integrated chip

    Optical photons are ideal carriers of quantum information. But to work together in a quantum computer or network, they need to have the same color — or frequency — and bandwidth. Changing a photon’s frequency requires altering its energy, which is particularly challenging on integrated photonic chips.
    Recently, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) developed an integrated electro-optic modulator that can efficiently change the frequency and bandwidth of single photons. The device could be used for more advanced quantum computing and quantum networks.
    The research is published in Light: Science & Applications.
    Converting a photon from one color to another is usually done by sending the photon into a crystal with a strong laser shining through it, a process that tends to be inefficient and noisy. Phase modulation, in which photon wave’s oscillation is accelerated or slowed down to change the photon’s frequency, offers a more efficient method, but the device required for such a process, an electro-optic phase modulator, has proven difficult to integrate on a chip.
    One material may be uniquely suited for such an application — thin-film lithium niobate.
    “In our work, we adopted a new modulator design on thin-film lithium niobate that significantly improved the device performance,” said Marko Lončar, the Tiantsai Lin Professor of Electrical Engineering at SEAS and senior author of the study. “With this integrated modulator, we achieved record-high terahertz frequency shifts of single photons.”
    The team also used the same modulator as a “time lens” — a magnifying glass that bends light in time instead of space — to change the spectral shape of a photon from fat to skinny. More