More stories

  • in

    Printing plastic webs to protect the cellphone screens of the future

    Follow the unbreakable bouncing phone! A Polytechnique Montréal team recently demonstrated that a fabric designed using additive manufacturing absorbs up to 96% of impact energy — all without breaking. Cell Reports Physical Science journal recently published an article with details about this innovation, which paves the way for the creation of unbreakable plastic coverings.
    The concept and accompanying research revealed in the article is relatively simple. Professors Frédérick Gosselin and Daniel Therriault from Polytechnique Montréal’s Department of Mechanical Engineering, along with doctoral student Shibo Zou, wanted to demonstrate how plastic webbing could be incorporated into a glass pane to prevent it from shattering on impact.
    It seems a simple enough concept, but further reflection reveals that there’s nothing simple about this plastic web.
    The researchers’ design was inspired by spider webs and their amazing properties. “A spider web can resist the impact of an insect colliding with it, due to its capacity to deform via sacrificial links at the molecular level, within silk proteins themselves,” Professor Gosselin explains. “We were inspired by this property in our approach.”
    Biomimicry via 3D printing
    Researchers used polycarbonate to achieve their results; when heated, polycarbonate becomes viscous like honey. Using a 3D printer, Professor Gosselin’s team harnessed this property to “weave” a series of fibres less than 2 mm thick, then repeated the process by printing a new series of fibres perpendicularly, moving fast, before the entire web solidified.
    It turns out that the magic is in the process itself — that’s where the final product acquires its key properties.
    As it’s slowly extruded by the 3D printer to form a fibre, the molten plastic creates circles that ultimately form a series of loops. “Once hardened, these loops turn into sacrificial links that give the fibre additional strength. When impact occurs, those sacrificial links absorb energy and break to maintain the fibre’s overall integrity — similar to silk proteins,” researcher Gosselin explains.
    In an article published in 2015, Professor Gosselin’s team demonstrated the principles behind the manufacturing of these fibres. The latest Cell Reports Physical Science article reveals how these fibres behave when intertwined to take the shape a of web.
    Study lead author Shibo Zou, used the opportunity to illustrate how such a web could behave when located inside a protective screen. After embedding a series of webs in transparent resin plates, he conducted impact tests. The result? Plastic wafers dispersed up to 96% of impact energy without breaking. Instead of cracking, they deform in certain places, preserving the wafers’ overall integrity.
    According to Professor Gosselin, this nature-inspired innovation could lead to the manufacture of a new type of bullet-proof glass, or lead to the production of more durable plastic protective smartphones screens. “It could also be used in aeronautics as a protective coating for aircraft engines,” the Professor Gosselin notes. In the meantime, he certainly intends to explore the possibilities that this approach may open for him.

    Story Source:
    Materials provided by Polytechnique Montréal. Note: Content may be edited for style and length. More

  • in

    Machine learning predicts anti-cancer drug efficacy

    With the advent of pharmacogenomics, machine learning research is well underway to predict the patients’ drug response that varies by individual from the algorithms derived from previously collected data on drug responses. Entering high-quality learning data that can reflect a person’s drug response as much as possible is the starting point for improving the accuracy of the prediction. Previously, preclinical study of animal models were used which were relatively easier to obtain compared to human clinical data.
    In light of this, a research team led by Professor Sanguk Kim in the Department of Life Sciences at POSTECH is drawing attention by successfully increasing the accuracy of anti-cancer drug response predictions by using data closest to a real person’s response. The team developed this machine learning technique through algorithms that learn the transcriptome information from artificial organoids derived from actual patients instead of animal models. These research findings were published in the international journal Nature Communications on October 30.
    Even patients with the same cancer have different reactions to anti-cancer drugs so customized treatment is considered paramount in treatment development. However, the current predictions were based on genetic information of cancer cells, limiting their accuracy. Due to unnecessary biomarker information, machine learning had an issue of learning based on false signals.
    To increase the predictive accuracy, the research team introduced machine learning algorithms that use protein interaction network that can interact with target proteins as well as the transcriptome of individual proteins that are directly related with drug targets. It induces learning the transcriptome production of a protein that is functionally close to the target protein. Through this, only selected biomarkers can be learned instead of false biomarkers that the conventional machine learning had to learn, which increases the accuracy.
    In addition, data from patient-derived organoids — not animal models — were used to narrow the discrepancy of responses in actual patients. With this method, colorectal cancer patients treated with 5-fluorouracil and bladder cancer patients treated with cisplatin were predicted to be comparable to actual clinical results.

    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Cockroaches and lizards inspire new robot

    A new high-speed amphibious robot inspired by the movements of cockroaches and lizards, developed by Ben-Gurion University of the Negev (BGU) researchers, swims and runs on top of water at high speeds and crawls on difficult terrain.
    The mechanical design of the AmphiSTAR robot and its control system were presented virtually last week at the IROS (International Conference on Intelligent Robots and Systems) by Dr. David Zarrouk, director, Bioinspired and Medical Robotics Laboratory in BGU’s Department of Mechanical Engineering, and graduate student Avi Cohen.
    “The AmphiSTAR uses a sprawling mechanism inspired by cockroaches, and it is designed to run on water at high speeds like the basilisk lizard,” says Zarrouk. “We envision that AmphiSTAR can be used for agricultural, search and rescue and excavation applications, where both crawling and swimming are required.”
    The palm-size AmphiSTAR, part of the family of STAR robots developed at the lab, is a wheeled robot fitted with four propellers underneath whose axes can be tilted using the sprawl mechanism. The propellers act as wheels over ground and as fins to propel the robot over water while swimming and running on water at high speeds of 1.5 m/s. Two air tanks enable it to float and transition smoothly between high speeds when hovering on water to lower speeds swimming, and from crawling to swimming and vice versa.
    The experimental robot can crawl over gravel, grass and concrete as fast as the original STAR robot and can attain speeds of 3.6 m/s (3.3 mph).
    “Our future research will focus on the scalability of the robot and on underwater swimming,” Zarrouk says.
    Video: https://www.youtube.com/watch?v=qXgPQ7_yld0&t=0s
    This study was supported in part by the BGU Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Initiative, and by the Marcus Endowment Fund, both at Ben-Gurion University of the Negev. The Marcus legacy gift, of over $480 million, was donated in 2016 to American Associates, Ben-Gurion University of the Negev by Dr. Howard and Lottie Marcus. The donation is the largest gift given to any Israeli university and is believed to be the largest gift to any Israeli institution.

    Story Source:
    Materials provided by American Associates, Ben-Gurion University of the Negev. Note: Content may be edited for style and length. More

  • in

    An underwater navigation system powered by sound

    GPS isn’t waterproof. The navigation system depends on radio waves, which break down rapidly in liquids, including seawater. To track undersea objects like drones or whales, researchers rely on acoustic signaling. But devices that generate and send sound usually require batteries — bulky, short-lived batteries that need regular changing. Could we do without them?
    MIT researchers think so. They’ve built a battery-free pinpointing system dubbed Underwater Backscatter Localization (UBL). Rather than emitting its own acoustic signals, UBL reflects modulated signals from its environment. That provides researchers with positioning information, at net-zero energy. Though the technology is still developing, UBL could someday become a key tool for marine conservationists, climate scientists, and the U.S. Navy.
    These advances are described in a paper being presented this week at the Association for Computing Machinery’s Hot Topics in Networks workshop, by members of the Media Lab’s Signal Kinetics group. Research Scientist Reza Ghaffarivardavagh led the paper, along with co-authors Sayed Saad Afzal, Osvy Rodriguez, and Fadel Adib, who leads the group and is the Doherty Chair of Ocean Utilization as well as an associate professor in the MIT Media Lab and the MIT Department of Electrical Engineering and Computer Science.
    “Power-hungry”
    It’s nearly impossible to escape GPS’ grasp on modern life. The technology, which relies on satellite-transmitted radio signals, is used in shipping, navigation, targeted advertising, and more. Since its introduction in the 1970s and ’80s, GPS has changed the world. But it hasn’t changed the ocean. If you had to hide from GPS, your best bet would be underwater.
    Because radio waves quickly deteriorate as they move through water, subsea communications often depend on acoustic signals instead. Sound waves travel faster and further underwater than through air, making them an efficient way to send data. But there’s a drawback.

    advertisement

    “Sound is power-hungry,” says Adib. For tracking devices that produce acoustic signals, “their batteries can drain very quickly.” That makes it hard to precisely track objects or animals for a long time-span — changing a battery is no simple task when it’s attached to a migrating whale. So, the team sought a battery-free way to use sound.
    Good vibrations
    Adib’s group turned to a unique resource they’d previously used for low-power acoustic signaling: piezoelectric materials. These materials generate their own electric charge in response to mechanical stress, like getting pinged by vibrating soundwaves. Piezoelectric sensors can then use that charge to selectively reflect some soundwaves back into their environment. A receiver translates that sequence of reflections, called backscatter, into a pattern of 1s (for soundwaves reflected) and 0s (for soundwaves not reflected). The resulting binary code can carry information about ocean temperature or salinity.
    In principle, the same technology could provide location information. An observation unit could emit a soundwave, then clock how long it takes that soundwave to reflect off the piezoelectric sensor and return to the observation unit. The elapsed time could be used to calculate the distance between the observer and the piezoelectric sensor. But in practice, timing such backscatter is complicated, because the ocean can be an echo chamber.
    The sound waves don’t just travel directly between the observation unit and sensor. They also careen between the surface and seabed, returning to the unit at different times. “You start running into all of these reflections,” says Adib. “That makes it complicated to compute the location.” Accounting for reflections is an even greater challenge in shallow water — the short distance between seabed and surface means the confounding rebound signals are stronger.

    advertisement

    The researchers overcame the reflection issue with “frequency hopping.” Rather than sending acoustic signals at a single frequency, the observation unit sends a sequence of signals across a range of frequencies. Each frequency has a different wavelength, so the reflected sound waves return to the observation unit at different phases. By combining information about timing and phase, the observer can pinpoint the distance to the tracking device. Frequency hopping was successful in the researchers’ deep-water simulations, but they needed an additional safeguard to cut through the reverberating noise of shallow water.
    Where echoes run rampant between the surface and seabed, the researchers had to slow the flow of information. They reduced the bitrate, essentially waiting longer between each signal sent out by the observation unit. That allowed the echoes of each bit to die down before potentially interfering with the next bit. Whereas a bitrate of 2,000 bits/second sufficed in simulations of deep water, the researchers had to dial it down to 100 bits/second in shallow water to obtain a clear signal reflection from the tracker. But a slow bitrate didn’t solve everything.
    To track moving objects, the researchers actually had to boost the bitrate. One thousand bits/second was too slow to pinpoint a simulated object moving through deep water at 30 centimeters/second. “By the time you get enough information to localize the object, it has already moved from its position,” explains Afzal. At a speedy 10,000 bits/second, they were able to track the object through deep water.
    Efficient exploration
    Adib’s team is working to improve the UBL technology, in part by solving challenges like the conflict between low bitrate required in shallow water and the high bitrate needed to track movement. They’re working out the kinks through tests in the Charles River. “We did most of the experiments last winter,” says Rodriguez. That included some days with ice on the river. “It was not very pleasant.”
    Conditions aside, the tests provided a proof-of-concept in a challenging shallow-water environment. UBL estimated the distance between a transmitter and backscatter node at various distances up to nearly half a meter. The team is working to increase UBL’s range in the field, and they hope to test the system with their collaborators at the Wood Hole Oceanographic Institution on Cape Cod.
    They hope UBL can help fuel a boom in ocean exploration. Ghaffarivardavagh notes that scientists have better maps of the moon’s surface than of the ocean floor. “Why can’t we send out unmanned underwater vehicles on a mission to explore the ocean? The answer is: We will lose them,” he says.
    UBL could one day help autonomous vehicles stay found underwater, without spending precious battery power. The technology could also help subsea robots work more precisely, and provide information about climate change impacts in the ocean. “There are so many applications,” says Adib. “We’re hoping to understand the ocean at scale. It’s a long-term vision, but that’s what we’re working toward and what we’re excited about.”
    This work was supported, in part, by the Office of Naval Research. More

  • in

    Ultrapotent COVID-19 vaccine candidate designed via computer

    An innovative nanoparticle vaccine candidate for the pandemic coronavirus produces virus-neutralizing antibodies in mice at levels ten-times greater than is seen in people who have recovered from COVID-19 infections. Designed by scientists at the University of Washington School of Medicine in Seattle, the vaccine candidate has been transferred to two companies for clinical development.
    Compared to vaccination with the soluble SARS-CoV-2 Spike protein, which is what many leading COVID-19 vaccine candidates are based on, the new nanoparticle vaccine produced ten times more neutralizing antibodies in mice, even at a six-fold lower vaccine dose. The data also show a strong B-cell response after immunization, which can be critical for immune memory and a durable vaccine effect. When administered to a single nonhuman primate, the nanoparticle vaccine produced neutralizing antibodies targeting multiple different sites on the Spike protein. Researchers say this may ensure protection against mutated strains of the virus, should they arise. The Spike protein is part of the coronavirus infectivity machinery.
    The findings are published in Cell. The lead authors of this paper are Alexandra Walls, a research scientist in the laboratory of David Veesler, who is an associate professor of biochemistry at the UW School of Medicine; and Brooke Fiala, a research scientist in the laboratory of Neil King, who is an assistant professor of biochemistry at the UW School of Medicine.
    The vaccine candidate was developed using structure-based vaccine design techniques invented at UW Medicine. It is a self-assembling protein nanoparticle that displays 60 copies of the SARS-CoV-2 Spike protein’s receptor-binding domain in a highly immunogenic array. The molecular structure of the vaccine roughly mimics that of a virus, which may account for its enhanced ability to provoke an immune response.
    “We hope that our nanoparticle platform may help fight this pandemic that is causing so much damage to our world,” said King, inventor of the computational vaccine design technology at the Institute for Protein Design at UW Medicine. “The potency, stability, and manufacturability of this vaccine candidate differentiate it from many others under investigation.”
    Hundreds of candidate vaccines for COVID-19 are in development around the world. Many require large doses, complex manufacturing, and cold-chain shipping and storage. An ultrapotent vaccine that is safe, effective at low doses, simple to produce and stable outside of a freezer could enable vaccination against COVID-19 on a global scale.
    “I am delighted that our studies of antibody responses to coronaviruses led to the design of this promising vaccine candidate,” said Veesler, who spearheaded the concept of a multivalent receptor-binding domain-based vaccine.
    The lead vaccine candidate from this report is being licensed non-exclusively and royalty-free during the pandemic by the University of Washington. One licensee, Icosavax, Inc., a Seattle biotechnology company co-founded in 2019 by King, is currently advancing studies to support regulatory filings and has initiated the U.S. Food and Drug Administtion’s Good Manufacturing Practice.
    To accelerate progress by Icosavax to the clinic, Amgen, Inc. has agreed to manufacture a key intermediate for these initial clinical studies. Another licensee, SK bioscience Co., Ltd., based in South Korea, is also advancing its own studies to support clinical and further development.
    The research reported in Cell was supported by the National Institute of Allergy and Infectious Diseases (DP1AI158186, HHSN272201700059C, 3U01AI42001-02S1), National Institute of General Medical Sciences (R01GM120553, R01GM099989), Bill & Melinda Gates Foundation (OPP1156262, OPP1126258, OPP1159947), Defense Threat Reduction Agency (HDTRA1-18-1-0001), Pew Biomedical Scholars Award, Investigators in the Pathogenesis of Infectious Disease Award from the Burroughs Wellcome Fund, Fast Grants, Animal Models Contract HHSN272201700036I-75N93020F00001, University of Washington’s Proteomics Resource (UWPR95794), North Carolina Coronavirus Relief Fund, and gifts from Jodi Green and Mike Halperin and from The Audacious Project.

    Story Source:
    Materials provided by University of Washington Health Sciences/UW Medicine. Original written by Ian Haydon, UW Medicine Institute for Protein Design. Note: Content may be edited for style and length. More

  • in

    Birdwatching from afar: Amazing new AI-enabled camera system to target specific behaviors

    A research team from Osaka University has developed an innovative new animal-borne data-collection system that, guided by artificial intelligence (AI), has led to the witnessing of previously unreported foraging behaviors in seabirds.
    Bio-logging is a technique involving the mounting of small lightweight video cameras and/or other data-gathering devices onto the bodies of wild animals. The systems then allow researchers to observe various aspects of that animal’s life, such as its behaviors and social interactions, with minimal disturbance.
    However, the considerable battery life required for these high-cost bio-logging systems has proven limiting so far. “Since bio-loggers attached to small animals have to be small and lightweight, they have short runtimes and it was therefore difficult to record interesting infrequent behaviors,” explains study corresponding author Takuya Maekawa.
    “We have developed a new AI-equipped bio-logging device that allows us to automatically detect and record the specific target behaviors of interest based on data from low-cost sensors such as accelerometers and geographic positioning systems (GPS).” The low-cost sensors then limit the use of the high-cost sensors, such as video cameras, to just the periods of time when they are most likely to capture the specific target behavior.
    The use of these systems in combination with machine learning techniques can focus data collection with the expensive sensors directly onto interesting but infrequent behaviors, greatly increasing the likelihood that those behaviors will be detected.
    The new AI-assisted video camera system was tested on black-tailed gulls and streaked shearwaters in their natural environment on islands off the coast of Japan. “The new method improved the detection of foraging behaviors in the black-tailed gulls 15-fold compared with the random sampling method,” says lead author Joseph Korpela. “In the streaked shearwaters, we applied a GPS-based AI-equipped system to detect specific local flight activities of these birds. The GPS-based system had a precision of 0.59 — far higher than the 0.07 of a periodic sampling method involving switching the camera on every 30 minutes.”
    There are many potential applications for the use of AI-equipped bio-loggers in the future, not least the further development of the systems themselves. “These systems have a huge range of possible applications including detection of poaching activity using anti-poaching tags,” says Maekawa. “We also anticipate that this work will be used to reveal the interactions between human society and wild animals that transmit epidemics such as coronavirus.”

    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    Intelligent cameras enhance human perception

    Intelligent cameras are the next milestone in image and video processing A team of researchers at the Chair of Multimedia Communications and Signal Processing at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has developed an intelligent camera that achieves not only high spatial and temporal but also spectral resolution. The camera has a wide range of applications that can improve environmental protection and resource conservation measures as well as autonomous driving or modern agriculture. The findings of the research have been publishedas an open access publication.
    ‘Research up to now has mainly focused on increasing spatial and temporal resolution, which means the number of megapixels or images per second,’ explains lecturer Dr. Jürgen Seiler. ‘Spectral resolution — the wavelength and thus the perception of colours — has largely been adjusted to match human sight during the development of cameras, which merely corresponds to measuring the colours red, green, and blue. However, much more information is hidden in the light spectrum that can be used for a wide range of tasks. For example, we know that some animals use additional light spectra for hunting and searching for food.’
    Three resolutions in one camera
    Seiler, who is an electrical engineer, has therefore developed a high-resolution multi-spectral camera that enhances human perception with his team at the Chair of Multimedia Communications and Signal Processing (LMS) led by Prof. Dr. Kaup at FAU. It combines all three resolutions — spatial, temporal and spectral — in a cost-efficient solution. ‘Up to now, there were only extremely expensive and complex methods for measuring the ultraviolet or infrared ranges of light or individual spectral bands for special industrial applications,’ says Seiler. ‘We looked for a cost-efficient model and we were able to develop a very cost-effective multi-spectral camera.’
    The researchers connected several inexpensive standard cameras with various spectral filters to form a multi-spectral camera array. ‘We then calculated an image in order to combine the various spectral information from each sensor,’ explains Nils Genser, research associate at LMS. ‘This new concept enables us to precisely determine the materials of each object captured using just one single image.’
    At the same time, the new camera is greatly superior to existing systems in terms of its spatial, temporal and spectral resolution. As the surroundings are recorded by several ‘eyes’ as is the case with human sight, the system also provides a precise indication of depth. This means that the system not only precisely determines the colour and certain material properties of objects it captures, but also the distance between them and the camera.
    Ideal for autonomous driving and environmental technology
    Autonomous driving is a potential application for these new intelligent cameras. ‘A whole range of solutions to various problems has now opened up thanks to our new technology,’ says Seiler. ‘In the infrared range, for example, we can differentiate between real people and signposts using the thermal signature. For night driving, we can detect animals crossing the road with sufficient warning.’
    The high-resolution multi-spectral cameras could also be used for protecting the environment and conserving resources. ‘Several plastics differ significantly from each other in various ranges of the spectrum, which is something the new intelligent camera can reliably detect,’ Genser emphasises. ‘Large amounts of plastics are simply burned instead of separated for recycling as they have a similar appearance. We can now separate them reliably.’

    Story Source:
    Materials provided by University of Erlangen-Nuremberg. Note: Content may be edited for style and length. More

  • in

    A new spin on atoms gives scientists a closer look at quantum weirdness

    When atoms get extremely close, they develop intriguing interactions that could be harnessed to create new generations of computing and other technologies. These interactions in the realm of quantum physics have proven difficult to study experimentally due the basic limitations of optical microscopes.
    Now a team of Princeton researchers, led by Jeff Thompson, an assistant professor of electrical engineering, has developed a new way to control and measure atoms that are so close together no optical lens can distinguish them.
    Described in an article published Oct. 30 in the journal Science, their method excites closely-spaced erbium atoms in a crystal using a finely tuned laser in a nanometer-scale optical circuit. The researchers take advantage of the fact that each atom responds to slightly different frequencies, or colors, of laser light, allowing the researchers to resolve and control multiple atoms, without relying on their spatial information.
    In a conventional microscope, the space between two atoms effectively disappears when their separation is below a key distance called the diffraction limit, which is roughly equal to the light’s wavelength. This is analogous to two distant stars that appear as a single point of light in the night sky. However, this is also the scale at which atoms start to interact and give rise to rich and interesting quantum mechanical behavior.
    “We always wonder, at the most fundamental level — inside solids, inside crystals — what do atoms actually do? How do they interact?” said physicist Andrei Faraon, a professor at the California Institute of Technology who was not involved in the research. “This [paper] opens the window to study atoms that are in very, very close proximity.”
    Studying atoms and their interactions at tiny distances allows scientists to explore and control a quantum property known as spin. As a form of momentum, spin is usually described as being either up or down (or both, but that’s another story). When the distance between two atoms grows vanishingly small — mere billionths of a meter — the spin of one exerts influence over the spin of the other, and vice versa. As spins interact in this realm, they can become entangled, a term scientists use to describe two or more particles that are inextricably linked. Entangled particles behave as if they share one existence, no matter how far apart they later become. Entanglement is the essential phenomenon that separates quantum mechanics from the classical world, and it’s at the center of the vision for quantum technologies. The new Princeton device is a stepping stone for scientists to study these spin interactions with unprecedented clarity.

    advertisement

    One important feature of the new Princeton device is its potential to address hundreds of atoms at a time, providing a rich quantum laboratory in which to gather empirical data. It’s a boon for physicists who hope to unlock reality’s deepest mysteries, including the spooky nature of entanglement.
    Such inquiry is not merely esoteric. Over the past three decades, engineers have sought to use quantum phenomena to create complex technologies for information processing and communication, from the logical building blocks of emerging quantum computers, capable of solving otherwise impossible problems, to ultrasecure communication methods that can link machines into an unhackable quantum Internet. To develop these systems further, scientists will need to entangle particles reliably and exploit their entanglement to encode and process information.
    Thompson’s team saw an opportunity in erbium. Traditionally used in lasers and magnets, erbium was not widely explored for use in quantum systems because it is difficult to observe, according to the researchers. The team made a breakthrough in 2018, developing a way to enhance the light emitted by these atoms, and to detect that signal extremely efficiently. Now they’ve shown they can do it all en masse.
    When the laser illuminates the atoms, it excites them just enough for them to emit a faint light at a unique frequency, but delicately enough to preserve and read out the atoms’ spins. These frequencies change ever so subtly according to the atoms’ different states, so that “up” has one frequency and “down” has another, and each individual atom has its own pair of frequencies.
    “If you have an ensemble of these qubits, they all emit light at very slightly different frequencies. And so by tuning the laser carefully to the frequency of one or the frequency of the other, we can address them, even though we have no ability to spatially resolve them,” Thompson said. “Each atom sees all of the light, but they only listen to the frequency they’re tuned to.”
    The light’s frequency is then a perfect proxy for the spin. Switching the spins up and down gives researchers a way to make calculations. It’s akin to transistors that are either on or off in a classical computer, giving rise to the zeroes and ones of our digital world.

    advertisement

    To form the basis of a useful quantum processor, these qubits will need to go a step further.
    “The strength of the interaction is related to the distance between the two spins,” said Songtao Chen, a postdoctoral researcher in Thompson’s lab and one of the paper’s two lead authors. “We want to make them close so we can have this mutual interaction, and use this interaction to create a quantum logic gate.”
    A quantum logic gate requires two or more entangled qubits, making it capable of performing uniquely quantum operations, such as computing the folding patterns of proteins or routing information on the quantum internet.
    Thompson, who holds a leadership position at the U.S. Department of Energy’s new $115M quantum science initiative, is on a mission to bring these qubits to heel. Within the materials thrust of the Co-Design Center for Quantum Advantage, he leads the sub- qubits for computing and networking.
    His erbium system, a new kind of qubit that is especially useful in networking applications, can operate using the existing telecommunications infrastructure, sending signals in the form of encoded light over silicon devices and optical fibers. These two properties give erbium an industrial edge over today’s most advanced solid-state qubits, which transmit information through visible light wavelengths that don’t work well with optical-fiber communication networks.
    Still, to operate at scale, the erbium system will need to be further engineered.
    While the team can control and measure the spin state of its qubits no matter how close they get, and use optical structures to produce high-fidelity measurement, they can’t yet arrange the qubits as needed to form two-qubit gates. To do that, engineers will need to find a different material to host the erbium atoms. The study was designed with this future improvement in mind.
    “One of the major advantages of the way we have done this experiment is that it has nothing to do with what host the erbium sits in,” said Mouktik Raha, a sixth-year graduate student in electrical engineering and one of the paper’s two lead authors. “As long as you can put erbium inside it and it doesn’t jitter around, you’re good to go.”
    Christopher M. Phenicie and Salim Ourari, both electrical engineering graduate students, also contributed to the paper. The work was carried out in conjunction with the Princeton Quantum Initiative, and funded in part by the National Science Foundation, the Princeton Center for Complex Materials, the Young Investigator Program of the Air Force Office of Scientific Research, and the Defense Advanced Research Projects Agency. More