More stories

  • in

    New quantum receiver the first to detect entire radio frequency spectrum

    A new quantum sensor can analyze the full spectrum of radio frequency and real-world signals, unleashing new potentials for soldier communications, spectrum awareness and electronic warfare.
    Army researchers built the quantum sensor, which can sample the radio-frequency spectrum — from zero frequency up to 20 GHz — and detect AM and FM radio, Bluetooth, Wi-Fi and other communication signals.
    The Rydberg sensor uses laser beams to create highly-excited Rydberg atoms directly above a microwave circuit, to boost and hone in on the portion of the spectrum being measured. The Rydberg atoms are sensitive to the circuit’s voltage, enabling the device to be used as a sensitive probe for the wide range of signals in the RF spectrum.
    “All previous demonstrations of Rydberg atomic sensors have only been able to sense small and specific regions of the RF spectrum, but our sensor now operates continuously over a wide frequency range for the first time,” said Dr. Kevin Cox, a researcher at the U.S. Army Combat Capabilities Development Command, now known as DEVCOM, Army Research Laboratory. “This is a really important step toward proving that quantum sensors can provide a new, and dominant, set of capabilities for our Soldiers, who are operating in an increasingly complex electro-magnetic battlespace.”
    The Rydberg spectrum analyzer has the potential to surpass fundamental limitations of traditional electronics in sensitivity, bandwidth and frequency range. Because of this, the lab’s Rydberg spectrum analyzer and other quantum sensors have the potential to unlock a new frontier of Army sensors for spectrum awareness, electronic warfare, sensing and communications — part of the Army’s modernization strategy.
    “Devices that are based on quantum constituents are one of the Army’s top priorities to enable technical surprise in the competitive future battlespace,” said Army researcher Dr. David Meyer. “Quantum sensors in general, including the one demonstrated here, offer unparalleled sensitivity and accuracy to detect a wide range of mission-critical signals.”
    The peer-reviewed journal Physical Review Applied published the researchers’ findings, Waveguide-coupled Rydberg spectrum analyzer from 0 to 20 GigaHerz, co-authored by Army researchers Drs. David Meyer, Paul Kunz, and Kevin Cox
    The researchers plan additional development to improve the signal sensitivity of the Rydberg spectrum analyzer, aiming to outperform existing state-of-the-art technology.
    “Significant physics and engineering effort is still necessary before the Rydberg analyzer can integrate into a field-testable device,” Cox said. “One of the first steps will be understanding how to retain and improve the device’s performance as the sensor size is decreased. The Army has emerged as a leading developer of Rydberg sensors, and we expect more cutting-edge research to result as this futuristic technology concept quickly becomes a reality.”

    Story Source:
    Materials provided by U.S. Army Research Laboratory. Note: Content may be edited for style and length. More

  • in

    State-funded pre-K may enhance math achievement

    In the first longitudinal study to follow Georgia pre-K students through middle school, Stacey Neuharth-Pritchett, associate dean for academic programs and professor in UGA’s Mary Frances Early College of Education, found that participating in pre-K programs positively predicted mathematical achievement in students through seventh grade.
    “Students who participated in the study were twice as likely to meet the state standards in their mathematics achievement,” said Neuharth-Pritchett. “School becomes more challenging as one progresses through the grades, and so if in middle school, students are still twice as likely to meet the state standards, it’s clear that something that happened early on was influencing their trajectory.”
    The study found that, in fourth through seventh grades, the odds of a pre-K participant in the study meeting Georgia’s state academic standards on the statewide standardized test were 1.67-2.10 times greater than the odds for a nonparticipant, providing evidence of sustained benefits of state-funded pre-K programs.
    “Pre-K is a critical space where children experience success, and it sets them on a trajectory for being successful as they make the transition to kindergarten,” she said. “The hope is that when children are successful early in school, they are more likely to be engaged as they progress and more likely to complete high school.”
    Although quality learning experiences during the early years of development have been shown to provide the skills and knowledge for later mathematics achievement, access and entry to high-quality preschool programs remain unequal across the nation.
    “Our study looked at a high-needs school district that enrolled children from vulnerable situations in terms of economics and access to early learning experiences,” said Neuharth-Pritchett. “A number of the children in the study had not had any other formative experiences before they went to kindergarten.”
    Educational experiences are seen as foundational to later school success with some studies documenting other beneficial outcomes for students who attend pre-K, including a higher chance to complete high school, less mental health concerns, less reliance on the welfare system and more. However, students from low-income families often have more limited opportunities to learn at home as well as in pre-K programs.

    advertisement

    While some families are knowledgeable about providing their children with basic mathematical concepts and other foundational skills for a smooth home to school entry, other families might not be aware of the expectations for having mastered a number of these foundational skills before entering kindergarten.
    “Equal access to pre-K education has a long history that goes all the way back to the war on poverty. Part of the thinking during the 1960s was that such early learning opportunities would provide the high-quality preschool education that could level the educational playing field between those with economic resources and those without,” she said. “Our study indicated sustained benefits for children’s early learning experiences that persist into the elementary and middle school years.”
    Some implications of the study for policymakers to consider include ensuring more equitable access to pre-K programs and hiring highly skilled teachers to promote children’s learning and development. More than half of the pre-K teachers involved in the study held either a master’s or specialist degree, indicating the importance and influence of high-quality, experienced instructors on children’s academic success.
    Because of a change in program support for the Georgia Prekindergarten Program during Gov. Nathan Deal’s term, a high proportion of pre-K teachers are now very early in their teaching careers.
    Along with Jisu Han, an assistant professor at Kyung Hee University and co-author of the study, Neuharth-Pritchett plans to continue following the study’s participants as they progress through high school.
    “The state of Georgia invests substantial resources into this program, so it’s good that these outcomes can be cited for its efficacy,” said Neuharth-Pritchett. “The data from this study gives a much more longitudinal view of success and suggests these programs contribute to children’s education and success. Our results ultimately contribute to evidence supporting early learning and factors influencing long-term academic success for Georgia’s children.”

    Story Source:
    Materials provided by University of Georgia. Note: Content may be edited for style and length. More

  • in

    Scientists propose new way to detect emotions using wireless signals

    A novel artificial intelligence (AI) approach based on wireless signals could help to reveal our inner emotions, according to new research from Queen Mary University of London.
    The study, published in the journal PLOS ONE, demonstrates the use of radio waves to measure heartrate and breathing signals and predict how someone is feeling even in the absence of any other visual cues, such as facial expressions.
    Participants were initially asked to watch a video selected by researchers for its ability to evoke one of four basic emotion types; anger, sadness, joy and pleasure. Whilst the individual was watching the video the researchers then emitted harmless radio signals, like those transmitted from any wireless system including radar or WiFi, towards the individual and measured the signals that bounced back off them. By analysing changes to these signals caused by slight body movements, the researchers were able to reveal ‘hidden’ information about an individual’s heart and breathing rates.
    Previous research has used similar non-invasive or wireless methods of emotion detection, however in these studies data analysis has depended on the use of classical machine learning approaches, whereby an algorithm is used to identify and classify emotional states within the data. For this study the scientists instead employed deep learning techniques, where an artificial neural network learns its own features from time-dependent raw data, and showed that this approach could detect emotions more accurately than traditional machine learning methods.
    Achintha Avin Ihalage, a PhD student at Queen Mary, said: “Deep learning allows us to assess data in a similar way to how a human brain would work looking at different layers of information and making connections between them. Most of the published literature that uses machine learning measures emotions in a subject-dependent way, recording a signal from a specific individual and using this to predict their emotion at a later stage.
    “With deep learning we’ve shown we can accurately measure emotions in a subject-independent way, where we can look at a whole collection of signals from different individuals and learn from this data and use it to predict the emotion of people outside of our training database.”
    Traditionally, emotion detection has relied on the assessment of visible signals such as facial expressions, speech, body gestures or eye movements. However, these methods can be unreliable as they do not effectively capture an individual’s internal emotions and researchers are increasingly looking towards ‘invisible’ signals, such as ECG to understand emotions.

    advertisement

    ECG signals detect electrical activity in the heart, providing a link between the nervous system and heart rhythm. To date the measurement of these signals has largely been performed using sensors that are placed on the body, but recently researchers have been looking towards non-invasive approaches that use radio waves, to detect these signals.
    Methods to detect human emotions are often used by researchers involved in psychological or neuroscientific studies but it is thought that these approaches could also have wider implications for the management of health and wellbeing.
    In the future, the research team plan to work with healthcare professionals and social scientists on public acceptance and ethical concerns around the use of this technology.
    Ahsan Noor Khan, a PhD student at Queen Mary and first author of the study, said: “Being able to detect emotions using wireless systems is a topic of increasing interest for researchers as it offers an alternative to bulky sensors and could be directly applicable in future ‘smart’ home and building environments. In this study, we’ve built on existing work using radio waves to detect emotions and show that the use of deep learning techniques can improve the accuracy of our results.”
    “We’re now looking to investigate how we could use low-cost existing systems, such as WiFi routers, to detect emotions of a large number of people gathered, for instance in an office or work environment. This type of approach would enable us to classify emotions of people on individual basis while performing routine activities. Moreover, we aim to improve the accuracy of emotion detection in a work environment using advanced deep learning techniques.”
    Professor Yang Hao, the project lead added: “This research opens up many opportunities for practical applications, especially in areas such as human/robot interaction and healthcare and emotional wellbeing, which has become increasingly important during the current Covid-19 pandemic.” More

  • in

    Einstein’s theory of general relativity unveiled a dynamic and bizarre cosmos

    Albert Einstein’s mind reinvented space and time, foretelling a universe so bizarre and grand that it has challenged the limits of human imagination. An idea born in a Swiss patent office that evolved into a mature theory in Berlin set forth a radical new picture of the cosmos, rooted in a new, deeper understanding of gravity.
    Out was Newton’s idea, which had reigned for nearly two centuries, of masses that appeared to tug on one another. Instead, Einstein presented space and time as a unified fabric distorted by mass and energy. Objects warp the fabric of spacetime like a weight resting on a trampoline, and the fabric’s curvature guides their movements. With this insight, gravity was explained.
    Einstein presented his general theory of relativity at the end of 1915 in a series of lectures in Berlin. But it wasn’t until a solar eclipse in 1919 that everyone took notice. His theory predicted that a massive object — say, the sun — could distort spacetime nearby enough to bend light from its straight-line course. Distant stars would thus appear not exactly where expected. Photographs taken during the eclipse verified that the position shift matched Einstein’s prediction. “Lights all askew in the heavens; men of science more or less agog,” declared a New York Times headline.
    Even a decade later, a story in Science News Letter, the predecessor of Science News, wrote of “Riots to understand Einstein theory” (SN: 2/1/30, p. 79). Apparently extra police had to be called in to control a crowd of 4,500 who “broke down iron gates and mauled each other” at the American Museum of Natural History in New York City to hear an explanation of general relativity.

    By 1931, physicist Albert A. Michelson, the first American to win a Nobel Prize in the sciences, called the theory “a revolution in scientific thought unprecedented in the history of science.”
    But for all the powers of divination we credit to Einstein today, he was a reluctant soothsayer. We now know that general relativity offered much more than Einstein was willing or able to see. “It was a profoundly different way of looking at the universe,” says astrophysicist David Spergel of the Simons Foundation’s Flatiron Institute in New York City, “and it had some wild implications that Einstein himself didn’t want to accept.” What’s more, says Spergel (a member of the Honorary Board of the Society for Science, publisher of Science News), “the wildest aspects of general relativity have all turned out to be true.”
    What had been masquerading as a quiet, static, finite place is instead a dynamic, ever-expanding arena filled with its own riot of space-bending beasts. Galaxies congregate in superclusters on scales vastly greater than anything experts had considered before the 20th century. Within those galaxies reside not only stars and planets, but also a zoo of exotic objects illustrating general relativity’s propensity for weirdness, including neutron stars, which pack a fat star’s worth of mass into the size of a city, and black holes, which pervert spacetime so strongly that no light can escape. And when these behemoths collide, they shake spacetime, blasting out ginormous amounts of energy. Our cosmos is violent, evolving and filled with science fiction–like possibilities that actually come straight out of general relativity.
    “General relativity opened up a huge stage of stuff for us to look at and try out and play with,” says astrophysicist Saul Perlmutter of the University of California, Berkeley. He points to the idea that the universe changes dramatically over its lifetime — “the idea of a lifetime of a universe at all is a bizarre concept” — and the idea that the cosmos is expanding, plus the thought that it could collapse and come to an end, and even that there might be other universes. “You get to realize that the world could be much more interesting even than we already ever imagined it could possibly be.”

    General relativity has become the foundation for today’s understanding of the cosmos. But the current picture is far from complete. Plenty of questions remain about mysterious matter and forces, about the beginnings and the end of the universe, about how the science of the big meshes with quantum mechanics, the science of the very small. Some astronomers believe a promising route to answering some of those unknowns is another of general relativity’s initially underappreciated features — the power of bent light to magnify features of the cosmos.
    Today’s scientists continue to poke and prod at general relativity to find clues to what they might be missing. General relativity is now being tested to a level of precision previously impossible, says astrophysicist Priyamvada Natarajan of Yale ​University. “General relativity expanded our cosmic view, then gave us sharper focus on the cosmos, and then turned the tables on it and said, ‘now we can test it much more strongly.’ ” It’s this testing that will perhaps uncover problems with the theory that might point the way to a fuller picture.
    And so, more than a century after general relativity debuted, there’s plenty left to foretell. The universe may turn out to be even wilder yet.
    Ravenous beasts
    Just over a century after Einstein unveiled general relativity, scientists obtained visual confirmation of one of its most impressive beasts. In 2019, a global network of telescopes revealed a mass warping spacetime with such fervor that nothing, not even light, could escape its snare. The Event Horizon Telescope released the first image of a black hole, at the center of galaxy M87 (SN: 4/27/19, p. 6).
    In 2019, the Event Horizon Telescope Collaboration released this first-ever image of a black hole, at the heart of galaxy M87. The image shows the shadow of the monster surrounded by a bright disk of gas.Event Horizon Telescope Collaboration
    “The power of an image is strong,” says Kazunori Akiyama, an astrophysicist at the MIT Haystack Observatory in Westford, Mass., who led one of the teams that created the image. “I somewhat expected that we might see something exotic,” Akiyama says. But after looking at the first image, “Oh my God,” he recalls thinking, “it’s just perfectly matching with our expectation of general relativity.”
    For a long time, black holes were mere mathematical curiosities. Evidence that they actually reside out in space didn’t start coming in until the second half of the 20th century. It’s a common story in the annals of physics. An oddity in some theorist’s equation points to a previously unknown phenomenon, which kicks off a search for evidence. Once the data are attainable, and if physicists get a little lucky, the search gives way to discovery.
    In the case of black holes, German physicist Karl Schwarzschild came up with a solution to Einstein’s equations near a single spherical mass, such as a planet or a star, in 1916, shortly after Einstein proposed general relativity. Schwarzschild’s math revealed how the curvature of spacetime would differ around stars of the same mass but increasingly smaller sizes — in other words, stars that were more and more compact. Out of the math came a limit to how small a mass could be squeezed. Then in the 1930s, J. Robert Oppenheimer and Hartland Snyder described what would happen if a massive star collapsing under the weight of its own gravity shrank past that critical size — today known as the “Schwarzschild radius” — reaching a point from which its light could never reach us. Still, Einstein — and most others — doubted that what we now call black holes were plausible in reality.
    The term “black hole” first appeared in print in Science News Letter. It was in a 1964 story by Ann Ewing, who was covering a meeting in Cleveland of the American Association for the Advancement of Science (SN: 1/18/64, p. 39). That’s also about the time that hints in favor of the reality of black holes started coming in.
    Just a few months later, Ewing reported the discovery of quasars — describing them in Science News Letter as “the most distant, brightest, most violent, heaviest and most puzzling sources of light and radio waves” (SN: 8/15/64, p. 106). Though not linked to black holes at the time, quasars hinted at some cosmic powerhouses needed to provide such energy. The use of X-ray astronomy in the 1960s revealed new features of the cosmos, including bright beacons that could come from a black hole scarfing down a companion star. And the motions of stars and gas clouds near the centers of galaxies pointed to something exceedingly dense lurking within.  
    Quasars (one illustrated) are so bright that they can outshine their home galaxies. Though baffling when first discovered, these outbursts are powered by massive, feeding black holes.Mark Garlick/Science Source
    Black holes stand out among other cosmic beasts for how extreme they are. The largest are many billion times the mass of the sun, and when they rip a star apart, they can spit out particles with 200 trillion electron volts of energy. That’s some 30 times the energy of the protons that race around the world’s largest and most powerful particle accelerator, the Large Hadron Collider.
    As evidence built into the 1990s and up to today, scientists realized these great beasts not only exist, but also help shape the cosmos. “These objects that general relativity predicted, that were mathematical curiosities, became real, then they were marginal. Now they’ve become central,” says Natarajan.
    We now know supermassive black holes reside at the centers of most if not all galaxies, where they generate outflows of energy that affect how and where stars form. “At the center of the galaxy, they define everything,” she says.
    Though visual confirmation is recent, it feels as though black holes have long been familiar. They are a go-to metaphor for any unknowable space, any deep abyss, any endeavor that consumes all our efforts while giving little in return.
    Real black holes, of course, have given plenty back: answers about our cosmos plus new questions to ponder, wonder and entertainment for space fanatics, a lost album from Weezer, numerous episodes of Doctor Who, the Hollywood blockbuster Interstellar.
    For physicist Nicolas Yunes of the University of Illinois at Urbana-Champaign, black holes and other cosmic behemoths continue to amaze. “Just thinking about the dimensions of these objects, how large they are, how heavy they are, how dense they are,” he says, “it’s really breathtaking.”
    [embedded content]
    In 2019, scientists gave us the first real picture of the supermassive black hole at the center of galaxy M87. How? We explain.
    Spacetime waves
    When general relativity’s behemoths collide, they disrupt the cosmic fabric. Ripples in spacetime called gravitational waves emanate outward, a calling card of a tumultuous and most energetic tango.
    Einstein’s math predicted such waves could be created, not only by gigantic collisions but also by explosions and other accelerating bodies. But for a long time, spotting any kind of spacetime ripple was a dream beyond measure. Only the most dramatic cosmic doings would create signals that were large enough for direct detection. Einstein, who called the waves gravitationswellen, was unaware that any such big events existed in the cosmos.
    Gravitational waves ripple away from two black holes that orbit each other before merging (shown in this simulation). The merging black holes created a new black hole that’s much larger than those found in previous collisions.Deborah Ferguson, Karan Jani, Deirdre Shoemaker and Pablo Laguna/Georgia Tech, Maya Collaboration
    Beginning in the 1950s, when others were still arguing whether gravitational waves existed in reality, physicist Joseph Weber sunk his career into trying to detect them. After a decade-plus effort, he claimed detection in 1969, identifying an apparent signal perhaps from a supernova or from a newly discovered type of rapidly spinning star called a pulsar. In the few years after reporting the initial find, Science News published more than a dozen stories on what it began calling the “Weber problem” (SN: 6/21/69, p. 593). Study after study could not confirm the results. What’s more, no sources of the waves could be found. A 1973 headline read, “The deepening doubt about Weber’s waves” (SN: 5/26/73, p. 338).
    Weber stuck by his claim until his death in 2000, but his waves were never verified. Nonetheless, scientists increasingly believed gravitational waves would be found. In 1974, radio astronomers Russell Hulse and Joseph Taylor spotted a neutron star orbiting a dense companion. Over the following years, the neutron star and its companion appeared to be getting closer together by the distance that would be expected if they were losing energy to gravitational waves. Scientists soon spoke not of the Weber problem, but of what equipment could possibly pick up the waves. “Now, although they have not yet seen, physicists believe,” Dietrick E. Thomsen wrote in Science News in 1984 (SN: 8/4/84, p. 76).
    It was a different detection strategy, decades in the making, that would provide the needed sensitivity. The Advanced Laser Interferometry Gravitational-wave Observatory, or LIGO, which reported the first confirmed gravitational waves in 2016, relies on two detectors, one in Hanford, Wash., and one in Livingston, La. Each detector splits the beam of a powerful laser in two, with each beam traveling down one of the detector’s two arms. In the absence of gravitational waves, the two beams recombine and cancel each other out. But if gravitational waves stretch one arm of the detector while squeezing the other, the laser light no longer matches up.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    The machines are an incredible feat of engineering. Even spacetime ripples detected from colliding black holes might stretch an arm of the LIGO detector by as little as one ten-thousandth of the width of a proton.
    When the first detection, from two colliding black holes, was announced, the discovery was heralded as the beginning of a new era in astronomy. It was Science News’ story of the year in 2016, and such a big hit that the pioneers of the LIGO detector won the Nobel Prize in physics the following year.
    Scientists with LIGO and another gravitational wave detector, Virgo, based in Italy, have by now logged dozens more detections (SN: 1/30/21, p. 30). Most of the waves have emanated from mergers of black holes, though a few events have featured neutron stars. Smashups so far have revealed the previously unknown birthplaces of some heavy elements and pointed to a bright jet of charged subatomic particles that could offer clues to mysterious flashes of high-energy light known as gamma-ray bursts. The waves also have revealed that midsize black holes, between 100 and 100,000 times the sun’s mass, do in fact exist — along with reconfirming that Einstein was right, at least so far.
    Researchers at two gravitational wave observatories, LIGO in the United States and Virgo in Italy (shown), have reported dozens of detections of black hole smashups, as well as neutron star mergers, in the last five years.The Virgo Collaboration
    Just five years in, some scientists are already eager for something even more exotic. In a Science News article about detecting black holes orbiting wormholes via gravitational waves, physicist Vítor Cardoso of Instituto Superior Técnico in Lisbon, Portugal, suggested a coming shift to more unusual phenomena: “We need to look for strange but exciting signals,” he said (SN: 8/29/20, p. 12).
    Gravitational wave astronomy is truly only at its beginnings. Improved sensitivity at existing Earth-based detectors will turn up the volume on gravitational waves, allowing detections from less energetic and more distant sources. Future detectors, including the space-based LISA, planned for launch in the 2030s, will get around the troublesome noise that interferes when Earth’s surface shakes.
    “Perhaps the most exciting thing would be to observe a small black hole falling into a big black hole, an extreme mass ratio inspiraling,” Yunes says. In such an event, the small black hole would zoom back and forth, back and forth, swirling in different directions as it followed wildly eccentric orbits, perhaps for years. That could offer the ultimate test of Einstein’s equations, revealing whether we truly understand how spacetime is warped in the extreme. More

  • in

    Researchers create novel photonic chip

    Researchers at the George Washington University and University of California, Los Angeles, have developed and demonstrated for the first time a photonic digital to analog converter without leaving the optical domain. Such novel converters can advance next-generation data processing hardware with high relevance for data centers, 6G networks, artificial intelligence and more.
    Current optical networks, through which most of the world’s data is transmitted, as well as many sensors, require a digital-to-analog conversion, which links digital systems synergistically to analog components.
    Using a silicon photonic chip platform, Volker J. Sorger, an associate professor of electrical and computer engineering at GW, and his colleagues have created a digital-to-analog converter that does not require the signal to be converted in the electrical domain, thus showing the potential to satisfy the demand for high data-processing capabilities while acting on optical data, interfacing to digital systems, and performing in a compact footprint, with both short signal delay and low power consumption.
    “We found a way to seamlessly bridge the gap that exists between these two worlds, analog and digital,” Sorger said. “This device is a key stepping stone for next-generation data processing hardware.”
    This work was funded by the Air Force Office of Scientific Research (FA9550-19-1-0277) and the Office of Navy Research (N00014-19-1-2595 of the Electronic Warfare Program).

    Story Source:
    Materials provided by George Washington University. Note: Content may be edited for style and length. More

  • in

    A new hands-off probe uses light to explore electron behavior in a topological insulator

    Topological insulators are one of the most puzzling quantum materials — a class of materials whose electrons cooperate in surprising ways to produce unexpected properties. The edges of a TI are electron superhighways where electrons flow with no loss, ignoring any impurities or other obstacles in their path, while the bulk of the material blocks electron flow.
    Scientists have studied these puzzling materials since their discovery just over a decade ago with an eye to harnessing them for things like quantum computing and information processing.
    Now researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have invented a new, hands-off way to probe the fastest and most ephemeral phenomena within a TI and clearly distinguish what its electrons are doing on the superhighway edges from what they’re doing everywhere else.
    The technique takes advantage of a phenomenon called high harmonic generation, or HHG, which shifts laser light to higher energies and higher frequencies — much like pressing a guitar string produces a higher note — by shining it through a material. By varying the polarization of laser light going into a TI and analyzing the shifted light coming out, researchers got strong and separate signals that told them what was happening in each of the material’s two contrasting domains.
    “What we found out is that the light coming out gives us information about the properties of the superhighway surfaces,” said Shambhu Ghimire, a principal investigator with the Stanford PULSE Institute at SLAC, where the work was carried out. “This signal is quite remarkable, and its dependence on the polarization of the laser light is dramatically different from what we see in conventional materials. We think we have a potentially novel approach for initiating and probing quantum behaviors that are supposed to be present in a broad range of quantum materials.”
    The research team reported the results in Physical Review A today.

    advertisement

    Light in, light out
    Starting in 2010, a series of experiments led by Ghimire and PULSE Director David Reis showed HHG can be produced in ways that were previously thought unlikely or even impossible: by beaming laser light into a crystal, a frozen argon gas or an atomically thin semiconductor material. Another study described how to use HHG to generate attosecond laser pulses, which can be used to observe and control the movements of electrons, by shining a laser through ordinary glass.
    In 2018, Denitsa Baykusheva, a Swiss National Science Foundation Fellow with a background in HHG research, joined the PULSE group as a postdoctoral researcher. Her goal was to study the potential for generating HHG in topological insulators — the first such study in a quantum material. “We wanted to see what happens to the intense laser pulse used to generate HHG,” she said. “No one had actually focused such a strong laser light on these materials before.”
    But midway through those experiments, the COVID-19 pandemic hit and the lab shut down in March 2020 for all but essential research. So the team had to think of other ways to make progress, Baykusheva said.
    “In a new area of research like this one, theory and experiment have to go hand in hand,” she explained. “Theory is essential for explaining experimental results and also predicting the most promising avenues for future experiments. So we all turned ourselves into theorists” — first working with pen and paper and then writing code and doing calculations to feed into computer models.

    advertisement

    An illuminating result
    To their surprise, the results predicted that circularly polarized laser light, whose waves spiral around the beam like a corkscrew, could be used to trigger HHG in topological insulators.
    “One of the interesting things we observed is that circularly polarized laser light is very efficient at generating harmonics from the superhighway surfaces of the topological insulator, but not from the rest of it,” Baykusheva said. “This is something very unique and specific to this type of material. It can be used to get information about electrons that travel the superhighways and those that don’t, and it can also be used to explore other types of materials that can’t be probed with linearly polarized light.”
    The results lay out a recipe for continuing to explore HHG in quantum materials, said Reis, who is a co-author of the study.
    “It’s remarkable that a technique that generates strong and potentially disruptive fields, which takes electrons in the material and jostles them around and uses them to probe the properties of the material itself, can give you such a clear and robust signal about the material’s topological states,” he said.
    “The fact that we can see anything at all is amazing, not to mention the fact that we could potentially use that same light to change the material’s topological properties.”
    Experiments at SLAC have resumed on a limited basis, Reis added, and the results of the theoretical work have given the team new confidence that they know exactly what they are looking for.
    Researchers from the Max Planck POSTECH/KOREA Research Initiative also contributed to this report. Major funding for the study came from the DOE Office of Science and the Swiss National Science Foundation. More

  • in

    Highly deformable piezoelectric nanotruss for tactile electronics

    With the importance of non-contact environments growing due to COVID-19, tactile electronic devices using haptic technology are gaining traction as new mediums of communication.
    Haptic technology is being applied in a wide array of fields such as robotics or interactive displays. haptic gloves are being used for augmented information communication technology. Efficient piezoelectric materials that can convert various mechanical stimuli into electrical signals and vice versa are a prerequisite for advancing high-performing haptic technology.
    A research team led by Professor Seungbum Hong confirmed the potential of tactile devices by developing ceramic piezoelectric materials that are three times more deformable. For the fabrication of highly deformable nanomaterials, the research team built a zinc oxide hollow nanostructure using proximity field nanopatterning and atomic layered deposition. The piezoelectric coefficient was measured to be approximately 9.2 pm/V and the nanopillar compression test showed an elastic strain limit of approximately 10%, which is more than three times greater than that of the bulk zinc oxide one.
    Piezoelectric ceramics have a high piezoelectric coefficient with a low elastic strain limit, whereas the opposite is true for piezoelectric polymers. Therefore, it has been very challenging to obtain good performance in both high piezoelectric coefficients as well as high elastic strain limits. To break the elastic limit of piezoelectric ceramics, the research team introduced a 3D truss-like hollow nanostructure with nanometer-scale thin walls.
    According to the Griffith criterion, the fracture strength of a material is inversely proportional to the square root of the preexisting flaw size. However, a large flaw is less likely to occur in a small structure, which, in turn, enhances the strength of the material. Therefore, implementing the form of a 3D truss-like hollow nanostructure with nanometer-scale thin walls can extend the elastic limit of the material. Furthermore, a monolithic 3D structure can withstand large strains in all directions while simultaneously preventing the loss from the bottleneck. Previously, the fracture property of piezoelectric ceramic materials was difficult to control, owing to the large variance in crack sizes. However, the research team structurally limited the crack sizes to manage the fracture properties.
    Professor Hong’s results demonstrate the potential for the development of highly deformable ceramic piezoelectric materials by improving the elastic limit using a 3D hollow nanostructure. Since zinc oxide has a relatively low piezoelectric coefficient compared to other piezoelectric ceramic materials, applying the proposed structure to such components promised better results in terms of the piezoelectric activity.
    “With the advent of the non-contact era, the importance of emotional communication is increasing. Through the development of novel tactile interaction technologies, in addition to the current visual and auditory communication, humankind will enter a new era where they can communicate with anyone using all five senses regardless of location as if they are with them in person,” Professor Hong said.
    “While additional research must be conducted to realize the application of the proposed designs for haptic enhancement devices, this study holds high value in that it resolves one of the most challenging issues in the use of piezoelectric ceramics, specifically opening new possibilities for their application by overcoming their mechanical constraints. More

  • in

    Beyond qubits: Next big step to scale up quantum computing

    Scientists and engineers at the University of Sydney and Microsoft Corporation have opened the next chapter in quantum technology with the invention of a single chip that can generate control signals for thousands of qubits, the building blocks of quantum computers.
    “To realise the potential of quantum computing, machines will need to operate thousands if not millions of qubits,” said Professor David Reilly, a designer of the chip who holds a joint position with Microsoft and the University of Sydney.
    “The world’s biggest quantum computers currently operate with just 50 or so qubits,” he said. “This small scale is partly because of limits to the physical architecture that control the qubits.”
    “Our new chip puts an end to those limits.”
    The results have been published in Nature Electronics.
    Most quantum systems require quantum bits, or qubits, to operate at temperatures close to absolute zero (-273.15 degrees). This is to prevent them losing their ‘quantumness’, the character of matter or light that quantum computers need to perform their specialised computations.

    advertisement

    In order for quantum devices to do anything useful, they need instructions. That means sending and receiving electronic signals to and from the qubits. With current quantum architecture, that involves a lot of wires.
    “Current machines create a beautiful array of wires to control the signals; they look like an inverted gilded birds’ nest or chandelier. They’re pretty, but fundamentally impractical. It means we can’t scale the machines up to perform useful calculations. There is a real input-output bottleneck,” said Professor Reilly, also a Chief Investigator at the ARC Centre for Engineered Quantum Systems (EQUS) .
    Microsoft Senior Hardware Engineer, Dr Kushal Das, a joint inventor of the chip, said: “Our device does away with all those cables. With just two wires carrying information as input, it can generate control signals for thousands of qubits.
    “This changes everything for quantum computing.”
    The control chip was developed at the Microsoft Quantum Laboratories at the University of Sydney, a unique industry-academic partnership that is changing the way scientists tackle engineering challenges.

    advertisement

    “Building a quantum computer is perhaps the most challenging engineering task of the 21st century. This can’t be achieved working with a small team in a university laboratory in a single country but needs the scale afforded by a global tech giant like Microsoft,” Professor Reilly said.
    “Through our partnership with Microsoft, we haven’t just suggested a theoretical architecture to overcome the input-output bottleneck, we’ve built it.
    “We have demonstrated this by designing a custom silicon chip and coupling it to a quantum system,” he said. “I’m confident to say this is the most advanced integrated circuit ever built to operate at deep cryogenic temperatures.”
    If realised, quantum computers promise to revolutionise information technology by solving problems beyond the scope of classical computers in fields as diverse as cryptography, medicine, finance, artificial intelligence and logistics.
    POWER BUDGET
    Quantum computers are at a similar stage that classical computers were in the 1940s. Machines like ENIAC, the world’s first electronic computer, required rooms of control systems to achieve any useful function.
    It has taken decades to overcome the scientific and engineering challenges that now allows for billions of transistors to fit into your mobile phone.
    “Our industry is facing perhaps even bigger challenges to take quantum computing beyond the ENIAC stage,” Professor Reilly said.
    “We need to engineer highly complex silicon chips that operate at 0.1 Kelvin,” he said. “That’s an environment 30 times colder than deep space.”
    Dr Sebastian Pauka’s doctoral research at the University of Sydney encompassed much of the work to interface quantum devices with the chip. He said: “Operating at such cold temperatures means we have an incredibly low power budget. If we try to put more power into the system, we overheat the whole thing.”
    In order to achieve their result, the scientists at Sydney and Microsoft built the most advanced integrated circuit to operate at cryogenic temperatures.
    “We have done this by engineering a system that operates in close proximity to the qubits without disturbing their operations,” Professor Reilly said.
    “Current control systems for qubits are removed metres away from the action, so to speak. They exist mostly at room temperature.
    “In our system we don’t have to come off the cryogenic platform. The chip is right there with the qubits. This means lower power and higher speeds. It’s a real control system for quantum technology.”
    YEARS OF ENGINEERING
    “Working out how to control these devices takes years of engineering development,” Professor Reilly said. “For this device we started four years ago when the University of Sydney started its partnership with Microsoft, which represents the single biggest investment in quantum technology in Australia.
    “We built lots of models and design libraries to capture the behaviour of transistors at deep cryogenic temperatures. Then we had to build devices, get them verified, characterised and finally connect them to qubits to see them work in practice.”
    Vice-Chancellor and Principal of the University of Sydney, Professor Stephen Garton, said: “The whole university community is proud of Professor Reilly’s success and we look forward to many years of continued partnership with Microsoft.”
    Professor Reilly said the field has now fundamentally changed. “It’s not just about ‘here is my qubit’. It’s about how you build all the layers and all the tech to build a real machine.
    ‘Our partnership with Microsoft allows us to work with academic rigour, with the benefit of seeing our results quickly put into practice.”
    The Deputy Vice-Chancellor (Research), Professor Duncan Ivison, said: “Our partnership with Microsoft has been about realising David Reilly’s inspired vision to enable quantum technology. It’s great to see that vision becoming a reality.”
    Professor Reilly said: “If we had remained solely in academia this chip would never have been built.”
    The Australian scientist said he isn’t stopping there.
    “We are just getting started on this new wave of quantum innovation,” he said. “The great thing about the partnership is we don’t just publish a paper and move on. We can now continue with the blueprint to realise quantum technology at the industrial scale.” More