More stories

  • in

    Single-photon source paves the way for practical quantum encryption

    Researchers have developed a new high-purity single-photon source that can operate at room temperature. The source is an important step toward practical applications of quantum technology, such as highly secure communication based on quantum key distribution (QKD).
    “We developed an on-demand way to generate photons with high purity in a scalable and portable system that operates at room temperature,” said Helen Zeng, a member of the research team from the University of Technology Sydney in Australia. “Our single-photon source could advance the development of practical QKD systems and can be integrated into a variety of real-world quantum photonic applications.”
    In the Optica Publishing Group journal Optics Letters, Zeng and colleagues from Australia’s University of New South Wales and Macquarie University describe their new single-photon source and show that it can produce over ten million single photons per second at room temperature. They also incorporated the single-photon source into a fully portable device that can perform QKD.
    The new single-photon source uniquely combines a 2D material called hexagonal boron nitride with an optical component known as a hemispherical solid immersion lens, which increases the source’s efficiency by a factor of six.
    Single photons at room temperature
    QKD offers impenetrable encryption for data communication by using the quantum properties of light to generate secure random keys for encrypting and decrypting data. QKD systems require robust and bright sources that emit light as a string of single photons. However, most of today’s single-photon sources don’t perform well unless operated at cryogenic temperatures hundreds of degrees below zero, which limits their practicality. More

  • in

    Quantum sensors: Measuring even more precisely

    Atomic clocks are the best sensors humankind has ever built. Today, they can be found in national standards institutes or satellites of navigation systems. Scientists all over the world are working to further optimize the precision of these clocks. Now, a research group led by Peter Zoller, a theorist from Innsbruck, Austria, has developed a new concept that can be used to operate sensors with even greater precision irrespective of which technical platform is used to make the sensor. “We answer the question of how precise a sensor can be with existing control capabilities, and give a recipe for how this can be achieved,” explain Denis Vasilyev and Raphael Kaubrügger from Peter Zoller’s group at the Institute of Quantum Optics and Quantum Information at the Austrian Academy of Sciences in Innsbruck.
    For this purpose, the physicists use a method from quantum information processing: variational quantum algorithms describe a circuit of quantum gates that depends on free parameters. Through optimization routines, the sensor autonomously finds the best settings for an optimal result. “We applied this technique to a problem from metrology — the science of measurement,” Vasilyev and Kaubrügger explain. “This is exciting because historically advances in atomic physics were motivated by metrology, and in turn quantum information processing emerged from that. So, we’ve come full circle here,” Peter Zoller enthuses. With the new approach, scientists can optimize quantum sensors to the point where they achieve the best possible precision technically permissible.
    Better measurements with little extra effort
    For some time, it has been understood that atomic clocks could run even more accurately by exploiting quantum mechanical entanglement. However, there has been a lack of methods to realize robust entanglement for such applications. The Innsbruck physicists are now using tailor-made entanglement that is precisely tuned to real-world requirements. With their method, they generate exactly the combination consisting of quantum state and measurements that is optimal for each individual quantum sensor. This allows the precision of the sensor to be brought close to the optimum possible according to the laws of nature, with only a slight increase in overhead. “In the development of quantum computers, we have learned to create tailored entangled states,” says Christian Marciniak from the Department of Experimental Physics at the University of Innsbruck. “We are now using this knowledge to build better sensors.”
    Demonstrating quantum advantage with sensors
    This theoretical concept was now implemented in practice for the first time at the University of Innsbruck, as the research group led by Thomas Monz and Rainer Blatt now reported in Nature. The physicists performed frequency measurements based on variational quantum calculations on their ion trap quantum computer. Because the interactions used in linear ion traps are still relatively easy to simulate on classical computers, the theory colleagues were able to check the necessary parameters on a supercomputer at the University of Innsbruck. Although the experimental setup is by no means perfect, the results agree surprisingly well with the theoretically predicted values. Since such simulations are not feasible for all sensors, the scientists demonstrated a second approach: They used methods to automatically optimize the parameters without prior knowledge. “Similar to machine learning, the programmable quantum computer finds its optimal mode autonomously as a high-precision sensor,” says experimental physicist Thomas Feldker, describing the underlying mechanism.
    “Our concept makes it possible to demonstrate the advantage of quantum technologies over classical computers on a problem of practical relevance,” emphasizes Peter Zoller. “We have demonstrated a crucial component of quantum-enhanced atomic clocks with our variational Ramsey interferometry. Running this in a dedicated atomic clock is the next step. What has so far only been shown for calculations of questionable practical relevance could now be demonstrated with a programmable quantum sensor in the near future — quantum advantage.”
    The research was financially supported by the Austrian Science Fund FWF, the Research Promotion Agency FFG, the European Union within the framework of the Quantum Flagship and the Federation of Austrian Industries Tyrol, among others.
    Story Source:
    Materials provided by University of Innsbruck. Note: Content may be edited for style and length. More

  • in

    Don’t underestimate undulating graphene

    Lay some graphene down on a wavy surface, and you’ll get a guide to one possible future of two-dimensional electronics.
    Rice University scientists put forth the idea that growing atom-thick graphene on a gently textured surface creates peaks and valleys in the sheets that turn them into “pseudo-electromagnetic” devices.
    The channels create their own minute but detectable magnetic fields. According to a study by materials theorist Boris Yakobson, alumnus Henry Yu and research scientist Alex Kutana at Rice’s George R. Brown School of Engineering, these could facilitate nanoscale optical devices like converging lenses or collimators.
    Their study appears in the American Chemical Society’s Nano Letters.
    They also promise a way to achieve a Hall effect — a voltage difference across the strongly conducting graphene — that could facilitate valleytronics applications that manipulate how electrons are trapped in “valleys” in an electronic band structure.
    Valleytronics are related to spintronics, in which a device’s memory bits are defined by an electron’s quantum spin state. But in valleytronics, electrons have degrees of freedom in the multiple momentum states (or valleys) they occupy. These can also be read as bits.
    This is all possible because graphene, while it may be one of the strongest known structures, is pliable enough as it adheres to a surface during chemical vapor deposition.
    “Substrate sculpting imparts deformation, which in turn alters the material electronic structure and changes its optical response or electric conductivity,” said Yu, now a postdoctoral researcher at Lawrence Livermore National Laboratory. “For sharper substrate features beyond the pliability of the material, one can engineer defect placements in the materials, which creates even more drastic changes in material properties.”
    Yakobson compared the process to depositing a sheet of graphene on an egg crate. The bumps in the crate deform the graphene, stressing it in a way that creates an electromagnetic field even without electrical or magnetic input.
    “The endless designs of substrate shapes allow for countless optical devices that can be created, making possible 2D electron optics,” Yakobson said. “This technology is a precise and efficient way of transmitting material carriers in 2D electronic devices, compared to traditional methods.”
    Yakobson is the Karl F. Hasselmann Professor of Materials Science and NanoEngineering and a professor of chemistry.
    The Office of Naval Research (N00014-18-1-2182) and the Army Research Office (W911NF-16-1-0255) supported the research.
    Story Source:
    Materials provided by Rice University. Original written by Mike Williams. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence tool may help predict heart attacks

    Investigators from Cedars-Sinai have created an artificial intelligence-enabled tool that may make it easier to predict if a person will have a heart attack.
    The tool, described in The Lancet Digital Health, accurately predicted which patients would experience a heart attack in five years based on the amount and composition of plaque in arteries that supply blood to the heart.
    Plaque buildup can cause arteries to narrow, which makes it difficult for blood to get to the heart, increasing the likelihood of a heart attack. A medical test called a coronary computed tomography angiography (CTA) takes 3D images of the heart and arteries and can give doctors an estimate of how much a patient’s arteries have narrowed. Until now, however, there has not been a simple, automated and rapid way to measure the plaque visible in the CTA images.
    “Coronary plaque is often not measured because there is not a fully automated way to do it,” said Damini Dey, PhD, director of the quantitative image analysis lab in the Biomedical Imaging Research Institute at Cedars-Sinai and senior author of the study. “When it is measured, it takes an expert at least 25 to 30 minutes, but now we can use this program to quantify plaque from CTA images in five to six seconds.”
    Dey and colleagues analyzed CTA images from 1,196 people who underwent a coronary CTA at 11 sites in Australia, Germany, Japan, Scotland and the United States. The investigators trained the AI algorithm to measure plaque by having it learn from coronary CTA images, from 921 people, that already had been analyzed by trained doctors.
    The algorithm works by first outlining the coronary arteries in 3D images, then identifying the blood and plaque deposits within the coronary arteries. Investigators found the tool’s measurements corresponded with plaque amounts seen in coronary CTAs. They also matched results with images taken by two invasive tests considered to be highly accurate in assessing coronary artery plaque and narrowing: intravascular ultrasound and catheter-based coronary angiography.
    Finally, the investigators discovered that measurements made by the AI algorithm from CTA images accurately predicted heart attack risk within five years for 1,611 people who were part of a multicenter trial called the SCOT-HEART trial.
    “More studies are needed, but it’s possible we may be able to predict if and how soon a person is likely to have a heart attack based on the amount and composition of the plaque imaged with this standard test,” said Dey, who is also professor of Biomedical Sciences at Cedars-Sinai.
    Dey and colleagues are continuing to study how well their AI algorithm quantifies plaque deposits in patients who undergo coronary CTA.
    Funding: The study was funded by the National Heart, Lung, and Blood Institute under award number 1R01HL148787-01A1.
    Story Source:
    Materials provided by Cedars-Sinai Medical Center. Note: Content may be edited for style and length. More

  • in

    Design tweak helps prevent malfunction in yarns designed to store energy

    In a new study, North Carolina State University researchers found a way to prevent electrical malfunctions in yarns designed to store electrical energy. Ultimately, the findings could help advance the development of “smart textiles” that would capture energy from the wearer’s movements and power sensors and wearable electronics.
    The researchers reported in npj Flexible Electronics that they were able to prevent short-circuiting in yarns that act as supercapacitors — which are electrical devices that store energy — by wrapping the yarns with an insulating thread. They also tested the strength and durability of the yarns to make sure they could still work after going through knitting and weaving processes.
    “A supercapacitor functions like a battery, but in this case, we’re working on a flexible battery shaped as a textile yarn that you could weave or knit into your T-shirt or sweater,” said Wei Gao, associate professor of textile engineering, chemistry and science and a University Faculty Scholar at NC State. “In this study, we have woven this yarn into a piece of fabric so that it can store electrical energy, and eventually we want to use it to power whatever electronic devices you need, whether it be a sensor, a light or even a cell phone.”
    While research into these so-called “yarn-shaped supercapacitors” is promising, researchers say developers face a consistent problem with their design: the yarn-shaped supercapacitors are more likely to short circuit as their length increases. Short-circuiting is when the electric current flows through an unintended path. It is a safety concern because a short circuit can result in a burst of heat energy or even a fire.
    “Everybody is trying to make smart electronics that can be incorporated into cloth or fabric,” Gao said. “What we found is if you try to make a supercapacitor yarn longer than 8 inches, it’s pretty easy for this device to short-circuit. It’s pretty dangerous, and it’s something nobody wants to encounter when wearing a smart suit.”
    To solve that problem, the researchers tested what would happen when they wrapped the super-capacitor yarn electrodes with insulating threads. The idea was that the threads would act as a physical barrier, keeping the opposite electrodes from contacting each other and preventing short-circuiting. They tested their device’s performance by connecting the electrodes to a power source and recording the device’s current response. They also tested how well the yarns were able to hold a charge. They found that the yarns kept 90% of the initial energy after charging and discharging them 10,000 times. More

  • in

    Social media data could help predict the next COVID surge

    In the summer of 2021, as the third wave of the COVID-19 pandemic wore on in the United States, infectious disease forecasters began to call attention to a disturbing trend.
    The previous January, as models warned that U.S. infections would continue to rise, cases plummeted instead. In July, as forecasts predicted infections would flatten, the Delta variant soared, leaving public health agencies scrambling to reinstate mask mandates and social distancing measures.
    “Existing forecast models generally did not predict the big surges and peaks,” said geospatial data scientist Morteza Karimzadeh, an assistant professor of geography at CU Boulder. “They failed when we needed them most.”
    New research from Karimzadeh and his colleagues suggests a new approach, using artificial intelligence and vast, anonymized datasets from Facebook could not only yield more accurate COVID-19 forecasts, but also revolutionize the way we track other infectious diseases, including the flu.
    Their findings, published in the International Journal of Data Science and Analytics, conclude this short-term forecasting method significantly outperforms conventional models for projecting COVID trends at the county level.
    Karimzadeh’s team is now one of about a dozen, including those from Columbia University and the Massachusetts Institute of Technology (MIT), submitting weekly projections to the COVID-19 Forecast Hub, a repository that aggregates the best data possible to create an “ensemble forecast” for the Centers for Disease Control. Their forecasts generally rank in the top two for accuracy each week. More

  • in

    Tiny, cheap solution for quantum-secure encryption

    It’s fairly reasonable to assume that an encrypted email can’t be seen by prying eyes. That’s because in order to break through most of the encryption systems we use on a day-to-day basis, unless you are the intended recipient, you’d need the answer to a mathematical problem that’s nearly impossible for a computer to solve in a reasonable amount of time.
    Nearly impossible for modern-day computers, at least.
    “If quantum computing becomes a reality, however, some of those problems are not hard anymore,” said Shantanu Chakrabartty, the Clifford W. Murphy Professor and vice dean for research and graduate education in the Preston M. Green Department of Electrical & Systems Engineering at the McKelvey School of Engineering.
    Already these new computing paradigms are becoming a reality and could soon be deployable. Hackers are already preparing by storing encrypted transactions now with the expectation they can decipher the information later.
    Chakrabartty’s lab at Washington University in St. Louis proposes a security system that is not only resistant to quantum attacks, but is also inexpensive, more convenient, and scalable without the need for fancy new equipment.
    This research will appear in the IEEE Transactions of Information Forensics Science.
    Security is often managed today by key distribution systems in which one person sends information hidden behind a key, maybe a long string of seemingly unassociated numbers. The receiver of that information can access the information if they possess another specific key. The two keys are related in a mathematical way that is nearly impossible to guess, but can be easily solved with the right algorithm or using a quantum computer. More

  • in

    Tackling large data sets and many parameter problems in particle physics

    One of the major challenges in particle physics is how to interpret large data sets that consist of many different observables in the context of models with different parameters.
    A new paper published in EPJ Plus, authored by Ursula Laa from the Institute of Statistics at BOKU University, Vienna, and German Valencia from the School of Physics and Astronomy, Monash University, Clayton, Australia, looks at the simplification of large data set and many parameter problems using tools to split large parameter spaces into a small number of regions.
    “We applied our tools to the so-called B-anomaly problem. In this problem there is a large number of experimental results and a theory that predicts them in terms of several parameters,” Laa says. “The problem has received much attention because the preferred parameters to explain the observations do not correspond to those predicted by the standard model of particle physics, and as such the results would imply new physics.”
    Valencia continues by explaining the paper shows how the Pandemonium tool can provide an interactive graphical way to study the connections between characteristics in the observations and regions of parameter space.
    “In the B-anomaly problem, for example, we can clearly visualise the tension between two important observables that have been singled out in the past,” Valencia says. “We can also see which improved measurements would be best to address that tension.
    “This can be most helpful in prioritising future experiments to address unresolved questions.”
    Laa elaborates by explaining that the methods developed and used by the duo are applicable to many other problems, in particular for models and observables that are less well understood than the applications discussed in the paper, such as multi Higgs models.
    “A challenge is the visualization of multidimensional parameter spaces, the current interface only allows the user to visualise high dimensional data spaces interactively,” Laa concludes. “The challenge is to automate this, which will be addressed in future work, using techniques from dimension reduction.”
    Story Source:
    Materials provided by Springer. Note: Content may be edited for style and length. More