More stories

  • in

    Record-breaking laser link could help us test whether Einstein was right

    Scientists from the International Centre for Radio Astronomy Research (ICRAR) and The University of Western Australia (UWA) have set a world record for the most stable transmission of a laser signal through the atmosphere.
    In a study published today in the journal Nature Communications, Australian researchers teamed up with researchers from the French National Centre for Space Studies (CNES) and the French metrology lab Systèmes de Référence Temps-Espace (SYRTE) at Paris Observatory.
    The team set the world record for the most stable laser transmission by combining the Aussies’ ‘phase stabilisation’ technology with advanced self-guiding optical terminals.
    Together, these technologies allowed laser signals to be sent from one point to another without interference from the atmosphere.
    Lead author Benjamin Dix-Matthews, a PhD student at ICRAR and UWA, said the technique effectively eliminates atmospheric turbulence.
    “We can correct for atmospheric turbulence in 3D, that is, left-right, up-down and, critically, along the line of flight,” he said.

    advertisement

    “It’s as if the moving atmosphere has been removed and doesn’t exist.
    “It allows us to send highly-stable laser signals through the atmosphere while retaining the quality of the original signal.”
    The result is the world’s most precise method for comparing the flow of time between two separate locations using a laser system transmitted through the atmosphere.
    ICRAR-UWA senior researcher Dr Sascha Schediwy said the research has exciting applications.
    “If you have one of these optical terminals on the ground and another on a satellite in space, then you can start to explore fundamental physics,” he said.

    advertisement

    “Everything from testing Einstein’s theory of general relativity more precisely than ever before, to discovering if fundamental physical constants change over time.”
    The technology’s precise measurements also have practical uses in earth science and geophysics.
    “For instance, this technology could improve satellite-based studies of how the water table changes over time, or to look for ore deposits underground,” Dr Schediwy said.
    There are further potential benefits for optical communications, an emerging field that uses light to carry information.
    Optical communications can securely transmit data between satellites and Earth with much higher data rates than current radio communications.
    “Our technology could help us increase the data rate from satellites to ground by orders of magnitude,” Dr Schediwy said.
    “The next generation of big data-gathering satellites would be able to get critical information to the ground faster.”
    The phase stabilisation technology behind the record-breaking link was originally developed to synchronise incoming signals for the Square Kilometre Array telescope.
    The multi-billion-dollar telescope is set to be built in Western Australia and South Africa from 2021. More

  • in

    Bringing atoms to a standstill: Miniaturizing laser cooling

    It’s cool to be small. Scientists at the National Institute of Standards and Technology (NIST) have miniaturized the optical components required to cool atoms down to a few thousandths of a degree above absolute zero, the first step in employing them on microchips to drive a new generation of super-accurate atomic clocks, enable navigation without GPS, and simulate quantum systems.
    Cooling atoms is equivalent to slowing them down, which makes them a lot easier to study. At room temperature, atoms whiz through the air at nearly the speed of sound, some 343 meters per second. The rapid, randomly moving atoms have only fleeting interactions with other particles, and their motion can make it difficult to measure transitions between atomic energy levels. When atoms slow to a crawl — about 0.1 meters per second — researchers can measure the particles’ energy transitions and other quantum properties accurately enough to use as reference standards in a myriad of navigation and other devices.
    For more than two decades, scientists have cooled atoms by bombarding them with laser light, a feat for which NIST physicist Bill Phillips shared the 1997 Nobel Prize in physics. Although laser light would ordinarily energize atoms, causing them to move faster, if the frequency and other properties of the light are chosen carefully, the opposite happens. Upon striking the atoms, the laser photons reduce the atoms’ momentum until they are moving slowly enough to be trapped by a magnetic field.
    But to prepare the laser light so that it has the properties to cool atoms typically requires an optical assembly as big as a dining-room table. That’s a problem because it limits the use of these ultracold atoms outside the laboratory, where they could become a key element of highly accurate navigation sensors, magnetometers and quantum simulations.
    Now NIST researcher William McGehee and his colleagues have devised a compact optical platform, only about 15 centimeters (5.9 inches) long, that cools and traps gaseous atoms in a 1-centimeter-wide region. Although other miniature cooling systems have been built, this is the first one that relies solely on flat, or planar, optics, which are easy to mass produce.
    “This is important as it demonstrates a pathway for making real devices and not just small versions of laboratory experiments,” said McGehee. The new optical system, while still about 10 times too big to fit on a microchip, is a key step toward employing ultracold atoms in a host of compact, chip-based navigation and quantum devices outside a laboratory setting. Researchers from the Joint Quantum Institute, a collaboration between NIST and the University of Maryland in College Park, along with scientists from the University of Maryland’s Institute for Research in Electronics and Applied Physics, also contributed to the study.

    advertisement

    The apparatus, described online in the New Journal of Physics, consists of three optical elements. First, light is launched from an optical integrated circuit using a device called an extreme mode converter. The converter enlarges the narrow laser beam, initially about 500 nanometers (nm) in diameter (about five thousandths the thickness of a human hair), to 280 times that width. The enlarged beam then strikes a carefully engineered, ultrathin film known as a “metasurface” that’s studded with tiny pillars, about 600 nm in length and 100 nm wide.
    The nanopillars act to further widen the laser beam by another factor of 100. The dramatic widening is necessary for the beam to efficiently interact with and cool a large collection of atoms. Moreover, by accomplishing that feat within a small region of space, the metasurface miniaturizes the cooling process.
    The metasurface reshapes the light in two other important ways, simultaneously altering the intensity and polarization (direction of vibration) of the light waves. Ordinarily, the intensity follows a bell-shaped curve, in which the light is brightest at the center of the beam, with a gradual falloff on either side. The NIST researchers designed the nanopillars so that the tiny structures modify the intensity, creating a beam that has a uniform brightness across its entire width. The uniform brightness allows more efficient use of the available light. Polarization of the light is also critical for laser cooling.
    The expanding, reshaped beam then strikes a diffraction grating that splits the single beam into three pairs of equal and oppositely directed beams. Combined with an applied magnetic field, the four beams, pushing on the atoms in opposing directions, serve to trap the cooled atoms.
    Each component of the optical system — the converter, the metasurface and the grating — had been developed at NIST but was in operation at separate laboratories on the two NIST campuses, in Gaithersburg, Maryland and Boulder, Colorado. McGehee and his team brought the disparate components together to build the new system.
    “That’s the fun part of this story,” he said. “I knew all the NIST scientists who had independently worked on these different components, and I realized the elements could be put together to create a miniaturized laser cooling system.”
    Although the optical system will have to be 10 times smaller to laser-cool atoms on a chip, the experiment “is proof of principle that it can be done,” McGehee added.
    “Ultimately, making the light preparation smaller and less complicated will enable laser-cooling based technologies to exist outside of laboratories,” he said. More

  • in

    Designing customized 'brains' for robots

    Contemporary robots can move quickly. “The motors are fast, and they’re powerful,” says Sabrina Neuman.
    Yet in complex situations, like interactions with people, robots often don’t move quickly. “The hang up is what’s going on in the robot’s head,” she adds.
    Perceiving stimuli and calculating a response takes a “boatload of computation,” which limits reaction time, says Neuman, who recently graduated with a PhD from the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Neuman has found a way to fight this mismatch between a robot’s “mind” and body. The method, called robomorphic computing, uses a robot’s physical layout and intended applications to generate a customized computer chip that minimizes the robot’s response time.
    The advance could fuel a variety of robotics applications, including, potentially, frontline medical care of contagious patients. “It would be fantastic if we could have robots that could help reduce risk for patients and hospital workers,” says Neuman.
    Neuman will present the research at this April’s International Conference on Architectural Support for Programming Languages and Operating Systems. MIT co-authors include graduate student Thomas Bourgeat and Srini Devadas, the Edwin Sibley Webster Professor of Electrical Engineering and Neuman’s PhD advisor. Other co-authors include Brian Plancher, Thierry Tambe, and Vijay Janapa Reddi, all of Harvard University. Neuman is now a postdoctoral NSF Computing Innovation Fellow at Harvard’s School of Engineering and Applied Sciences.
    There are three main steps in a robot’s operation, according to Neuman. The first is perception, which includes gathering data using sensors or cameras. The second is mapping and localization: “Based on what they’ve seen, they have to construct a map of the world around them and then localize themselves within that map,” says Neuman. The third step is motion planning and control — in other words, plotting a course of action.

    advertisement

    These steps can take time and an awful lot of computing power. “For robots to be deployed into the field and safely operate in dynamic environments around humans, they need to be able to think and react very quickly,” says Plancher. “Current algorithms cannot be run on current CPU hardware fast enough.”
    Neuman adds that researchers have been investigating better algorithms, but she thinks software improvements alone aren’t the answer. “What’s relatively new is the idea that you might also explore better hardware.” That means moving beyond a standard-issue CPU processing chip that comprises a robot’s brain — with the help of hardware acceleration.
    Hardware acceleration refers to the use of a specialized hardware unit to perform certain computing tasks more efficiently. A commonly used hardware accelerator is the graphics processing unit (GPU), a chip specialized for parallel processing. These devices are handy for graphics because their parallel structure allows them to simultaneously process thousands of pixels. “A GPU is not the best at everything, but it’s the best at what it’s built for,” says Neuman. “You get higher performance for a particular application.” Most robots are designed with an intended set of applications and could therefore benefit from hardware acceleration. That’s why Neuman’s team developed robomorphic computing.
    The system creates a customized hardware design to best serve a particular robot’s computing needs. The user inputs the parameters of a robot, like its limb layout and how its various joints can move. Neuman’s system translates these physical properties into mathematical matrices. These matrices are “sparse,” meaning they contain many zero values that roughly correspond to movements that are impossible given a robot’s particular anatomy. (Similarly, your arm’s movements are limited because it can only bend at certain joints — it’s not an infinitely pliable spaghetti noodle.)
    The system then designs a hardware architecture specialized to run calculations only on the non-zero values in the matrices. The resulting chip design is therefore tailored to maximize efficiency for the robot’s computing needs. And that customization paid off in testing.

    advertisement

    Hardware architecture designed using this method for a particular application outperformed off-the-shelf CPU and GPU units. While Neuman’s team didn’t fabricate a specialized chip from scratch, they programmed a customizable field-programmable gate array (FPGA) chip according to their system’s suggestions. Despite operating at a slower clock rate, that chip performed eight times faster than the CPU and 86 times faster than the GPU.
    “I was thrilled with those results,” says Neuman. “Even though we were hamstrung by the lower clock speed, we made up for it by just being more efficient.”
    Plancher sees widespread potential for robomorphic computing. “Ideally we can eventually fabricate a custom motion-planning chip for every robot, allowing them to quickly compute safe and efficient motions,” he says. “I wouldn’t be surprised if 20 years from now every robot had a handful of custom computer chips powering it, and this could be one of them.” Neuman adds that robomorphic computing might allow robots to relieve humans of risk in a range of settings, such as caring for covid-19 patients or manipulating heavy objects.
    Neuman next plans to automate the entire system of robomorphic computing. Users will simply drag and drop their robot’s parameters, and “out the other end comes the hardware description. I think that’s the thing that’ll push it over the edge and make it really useful.”
    This research was funded by the National Science Foundation, the Computing Research Agency, the CIFellows Project, and the Defense Advanced Research Projects Agency. More

  • in

    Why older adults must go to the front of the vaccine line

    Vaccinating older adults for COVID-19 first will save substantially more U.S. lives than prioritizing other age groups, and the slower the vaccine rollout and more widespread the virus, the more critical it is to bring them to the front of the line.
    That’s one key takeaway from a new University of Colorado Boulder paper, published today in the journal Science, which uses mathematical modeling to make projections about how different distribution strategies would play out in countries around the globe.
    The research has already informed policy recommendations by the Centers for Disease Control and the World Health Organization to prioritize older adults after medical workers.
    Now, as policymakers decide how and whether to carry out that advice, the paper — which includes an interactive tool (https://vaxfirst.colorado.edu/) — presents the numbers behind the tough decision.
    “Common sense would suggest you want to protect the older, most vulnerable people in the population first. But common sense also suggests you want to first protect front-line essential workers (like grocery store clerks and teachers) who are at higher risk of exposure,” said senior author Daniel Larremore, a computational biologist in the Department of Computer Science and CU Boulder’s BioFrontiers Institute. “When common sense leads you in two different directions, math can help you decide.”
    For the study, Larremore and lead author Kate Bubar, a graduate student in the Department of Applied Mathematics, teamed up with colleagues at the Harvard T.H. Chan School of Public Health and the University of Chicago.

    advertisement

    They drew on demographic information from different countries, as well as up-to-date data on how many people have already tested positive for COVID-19, how quickly the virus is spreading, how fast vaccines are rolling out and their estimated efficacy.
    Then they modeled what would happen in five different scenarios in which a different group got vaccinated first: Children and teenagers; adults ages 20 to 49; adults 20 or older; or adults 60 or older (considering that about 30% of those eligible might decline). In the fifth scenario, anyone who wanted a vaccine got one while supplies lasted.
    Results from the United States, Belgium, Brazil, China, India, Poland, South Africa, Spain and Zimbabwe are included in the paper, with more countries included in the online tool.
    Different strategies worked better or worse, depending on local circumstances, but a few key findings jumped out.
    In most scenarios, across countries, prioritizing adults 60+ saved the most lives.

    advertisement

    “Age is the strongest predictor of vulnerability,” said Larremore, noting that while pre-existing conditions like asthma boost risk of severe illness or death, age boosts vulnerability more. “You have an exponentially higher likelihood of dying from COVID-19 as you get older.”
    The authors also note that, while the vaccines being distributed now are believed to have about a 90 to 95% chance of protecting against severe disease, researchers don’t yet know how well they block infection and transmission. If they don’t block it well and asymptomatic spreaders abound, it again makes the most sense to vaccinate older adults. If nothing else, they’ll be personally protected against grave disease.
    Only in scenarios where the virus is under control and the vaccine is known to block infection and transmission well does it make sense to move younger adults to the front of the line. That is not the situation in the United States right now.
    “For essential workers who might be frustrated that they are not first, we hope this study offers some clarity,” said Bubar. “We realize it is a big sacrifice for them to make but our study shows it will save lives.”
    So will a faster rollout, they found.
    For instance, all other things being equal, if the rollout speed was to be doubled from current rates under current transmission conditions, COVID-19 mortality could be reduced by about 23%, or 65,000 lives, over the next three months.
    The paper also suggests that in some situations where COVID has already infected large swaths of the population and vaccine is in short supply, it might make sense to ask younger adults who have already tested positive to step to the back of the line.
    “Our research suggests that prioritizing people who have not yet had COVID could allow hard-hit communities to stretch those first doses further and possibly get to some of the herd immunity effects sooner,” said Larremore.
    The authors stress that vaccines alone are not the only tactic for helping win the race against COVID.
    “To allow the vaccine to get to folks before the virus does, we need to not only roll out the vaccine quickly and get it to the most vulnerable people. We have to also keep our foot on the virus brake with masks, distancing and smart policies,” said Larremore. More

  • in

    Mathematical framework enables accurate characterization of shapes

    In nature, many things have evolved that differ in size, color and, above all, in shape. While the color or size of an object can be easily described, the description of a shape is more complicated. In a study now published in Nature Communications, Jacqueline Nowak of the Max Planck Institute of Molecular Plant Physiology and her colleagues have outlined a new and improved way to describe shapes based on a network representation that can also be used to reassemble and compare shapes.
    Jacqueline Nowak designed a novel approach that relies on a network-based shape representation, named visibility graph, along with a tool for analyzing shapes, termed GraVis. The visibility graph represents the shape of an object that is defined by its surrounding contour and the mathematical structure behind GraVis is specified by a set of nodes equidistantly placed around the contour. The nodes are then connected with each other by edges, that do not cross or align with the shape boundary.As a result, testing the connection between all pairs of nodes specifies the visibility graph for the analyzed shape.
    In this study, Jacqueline Nowak used the visibility graphs and the GraVis tool to compare different shapes. To test the power of the new approach, visibility graphs of simple triangular, rectangular and circular shapes, but also complex shapes of sand grains, fish shapes and leaf shapes were compared with each other.
    By using different machine learning approaches, they demonstrated that the approach can be used to distinguish shapes according to their complexity. Furthermore, visibility graphs enable to distinguish the complexity of shapes as it was shown for epidermal pavement cells in plants, which have a similar shape to pieces of jigsaw puzzle. For these cells, distinct shape parameters like lobe length, neck width or cell area can be accurately quantified with GraVis. “The quantification of the lobe number of epidermal cells with GraVis outperforms existing tools, showing that it is a powerful tool to address particular questions relevant to shape analysis,” says Zoran Nikoloski, GraVis project leader, head of the research group “Systems biology and Mathematical Modelling” at the Max Planck Institute of Molecular Plant Physiology and Professor of Bioinformatics at University of Potsdam.
    In future, the scientists want to apply visibility graphs of epidermal cells and entire leaves to gain biological insights of key cellular processes that impact shape. In addition, shape features of different plant cells quantified by GraVis can facilitate genetic screens to determine the genetic basis of morphogenesis. Finally, the application of GraVis will help to gain deeper understanding of the interrelation between cells and organ shapes in nature.

    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Using VR training to boost our sense of agency and improve motor control

    With Japan’s society rapidly aging, there has been a sharp increase in patients who experience motor dysfunctions. Rehabilitation is key to overcoming such ailments.
    A researcher from Tohoku University has developed a new virtual reality (VR) based method that can benefit rehabilitation and sports training by increasing bodily awareness and?improving motor control.
    His research was published in the Journal Scientific Report.
    Not only can we see and touch our body, but we can sense it too. Our body is constantly firing off information to our brains that tell us where our limbs are in real-time. This process makes us aware of our body and gives us ownership over it. Meanwhile, our ability to control the movement and actions of our body parts voluntarily affords us agency over our body.
    Ownership and agency are highly integrated and are related to our motor control. However, separating our sense of body ownership from our sense of agency has long evaded researchers, making it difficult to ascertain whether both ownership and agency truly affect motor control.
    Professor Kazumichi Matsumiya from the Graduate School of Information Sciences at Tohoku University could isolate these two senses by using VR. Participants viewed a computer-generated hand, and Matsumiya independently measured their sense of ownership and agency over the hand.
    “I found that motor control is improved when participants experienced a sense of agency over the artificial body, regardless of their sense of body ownership,” said Matsumiya. “Our findings suggest that artificial manipulation of agency will enhance the effectiveness of rehabilitation and aid sports training techniques to improve overall motor control.”

    Story Source:
    Materials provided by Tohoku University. Note: Content may be edited for style and length. More

  • in

    Researchers improve data readout by using 'quantum entanglement'

    Researchers say they have been able to greatly improve the readout of data from digital memories — thanks to a phenomenon known as ‘quantum entanglement’.
    The research team, which included researchers from the Italian Institute of Metrological Research (INRIM) and the University of York, say the findings could have major applications for digital storage devices, including optical memories such as CD or BluRay disks.
    This is the first experimental demonstration that quantum sources of light can enhance the readout of information from digital memories, an advance that could potentially lead to faster access of data in large databases and to construct memories with higher capacities in our next-generation computers.
    In an optical memory, bits are read by shining a laser beam over the reflecting surface of the disk. In the memory, each microscopic cell has one of two possible levels of reflectivity, representing the values “zero” and “one” of a bit.
    As a result, the laser beam reflected from a cell may be more or less intense depending on the value of the bit. The intensity of the beam is then registered by a detector and finally translated into an electrical signal.
    However, when the intensity of the laser beam becomes too low, for example as a result of an increased speed of the disk, energy fluctuations prevent the correct retrieval of the bits, introducing too many errors.
    The study showed how to fix this problem by resorting to more sophisticated light sources, where the use of quantum entanglement completely removes the unwanted fluctuations.
    The researchers say the consequences of the study go far beyond applications to digital memories. In fact, the same principle can be used in spectroscopy and the measurement of biological samples, chemical compounds and other materials.
    The scheme also paves the way for non-invasive, ultra-sensitive measurements by greatly reducing the optical power without reducing the amount of information recovered from the systems.
    Another promising perspective explored by the researchers is to extend the method to the recognition of complex patterns in conjunction with modern machine-learning algorithms, with potential implications for bio-imaging.
    Professor Stefano Pirandola, from the Department of Computer Science at the University of York, said: “This experiment finally shows how we can harness quantum entanglement to better read information from memory devices and other physical systems.”

    Story Source:
    Materials provided by University of York. Note: Content may be edited for style and length. More

  • in

    Innovations through hair-thin optical fibers

    Scientists at the University of Bonn have built hair-thin optical fibre filters in a very simple way. They are not only extremely compact and stable, but also colour-tunable. This means they can be used in quantum technology and as sensors for temperature or for detecting atmospheric gases. The results have been published in the journal Optics Express.
    Optical fibers not much thicker than a human hair today not only constitute the backbone of our world-wide information exchange. They are also the basis for building extremely compact and robust sensors with very high sensitivity for temperature, chemical analysis and much more.
    Optical resonators or filters are important components cutting out very narrow spectral lines from white light sources. In the simplest case such filters are built from two opposing mirrors tossing light back and forth as precisely as the pendulum of a clock work. The color of the filtered light is set by the mirror separation.
    Suitable mirrors with high quality have been integrated with the end of such hairlike fibers for some time. Researchers of the University of Bonn have succeeded to build in a simple way such hairlike optical fiber resonators. They are not only extremely compact and stable but also allow to tune their color: they have glued the fiber ends carrying the mirrors into a common ferrule which can be stretched by means of a piezo crystal and hence control the mirror separation.
    “The miniaturised optical filter makes a further contribution to making photonics and quantum technologies the decisive technology of the 21st century,” says Prof. Dr. Dieter Meschede from the Institute of Applied Physics at University of Bonn. The scientist is a member of “Matter and light for quantum computing” (ML4Q) Cluster of Excellence of the Universities of Bonn and Cologne and RWTH Aachen University and is also a member of the Transdisciplinary Research Area “Building Blocks of Matter and Fundamental Interactions” at the University of Bonn.
    Miniaturized highly stable optical precision filters are promising multiple applications: they can store light energy within such a small volume such that already single photons can be efficiently stored and manipulated. Their high sensitivity suggests to build extremely compact and selective sensors, e.g. for detecting atmospheric gases. Using even more stable materials for the ferrule tiny optical clock works with extremely high frequency stability may be built.

    Story Source:
    Materials provided by University of Bonn. Note: Content may be edited for style and length. More