More stories

  • in

    Breakthrough quantum algorithm

    City College of New York physicist Pouyan Ghaemi and his research team are claiming significant progress in using quantum computers to study and predict how the state of a large number of interacting quantum particles evolves over time. This was done by developing a quantum algorithm that they run on an IBM quantum computer. “To the best of our knowledge, such particular quantum algorithm which can simulate how interacting quantum particles evolve over time has not been implemented before,” said Ghaemi, associate professor in CCNY’s Division of Science.
    Entitled “Probing geometric excitations of fractional quantum Hall states on quantum computers,” the study appears in the journal of Physical Review Letters.
    “Quantum mechanics is known to be the underlying mechanism governing the properties of elementary particles such as electrons,” said Ghaemi. “But unfortunately there is no easy way to use equations of quantum mechanics when we want to study the properties of large number of electrons that are also exerting force on each other due to their electric charge.
    His team’s discovery, however, changes this and raises other exciting possibilities.
    “On the other front, recently, there has been extensive technological developments in building the so-called quantum computers. These new class of computers utilize the law of quantum mechanics to preform calculations which are not possible with classical computers.”
    We know that when electrons in material interact with each other strongly, interesting properties such as high-temperature superconductivity could emerge,” Ghaemi noted. “Our quantum computing algorithm opens a new avenue to study the properties of materials resulting from strong electron-electron interactions. As a result it can potentially guide the search for useful materials such as high temperature superconductors.”
    He added that based on their results, they can now potentially look at using quantum computers to study many other phenomena that result from strong interaction between electrons in solids. “There are many experimentally observed phenomena that could be potentially understood using the development of quantum algorithms similar to the one we developed.”
    The research was done at CCNY — and involved an interdisciplinary team from the physics and electrical engineering departments — in collaboration with experts from Western Washington University, Leeds University in the UK; and Schlumberger-Doll Research Center in Cambridge, Massachusetts. The research was funded by the National Science Foundation and Britain’s Engineering and Science Research Council.
    Story Source:
    Materials provided by City College of New York. Note: Content may be edited for style and length. More

  • in

    Quantum entanglement makes quantum communication even more secure

    Stealthy communication just got more secure, thanks to quantum entanglement.

    Quantum physics provides a way to share secret information that’s mathematically proven to be safe from the prying eyes of spies. But until now, demonstrations of the technique, called quantum key distribution, rested on an assumption: The devices used to create and measure quantum particles have to be known to be flawless. Hidden defects could allow a stealthy snoop to penetrate the security unnoticed.

    Now, three teams of researchers have demonstrated the ability to perform secure quantum communication without prior confirmation that the devices are foolproof. Called device-independent quantum key distribution, the method is based on quantum entanglement, a mysterious relationship between particles that links their properties even when separated over long distances.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    In everyday communication, such as the transmission of credit card numbers over the internet, a secret code, or key, is used to garble the information, so that it can be read only by someone else with the key. But there’s a quandary: How can a distant sender and receiver share that key with one another while ensuring that no one else has intercepted it along the way?

    Quantum physics provides a way to share keys by transmitting a series of quantum particles, such as particles of light called photons, and performing measurements on them. By comparing notes, the users can be sure that no one else has intercepted the key. Those secret keys, once established, can then be used to encrypt the sensitive intel (SN: 12/13/17). By comparison, standard internet security rests on a relatively shaky foundation of math problems that are difficult for today’s computers to solve, which could be vulnerable to new technology, namely quantum computers (SN: 6/29/17).

    But quantum communication typically has a catch. “There cannot be any glitch that is unforeseen,” says quantum physicist Valerio Scarani of the National University of Singapore. For example, he says, imagine that your device is supposed to emit one photon but unknown to you, it emits two photons. Any such flaws would mean that the mathematical proof of security no longer holds up. A hacker could sniff out your secret key, even though the transmission seems secure.

    Device-independent quantum key distribution can rule out such flaws. The method builds off of a quantum technique known as a Bell test, which involves measurements of entangled particles. Such tests can prove that quantum mechanics really does have “spooky” properties, namely nonlocality, the idea that measurements of one particle can be correlated with those of a distant particle. In 2015, researchers performed the first “loophole-free” Bell tests, which certified beyond a doubt that quantum physics’ counterintuitive nature is real (SN: 12/15/15).

    “The Bell test basically acts as a guarantee,” says Jean-Daniel Bancal of CEA Saclay in France. A faulty device would fail the test, so “we can infer that the device is working properly.”

    In their study, Bancal and colleagues used entangled, electrically charged strontium atoms separated by about two meters. Measurements of those ions certified that their devices were behaving properly, and the researchers generated a secret key, the team reports in the July 28 Nature.

    Typically, quantum communication is meant for long-distance dispatches. (To share a secret with someone two meters away, it would be easier to simply walk across the room.) So Scarani and colleagues studied entangled rubidium atoms 400 meters apart. The setup had what it took to produce a secret key, the researchers report in the same issue of Nature. But the team didn’t follow the process all the way through: The extra distance meant that producing a key would have taken months.

    In the third study, published in the July 29 Physical Review Letters, researchers wrangled entangled photons rather than atoms or ions. Physicist Wen-Zhao Liu of the University of Science and Technology of China in Hefei and colleagues also demonstrated the capability to generate keys, at distances up to 220 meters. This is particularly challenging to do with photons, Liu says, because photons are often lost in the process of transmission and detection.

    Loophole-free Bell tests are already no easy feat, and these techniques are even more challenging, says physicist Krister Shalm of the National Institute of Standards and Technology in Boulder, Colo. “The requirements for this experiment are so absurdly high that it’s just an impressive achievement to be able to demonstrate some of these capabilities,” says Shalm, who wrote a perspective in the same issue of Nature.

    That means that the technique won’t see practical use anytime soon, says physicist Nicolas Gisin of the University of Geneva, who was not involved with the research.

    Still, device-independent quantum key distribution is “a totally fascinating idea,” Gisin says. Bell tests were designed to answer a philosophical question about the nature of reality — whether quantum physics really is as weird as it seems. “To see that this now becomes a tool that enables something else,” he says, “this is the beauty.” More

  • in

    Humans may not be able to handle as much heat as scientists thought

    More than 2,000 people dead from extreme heat and wildfires raging in Portugal and Spain. High temperature records shattered from England to Japan. Overnights that fail to cool.

    Brutal heat waves are quickly becoming the hallmark of the summer of 2022.

    And even as climate change continues to crank up the temperature, scientists are working fast to understand the limits of humans’ resilience to heat extremes. Recent research suggests that heat stress tolerance in people may be lower than previously thought. If true, millions more people could be at risk of succumbing to dangerous temperatures sooner than expected.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    “Bodies are capable of acclimating over a period of time” to temperature changes, says Vivek Shandas, an environmental planning and climate adaptation researcher at Portland State University in Oregon. Over geologic time, there have been many climate shifts that humans have weathered, Shandas says. “[But] we’re in a time when these shifts are happening much more quickly.”

    Just halfway through 2022, heat waves have already ravaged many countries. The heat arrived early in southern Asia: In April, Wardha, India, saw a high of 45° Celsius (113° Fahrenheit); in Nawabshah, Pakistan, in May recorded temperatures rose to 49.5° C (121.1° F).

    Extreme heat alerts blared across Europe beginning in June and continuing through July, the rising temperatures exacerbating drought and sparking wildfires. The United Kingdom shattered its hottest-ever record July 19 when temperatures reached 40.3° C in the English village of Coningsby. The heat fueled fires in France, forcing thousands to evacuate from their homes. 

    And the litany goes on: Japan experienced its worst June heat wave since record-keeping began in 1875, leading to the country’s highest-ever recorded June temperature of 40.2° C.  China’s coastal megacities, from Shanghai to Chengdu, were hammered by heat waves in July as temperatures in the region also rose above 40° C. And in the United States, a series of heat waves gripped the Midwest, the South and the West in June and July. Temperatures soared to 42° C in North Platte, Neb., and to 45.6° C in Phoenix.

    The current global rate of warming on Earth is unprecedented (SN: 7/24/19). And scientists have long predicted that human-caused climate change will increase the occurrence of heat waves. Globally, humans’ exposure to extreme heat tripled from 1983 to 2016, particularly in South Asia.

    The heat already is taking an increasing toll on human health. It can cause heat cramps, heat exhaustion and heat stroke, which is often fatal. Dehydration can lead to kidney and heart disease. Extreme heat can even change how we behave, increasing aggression and decreasing our ability to focus (SN: 8/18/21).

    Staying cool

    The human body has various ways to shed excess heat and keep the core of the body at an optimal temperature of about 37° C (98.6° F). The heart pumps faster, speeding up blood flow that carries heat to the skin (SN: 4/3/18). Air passing over the skin can wick away some of that heat. Evaporative cooling — sweating — also helps.

    But there’s a limit to how much heat humans can endure. In 2010, scientists estimated that theoretical heat stress limit to be at a “wet bulb” temperature of 35° C. Wet bulb temperatures depend on a combination of humidity and “dry bulb” air temperature measured by a thermometer. Those variables mean a place could hit a wet bulb temperature of 35° C in different ways — for instance, if the air is that temperature and there’s 100 percent humidity, or if the air temperature is 45° C and there’s 50 percent humidity. The difference is due to evaporative cooling.

    When water evaporates from the skin or another surface, it steals away energy in the form of heat, briefly cooling that surface. That means that in drier regions, the wet bulb temperature — where that ephemeral cooling effect happens readily — will be lower than the actual air temperature. In humid regions, however, wet and dry bulb temperatures are similar, because the air is so moist it’s difficult for sweat to evaporate quickly.

    So when thinking about heat stress on the body, scientists use wet bulb temperatures because they are a measure of how much cooling through evaporation is possible in a given climate, says Daniel Vecellio, a climate scientist at Penn State.

    “Both hot/dry and warm/humid environments can be equally dangerous,” Vecellio says — and this is where the body’s different cooling strategies come into play. In hot, dry areas, where the outside temperature may be much hotter than skin temperature, human bodies rely entirely on sweating to cool down, he says. In warm, humid areas, where the air temperature may actually be cooler than skin temperatures (but the humidity makes it seem warmer than it is), the body can’t sweat as efficiently. Instead, the cooler air passing over the skin can draw away the heat.

    How hot is too hot?

    Given the complexity of the body’s cooling system, and the diversity of human bodies, there isn’t really a one-size-fits-all threshold temperature for heat stress for everybody. “No one’s body runs at 100 percent efficiency,” Vecellio says. Different body sizes, the ability to sweat, age and acclimation to a regional climate all have a role.

    Still, for the last decade, that theoretical wet bulb 35° C number has been considered to be the point beyond which humans can no longer regulate their bodies’ temperatures. But recent laboratory-based research by Vecellio and his colleagues suggests that a general, real-world threshold for human heat stress is much lower, even for young and healthy adults.

    The researchers tracked heat stress in two dozen subjects ranging in age from 18 to 34, under a variety of controlled climates. In the series of experiments, the team varied humidity and temperature conditions within an environmental chamber, sometimes holding temperature constant while varying the humidity, and sometimes vice versa.

    The subjects exerted themselves within the chamber just enough to simulate minimal outdoor activity, walking on a treadmill or pedaling slowly on a bike with no resistance. During these experiments, which lasted for 1.5 to two hours, the researchers measured the subjects’ skin temperatures using wireless probes and assessed their core temperatures using a small telemetry pill that the subjects swallowed.

    In warm and humid conditions, the subjects in the study were unable to tolerate heat stress at wet bulb temperatures closer to 30° or 31° C, the team estimates. In hot and dry conditions, that wet bulb temperature was even lower, ranging from 25° to 28° C, the researchers reported in the February Journal of Applied Physiology. For context, in a very dry environment at about 10 percent humidity, a wet bulb temperature of 25° C would correspond to an air temperature of about 50° C (122° F).

    These results suggest that there is much more work to be done to understand what humans can endure under real-world heat and humidity conditions, but that the threshold may be much lower than thought, Vecellio says. The 2010 study’s theoretical finding of 35° C may still be “the upper limit,” he adds. “We’re showing the floor.”

    And that’s for young, healthy adults doing minimal activity. Thresholds for heat stress are expected to be lower for outdoor workers required to exert themselves, or for the elderly or children. Assessing laboratory limits for more at-risk people is the subject of ongoing work for Vecellio and his colleagues.

    A worker wipes away sweat in Toulouse, France, on July 13. An intense heat wave swept across Europe in mid-July, engulfing Spain, Portugal, France, England and other countries.VALENTINE CHAPUIS/AFP via Getty Images

    If the human body’s tolerance for heat stress is generally lower than scientists have realized, that could mean millions more people will be at risk from the deadliest heat sooner than scientists have realized. As of 2020, there were few reports of wet bulb temperatures around the world reaching 35° C, but climate simulations project that limit could be regularly exceeded in parts of South Asia and the Middle East by the middle of the century.

    Some of the deadliest heat waves in the last two decades were at lower wet bulb temperatures: Neither the 2003 European heat wave, which caused an estimated 30,000 deaths, nor the 2010 Russian heat wave, which killed over 55,000 people, exceeded wet bulb temperatures of 28° C.

    Protecting people

    How best to inform the public about heat risk is “the part that I find to be tricky,” says Shandas, who wasn’t involved in Vecellio’s research. Shandas developed the scientific protocol for the National Integrated Heat Health Information System’s Urban Heat Island mapping campaign in the United States.

    It’s very useful to have this physiological data from a controlled, precise study, Shandas says, because it allows us to better understand the science behind humans’ heat stress tolerance. But physiological and environmental variability still make it difficult to know how best to apply these findings to public health messaging, such as extreme heat warnings, he says. “There are so many microconsiderations that show up when we’re talking about a body’s ability to manage [its] internal temperature.”

    One of those considerations is the ability of the body to quickly acclimate to a temperature extreme. Regions that aren’t used to extreme heat may experience greater mortality, even at lower temperatures, simply because people there aren’t used to the heat. The 2021 heat wave in the Pacific Northwest wasn’t just extremely hot — it was extremely hot for that part of the world at that time of year, which makes it more difficult for the body to adapt, Shandas says (SN: 6/29/21).

    Heat that arrives unusually early and right on the heels of a cool period can also be more deadly, says Larry Kalkstein, a climatologist at the University of Miami and the chief heat science advisor for the Washington, D.C.–based nonprofit Adrienne Arsht-Rockefeller Foundation Resilience Center. “Often early season heat waves in May and June are more dangerous than those in August and September.”

    One way to improve communities’ resilience to the heat may be to treat heat waves like other natural disasters — including give them names and severity rankings (SN: 8/14/20). As developed by an international coalition known as the Extreme Heat Resilience Alliance, those rankings form the basis for a new type of heat wave warning that explicitly considers the factors that impact heat stress, such as wet bulb temperature and acclimation, rather than just temperature extremes.

    The rankings also consider factors such as cloud cover, wind and how hot the temperatures are overnight. “If it’s relatively cool overnight, there’s not as much negative health outcome,” says Kalkstein, who created the system. But overnight temperatures aren’t getting as low as they used to in many places. In the United States, for example, the average minimum temperatures at nighttime are now about 0.8° C warmer than they were during the first half of the 20th century, according to the country’s Fourth National Climate Assessment, released in 2018 (SN: 11/28/18).

    By naming heat waves like hurricanes, officials hope to increase citizens’ awareness of the dangers of extreme heat. Heat wave rankings could also help cities tailor their interventions to the severity of the event. Six cities are currently testing the system’s effectiveness: four in the United States and in Athens, Greece, and Seville, Spain. On July 24, with temperatures heading toward 42° C, Seville became the first city in the world to officially name a heat wave, sounding the alarm for Heat Wave Zoe.

    As 2022 continues to smash temperature records around the globe, such warnings may come not a moment too soon. More

  • in

    Anti-butterfly effect enables new benchmarking of quantum-computer performance

    Research drawing on the quantum “anti-butterfly effect” solves a longstanding experimental problem in physics and establishes a method for benchmarking the performance of quantum computers.
    “Using the simple, robust protocol we developed, we can determine the degree to which quantum computers can effectively process information, and it applies to information loss in other complex quantum systems, too,” said Bin Yan, a quantum theorist at Los Alamos National Laboratory.
    Yan is corresponding author of the paper on benchmarking information scrambling published today in Physical Review Letters. “Our protocol quantifies information scrambling in a quantum system and unambiguously distinguishes it from fake positive signals in the noisy background caused by quantum decoherence,” he said.
    Noise in the form of decoherence erases all the quantum information in a complex system such as a quantum computer as it couples with the surrounding environment. Information scrambling through quantum chaos, on the other hand, spreads information across the system, protecting it and allowing it to be retrieved.
    Coherence is a quantum state that enables quantum computing, and decoherence refers to the loss of that state as information leaks to the surrounding environment.
    “Our method, which draws on the quantum anti-butterfly effect we discovered two years ago, evolves a system forward and backward through time in a single loop, so we can apply it to any system with time-reversing the dynamics, including quantum computers and quantum simulators using cold atoms,” Yan said.
    The Los Alamos team demonstrated the protocol with simulations on IBM cloud-based quantum computers.
    The inability to distinguish decoherence from information scrambling has stymied experimental research into the phenomenon. First studied in black-hole physics, information scrambling has proved relevant across a wide range of research areas, including quantum chaos in many-body systems, phase transition, quantum machine learning and quantum computing. Experimental platforms for studying information scrambling include superconductors, trapped ions and cloud-based quantum computers.
    Practical application of the quantum anti-butterfly effect
    Yan and co-author Nikolai Sinitsyn published a paper in 2020 proving that evolving quantum processes backwards on a quantum computer to damage information in the simulated past causes little change when returned to the present. In contrast, a classical-physics system smears the information irrecoverably during the back-and-forth time loop.
    Building on this discovery, Yan, Sinitsyn and co-author Joseph Harris, a University of Edinburgh graduate student who worked on the current paper as a participant in the Los Alamos Quantum Computing Summer School, developed the protocol. It prepares a quantum system and subsystem, evolves the full system forward in time, causes a change in a different subsystem, then evolves the system backward for the same amount of time. Measuring the overlap of information between the two subsystems shows how much information has been preserved by scrambling and how much lost to decoherence.
    Story Source:
    Materials provided by DOE/Los Alamos National Laboratory. Note: Content may be edited for style and length. More

  • in

    Engineering roboticists discover alternative physics

    A precursor step to understanding physics is identifying relevant variables. Columbia Engineers developed an AI program to tackle a longstanding problem: whether it is possible to identify state variables from only high-dimensional observational data. Using video recordings of a variety of physical dynamical systems, the algorithm discovered the intrinsic dimension of the observed dynamics and identified candidate sets of state variables — without prior knowledge of the underlying physics.
    Energy, Mass, Velocity. These three variables make up Einstein’s iconic equation E=MC2. But how did Einstein know about these concepts in the first place? A precursor step to understanding physics is identifying relevant variables. Without the concept of energy, mass, and velocity, not even Einstein could discover relativity. But can such variables be discovered automatically? Doing so could greatly accelerate scientific discovery.
    This is the question that researchers at Columbia Engineering posed to a new AI program. The program was designed to observe physical phenomena through a video camera, then try to search for the minimal set of fundamental variables that fully describe the observed dynamics. The study was published on July 25 in Nature Computational Science.
    The researchers began by feeding the system raw video footage of phenomena for which they already knew the answer. For example, they fed a video of a swinging double-pendulum known to have exactly four “state variables” — the angle and angular velocity of each of the two arms. After a few hours of analysis, the AI outputted the answer: 4.7.
    “We thought this answer was close enough,” said Hod Lipson, director of the Creative Machines Lab in the Department of Mechanical Engineering, where the work was primarily done. “Especially since all the AI had access to was raw video footage, without any knowledge of physics or geometry. But we wanted to know what the variables actually were, not just their number.”
    The researchers then proceeded to visualize the actual variables that the program identified. Extracting the variables themselves was not easy, since the program cannot describe them in any intuitive way that would be understandable to humans. After some probing, it appeared that two of the variables the program chose loosely corresponded to the angles of the arms, but the other two remain a mystery. “We tried correlating the other variables with anything and everything we could think of: angular and linear velocities, kinetic and potential energy, and various combinations of known quantities,” explained Boyuan Chen PhD ’22, now an assistant professor at Duke University, who led the work. “But nothing seemed to match perfectly.” The team was confident that the AI had found a valid set of four variables, since it was making good predictions, “but we don’t yet understand the mathematical language it is speaking,” he explained. More

  • in

    Seeing the light: Researchers develop new AI system using light to learn associatively

    Researchers at Oxford University’s Department of Materials, working in collaboration with colleagues from Exeter and Munster have developed an on-chip optical processor capable of detecting similarities in datasets up to 1,000 times faster than conventional machine learning algorithms running on electronic processors.
    The new research published in Optica took its inspiration from Nobel Prize laureate Ivan Pavlov’s discovery of classical conditioning. In his experiments, Pavlov found that by providing another stimulus during feeding, such as the sound of a bell or metronome, his dogs began to link the two experiences and would salivate at the sound alone. The repeated associations of two unrelated events paired together could produce a learned response — a conditional reflex.
    Co-first author Dr James Tan You Sian, who did this work as part of his DPhil in the Department of Materials, University of Oxford said: ‘Pavlovian associative learning is regarded as a basic form of learning that shapes the behaviour of humans and animals — but adoption in AI systems is largely unheard of. Our research on Pavlovian learning in tandem with optical parallel processing demonstrates the exciting potential for a variety of AI tasks.’
    The neural networks used in most AI systems often require a substantial number of data examples during a learning process — training a model to reliably recognise a cat could use up to 10,000 cat/non-cat images — at a computational and processing cost.
    Rather than relying on backpropagation favoured by neural networks to ‘fine-tune’ results, the Associative Monadic Learning Element (AMLE) uses a memory material that learns patterns to associate together similar features in datasets — mimicking the conditional reflex observed by Pavlov in the case of a ‘match’.
    The AMLE inputs are paired with the correct outputs to supervise the learning process, and the memory material can be reset using light signals. In testing, the AMLE could correctly identify cat/non-cat images after being trained with just five pairs of images.
    The considerable performance capabilities of the new optical chip over a conventional electronic chip are down to two key differences in design: a unique network architecture incorporating associative learning as a building block rather than using neurons and a neural network the use of ‘wavelength-division multiplexing’to send multiple optical signals on different wavelengths on a single channel to increase computational speed.The chip hardware uses light to send and retrieve data to maximise information density — several signals on different wavelengths are sent simultaneously for parallel processing which increases the detection speed of recognition tasks. Each wavelength increases the computational speed.
    Professor Wolfram Pernice, co-author from Münster University explained: ‘The device naturally captures similarities in datasets while doing so in parallel using light to increase the overall computation speed — which can far exceed the capabilities of conventional electronic chips.’
    An associative learning approach could complement neural networks rather than replace them clarified co-first author Professor Zengguang Cheng, now at Fudan University.
    ‘It is more efficient for problems that don’t need substantial analysis of highly complex features in the datasets’ said Professor Cheng. ‘Many learning tasks are volume based and don’t have that level of complexity — in these cases, associative learning can complete the tasks more quickly and at a lower computational cost.’
    ‘It is increasingly evident that AI will be at the centre of many innovations we will witness in the coming phase of human history. This work paves the way towards realising fast optical processors that capture data associations for particular types of AI computations, although there are still many exciting challenges ahead.’ said Professor Harish Bhaskaran, who led the study.
    Story Source:
    Materials provided by University of Oxford. Note: Content may be edited for style and length. More

  • in

    Improving image sensors for machine vision

    Image sensors measure light intensity, but angle, spectrum, and other aspects of light must also be extracted to significantly advance machine vision.
    In Applied Physics Letters, published by AIP Publishing, researchers at the University of Wisconsin-Madison, Washington University in St. Louis, and OmniVision Technologies highlight the latest nanostructured components integrated on image sensor chips that are most likely to make the biggest impact in multimodal imaging.
    The developments could enable autonomous vehicles to see around corners instead of just a straight line, biomedical imaging to detect abnormalities at different tissue depths, and telescopes to see through interstellar dust.
    “Image sensors will gradually undergo a transition to become the ideal artificial eyes of machines,” co-author Yurui Qu, from the University of Wisconsin-Madison, said. “An evolution leveraging the remarkable achievement of existing imaging sensors is likely to generate more immediate impacts.”
    Image sensors, which converts light into electrical signals, are composed of millions of pixels on a single chip. The challenge is how to combine and miniaturize multifunctional components as part of the sensor.
    In their own work, the researchers detailed a promising approach to detect multiple-band spectra by fabricating an on-chip spectrometer. They deposited photonic crystal filters made up of silicondirectly on top of the pixels to create complex interactions between incident light and the sensor.
    The pixels beneath the films record the distribution of light energy, from which light spectral information can be inferred. The device — less than a hundredth of a square inch in size — is programmable to meet various dynamic ranges, resolution levels, and almost any spectral regime from visible to infrared.
    The researchers built a component that detects angular information to measure depth and construct 3D shapes at subcellular scales. Their work was inspired by directional hearing sensors found in animals, like geckos, whose heads are too small to determine where sound is coming from in the same way humans and other animals can. Instead, they use coupled eardrums to measure the direction of sound within a size that is orders of magnitude smaller than the corresponding acoustic wavelength.
    Similarly, pairs of silicon nanowires were constructed as resonators to support optical resonance. The optical energy stored in two resonators is sensitive to the incident angle. The wire closest to the light sends the strongest current. By comparing the strongest and weakest currents from both wires, the angle of the incoming light waves can be determined.
    Millions of these nanowires can be placed on a 1-square-millimeter chip. The research could support advances in lensless cameras, augmented reality, and robotic vision.
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    How to make jet fuel from sunlight, air and water vapor

    Jet fuel can now be siphoned from the air.

    Or at least that’s the case in Móstoles, Spain, where researchers demonstrated that an outdoor system could produce kerosene, used as jet fuel, with three simple ingredients: sunlight, carbon dioxide and water vapor. Solar kerosene could replace petroleum-derived jet fuel in aviation and help stabilize greenhouse gas emissions, the researchers report in the July 20 Joule.

    Burning solar-derived kerosene releases carbon dioxide, but only as much as is used to make it, says Aldo Steinfeld, an engineer at ETH Zurich. “That makes the fuel carbon neutral, especially if we use carbon dioxide captured directly from the air.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Kerosene is the fuel of choice for aviation, a sector responsible for around 5 percent of human-caused greenhouse gas emissions. Finding sustainable alternatives has proven difficult, especially for long-distance aviation, because kerosene is packed with so much energy, says chemical physicist Ellen Stechel of Arizona State University in Tempe who was not involved in the study.

    In 2015, Steinfeld and his colleagues synthesized solar kerosene in the laboratory, but no one had produced the fuel entirely in a single system in the field. So Steinfeld and his team positioned 169 sun-tracking mirrors to reflect and focus radiation equivalent to about 2,500 suns into a solar reactor atop a 15-meter-tall tower. The reactor has a window to let the light in, ports that supply carbon dioxide and water vapor as well as a material used to catalyze chemical reactions called porous ceria.

    Within the solar reactor, porous ceria (shown) gets heated by sunlight and reacts with carbon dioxide and water vapor to produce syngas, a mixture of hydrogen gas and carbon monoxide.ETH Zurich

    When heated with solar radiation, the ceria reacts with carbon dioxide and water vapor in the reactor to produce syngas — a mixture of hydrogen gas and carbon monoxide. The syngas is then piped to the tower’s base where a machine converts it into kerosene and other hydrocarbons.

    Over nine days of operation, the researchers found that the tower converted about 4 percent of the used solar energy into roughly 5,191 liters of syngas, which was used to synthesize both kerosene and diesel. This proof-of-principle setup produced about a liter of kerosene a day, Steinfeld says.

    “It’s a major milestone,” Stechel says, though the efficiency needs to be improved for the technology to be useful to industry. For context, a Boeing 747 passenger jet burns around 19,000 liters of fuel during takeoff and the ascent to cruising altitude. Recovering heat unused by the system and improving the ceria’s heat absorption could boost the tower’s efficiency to more than 20 percent, making it economically practical, the researchers say. More