More stories

  • in

    New theory of decision-making seeks to explain why humans don't make optimal choices

    A new theory of economic decision-making from Mina Mahmoudi, a lecturer in the Department of Economics at Rensselaer Polytechnic Institute, offers an explanation as to why humans, in general, make decisions that are simply adequate, not optimal.
    In research published today in the Review of Behavioral Economics, Dr. Mahmoudi theorizes an aspect of relative thinking explaining people may use ratios in their decision-making when they should only use absolute differences. The inverse is also possible.
    To explain this behavioral anomaly, Dr. Mahmoudi has developed a ratio-difference theory that gives weight to both ratio and difference comparisons. This theory seeks to more accurately capture the manner by which a boundedly rational decision-maker might operationally distinguish whether one alternative is better than another.
    “Effectively solving some economic problems requires one to think in terms of differences while others require one to think in terms of ratios,” Dr. Mahmoudi said. “Because both types of thinking are necessary, it is reasonable to think people develop and apply both types. However, it is also reasonable to expect that people misapply the two types of thinking, especially when less experienced with the context.”
    Past studies have shown that when given the opportunity to save, for example, $5 on a $25 item or a $500 item, people in general would put in more effort to save the money on the lower-cost product than the more expensive item. They believe they are getting a better deal because the ratio of cost to savings is higher. In fact, the $5 saved is the same for both items and the perfect, or optimal choice, would be to look at the absolute savings and work equally hard to save each $5. People should use differences to solve this problem, but many seem to make unreasonable decisions because they apply ratio thinking.
    “Understanding how the cognitive and motivational characteristics of human beings and the operating procedures of organizations influence the working of economic systems is of critical importance,” Dr. Mahmoudi said. “Many economic behaviors such as imitation occur and many economic institutions like inventories exist because people cannot maximize or because markets are not in equilibrium. Our model provides an example of a behavior that occurs because people cannot maximize.”
    This model can be applied to a variety of behavioral economic experiments in the gambling industry and financial markets among others.
    Story Source:
    Materials provided by Rensselaer Polytechnic Institute. Original written by Jeanne Hedden Gallagher. Note: Content may be edited for style and length. More

  • in

    Merging physical domain knowledge with AI improves prediction accuracy of battery capacity

    Recently, electric vehicles (EVs) are seen everywhere, from passenger cars, buses, to taxis. EVs have the advantage of being eco-friendly and having low maintenance costs; but their owners must remain wary of fatal accidents in case the battery runs out or reaches the end of its life. Therefore, precise capacity and lifespan predictions for the lithium-ion batteries — commonly used in EVs — are vital.
    A POSTECH research team led by Professor Seungchul Lee, and Ph.D. candidate Sung Wook Kim (Department of Mechanical Engineering) collaborated with Professor Ki-Yong Oh of Hanyang University to develop a novel artificial intelligence (AI) technology that can accurately predict the capacity and lifespan of lithium-ion batteries. This research breakthrough, which considerably improved the prediction accuracy by merging physical domain knowledge with AI, has recently been published in Applied Energy, an international academic journal in the energy field.
    There are two methods of predicting the battery capacity: a physics-based model, which simplifies the intricate internal structure of batteries, and an AI model, which uses the electrical and mechanical responses of batteries. However, the conventional AI model required large amounts of data for training. In addition, when applied to untrained data, its prediction accuracy was very low, which desperately called for the emergence of a next-generation AI technology.
    To effectively predict battery capacity with less training data, the research team combined a feature extraction strategy that differs from conventional methods with physical domain knowledge-based neural networks. As a result, the battery prediction accuracy for testing batteries with various capacities and lifespan distributions improved by up to 20%. Its reliability was ensured by confirming the consistency of the results. These outcomes are anticipated to lay the foundation for applying highly dependable physical domain knowledge-based AI to various industries.
    Professor Lee of POSTECH remarked, “The limitations of data-based AI have been overcome using physics knowledge. The difficulty of building big data has also been alleviated thanks to the development of the differentiated feature extraction technique.”
    Professor Oh of Hanyang University added, “Our research is significant in that it will contribute in propagating EVs to the public by enabling accurate predictions of remaining lifespan of batteries in next-generational EVs.”
    This study was supported by the Institute of Civil Military Technology Cooperation and the National Research Foundation of Korea.
    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Breakthrough paves way for photonic sensing at the ultimate quantum limit

    Sensors are a constant feature of our everyday lives. Although they often go unperceived, sensors provide critical information essential to modern healthcare, security, and environmental monitoring. Modern cars alone contain over 100 sensors and this number will only increase.
    Quantum sensing is poised to revolutionise today’s sensors, significantly boosting the performance they can achieve. More precise, faster, and reliable measurements of physical quantities can have a transformative effect on every area of science and technology, including our daily lives.
    However, the majority of quantum sensing schemes rely on special entangled or squeezed states of light or matter that are hard to generate and detect. This is a major obstacle to harnessing the full power of quantum-limited sensors and deploying them in real-world scenarios.
    In a paper published today, a team of physicists at the Universities of Bristol, Bath and Warwick have shown it is possible to perform high precision measurements of important physical properties without the need for sophisticated quantum states of light and detection schemes.
    The key to this breakthrough is the use of ring resonators — tiny racetrack structures that guide light in a loop and maximize its interaction with the sample under study. Importantly, ring resonators can be mass manufactured using the same processes as the chips in our computers and smartphones.
    Alex Belsley, Quantum Engineering Technology Labs (QET Labs) PhD student and lead author of the work, said: “We are one step closer to all integrated photonic sensors operating at the limits of detection imposed by quantum mechanics.”
    Employing this technology to sense absorption or refractive index changes can be used to identify and characterise a wide range of materials and biochemical samples, with topical applications from monitoring greenhouse gases to cancer detection.
    Associate Professor Jonathan Matthews, co-Director of QET Labs and co-author of the work, stated: “We are really excited by the opportunities this result enables: we now know how to use mass manufacturable processes to engineer chip scale photonic sensors that operate at the quantum limit.”
    Story Source:
    Materials provided by University of Bristol. Note: Content may be edited for style and length. More

  • in

    A quantum drum that stores quantum states for record-long times

    Researchers at the Niels Bohr Institute, University of Copenhagen, have improved the coherence time of a previously developed quantum membrane dramatically. The improvement will expand the usability of the membrane for a variety of different purposes. With a coherence time of one hundred milliseconds, the membrane can for example store sensitive quantum information for further processing in a quantum computer or network. The result has now been published in Nature Communications.
    The quantum drum is now connected to a read-out unit
    As a first step, the team of researchers has combined the membrane with a superconducting microwave circuit, which enables precise readouts from the membrane. That is, it has become “plugged in,” as required for virtually any application. With this development, the membrane can be connected to various other devices that process or transmit quantum information.
    Cooling the quantum drum system to reach quantum ground state
    Since the temperature of the environment determines the level of random forces disturbing the membrane, it is imperative to reach a sufficiently low temperature to prevent the quantum state of motion from being washed out. The researchers achieve this by means of a helium-based refrigerator. With the help of the microwave circuit, they can then control the quantum state of the membrane motion. In their recent work, the researchers could prepare the membrane in the quantum ground state, meaning that its motion is dominated by quantum fluctuations. The quantum ground state corresponds to an effective temperature of 0,00005 degrees above the absolute zero, which is −273.15 °C.
    Applications for the plugged in quantum membrane are many
    One could use a slightly modified version of this system that can feel forces from both microwave and optical signals to build a quantum transducer from microwave to optics. Quantum information can be transported at room temperature in optical fibers on kilometers without perturbations. On the other hand, the information is typically processed inside a cooling unit, capable of reaching sufficiently low temperatures for superconducting circuits like the membrane to operate. Connecting these two systems — superconducting circuits to optical fibers — could therefore enable the construction of a quantum internet: several quantum computers linked together with optical fibers. No computers have infinite space, so the possibility of distributing computational capabilities to connected quantum computers, would greatly enhance the capacity to solve complicated problems.
    Gravity — not well understood in quantum mechanics, but crucial — can now be explored
    The role of gravity in the quantum regime is a yet unanswered, fundamental question in physics. This is yet another place where the high coherence time of the membranes demonstrated here may be applied for study. One hypothesis in this area is that gravity has the potential to destroy some quantum states with time. With a device as big as the membrane, such hypotheses may be tested in the future.
    Story Source:
    Materials provided by University of Copenhagen – Faculty of Science. Note: Content may be edited for style and length. More

  • in

    A novel all-optical switching method makes optical computing and communication systems more power-efficient

    Photonics researchers have introduced a novel method to control a light beam with another beam through a unique plasmonic metasurface in a linear medium at ultra-low power. This simple linear switching method makes nanophotonic devices such as optical computing and communication systems more sustainable requiring low intensity of light.
    All-optical switching is the modulation of signal light due to control light in such a way that it possesses the ON/OFF conversion function. In general, a light beam can be modulated with another intense laser beam in the presence of a nonlinear medium.
    The switching method developed by the researchers is fundamentally based on the quantum optical phenomenon known as Enhancementof Index of Refraction (EIR).
    “Our work is the first experimental demonstration of this effect on the optical system and its utilization for linear all-optical switching. The research also enlightens the scientific community to achieve loss-compensated plasmonic devices operating at resonance frequencies through extraordinary enhancement of refractive index without using any gain media or nonlinear processes,” says Humeyra Caglayan, Associate Professor (tenure track) in Photonics at Tampere University.
    Optical switching enabled with ultrafast speed
    High-speed switching and low-loss medium to avoid the strong dissipation of signal during propagation are the basis to develop integrated photonic technology where photons are utilized as information carriers instead of electrons. To realize on-chip ultrafast all-optical switch networks and photonic central processing units, all-optical switching must have ultrafast switching time, ultralow threshold control power, ultrahigh switching efficiency, and nanoscale feature size. More

  • in

    Study explores the promises and pitfalls of evolutionary genomics

    The second century Alexandrian astronomer and mathematician Claudius Ptolemy had a grand ambition. Hoping to make sense of the motion of stars and the paths of planets, he published a magisterial treatise on the subject, known as the Almagest. Ptolemy created a complex mathematical model of the universe that seemed to recapitulate the movements of the celestial objects he observed.
    Unfortunately, a fatal flaw lay at the heart of his cosmic scheme. Following the prejudices of his day, Ptolemy worked from the premise that the Earth was the center of the universe. The Ptolemaic universe, composed of complex “epicycles” to account for planet and star movements, has long since been consigned to the history books, though its conclusions remained the scientific dogma for over 1200 years.
    The field of evolutionary biology is no less subject to misguided theoretical approaches, sometimes producing impressive models that nevertheless fail to convey the true workings of nature as it shapes the dizzying assortment of living forms on Earth.
    A new study examines mathematical models designed to draw inferences about how evolution operates at the level of populations of organisms. The study concludes that such models must be constructed with the greatest care, avoiding unwarranted initial assumptions, weighing the quality of existing knowledge and remaining open to alternate explanations.
    Failure to apply strict procedures in null model construction can lead to theories that seem to square with certain aspects of available data derived from DNA sequencing, yet fail to correctly elucidate underlying evolutionary processes, which are often highly complex and multifaceted.
    Such theoretical frameworks may offer compelling but ultimately flawed pictures of how evolution actually acts on populations over time, be these populations of bacteria, shoals of fish, or human societies and their various migrations during prehistory. More

  • in

    Bumps could smooth quantum investigations

    Atoms do weird things when forced out of their comfort zones. Rice University engineers have thought up a new way to give them a nudge.
    Materials theorist Boris Yakobson and his team at Rice’s George R. Brown School of Engineering have a theory that changing the contour of a layer of 2D material, thus changing the relationships between its atoms, might be simpler to do than previously thought.
    While others twist 2D bilayers — two layers stacked together — of graphene and the like to change their topology, the Rice researchers suggest through computational models that growing or stamping single-layer 2D materials on a carefully designed undulating surface would achieve “an unprecedented level of control” over their magnetic and electronic properties.
    They say the discovery opens a path to explore many-body effects, the interactions between multiple microscopic particles, including quantum systems.
    The paper by Yakobson and two alumni, co-lead author Sunny Gupta and Henry Yu, of his lab appears in Nature Communications.
    The researchers were inspired by recent discoveries that twisting or otherwise deforming 2D materials bilayers like bilayer graphene into “magic angles” induced interesting electronic and magnetic phenomena, including superconductivity. More

  • in

    Growing wildfire threats loom over the birthplace of the atomic bomb

    There are things I will always remember from my time in New Mexico. The way the bark of towering ponderosa pines smells of vanilla when you lean in close. Sweeping vistas, from forested mountaintops to the Rio Grande Valley, that embellish even the most mundane shopping trip. The trepidation that comes with the tendrils of smoke rising over nearby canyons and ridges during the dry, wildfire-prone summer months.

    There were no major wildfires near Los Alamos National Laboratory during the year and a half that I worked in public communications there and lived just across Los Alamos Canyon from the lab. I’m in Maryland now, and social media this year has brought me images and video clips of the wildfires that have been devastating parts of New Mexico, including the Cerro Pelado fire in the Jemez Mountains just west of the lab.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Wherever they pop up, wildfires can ravage the land, destroy property and displace residents by the tens of thousands. The Cerro Pelado fire is small compared with others raging east of Santa Fe — it grew only to the size of Washington, D.C. The fire, which started mysteriously on April 22, is now mostly contained. But at one point it came within 5.6 kilometers of the lab, seriously threatening the place that’s responsible for creating and maintaining key portions of fusion bombs in our nation’s nuclear arsenal.

    That close call may be just a hint of growing fire risks to come for the weapons lab as the Southwest suffers in the grip of an epic drought made worse by human-caused climate change (SN: 4/16/20). May and June typically mark the start of the state’s wildfire season. This year, fires erupted in April and were amplified by a string of warm, dry and windy days. The Hermits Peak and Calf Canyon fires east of Santa Fe have merged to become the largest wildfire in New Mexico’s recorded history.

    Los Alamos National Lab is in northern New Mexico, about 56 kilometers northwest of Santa Fe. The lab’s primary efforts revolve around nuclear weapons, accounting for 71 percent of its $3.9 billion budget, according the lab’s fiscal year 2021 numbers. The budget covers a ramp-up in production of hollow plutonium spheres, known as “pits” because they are the cores of nuclear bombs, to 30 per year beginning in 2026. That’s triple the lab’s current capability of 10 pits per year. The site is also home to radioactive waste and debris that has been a consequence of weapons production since the first atomic bomb was built in Los Alamos in the early 1940s (SN: 8/6/20).

    What is the danger due to fire approaching the lab’s nuclear material and waste? According to literature that Peter Hyde, a spokesperson for the lab, sent to me to ease my concern, not much.

    Over the last 3½ years, the lab has removed 3,500 tons of trees and other potential wildfire fuel from the sprawling, 93-square-kilometer complex. Lab facilities, a lab pamphlet says, “are designed and operated to protect the materials that are inside, and radiological and other potentially hazardous materials are stored in containers that are engineered and tested to withstand extreme environments, including heat from fire.”

    What’s more, most of roughly 20,000 drums full of nuclear waste that were stored under tents on the lab’s grounds have been removed. They were a cause for anxiety during the last major fire to threaten the lab in 2011. According to the most recent numbers on the project’s website, all but 3,812 of those drums have been shipped off to be stored 655 meters underground at the Waste Isolation Pilot Plant near Carlsbad, N.M.

    But there’s still 3,500 cubic meters of nuclear waste  in the storage area, according to a March 2022 DOE strategic planning document for Los Alamos. That’s enough to fill 17,000 55-gallon drums. So potentially disastrous quantities of relatively exposed nuclear waste remain at the lab — a single drum from the lab site that exploded after transport to Carlsbad in 2014 resulted in a two-year shutdown of the storage facility. With a total budgeted cleanup cost of $2 billion, the incident is one of the most expensive nuclear accidents in the nation’s history.

    Since the 2011 fire, a wider buffer space around the tents has been cleared of vegetation. In conjunction with fire suppression systems, it’s unlikely that wildfire will be a danger to the waste-filled drums, according to a 2016 risk analysis of extreme wildfire scenarios conducted by the lab.

    But a February 2021 audit by the U.S. Department of Energy’s Office of Inspector General is less rosy. It found that, despite the removal of most of the waste drums and the multiyear wildfire mitigation efforts that the lab describes, the lab’s wildfire protection is still lacking.

    According to the 20-page federal audit, the lab at that time had not developed a “comprehensive, risk-based approach to wildland fire management” in accordance with federal policies related to wildland fire management. The report also noted compounding issues, including the absence of federal oversight of the lab’s wildfire management activities.

    A canyon on lab grounds that runs alongside the adjacent city of Los Alamos (two spots shown) was called out in an audit by the Department of Energy’s Office of Inspector General because it was packed with about 400 to 500 trees per acre. The ideal number from a wildfire management viewpoint is 40 to 50 trees per acre.The Department of Energy’s Wildland Fire Prevention Efforts at the Los Alamos National Laboratory

    Among the ongoing risks, not all fire roads were maintained well enough to provide a safe route for firefighters and others, “which could create dangerous conditions for emergency responders and delay response times,” the auditors wrote.

    And a canyon that runs between the lab and the adjacent town of Los Alamos was identified in the report as being packed with 10 times the number of trees that would be ideal, from a wildfire safety perspective. To make matters worse, there’s a hazardous waste site at the bottom of the canyon that could, the auditors wrote, “produce a health risk to the environment and to human health during a fire.”

    “The report was pretty stark,” says Edwin Lyman, director of nuclear power safety at the Union of Concerned Scientists. “And certainly, after all the warnings, if they’re still not doing all they need to do to fully mitigate the risk, then that’s just foolishness.”

    A 2007 federal audit of Los Alamos, as well as nuclear weapons facilities in Washington state and Idaho, showed similar problems. In short, it seems little has changed at Los Alamos in the 14-year span between 2007 and 2021. Lab spokespeople did not respond to my questions about the lab’s efforts to address the specific problems identified in the 2021 report, despite repeated requests. 

    The Los Alamos area has experienced three major wildfires since the lab was founded — the Cerro Grande fire in 2000, Las Conchas in 2011 and Cerro Pelado this year. But we probably can’t count on 11-year gaps between future wildfires near Los Alamos, according to Alice Hill, the senior fellow for energy and the environment with the Council on Foreign Relations, who’s based in Washington, D.C.

    The changing climate is expected to dramatically affect wildfire risks in years to come, turning Los Alamos and surrounding areas into a tinderbox. A study in 2018 in Climatic Change found that the region extending from the higher elevations in New Mexico, where Los Alamos is located, into Colorado and Arizona will experience the greatest increase in wildfire probabilities in the Southwest. A new risk projection tool that was recommended by Hill, called Risk Factor, also shows increasing fire risk in the Los Alamos area over the next 30 years.

    “We are at the point where we are imagining, as we have to, things that we’ve never experienced,” Hill says. “That is fundamentally different than how we have approached these problems throughout human history, which is to look to the past to figure out how to be safer in the future…. The nature of wildfire has changed as more heat is added [to the planet], as temperatures rise.”

    Increased plutonium pit production will add to the waste that needs to be shipped to Carlsbad. “Certainly, the radiological assessments in sort of the worst case of wildfire could lead to a pretty significant release of radioactivity, not only affecting the workers onsite but also the offsite public. It’s troubling,” says Lyman, who suggests that nuclear labs like Los Alamos should not be located in such fire-prone areas.

    The Los Alamos Neutron Science Center (shown in March of 2019), a key facility at Los Alamos National Laboratory, was evacuated in March 2019 when power lines sparked a nearby wildfire. It could be damaged or even destroyed if a high-intensity wildfire burned through a nearby heavily forested canyon, according to an audit by the Department of Energy’s Office of Inspector General.The Department of Energy’s Wildland Fire Prevention Efforts at the Los Alamos National Laboratory

    For now, some risks from the Cerra Pelado wildfire will persist, according to Jeff Surber, operations section chief for the U.S. Department of Agriculture Forestry Service’s efforts to fight the fire. Large wildfires like Cerra Pelado “hold heat for so long and they continue to smolder in the interior where it burns intermittently,” he said in a May 9 briefing to Los Alamos County residents, and to concerned people like me watching online.

    It will be vital to monitor the footprint of the fire until rain or snow finally snuffs it out late in the year. Even then, some danger will linger in the form of “zombie fires” that can flame up long after wildfires appear to have been extinguished (SN: 5/19/21). “We’ve had fires come back in the springtime because there was a root underground that somehow stayed lit all winter long,” said Surber.

    So the Cerro Pelado fire, and its occasional smoky tendrils, will probably be a part of life in northern New Mexico for months still. And the future seems just as fiery, if not worse. That’s something all residents, including the lab, need to be preparing for.

    Meantime, if you make it out to the mountains of New Mexico soon enough, be sure to sniff a vanilla-flavored ponderosa while you still can. I know I will. More