More stories

  • in

    One photon is all it takes to kick off photosynthesis

    For photosynthesis, one photon is all it takes.

    Only a single particle of light is required to spark the first steps of the biological process that converts light into chemical energy, scientists report June 14 in Nature.

    While scientists have long assumed that the reactions of photosynthesis begin upon the absorption of just one photon, that hadn’t yet been demonstrated, says physical chemist Graham Fleming, of the University of California, Berkeley. He and colleagues decided “we would just look to see was it really true that one photon was enough to start the whole thing off.”

    .email-conversion {
    border: 1px solid #ffcccb;
    color: white;
    margin-top: 50px;
    background-image: url(“/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg”);
    padding: 20px;
    clear: both;
    }

    .zephr-registration-form{max-width:440px;margin:20px auto;padding:20px;background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form *{box-sizing:border-box}.zephr-registration-form-text > *{color:var(–zephr-color-text-main)}.zephr-registration-form-relative-container{position:relative}.zephr-registration-form-flex-container{display:flex}.zephr-registration-form-input.svelte-blfh8x{display:block;width:100%;height:calc(var(–zephr-input-height) * 1px);padding-left:8px;font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input.svelte-blfh8x::placeholder{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-checkbox.svelte-blfh8x{width:auto;height:auto;margin:8px 5px 0 0;float:left}.zephr-registration-form-input-radio.svelte-blfh8x{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x{width:50px;padding:0;border-radius:50%}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch{border:none;border-radius:50%;padding:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch-wrapper{border:none;border-radius:50%;padding:0}.zephr-registration-form-input.disabled.svelte-blfh8x,.zephr-registration-form-input.disabled.svelte-blfh8x:hover{border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);background-color:var(–zephr-color-background-tinted)}.zephr-registration-form-input.error.svelte-blfh8x{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-label.svelte-1ok5fdj.svelte-1ok5fdj{margin-top:10px;display:block;line-height:30px;font-size:12px;color:var(–zephr-color-text-tinted);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj{display:block}.zephr-registration-form-button.svelte-17g75t9{height:calc(var(–zephr-button-height) * 1px);line-height:0;padding:0 20px;text-decoration:none;text-transform:capitalize;text-align:center;border-radius:calc(var(–zephr-button-borderRadius) * 1px);font-size:calc(var(–zephr-button-fontSize) * 1px);font-weight:normal;cursor:pointer;border-style:solid;border-width:calc(var(–zephr-button-borderWidth) * 1px);border-color:var(–zephr-color-action-tinted);transition:backdrop-filter 0.2s, background-color 0.2s;margin-top:20px;display:block;width:100%;background-color:var(–zephr-color-action-main);color:#fff;position:relative;overflow:hidden;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-button.svelte-17g75t9:hover{background-color:var(–zephr-color-action-tinted);border-color:var(–zephr-color-action-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled:hover{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-text.svelte-i1fi5{font-size:19px;text-align:center;margin:20px auto;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-divider-container.svelte-mk4m8o{display:flex;align-items:center;justify-content:center;margin:40px 0}.zephr-registration-form-divider-line.svelte-mk4m8o{height:1px;width:50%;margin:0 5px;background-color:var(–zephr-color-text-tinted);;}.zephr-registration-form-divider-text.svelte-mk4m8o{margin:0 12px;color:var(–zephr-color-text-main);font-size:14px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);white-space:nowrap}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message.svelte-179421u{text-align:center;padding:10px 30px;border-radius:5px;font-size:15px;margin-top:10px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message-title.svelte-179421u{font-weight:bold;margin-bottom:10px}.zephr-registration-form-response-message-success.svelte-179421u{background-color:#baecbb;border:1px solid #00bc05}.zephr-registration-form-response-message-error.svelte-179421u{background-color:#fcdbec;border:1px solid #d90c00}.zephr-registration-form-social-sign-in.svelte-gp4ky7{align-items:center}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7{height:55px;padding:0 15px;color:#000;background-color:#fff;box-shadow:0px 0px 5px rgba(0, 0, 0, 0.3);border-radius:10px;font-size:17px;display:flex;align-items:center;cursor:pointer;margin-top:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7:hover{background-color:#fafafa}.zephr-registration-form-social-sign-in-icon.svelte-gp4ky7{display:flex;justify-content:center;margin-right:30px;width:25px}.zephr-form-link-message.svelte-rt4jae{margin:10px 0 10px 20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-tcs.svelte-1wyy3bx{margin:20px 0 0 0;font-size:15px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-inline.svelte-1wyy3bx{margin:20px 0 0 0}.zephr-registration-form-progress-bar.svelte-8qyhcl{width:100%;border:0;border-radius:20px;margin-top:10px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-bar{background-color:var(–zephr-color-background-tinted);border:0;border-radius:20px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-value{background-color:var(–zephr-color-text-tinted);border:0;border-radius:20px}.zephr-registration-progress-bar-step.svelte-8qyhcl{margin:auto;color:var(–zephr-color-text-tinted);font-size:12px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-progress-bar-step.svelte-8qyhcl:first-child{margin-left:0}.zephr-registration-progress-bar-step.svelte-8qyhcl:last-child{margin-right:0}.zephr-registration-form-input-error-text.svelte-19a73pq{color:var(–zephr-color-warning-main);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-select.svelte-19a73pq{display:block;appearance:auto;width:100%;height:calc(var(–zephr-input-height) * 1px);font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-color-text-main);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;padding:10px}.zephr-registration-form-input-select.disabled.svelte-19a73pq{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.unselected.svelte-19a73pq{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.error.svelte-19a73pq{border-color:var(–zephr-color-warning-main)}.zephr-registration-form-input-textarea.svelte-19a73pq{background-color:#fff;border:1px solid #ddd;color:#222;font-size:14px;font-weight:300;padding:16px;width:100%}.zephr-registration-form-input-slider-output.svelte-19a73pq{margin:13px 0 0 10px}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.spin.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 2s 0s infinite linear}.pulse.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 1s infinite steps(8)}@keyframes svelte-1cj2gr0-spin{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}.zephr-registration-form-checkbox.svelte-1gzpw2y{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-checkbox-label.svelte-1gzpw2y{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-checkmark.svelte-1gzpw2y{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid var(–zephr-color-text-main);border-radius:6px;margin-right:12px;cursor:pointer}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y{border-color:#009fe3}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y:after{content:””;position:absolute;width:6px;height:13px;border:solid #009fe3;border-width:0 2px 2px 0;transform:rotate(45deg);top:3px;left:8px;box-sizing:border-box}.zephr-registration-form-checkmark.disabled.svelte-1gzpw2y{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-checkmark.disabled.checked.svelte-1gzpw2y:after{border:solid var(–zephr-color-background-tinted);border-width:0 2px 2px 0}.zephr-registration-form-checkmark.error.svelte-1gzpw2y{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-radio.svelte-1qn5n0t{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-radio-label.svelte-1qn5n0t{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-radio-dot.svelte-1qn5n0t{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid #ebebeb;border-radius:50%;margin-right:12px}.checked.svelte-1qn5n0t{border-color:#009fe3}.checked.svelte-1qn5n0t:after{content:””;position:absolute;width:17px;height:17px;background:#009fe3;background:linear-gradient(#009fe3, #006cb5);border-radius:50%;top:2px;left:2px}.disabled.checked.svelte-1qn5n0t:after{background:var(–zephr-color-background-tinted)}.error.svelte-1qn5n0t{border:1px solid var(–zephr-color-warning-main)}.zephr-form-link.svelte-64wplc{margin:10px 0;color:#6ba5e9;text-decoration:underline;cursor:pointer;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-form-link-disabled.svelte-64wplc{color:var(–zephr-color-text-main);cursor:none;text-decoration:none}.zephr-registration-form-google-icon.svelte-1jnblvg{width:20px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}
    .zephr-registration-form {
    max-width: 100%;
    background-image: url(https://www.sciencenews.org/wp-content/uploads/2023/05/newsletter-signup-background_REV.jpg);
    background-position: top 75% left 67%;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    margin: 0px auto;
    margin-bottom: 4rem;
    padding: 20px;
    }

    .zephr-registration-form-text.svelte-i1fi5 p {
    padding-left: 10px;
    padding-right: 10px;
    padding-bottom: 10px;
    }

    .zephr-registration-form-text.svelte-i1fi5 h2 {
    padding-top: 10px;
    }

    .zephr-registration-form-text h6 {
    font-size: 0.8rem;
    }

    .zephr-registration-form h4 {
    font-size: 3rem;
    }

    .zephr-registration-form h4 {
    font-size: 1.5rem;
    }

    .zephr-registration-form-button.svelte-17g75t9:hover {
    background-color: #fc6a65;
    border-color: #fc6a65;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9:disabled {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9 {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-text > * {
    color: #FFFFFF;
    font-weight: bold
    font: 25px;
    background-color: rgba(10,10,10,0.7)
    }
    .zephr-registration-form-progress-bar.svelte-8qyhcl {
    width: 100%;
    border: 0;
    border-radius: 20px;
    margin-top: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-title.svelte-179421u {
    font-weight: bold;
    margin-bottom: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-success.svelte-179421u {
    background-color: #8db869;
    border: 1px solid #8db869;
    color: white;
    margin-top: -0.2rem;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(1){
    font-size: 18px;
    text-align: center;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(5){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(7){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(9){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj {
    display: none;
    color: white;
    }
    .zephr-registration-form-input.disabled.svelte-blfh8x, .zephr-registration-form-input.disabled.svelte-blfh8x:hover {
    border: calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);
    background-color: white;
    }
    .zephr-registration-form-checkbox-label.svelte-1gzpw2y {
    display: flex;
    align-items: center;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    font-size: 20px;
    margin-bottom: -20px;
    }
    .zephr-registration-progress-bar-step.svelte-8qyhcl:first-child {
    display: none;
    }
    .zephr-registration-progress-bar-step.svelte-8qyhcl:last-child {
    display: none;
    }

    The sunlight that falls on Earth’s surface seems brilliant to human eyes. But on small scales, that translates to a dribble of photons. Only a few tens of photons of the appropriate wavelengths of sunlight fall on a square nanometer per second, the scale of the tiny chlorophyll and bacteriochlorophyll molecules that are central to photosynthesis in plants and bacteria.

    Many laboratory experiments on photosynthesis use lasers, much more powerful light sources, to kick off the reactions. Instead, Graham and colleagues used a source of light that produces just two photons at a time. One photon served as a herald, going off to a detector to let researchers know when two photons were released. The other photon went into a solution containing photon-absorbing structures from the photosynthetic bacterium Rhodobacter sphaeroides. These structures, called light-harvesting 2 complexes, or LH2, are made up of two rings of bacteriochlorophyll and other molecules.

    In a normal photosynthesis reaction, LH2 absorbs a photon and passes its energy to another LH2 complex, and then another, like a game of hot potato. Eventually the energy reaches another type of ring, called the light-harvesting 1 complex, or LH1, which then passes it to the reaction center where the energy is finally converted into a form that the bacterium can use.

    In the experiment, there was no LH1, so the LH2 instead emitted a photon of a different wavelength than the first, a sign that energy had been transferred from the first ring of LH2 to the second, a first step of photosynthesis. The researchers detected that second photon, and by comparing the detection times to those of the initial herald photons, confirmed that the LH2 needed to absorb only one photon to kick things off.

    Plants and bacteria use different processes for photosynthesis, but the initial steps are similar enough that in plants, too, a single photon would set off the initial steps, Fleming says. However, in plants, multiple independently absorbed photons are needed in order to complete the reaction.

    The role of single photons isn’t surprising, says biochemist Richard Cogdell of the University of Glasgow in Scotland. The important thing the researchers have done, he says, is to demonstrate the new technique. “By doing this you’re able to essentially interrogate what will be happening in nature,” he says.  

    Some scientists suspect that photosynthesis relies on quantum physics (SN: 2/3/10). While it’s not clear whether the new technique could resolve the role of quantum effects, it could help scientists disentangle natural effects from artifacts of using intense sources of light in studies of photosynthesis.

    “You can really work out what’s happening in the early reactions in photosynthesis as it were outside,” says Cogdell, “[as if] you could shrink yourself down and watch these photons moving around.” More

  • in

    A step toward safe and reliable autopilots for flying

    In the film “Top Gun: Maverick,” Maverick, played by Tom Cruise, is charged with training young pilots to complete a seemingly impossible mission — to fly their jets deep into a rocky canyon, staying so low to the ground they cannot be detected by radar, then rapidly climb out of the canyon at an extreme angle, avoiding the rock walls. Spoiler alert: With Maverick’s help, these human pilots accomplish their mission.
    A machine, on the other hand, would struggle to complete the same pulse-pounding task. To an autonomous aircraft, for instance, the most straightforward path toward the target is in conflict with what the machine needs to do to avoid colliding with the canyon walls or staying undetected. Many existing AI methods aren’t able to overcome this conflict, known as the stabilize-avoid problem, and would be unable to reach their goal safely.
    MIT researchers have developed a new technique that can solve complex stabilize-avoid problems better than other methods. Their machine-learning approach matches or exceeds the safety of existing methods while providing a tenfold increase in stability, meaning the agent reaches and remains stable within its goal region.
    In an experiment that would make Maverick proud, their technique effectively piloted a simulated jet aircraft through a narrow corridor without crashing into the ground.
    “This has been a longstanding, challenging problem. A lot of people have looked at it but didn’t know how to handle such high-dimensional and complex dynamics,” says Chuchu Fan, the Wilson Assistant Professor of Aeronautics and Astronautics, a member of the Laboratory for Information and Decision Systems (LIDS), and senior author of a new paper on this technique.
    Fan is joined by lead author Oswin So, a graduate student. The paper will be presented at the Robotics: Science and Systems conference.

    The stabilize-avoid challenge
    Many approaches tackle complex stabilize-avoid problems by simplifying the system so they can solve it with straightforward math, but the simplified results often don’t hold up to real-world dynamics.
    More effective techniques use reinforcement learning, a machine-learning method where an agent learns by trial-and-error with a reward for behavior that gets it closer to a goal. But there are really two goals here — remain stable and avoid obstacles — and finding the right balance is tedious.
    The MIT researchers broke the problem down into two steps. First, they reframe the stabilize-avoid problem as a constrained optimization problem. In this setup, solving the optimization enables the agent to reach and stabilize to its goal, meaning it stays within a certain region. By applying constraints, they ensure the agent avoids obstacles, So explains.
    Then for the second step, they reformulate that constrained optimization problem into a mathematical representation known as the epigraph form and solve it using a deep reinforcement learning algorithm. The epigraph form lets them bypass the difficulties other methods face when using reinforcement learning.

    “But deep reinforcement learning isn’t designed to solve the epigraph form of an optimization problem, so we couldn’t just plug it into our problem. We had to derive the mathematical expressions that work for our system. Once we had those new derivations, we combined them with some existing engineering tricks used by other methods,” So says.
    No points for second place
    To test their approach, they designed a number of control experiments with different initial conditions. For instance, in some simulations, the autonomous agent needs to reach and stay inside a goal region while making drastic maneuvers to avoid obstacles that are on a collision course with it.
    When compared with several baselines, their approach was the only one that could stabilize all trajectories while maintaining safety. To push their method even further, they used it to fly a simulated jet aircraft in a scenario one might see in a “Top Gun” movie. The jet had to stabilize to a target near the ground while maintaining a very low altitude and staying within a narrow flight corridor.
    This simulated jet model was open-sourced in 2018 and had been designed by flight control experts as a testing challenge. Could researchers create a scenario that their controller could not fly? But the model was so complicated it was difficult to work with, and it still couldn’t handle complex scenarios, Fan says.
    The MIT researchers’ controller was able to prevent the jet from crashing or stalling while stabilizing to the goal far better than any of the baselines.
    In the future, this technique could be a starting point for designing controllers for highly dynamic robots that must meet safety and stability requirements, like autonomous delivery drones. Or it could be implemented as part of larger system. Perhaps the algorithm is only activated when a car skids on a snowy road to help the driver safely navigate back to a stable trajectory.
    Navigating extreme scenarios that a human wouldn’t be able to handle is where their approach really shines, So adds.
    “We believe that a goal we should strive for as a field is to give reinforcement learning the safety and stability guarantees that we will need to provide us with assurance when we deploy these controllers on mission-critical systems. We think this is a promising first step toward achieving that goal,” he says.
    Moving forward, the researchers want to enhance their technique so it is better able to take uncertainty into account when solving the optimization. They also want to investigate how well the algorithm works when deployed on hardware, since there will be mismatches between the dynamics of the model and those in the real world.
    The work is funded, in part, by MIT Lincoln Laboratory under the Safety in Aerobatic Flight Regimes program. More

  • in

    Four-legged robot traverses tricky terrains thanks to improved 3D vision

    Researchers led by the University of California San Diego have developed a new model that trains four-legged robots to see more clearly in 3D. The advance enabled a robot to autonomously cross challenging terrain with ease — including stairs, rocky ground and gap-filled paths — while clearing obstacles in its way.
    The researchers will present their work at the 2023 Conference on Computer Vision and Pattern Recognition (CVPR), which will take place from June 18 to 22 in Vancouver, Canada.
    “By providing the robot with a better understanding of its surroundings in 3D, it can be deployed in more complex environments in the real world,” said study senior author Xiaolong Wang, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering.
    The robot is equipped with a forward-facing depth camera on its head. The camera is tilted downwards at an angle that gives it a good view of both the scene in front of it and the terrain beneath it.
    To improve the robot’s 3D perception, the researchers developed a model that first takes 2D images from the camera and translates them into 3D space. It does this by looking at a short video sequence that consists of the current frame and a few previous frames, then extracting pieces of 3D information from each 2D frame. That includes information about the robot’s leg movements such as joint angle, joint velocity and distance from the ground. The model compares the information from the previous frames with information from the current frame to estimate the 3D transformation between the past and the present.
    The model fuses all that information together so that it can use the current frame to synthesize the previous frames. As the robot moves, the model checks the synthesized frames against the frames that the camera has already captured. If they are a good match, then the model knows that it has learned the correct representation of the 3D scene. Otherwise, it makes corrections until it gets it right.

    The 3D representation is used to control the robot’s movement. By synthesizing visual information from the past, the robot is able to remember what it has seen, as well as the actions its legs have taken before, and use that memory to inform its next moves.
    “Our approach allows the robot to build a short-term memory of its 3D surroundings so that it can act better,” said Wang.
    The new study builds on the team’s previous work, where researchers developed algorithms that combine computer vision with proprioception — which involves the sense of movement, direction, speed, location and touch — to enable a four-legged robot to walk and run on uneven ground while avoiding obstacles. The advance here is that by improving the robot’s 3D perception (and combining it with proprioception), the researchers show that the robot can traverse more challenging terrain than before.
    “What’s exciting is that we have developed a single model that can handle different kinds of challenging environments,” said Wang. “That’s because we have created a better understanding of the 3D surroundings that makes the robot more versatile across different scenarios.”
    The approach has its limitations, however. Wang notes that their current model does not guide the robot to a specific goal or destination. When deployed, the robot simply takes a straight path and if it sees an obstacle, it avoids it by walking away via another straight path. “The robot does not control exactly where it goes,” he said. “In future work, we would like to include more planning techniques and complete the navigation pipeline.”
    Video: https://youtu.be/vJdt610GSGk
    Paper title: “Neural Volumetric Memory for Visual Locomotion Control.” Co-authors include Ruihan Yang, UC San Diego, and Ge Yang, Massachusetts Institute of Technology.
    This work was supported in part by the National Science Foundation (CCF-2112665, IIS-2240014, 1730158 and ACI-1541349), an Amazon Research Award and gifts from Qualcomm. More

  • in

    The chatbot will see you now

    The informed consent process in biomedical research is biased towards people who can meet with clinical study staff during the working day. For those who have the availability to have a consent conversation, the time burden can be off-putting. Professor Eric Vilain, from the Department of Paediatrics, University of California, Irvine, USA, will tell the European Society of Human Genetics annual conference today (Tuesday 13 June) how results from his team’s study of the use of a chatbot (GIA — ‘Genetics Information Assistant’ developed by Invitae Corporation) in the consent process show that it encourages inclusivity, and leads to faster completion and high levels of understanding. Since such consent is the cornerstone of all research studies, finding ways of cutting the time spent on it while continuing to make sure that participants’ understanding is not lessened is something clinicians have aimed for some time.
    Working with their institutional review board (IRB), Prof Vilain’s team from across University of California Irvine, Children’s National Hospital, and Invitae Corporation designed a script for the GIA chatbot to transform the trial consent form and protocol into a logic flow and script. Unlike conventional methods of obtaining consent, the bot was able to quiz participants to assess the knowledge they had attained. It could also be accessed at any time, allowing individuals with less free time to use it outside normal business hours. “We saw that more than half of our participants interacted with the bot at these times, and this shows its utility in decreasing the barriers to entry to research. Currently, most people who participate in biomedical research have time to do so as well as the knowledge that studies exist,” says Prof Vilain
    The researchers involved 72 families in the consent process during a six-month time period as part of the US national GREGoR consortium, a National Institutes of Health initiative to advance rare disease research. A total of 37 families completed consent using the traditional process, while 35 used the chatbot. The researchers found that the median length of the consent conversation was shorter for those using the bot, at 44 rather than 76 minutes, and the time from referral to the study to consent completion was also faster, at five as opposed to 16 days. The level of understanding of those who had used the bot was assessed with a 10-question quiz that 96% of participants passed, and a request for feedback showed that 86% thought that they had had a positive experience.
    “I was surprised and pleased that a significant number of people would feel comfortable communicating with a chatbot,” says Prof Vilain. “But we worked hard with our IRB to ensure that it didn’t ‘hallucinate’ (make mistakes) and to ensure that knowledge was conveyed correctly. When the bot was unable to answer a question, it encouraged the participant to speak with a member of the study team.”
    While it is not possible to give an accurate account of cost saving, the time savings of staff were substantial, the researchers say. Because people can pause the chatbot consent process at any time, it can be completed much more quickly — for example, four participants completed in 24 hours. Of the consent conversations that were quick (less than an hour), 83% of them were with the chatbot. The consent conversations that were longer (between one and two hours), were with a study staff member (66%).
    “But it’s far from being just about speed,” says Prof Vilain. “The traditional method of consenting does not have a mechanism to verify understanding objectively. It is based on the conviction of the study staff member hosting the conversation that the consent has been informed properly and the individual understands what they are consenting to. The chat-based method can test comprehension more objectively. It does not allow users who do not show understanding to give consent, and puts them in touch with a genetic counsellor to figure out why knowledge transmission did not occur.
    “We believe that our work has made an important contribution to the obtention of properly-informed consent, and would now like to see it used in different languages to reach global populations,” he concludes.
    Professor Alexandre Reymond, chair of the conference, said: “The keystone to informed consent should be that it is by definition ‘informed’, and we should explore all possibilities to ensure this in the future.”
    (ends) More

  • in

    Loneliness, insomnia linked to work with AI systems

    Employees who frequently interact with artificial intelligence systems are more likely to experience loneliness that can lead to insomnia and increased after-work drinking, according to research published by the American Psychological Association.
    Researchers conducted four experiments in the U.S., Taiwan, Indonesia and Malaysia. Findings were consistent across cultures. The research was published online in the Journal of Applied Psychology.
    In a prior career, lead researcher Pok Man Tang, PhD, worked in an investment bank where he used AI systems, which led to his interest in researching the timely issue.
    “The rapid advancement in AI systems is sparking a new industrial revolution that is reshaping the workplace with many benefits but also some uncharted dangers, including potentially damaging mental and physical impacts for employees,” said Tang, an assistant professor of management at the University of Georgia. “Humans are social animals, and isolating work with AI systems may have damaging spillover effects into employees’ personal lives.”
    At the same time, working with AI systems may have some benefits. The researchers found that employees who frequently used AI systems were more likely to offer help to fellow employees, but that response may have been triggered by their loneliness and need for social contact.
    Furthermore, the studies found that participants with higher levels of attachment anxiety — the tendency to feel insecure and worried about social connections — responded more strongly to working on AI systems with both positive reactions, such as helping others, and negative ones, such as loneliness and insomnia.

    In one experiment, 166 engineers at a Taiwanese biomedical company who worked with AI systems were surveyed over three weeks about their feelings of loneliness, attachment anxiety and sense of belonging. Coworkers rated individual participants on their helpful behaviors, and family members reported on participants’ insomnia and after-work alcohol consumption. Employees who interacted more frequently with AI systems were more likely to experience loneliness, insomnia and increased after-work alcohol consumption, but also showed some helping behaviors toward fellow employees.
    In another experiment with 126 real estate consultants in an Indonesian property management company, half were instructed not to use AI systems for three consecutive days while the other half were told to work with AI systems as much as possible. The findings for the latter group were similar to the previous experiment, except there was no association between the frequency of AI use and after-work alcohol consumption.
    There were similar findings from an online experiment with 214 full-time working adults in the U.S. and another with 294 employees at a Malaysian tech company.
    The research findings are correlational and don’t prove that work with AI systems causes loneliness or the other responses, just that there is an association among them.
    Tang said that moving forward, developers of AI technology should consider equipping AI systems with social features, such as a human voice, to emulate human-like interactions. Employers also could limit the frequency of work with AI systems and offer opportunities for employees to socialize.
    Team decision-making and other tasks where social connections are important could be done by people, while AI systems could focus more on tedious and repetitive tasks, Tang added.
    “Mindfulness programs and other positive interventions also might help relieve loneliness,” Tang said. “AI will keep expanding so we need to act now to lessen the potentially damaging effects for people who work with these systems.” More

  • in

    Mori3: A polygon shape-shifting robot for space travel

    Jamie Paik and her team of researchers at EPFL’s School of Engineering have created an origami-like robot that can change shape, move around and interact with objects and people.
    By combining inspiration from the digital world of polygon meshing and the biological world of swarm behavior, the Mori3 robot can morph from 2D triangles into almost any 3D object. The EPFL research, which has been published in Nature Machine Intelligence, shows the promise of modular robotics for space travel. “Our aim with Mori3 is to create a modular, origami-like robot that can be assembled and disassembled at will depending on the environment and task at hand,” says Jamie Paik, director of the Reconfigurable Robotics Lab. “Mori3 can change its size, shape and function.”
    A polygon robot
    The individual modules of the Mori3 robot are triangular in shape. The modules easily join together to create polygons of different sizes and configurations in a process known as polygon meshing. “We have shown that polygon meshing is a viable robotic strategy,” says Christoph Belke, a Post-doctoral researcher in robotics. To achieve this, the team had to push the boundaries of various aspects of robotics, including the mechanical and electronic design, computer systems and engineering. “We had to rethink the way we understand robotics,” explains Belke. “These robots can change their own shape, attach to each other, communicate and reconfigure to form functional and articulated structures.” This proof of concept is a success as Mori3 robots are good at doing the three things that robots should be able to do: moving around, handling and transporting objects, and interacting with users.
    Destined for space
    What is the advantage in creating modular and multi-functional robots? Paik explains that, to perform a wide range of tasks, robots need to be able to change their shape or configuration. “Polygonal and polymorphic robots that connect to one another to create articulated structures can be used effectively for a variety of applications,” she says. “Of course, a general-purpose robot like Mori3 will be less effective than specialized robots in certain areas. That said, Mori3’s biggest selling point is its versatility.” Mori3 robots were designed in part to be used in spacecraft, which don’t have the room to store different robots for each individual task that needs to be carried out. The researchers hope that Mori3 robots will be used for communication purposes and external repairs. More

  • in

    Liquid metal sticks to surfaces without a binding agent

    Everyday materials such as paper and plastic could be transformed into electronic “smart devices” by using a simple new method to apply liquid metal to surfaces, according to scientists in Beijing, China. The study, published June 9 in the journal Cell Reports Physical Science, demonstrates a technique for applying a liquid metal coating to surfaces that do not easily bond with liquid metal. The approach is designed to work at a large scale and may have applications in wearable testing platforms, flexible devices, and soft robotics.
    “Before, we thought that it was impossible for liquid metal to adhere to non-wetting surfaces so easily, but here it can adhere to various surfaces only by adjusting the pressure, which is very interesting,” said Bo Yuan, a scientist at Tsinghua University and the first author of the study.
    Scientists seeking to combine liquid metal with traditional materials have been impeded by liquid metal’s extremely high surface tension, which prevents it from binding with most materials, including paper. To overcome this issue, previous research has mainly focused on a technique called “transfer printing,” which involves using a third material to bind the liquid metal to the surface. But this strategy comes with drawbacks — adding more materials can complicate the process and may weaken the end product’s electrical, thermal, or mechanical performance.
    To explore an alternative approach that would allow them to directly print liquid metal on substrates without sacrificing the metal’s properties, Yuan and colleagues applied two different liquid metals (eGaln and BilnSn) to various silicone and silicone polymer stamps, then applied different forces as they rubbed the stamps onto paper surfaces.
    “At first, it was hard to realize stable adhesion of the liquid metal coating on the substrate,” said Yuan. “However, after a lot of trial and error, we finally had the right parameters to achieve stable, repeatable adhesion.”
    The researchers found that rubbing the liquid metal-covered stamp against the paper with a small amount of force enabled the metal droplets to bind effectively to the surface, while applying larger amounts of force prevented the droplets from staying in place.
    Next, the team folded the metal-coated paper into a paper crane, demonstrating that the surface can still be folded as usual after the process is completed. And after doing so, the modified paper still maintains its usual properties.
    While the technique appears promising, Yuan noted that the researchers are still figuring out how to guarantee that the liquid metal coating stays in place after it has been applied. For now, a packaging material can be added to the paper’s surface, but the team hopes to figure out a solution that won’t require it.
    “Just like wet ink on paper can be wiped off by hand, the liquid metal coating without packaging here also can be wiped off by the object it touches as it is applied,” said Yuan. “The properties of the coating itself will not be greatly affected, but objects in contact may be soiled.”
    In the future, the team also plans to build on the method so that it can be used to apply liquid metal to a greater variety of surfaces, including metal and ceramic.
    “We also plan to construct smart devices using materials treated by this method,” said Yuan.
    This work was supported by China Postdoctoral Science Foundation, the National Nature Science Foundation of China, and the cooperation funding between Nanshan and Tsinghua SIGS in science and technology. More

  • in

    Novel ferroelectrics for more efficient microelectronics

    When we communicate with others over wireless networks, information is sent to data centers where it is collected, stored, processed, and distributed. As computational energy usage continues to grow, it is on pace to potentially become the leading source of energy consumption in this century. Memory and logic are physically separated in most modern computers, and therefore the interaction between these two components is very energy intensive in accessing, manipulating, and re-storing data. A team of researchers from Carnegie Mellon University and Penn State University is exploring materials that could possibly lead to the integration of the memory directly on top of the transistor. By changing the architecture of the microcircuit, processors could be much more efficient and consume less energy. In addition to creating proximity between these components, the nonvolatile materials studied have the potential to eliminate the need for computer memory systems to be refreshed regularly.
    Their recent work published in Science explores materials that are ferroelectric, or have a spontaneous electric polarization that can be reversed by the application of an external electric field. Recently discovered wurtzite ferroelectrics, which are mainly composed of materials that are already incorporated in semiconductor technology for integrated circuits, allow for the integration of new power-efficient devices for applications such as non-volatile memory, electro-optics, and energy harvesting. One of the biggest challenges of wurtzite ferroelectrics is that the gap between the electric fields required for operation and the breakdown field is very small.
    “Significant efforts are devoted to increasing this margin, which demands a thorough understanding of the effect of films’ composition, structure, and architecture on the polarization switching ability at practical electric fields,” said Carnegie Mellon post-doctoral researcher Sebastian Calderon, who is the lead author of the paper.
    The two institutions were brought together to collaborate on this study through the Center for 3D Ferroelectric Microelectronics (3DFeM), which is an Energy Frontier Research Center (EFRC) program led by Penn State University through funding from the U.S. Department of Energy’s (DOE) office of Basic Energy Science (BES).
    Carnegie Mellon’s materials science and engineering department, led by Professor Elizabeth Dickey, was tapped for this project because of its background in studying the role of the structure of materials in the functional properties at very small scales through electron microscopy.
    “Professor Dickey’s group brings a particular topical expertise in measuring the structure of these materials at very small length scales, as well as a focus on the particular electronic materials of interest of this project,” said Jon-Paul Maria, professor of Materials Science and Engineering at Penn State University.
    Together, the research team designed an experiment combining the strong expertise of both institutions on the synthesis, characterization and theoretical modeling of wurtzite ferroelectrics. By observing and quantifying real-time polarization switching using scanning transmission electron microscopy (STEM), the study resulted in a fundamental understanding of how such novel ferroelectric materials switch at the atomic level. As research in this area progresses, the goal is to scale the materials to a size in which they can be used in modern microelectronics.
    This material is based upon work supported by the center for 3D Ferroelectric Microelectronics (3DFeM), an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences Energy Frontier Research Centers program under Award Number DE-SC0021118. More