More stories

  • in

    Landscape Explorer shows how much the American West has changed

    With the click of a mouse, a new mapping tool shows how places in the American West have changed over the last 70 years.

    With just a Web browser, anyone can open Landscape Explorer, which will pull up a modern Google map of the United States beside a black-and-white aerial image of the western states circa 1950. A slider button allows for scrolling back and forth between past and present.

    You can type a place or address into the search bar, then zoom in or out. Search for “Lake Powell” and watch the Colorado River’s red rock canyons of the past turn into a reservoir. Type in “Las Vegas” and see Sin City’s sprawling grid of streets disappear into desert arroyos as you swipe back in time.

    .email-conversion {
    border: 1px solid #ffcccb;
    color: white;
    margin-top: 50px;
    background-image: url(“/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg”);
    padding: 20px;
    clear: both;
    }

    .zephr-registration-form{max-width:440px;margin:20px auto;padding:20px;background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form *{box-sizing:border-box}.zephr-registration-form-text > *{color:var(–zephr-color-text-main)}.zephr-registration-form-relative-container{position:relative}.zephr-registration-form-flex-container{display:flex}.zephr-registration-form-input.svelte-blfh8x{display:block;width:100%;height:calc(var(–zephr-input-height) * 1px);padding-left:8px;font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input.svelte-blfh8x::placeholder{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-checkbox.svelte-blfh8x{width:auto;height:auto;margin:8px 5px 0 0;float:left}.zephr-registration-form-input-radio.svelte-blfh8x{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x{width:50px;padding:0;border-radius:50%}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch{border:none;border-radius:50%;padding:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch-wrapper{border:none;border-radius:50%;padding:0}.zephr-registration-form-input.disabled.svelte-blfh8x,.zephr-registration-form-input.disabled.svelte-blfh8x:hover{border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);background-color:var(–zephr-color-background-tinted)}.zephr-registration-form-input.error.svelte-blfh8x{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-label.svelte-1ok5fdj.svelte-1ok5fdj{margin-top:10px;display:block;line-height:30px;font-size:12px;color:var(–zephr-color-text-tinted);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj{display:block}.zephr-registration-form-button.svelte-17g75t9{height:calc(var(–zephr-button-height) * 1px);line-height:0;padding:0 20px;text-decoration:none;text-transform:capitalize;text-align:center;border-radius:calc(var(–zephr-button-borderRadius) * 1px);font-size:calc(var(–zephr-button-fontSize) * 1px);font-weight:normal;cursor:pointer;border-style:solid;border-width:calc(var(–zephr-button-borderWidth) * 1px);border-color:var(–zephr-color-action-tinted);transition:backdrop-filter 0.2s, background-color 0.2s;margin-top:20px;display:block;width:100%;background-color:var(–zephr-color-action-main);color:#fff;position:relative;overflow:hidden;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-button.svelte-17g75t9:hover{background-color:var(–zephr-color-action-tinted);border-color:var(–zephr-color-action-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled:hover{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-text.svelte-i1fi5{font-size:19px;text-align:center;margin:20px auto;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-divider-container.svelte-mk4m8o{display:flex;align-items:center;justify-content:center;margin:40px 0}.zephr-registration-form-divider-line.svelte-mk4m8o{height:1px;width:50%;margin:0 5px;background-color:var(–zephr-color-text-tinted);;}.zephr-registration-form-divider-text.svelte-mk4m8o{margin:0 12px;color:var(–zephr-color-text-main);font-size:14px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);white-space:nowrap}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message.svelte-179421u{text-align:center;padding:10px 30px;border-radius:5px;font-size:15px;margin-top:10px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message-title.svelte-179421u{font-weight:bold;margin-bottom:10px}.zephr-registration-form-response-message-success.svelte-179421u{background-color:#baecbb;border:1px solid #00bc05}.zephr-registration-form-response-message-error.svelte-179421u{background-color:#fcdbec;border:1px solid #d90c00}.zephr-registration-form-social-sign-in.svelte-gp4ky7{align-items:center}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7{height:55px;padding:0 15px;color:#000;background-color:#fff;box-shadow:0px 0px 5px rgba(0, 0, 0, 0.3);border-radius:10px;font-size:17px;display:flex;align-items:center;cursor:pointer;margin-top:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7:hover{background-color:#fafafa}.zephr-registration-form-social-sign-in-icon.svelte-gp4ky7{display:flex;justify-content:center;margin-right:30px;width:25px}.zephr-form-link-message.svelte-rt4jae{margin:10px 0 10px 20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-tcs.svelte-1wyy3bx{margin:20px 0 0 0;font-size:15px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-inline.svelte-1wyy3bx{margin:20px 0 0 0}.zephr-registration-form-progress-bar.svelte-8qyhcl{width:100%;border:0;border-radius:20px;margin-top:10px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-bar{background-color:var(–zephr-color-background-tinted);border:0;border-radius:20px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-value{background-color:var(–zephr-color-text-tinted);border:0;border-radius:20px}.zephr-registration-progress-bar-step.svelte-8qyhcl{margin:auto;color:var(–zephr-color-text-tinted);font-size:12px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-progress-bar-step.svelte-8qyhcl:first-child{margin-left:0}.zephr-registration-progress-bar-step.svelte-8qyhcl:last-child{margin-right:0}.zephr-registration-form-input-error-text.svelte-19a73pq{color:var(–zephr-color-warning-main);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-select.svelte-19a73pq{display:block;appearance:auto;width:100%;height:calc(var(–zephr-input-height) * 1px);font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-color-text-main);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;padding:10px}.zephr-registration-form-input-select.disabled.svelte-19a73pq{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.unselected.svelte-19a73pq{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.error.svelte-19a73pq{border-color:var(–zephr-color-warning-main)}.zephr-registration-form-input-textarea.svelte-19a73pq{background-color:#fff;border:1px solid #ddd;color:#222;font-size:14px;font-weight:300;padding:16px;width:100%}.zephr-registration-form-input-slider-output.svelte-19a73pq{margin:13px 0 0 10px}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.spin.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 2s 0s infinite linear}.pulse.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 1s infinite steps(8)}@keyframes svelte-1cj2gr0-spin{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}.zephr-registration-form-checkbox.svelte-1gzpw2y{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-checkbox-label.svelte-1gzpw2y{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-checkmark.svelte-1gzpw2y{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid var(–zephr-color-text-main);border-radius:6px;margin-right:12px;cursor:pointer}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y{border-color:#009fe3}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y:after{content:””;position:absolute;width:6px;height:13px;border:solid #009fe3;border-width:0 2px 2px 0;transform:rotate(45deg);top:3px;left:8px;box-sizing:border-box}.zephr-registration-form-checkmark.disabled.svelte-1gzpw2y{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-checkmark.disabled.checked.svelte-1gzpw2y:after{border:solid var(–zephr-color-background-tinted);border-width:0 2px 2px 0}.zephr-registration-form-checkmark.error.svelte-1gzpw2y{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-radio.svelte-1qn5n0t{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-radio-label.svelte-1qn5n0t{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-radio-dot.svelte-1qn5n0t{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid #ebebeb;border-radius:50%;margin-right:12px}.checked.svelte-1qn5n0t{border-color:#009fe3}.checked.svelte-1qn5n0t:after{content:””;position:absolute;width:17px;height:17px;background:#009fe3;background:linear-gradient(#009fe3, #006cb5);border-radius:50%;top:2px;left:2px}.disabled.checked.svelte-1qn5n0t:after{background:var(–zephr-color-background-tinted)}.error.svelte-1qn5n0t{border:1px solid var(–zephr-color-warning-main)}.zephr-form-link.svelte-64wplc{margin:10px 0;color:#6ba5e9;text-decoration:underline;cursor:pointer;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-form-link-disabled.svelte-64wplc{color:var(–zephr-color-text-main);cursor:none;text-decoration:none}.zephr-registration-form-google-icon.svelte-1jnblvg{width:20px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}
    .zephr-registration-form {
    max-width: 100%;
    background-image: url(/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg);
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    margin: 0px auto;
    margin-bottom: 4rem;
    padding: 20px;
    }

    .zephr-registration-form-text h6 {
    font-size: 0.8rem;
    }

    .zephr-registration-form h4 {
    font-size: 3rem;
    }

    .zephr-registration-form h4 {
    font-size: 1.5rem;
    }

    .zephr-registration-form-button.svelte-17g75t9:hover {
    background-color: #fc6a65;
    border-color: #fc6a65;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9:disabled {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9 {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-text > * {
    color: #FFFFFF;
    font-weight: bold
    font: 25px;
    }
    .zephr-registration-form-progress-bar.svelte-8qyhcl {
    width: 100%;
    border: 0;
    border-radius: 20px;
    margin-top: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-title.svelte-179421u {
    font-weight: bold;
    margin-bottom: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-success.svelte-179421u {
    background-color: #8db869;
    border: 1px solid #8db869;
    color: white;
    margin-top: -0.2rem;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(1){
    font-size: 18px;
    text-align: center;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(5){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(7){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(9){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj {
    display: none;
    color: white;
    }
    .zephr-registration-form-input.disabled.svelte-blfh8x, .zephr-registration-form-input.disabled.svelte-blfh8x:hover {
    border: calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);
    background-color: white;
    }
    .zephr-registration-form-checkbox-label.svelte-1gzpw2y {
    display: flex;
    align-items: center;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    font-size: 20px;
    margin-bottom: -20px;
    }

    The free tool is an easy way for anyone with an interest in the American West to peruse the past. But Landscape Explorer also has a loftier purpose: helping government agencies, landowners and conservation professionals make complex decisions about how to manage land.

    The powerful visual contrast between the historical snapshot and modern-day satellite imagery “allows us to go from zero to 100” in terms of understanding ecosystem changes, says Scott Morford, an applied spatial ecologist at the University of Montana in Missoula who led the development of Landscape Explorer. The project was supported by Working Lands for Wildlife, a conservation initiative led by the U.S. Department of Agriculture, and other partners. The impetus, Morford says, was to “give us a reference for how rapidly things are changing across biomes that we care about.”

    Before Landscape Explorer, most collections of historical imagery of large-scale landscapes went back to only the 1980s. Finding earlier imagery of large landscapes was expensive and time-consuming. While some previous projects have stitched together historical imagery at small scales to look at how a particular watershed or county has changed, “the real revolution is that we were able to figure out how to do it at scale,” Morford says. “We wanted to make something that was universal and accessible” for everyone, not just remote sensing specialists.

    Morford and colleagues processed about 170,000 aerial images that were taken by U.S. Army pilots during the Cold War and later digitized. To create a continuous mosaic, the team used specialized software to stitch together images of adjacent patches of land, the researchers explained in a paper published in July in Remote Sensing in Ecology and Conservation. The final step was pairing the mosaic with satellite imagery using Google Earth Engine.

    Landscape Explorer began as a small project to assess the extent of woody encroachment onto grasslands in western Montana. Due to fire suppression, conifers like western juniper or eastern red cedar are taking over ecosystems that were historically treeless, such as sagebrush steppe and prairies. A monoculture of these water-guzzling trees is bad news for local biodiversity and increases the risk of catastrophic wildfire.

    Third-generation Montana rancher Bruce Peterson says that seeing historical and current aerial imagery side by side made him realize how the steady infiltration of trees had devalued his family’s livestock pastures. “It’s a little bit like losing your hearing or your vision with these trees. They eat away a little of your land at a time, and then by the time you get hearing aids or glasses, you realize it’s gotten really out of hand,” Peterson says.

    Using Landscape Explorer, Peterson and dozens of other landowners involved in the Southwest Montana Sagebrush Partnership have prioritized where to remove invading trees. The group has restored nearly 50,000 acres of treeless rangeland since 2020, according to the Nature Conservancy, a member of the partnership.

    Landscape Explorer also helped the Clark Fork Coalition, a Montana-based nonprofit that protects and restores waterways, to see how urban and industrial development has impacted floodplains. “This tool gives us the power of time travel. It’s like a time-lapse showing all that’s been lost and where the continued pressure is very real,” says Karen Knudsen, the coalition’s executive director.

    After seeing the successes in Montana, the makers of Landscape Explorer extended the tool to 17 states in the West to show where forests, grasslands or rivers are most at risk of disappearing and where intact habitats can still be preserved.

    Since the expanded tool was released in September, researchers have used it to assess glacial retreat in the Pacific Northwest, measure the historical extent of sand dunes in coastal California and pinpoint where wetlands have dried up. Morford is excited to see all the ways Landscape Explorer can help land managers. “It’s going to be used in ways we haven’t even thought of yet.” More

  • in

    Artificial intelligence makes gripping more intuitive

    Different types of grasps and bionic design: technological developments in recent decades have already led to advanced artificial hands. They can enable amputees who have lost a hand through accident or illness to regain some movements. Some of these modern prostheses allow independent finger movements and wrist rotation. These movements can be selected via a smartphone app or by using muscle signals from the forearm, typically detected by two sensors.
    For instance, the activation of wrist flexor muscles can be used to close the fingers together to grip a pen. If the wrist extensor muscles are contracted, the fingers re-open and the hand releases the pen. The same approach makes it possible to control different finger movements that are selected with the simultaneous activation of both flexor and extensor muscle groups. “These are movements that the patient has to learn during rehabilitation,” says Cristina Piazza, a professor of rehabilitation and assistive robotics at TUM. Now, Prof. Piazza’s research team has shown that artificial intelligence can enable patients to control advanced hand prostheses more intuitively by using the “synergy principle” and with the help of 128 sensors on the forearm.
    The synergy principle: the brain activates a pool of muscle cells
    What is the synergy principle? “It is known from neuroscientific studies that repetitive patterns are observed in experimental sessions, both in kinematics and muscle activation,” says Prof. Piazza. These patterns can be interpreted as the way in which the human brain copes with the complexity of the biological system. That means that the brain activates a pool of muscle cells, also in the forearm. The professor adds: “When we use our hands to grasp an object, for example a ball, we move our fingers in a synchronized way and adapt to the shape of the object when contact occurs.” The researchers are now using this principle to design and control artificial hands by creating new learning algorithms. This is necessary for intuitive movement: When controlling an artificial hand to grasp a pen, for example, multiple steps take place. First, the patient orients the artificial hand according to the grasping location, slowly moves the fingers together, and then grabs the pen. The goal is to make these movements more and more fluid, so that it is hardly noticeable that numerous separate movements make up an overall process. “With the help of machine learning, we can understand the variations among subjects and improve the control adaptability over time and the learning process,” concludes Patricia Capsi Morales, the senior scientist in Prof. Piazza’s team.
    Discovering patterns from 128 signal channels
    Experiments with the new approach already indicate that conventional control methods could soon be empowered by more advanced strategies. To study what is happening at the level of the central nervous system, the researchers are working with two films: one for the inside and one for the outside of the forearm. Each contains up to 64 sensors to detect muscle activation. The method also estimates which electrical signals the spinal motor neurons have transmitted. “The more sensors we use, the better we can record information from different muscle groups and find out which muscle activations are responsible for which hand movements,” explains Prof. Piazza. Depending on whether a person intends to make a fist, grip a pen or open a jam jar, “characteristic features of muscle signals” result, according to Dr. Capsi Morales — a prerequisite for intuitive movements.
    Wrist and hand movement: Eight out of ten people prefer the intuitive way
    Current research concentrates on the movement of the wrist and the whole hand. It shows that most people (eight out of ten) prefer the intuitive way of moving wrist and hand. This is also the more efficient way. But two of ten learn to handle the less intuitive way, becoming in the end even more precise. “Our goal is to investigate the learning effect and find the right solution for each patient,” Dr. Capsi Morales explains. “This is a step in the right direction,” says Prof. Piazza, who emphasizes that each system consists of individual mechanics and properties of the hand, special training with patients, interpretation and analysis, and machine learning.
    Current challenges of advanced control of artificial hands
    There are still some challenges to address: The learning algorithm, which is based on the information from the sensors, has to be retrained every time the film slips or is removed. In addition, the sensors must be prepared with a gel to guarantee the necessary conductivity to record the signals from the muscles precisely. “We use signal processing techniques to filter out the noise and get usable signals,” explains Dr. Capsi Morales. Every time a new patient wears the cuff with the many sensors over their forearm, the algorithm must first identify the activation patterns for each movement sequence to later detect the user’s intention and translate it into commands for the artificial hand. More

  • in

    AI accelerates problem-solving in complex scenarios

    While Santa Claus may have a magical sleigh and nine plucky reindeer to help him deliver presents, for companies like FedEx, the optimization problem of efficiently routing holiday packages is so complicated that they often employ specialized software to find a solution.
    This software, called a mixed-integer linear programming (MILP) solver, splits a massive optimization problem into smaller pieces and uses generic algorithms to try and find the best solution. However, the solver could take hours — or even days — to arrive at a solution.
    The process is so onerous that a company often must stop the software partway through, accepting a solution that is not ideal but the best that could be generated in a set amount of time.
    Researchers from MIT and ETH Zurich used machine learning to speed things up.
    They identified a key intermediate step in MILP solvers that has so many potential solutions it takes an enormous amount of time to unravel, which slows the entire process. The researchers employed a filtering technique to simplify this step, then used machine learning to find the optimal solution for a specific type of problem.
    Their data-driven approach enables a company to use its own data to tailor a general-purpose MILP solver to the problem at hand.
    This new technique sped up MILP solvers between 30 and 70 percent, without any drop in accuracy. One could use this method to obtain an optimal solution more quickly or, for especially complex problems, a better solution in a tractable amount of time.

    This approach could be used wherever MILP solvers are employed, such as by ride-hailing services, electric grid operators, vaccination distributors, or any entity faced with a thorny resource-allocation problem.
    “Sometimes, in a field like optimization, it is very common for folks to think of solutions as either purely machine learning or purely classical. I am a firm believer that we want to get the best of both worlds, and this is a really strong instantiation of that hybrid approach,” says senior author Cathy Wu, the Gilbert W. Winslow Career Development Assistant Professor in Civil and Environmental Engineering (CEE), and a member of a member of the Laboratory for Information and Decision Systems (LIDS) and the Institute for Data, Systems, and Society (IDSS).
    Wu wrote the paper with co-lead authors Siriu Li, an IDSS graduate student, and Wenbin Ouyang, a CEE graduate student; as well as Max Paulus, a graduate student at ETH Zurich. The research will be presented at the Conference on Neural Information Processing Systems.
    Tough to solve
    MILP problems have an exponential number of potential solutions. For instance, say a traveling salesperson wants to find the shortest path to visit several cities and then return to their city of origin. If there are many cities which could be visited in any order, the number of potential solutions might be greater than the number of atoms in the universe.
    “These problems are called NP-hard, which means it is very unlikely there is an efficient algorithm to solve them. When the problem is big enough, we can only hope to achieve some suboptimal performance,” Wu explains.

    An MILP solver employs an array of techniques and practical tricks that can achieve reasonable solutions in a tractable amount of time.
    A typical solver uses a divide-and-conquer approach, first splitting the space of potential solutions into smaller pieces with a technique called branching. Then, the solver employs a technique called cutting to tighten up these smaller pieces so they can be searched faster.
    Cutting uses a set of rules that tighten the search space without removing any feasible solutions. These rules are generated by a few dozen algorithms, known as separators, that have been created for different kinds of MILP problems.
    Wu and her team found that the process of identifying the ideal combination of separator algorithms to use is, in itself, a problem with an exponential number of solutions.
    “Separator management is a core part of every solver, but this is an underappreciated aspect of the problem space. One of the contributions of this work is identifying the problem of separator management as a machine learning task to begin with,” she says.
    Shrinking the solution space
    She and her collaborators devised a filtering mechanism that reduces this separator search space from more than 130,000 potential combinations to around 20 options. This filtering mechanism draws on the principle of diminishing marginal returns, which says that the most benefit would come from a small set of algorithms, and adding additional algorithms won’t bring much extra improvement.
    Then they use a machine-learning model to pick the best combination of algorithms from among the 20 remaining options.
    This model is trained with a dataset specific to the user’s optimization problem, so it learns to choose algorithms that best suit the user’s particular task. Since a company like FedEx has solved routing problems many times before, using real data gleaned from past experience should lead to better solutions than starting from scratch each time.
    The model’s iterative learning process, known as contextual bandits, a form of reinforcement learning, involves picking a potential solution, getting feedback on how good it was, and then trying again to find a better solution.
    This data-driven approach accelerated MILP solvers between 30 and 70 percent without any drop in accuracy. Moreover, the speedup was similar when they applied it to a simpler, open-source solver and a more powerful, commercial solver.
    In the future, Wu and her collaborators want to apply this approach to even more complex MILP problems, where gathering labeled data to train the model could be especially challenging. Perhaps they can train the model on a smaller dataset and then tweak it to tackle a much larger optimization problem, she says. The researchers are also interested in interpreting the learned model to better understand the effectiveness of different separator algorithms.
    This research is supported, in part, by Mathworks, the National Science Foundation (NSF), the MIT Amazon Science Hub, and MIT’s Research Support Committee. More

  • in

    Using AI to find microplastics

    An interdisciplinary research team from the University of Waterloo is using artificial intelligence (AI) to identify microplastics faster and more accurately than ever before.
    Microplastics are commonly found in food and are dangerous pollutants that cause severe environmental damage — finding them is the key to getting rid of them.
    The research team’s advanced imaging identification system could help wastewater treatment plants and food production industries make informed decisions to mitigate the potential impact of microplastics on the environment and human health.
    A comprehensive risk analysis and action plan requires quality information based on accurate identification. In search of a robust analytical tool that could enumerate, identify and describe the many microplastics that exist, project lead Dr. Wayne Parker and his team, employed an advanced spectroscopy method which exposes particles to a range of wavelengths of light. Different types of plastics produce different signals in response to the light exposure. These signals are like fingerprints that can also be employed to mark particles as microplastic or not.
    The challenge researchers often find is that microplastics come in wide varieties due to the presence of manufacturing additives and fillers that can blur the “fingerprints” in a lab setting. This makes identifying microplastics from organic material, as well as the different types of microplastics, often difficult. Human intervention is usually required to dig out subtle patterns and cues, which is slow and prone to error.
    “Microplastics are hydrophobic materials that can soak up other chemicals,” said Parker, a professor in Waterloo’s Department of Civil and Environmental Engineering. “Science is still evolving in terms of how bad the problem is, but it’s theoretically possible that microplastics are enhancing the accumulation of toxic substances in the food chain.”
    Parker approached Dr. Alexander Wong, a professor in Waterloo’s Department of Systems Design Engineeringand the Canada Research Chair in Artificial Intelligence and Medical Imaging for assistance. With his help, the team developed an AI tool called PlasticNet that enables researchers to rapidly analyze large numbers of particles approximately 50 per cent faster than prior methods and with 20 per cent more accuracy.

    The tool is the latest sustainable technology designed by Waterloo researchers to protect our environment and engage in research that will contribute to a sustainable future.
    “We built a deep learning neural network to enhance microplastic identification from the spectroscopic signals,” said Wong. “We trained it on data from existing literature sources and our own generated images to understand the varied make-up of microplastics and spot the differences quickly and correctly — regardless of the fingerprint quality.”
    Parker’s former PhD student, Frank Zhu, tested the system on microplastics isolated from a local wastewater treatment plant. Results show that it can identify microplastics with unprecedented speed and accuracy. This information can empower treatment plants to implement effective measures to control and eliminate these substances.
    The next steps involve continued learning and testing, as well as feeding the PlasticNet system more data to increase the quality of its microplastics identification capabilities for application across a broad range of needs. More

  • in

    Diamonds and rust help unveil ‘impossible’ quasi-particles

    Researchers have discovered magnetic monopoles — isolated magnetic charges — in a material closely related to rust, a result that could be used to power greener and faster computing technologies.
    Researchers led by the University of Cambridge used a technique known as diamond quantum sensing to observe swirling textures and faint magnetic signals on the surface of hematite, a type of iron oxide.
    The researchers observed that magnetic monopoles in hematite emerge through the collective behaviour of many spins (the angular momentum of a particle). These monopoles glide across the swirling textures on the surface of the hematite, like tiny hockey pucks of magnetic charge. This is the first time that naturally occurring emergent monopoles have been observed experimentally.
    The research has also shown the direct connection between the previously hidden swirling textures and the magnetic charges of materials like hematite, as if there is a secret code linking them together. The results, which could be useful in enabling next-generation logic and memory applications, are reported in the journal Nature Materials.
    According to the equations of James Clerk Maxwell, a giant of Cambridge physics, magnetic objects, whether a fridge magnet or the Earth itself, must always exist as a pair of magnetic poles that cannot be isolated.
    “The magnets we use every day have two poles: north and south,” said Professor Mete Atatüre, who led the research. “In the 19th century, it was hypothesised that monopoles could exist. But in one of his foundational equations for the study of electromagnetism, James Clerk Maxwell disagreed.”
    Atatüre is Head of Cambridge’s Cavendish Laboratory, a position once held by Maxwell himself. “If monopoles did exist, and we were able to isolate them, it would be like finding a missing puzzle piece that was assumed to be lost,” he said.

    About 15 years ago, scientists suggested how monopoles could exist in a magnetic material. This theoretical result relied on the extreme separation of north and south poles so that locally each pole appeared isolated in an exotic material called spin ice.
    However, there is an alternative strategy to find monopoles, involving the concept of emergence. The idea of emergence is the combination of many physical entities can give rise to properties that are either more than or different to the sum of their parts.
    Working with colleagues from the University of Oxford and the National University of Singapore, the Cambridge researchers used emergence to uncover monopoles spread over two-dimensional space, gliding across the swirling textures on the surface of a magnetic material.
    The swirling topological textures are found in two main types of materials: ferromagnets and antiferromagnets. Of the two, antiferromagnets are more stable than ferromagnets, but they are more difficult to study, as they don’t have a strong magnetic signature.
    To study the behaviour of antiferromagnets, Atatüre and his colleagues use an imaging technique known as diamond quantum magnetometry. This technique uses a single spin — the inherent angular momentum of an electron — in a diamond needle to precisely measure the magnetic field on the surface of a material, without affecting its behaviour.
    For the current study, the researchers used the technique to look at hematite, an antiferromagnetic iron oxide material. To their surprise, they found hidden patterns of magnetic charges within hematite, including monopoles, dipoles and quadrupoles.

    “Monopoles had been predicted theoretically, but this is the first time we’ve actually seen a two-dimensional monopole in a naturally occurring magnet,” said co-author Professor Paolo Radaelli, from the University of Oxford.
    “These monopoles are a collective state of many spins that twirl around a singularity rather than a single fixed particle, so they emerge through many-body interactions. The result is a tiny, localised stable particle with diverging magnetic field coming out of it,” said co-first author Dr Hariom Jani, from the University of Oxford.
    “We’ve shown how diamond quantum magnetometry could be used to unravel the mysterious behaviour of magnetism in two-dimensional quantum materials, which could open up new fields of study in this area,” said co-first author Dr Anthony Tan, from the Cavendish Laboratory. “The challenge has always been direct imaging of these textures in antiferromagnets due to their weaker magnetic pull, but now we’re able to do so, with a nice combination of diamonds and rust.”
    The study not only highlights the potential of diamond quantum magnetometry but also underscores its capacity to uncover and investigate hidden magnetic phenomena in quantum materials. If controlled, these swirling textures dressed in magnetic charges could power super-fast and energy-efficient computer memory logic.
    The research was supported in part by the Royal Society, the Sir Henry Royce Institute, the European Union, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). More

  • in

    Exposure to soft robots decreases human fears about working with them

    Seeing robots made with soft, flexible parts in action appears to lower people’s anxiety about working with them or even being replaced by them.
    A Washington State University study found that watching videos of a soft robot working with a person at picking and placing tasks lowered the viewers’ safety concerns and feelings of job insecurity. This was true even when the soft robot was shown working in close proximity to the person. This finding shows soft robots hold a potential psychological advantage over rigid robots made of metal or other hard materials.
    “Prior research has generally found that the closer you are to a rigid robot, the more negative your reactions are, but we didn’t find those outcomes in this study of soft robots,” said lead author Tahira Probst, a WSU psychology professor.
    Currently, human and rigid robotic workers have to maintain a set distance for safety reasons, but as this study indicates, proximity to soft robots could be not only physically safer but also more psychologically accepted.
    “This finding needs to be replicated, but if it holds up, that means humans could work together more closely with the soft robots,” Probst said.
    The study, published in the journal IISE Transactions on Occupational Ergonomics and Human Factors, did find that faster interactions with a soft robot tended to cause more negative responses, but when the study participants had previous experience with robots, faster speed did not bother them. In fact, they preferred the faster interactions. This reinforces the finding that greater familiarity increased overall comfort with soft robots.
    About half of all occupations are highly likely to involve some type of automation within the next couple decades, said Probst, particularly those related to production, transportation, extraction and agriculture.

    Soft robots, which are made with flexible materials like fabric and rubber, are still relatively new technology compared to rigid robots which are already widely in use in manufacturing.
    Rigid robots have many limitations including their high cost and high safety concerns — two problems soft robots can potentially solve, said study co-author Ming Luo, an assistant professor in WSU’s School of Mechanical and Materials Engineering.
    “We make soft robots that are naturally safe, so we don’t have to focus a lot on expensive hardware and sensors to guarantee safety like has to be done with rigid robots,” said Luo.
    As an example, Luo noted that one rigid robot used for apple picking could cost around $30,000 whereas the current research and development cost for one soft robot, encompassing all components and manufacturing, is under $5,000. Also, that cost could be substantially decreased if production were scaled up.
    Luo’s team is in the process of developing soft robots for a range of functions, including fruit picking, pruning and pollinating. Soft robots also have the potential help elderly or disabled people in home or health care settings. Much more development has to be done before this can be a reality, Luo said, but his engineering lab has partnered with Probst’s psychology team to better understand human-robot interactions early in the process.
    “It’s good to know how humans will react to the soft robots in advance and then incorporate that information into the design,” said Probst. “That’s why we’re working in tandem, where the psychology side is informing the technical development of these robots in their infancy.”
    To further test this study’s findings, the researchers are planning to bring participants into the lab to interact directly with soft robots. In addition to collecting participants self-reported surveys, they will also measure participants’ physical stress reactions, such as heart rate and galvanic skin responses, which are changes in the skin’s electrical resistance in reaction to emotional stress. More

  • in

    Enhanced AI tracks neurons in moving animals

    Recent advances allow imaging of neurons inside freely moving animals. However, to decode circuit activity, these imaged neurons must be computationally identified and tracked. This becomes particularly challenging when the brain itself moves and deforms inside an organism’s flexible body, e.g. in a worm. Until now, the scientific community has lacked the tools to address the problem.
    Now, a team of scientists from EPFL and Harvard have developed a pioneering AI method to track neurons inside moving and deforming animals. The study, now published in Nature Methods, was led by Sahand Jamal Rahi at EPFL’s School of Basic Sciences.
    The new method is based on a convolutional neural network (CNN), which is a type of AI that has been trained to recognize and understand patterns in images. This involves a process called “convolution,” which looks at small parts of the picture — like edges, colors, or shapes — at a time and then combines all that information together to make sense of it and to identify objects or patterns.
    The problem is that to identify and track neurons during a movie of an animal’s brain, many images have to be labeled by hand because the animal appears very differently across time due to the many different body deformations. Given the diversity of the animal’s postures, generating a sufficient number of annotations manually to train a CNN can be daunting.
    To address this, the researchers developed an enhanced CNN featuring ‘targeted augmentation’. The innovative technique automatically synthesizes reliable annotations for reference out of only a limited set of manual annotations. The result is that the CNN effectively learns the internal deformations of the brain and then uses them to create annotations for new postures, drastically reducing the need for manual annotation and double-checking.
    The new method is versatile, being able to identify neurons whether they are represented in images as individual points or as 3D volumes. The researchers tested it on the roundworm Caenorhabditis elegans, whose 302 neurons have made it a popular model organism in neuroscience.
    Using the enhanced CNN, the scientists measured activity in some of the worm’s interneurons (neurons that bridge signals between neurons). They found that they exhibit complex behaviors, for example changing their response patterns when exposed to different stimuli, such as periodic bursts of odors.
    The team have made their CNN accessible, providing a user-friendly graphical user interface that integrates targeted augmentation, streamlining the process into a comprehensive pipeline, from manual annotation to final proofreading.
    “By significantly reducing the manual effort required for neuron segmentation and tracking, the new method increases analysis throughput three times compared to full manual annotation,” says Sahand Jamal Rahi. “The breakthrough has the potential to accelerate research in brain imaging and deepen our understanding of neural circuits and behaviors.” More

  • in

    Underwater vehicle AI model could be used in other adaptive control systems

    Unmanned Underwater Vehicles (UUVs) are used around the world to conduct difficult environmental, remote, oceanic, defence and rescue missions in often unpredictable and harsh conditions.
    A new study led by Flinders University and French researchers has now used a novel bio-inspired computing artificial intelligence solution to improve the potential of UUVs and other adaptive control systems to operate more reliability in rough seas and other unpredictable conditions.
    This innovative approach, using the Biologically-Inspired Experience Replay (BIER) method, has been published by the Institute of Electrical and Electronics Engineers journal IEEE Access.
    Unlike conventional methods, BIER aims to overcome data inefficiency and performance degradation by leveraging incomplete but valuable recent experiences, explains first author Dr Thomas Chaffre.
    “The outcomes of the study demonstrated that BIER surpassed standard Experience Replay methods, achieving optimal performance twice as fast as the latter in the assumed UUV domain.
    “The method showed exceptional adaptability and efficiency, exhibiting its capability to stabilize the UUV in varied and challenging conditions.”
    The method incorporates two memory buffers, one focusing on recent state-action pairs and the other emphasising positive rewards.

    To test the effectiveness of the proposed method, researchers conducted simulated scenarios using a robot operating system (ROS)-based UUV simulator and gradually increasing scenarios’ complexity.
    These scenarios varied in target velocity values and the intensity of current disturbances.
    Senior author Flinders University Associate Professor in AI and Robotics Paulo Santos says the BIER method’s success holds promise for enhancing adaptability and performance in various fields requiring dynamic, adaptive control systems.
    UUVs’ capabilities in mapping, imaging and sensor controls are rapidly improving, including with Deep Reinforcement Learning (DRL), which is rapidly advancing the adaptive control responses to underwater disturbances UUVs can encounter.
    However, the efficiency of these methods gets challenged when faced with unforeseen variations in real-world applications.
    The complex dynamics of the underwater environment limit the observability of UUV manoeuvring tasks, making it difficult for existing DRL methods to perform optimally.
    The introduction of BIER marks a significant step forward in enhancing the effectiveness of deep reinforcement learning method in general.
    Its ability to efficiently navigate uncertain and dynamic environments signifies a promising advancement in the area of adaptive control systems, researchers conclude.
    Acknowledgements: This work was funded by Flinders University and ENSTA Bretagne with support from the Government of South Australia (Australia), the Région Bretagne (France) and Naval Group. More