More stories

  • in

    'Metalens' could disrupt vacuum UV market

    Rice University photonics researchers have created a potentially disruptive technology for the ultraviolet optics market.
    By precisely etching hundreds of tiny triangles on the surface of a microscopic film of zinc oxide, nanophotonics pioneer Naomi Halas and colleagues created a “metalens” that transforms incoming long-wave UV (UV-A) into a focused output of vacuum UV (VUV) radiation. VUV is used in semiconductor manufacturing, photochemistry and materials science and has historically been costly to work with, in part because it is absorbed by almost all types of glass used to make conventional lenses.
    “This work is particularly promising in light of recent demonstrations that chip manufacturers can scale up the production of metasurfaces with CMOS-compatible processes,” said Halas, co-corresponding author of a metalens demonstration study published in Science Advances. “This is a fundamental study, but it clearly points to a new strategy for high-throughput manufacturing of compact VUV optical components and devices.”
    Halas’ team showed its microscopic metalens could convert 394-nanometer UV into a focused output of 197-nanometer VUV. The disc-shaped metalens is a transparent sheet of zinc oxide that is thinner than a sheet of paper and just 45 millionths of a meter in diameter. In the demonstration, a 394-nanometer UV-A laser was shined at the back of the disc, and researchers measured the light that emerged from the other side.
    Study co-first author Catherine Arndt, an applied physics graduate student in Halas’ research group, said the key feature of the metalens is its interface, a front surface that is studded with concentric circles of tiny triangles.
    “The interface is where all of the physics is happening,” she said. “We’re actually imparting a phase shift, changing both how quickly the light is moving and the direction it’s traveling. We don’t have to collect the light output because we use electrodynamics to redirect it at the interface where we generate it.”
    Violet light has the lowest wavelength visible to humans. Ultraviolet has even lower wavelengths, which range from 400 nanometers to 10 nanometers. Vacuum UV, with wavelengths between 100-200 nanometers, is so-named because it is strongly absorbed by oxygen. Using VUV light today typically requires a vacuum chamber or other specialized environment, as well as machinery to generate and focus VUV. More

  • in

    New shape memory alloy discovered through artificial intelligence framework

    Funded by the National Science Foundation’s Designing Materials to Revolutionize Our Engineering Future (DMREF) Program, researchers from the Department of Materials Science and Engineering at Texas A&M University used an Artificial Intelligence Materials Selection framework (AIMS) to discover a new shape memory alloy. The shape memory alloy showed the highest efficiency during operation achieved thus far for nickel-titanium-based materials. In addition, their data-driven framework offers proof of concept for future materials development.
    Shape memory alloys are utilized in various fields where compact, lightweight and solid-state actuations are needed, replacing hydraulic or pneumatic actuators because they can deform when cold and then return to their original shape when heated. This unique property is critical for applications, such as airplane wings, jet engines and automotive components, that must withstand repeated, recoverable large-shape changes.
    There have been many advancements in shape memory alloys since their beginnings in the mid-1960s, but at a cost. Understanding and discovering new shape memory alloys has required extensive research through experimentation and ad-hoc trial and error. Despite many of which have been documented to help further shape memory alloy applications, new alloy discoveries have occurred in a decadal fashion. About every 10 years, a significant shape memory alloy composition or system has been discovered. Moreover, even with advances in shape memory alloys, they are hindered by their low energy efficiency caused by incompatibilities in their microstructure during the large shape change. Further, they are notoriously difficult to design from scratch.
    To address these shortcomings, Texas A&M researchers have combined experimental data to create an AIMS computational framework capable of determining optimal materials compositions and processing these materials, which led to the discovery of a new shape memory alloy composition.
    “When designing materials, sometimes you have multiple objectives or constraints that conflict, which is very difficult to work around,” said Dr. Ibrahim Karaman, Chevron Professor I and materials science and engineering department head. “Using our machine-learning framework, we can use experimental data to find hidden correlations between different materials’ features to see if we can design new materials.”
    The shape memory alloy found during the study using AIMS was predicted and proven to achieve the narrowest hysteresis ever recorded. In other words, the material showed the lowest energy loss when converting thermal energy to mechanical work. The material showcased high efficiency when subject to thermal cycling due to its extremely small transformation temperature window. The material also exhibited excellent cyclic stability under repeated actuation.
    A nickel-titanium-copper composition is typical for shape memory alloys. Nickel-titanium-copper alloys typically have titanium equal to 50% and form a single-phase material. Using machine learning, the researchers predicted a different composition with titanium equal to 47% and copper equal to 21%. While this composition is in the two-phase region and forms particles, they help enhance the material’s properties, explained William Trehern, doctoral student and graduate research assistant in the materials science and engineering department and the publication’s first author.
    In particular, this high-efficiency shape memory alloy lends itself to thermal energy harvesting, which requires materials that can capture waste energy produced by machines and put it to use, and thermal energy storage, which is used for cooling electronic devices.
    More notably, the AIMS framework offers the opportunity to use machine-learning techniques in materials science. The researchers see potential to discover more shape memory alloy chemistries with desired characteristics for various other applications.
    “It is a revelation to use machine learning to find connections that our brain or known physical principles may not be able to explain,” said Karaman. “We can use data science and machine learning to accelerate the rate of materials discovery. I also believe that we can potentially discover new physics or mechanisms behind materials behavior that we did not know before if we pay attention to the connections machine learning can find.”
    Other contributors include Dr. Raymundo Arróyave and Dr. Kadri Can Atli, professors in the materials science and engineering department, and materials science and engineering undergraduate student Risheil Ortiz-Ayala.
    “While machine learning is now widely used in materials science, most approaches to date focus on predicting the properties of a material without necessarily explaining how to process it to achieve target properties,” said Arróyave. “Here, the framework looked not only at the chemical composition of candidate materials, but also the processing necessary to attain the properties of interest.”
    Story Source:
    Materials provided by Texas A&M University. Original written by Michelle Revels. Note: Content may be edited for style and length. More

  • in

    How some sunscreens damage coral reefs

    One common chemical in sunscreen can have devastating effects on coral reefs. Now, scientists know why.

    Sea anemones, which are closely related to corals, and mushroom coral can turn oxybenzone — a chemical that protects people against ultraviolet light — into a deadly toxin that’s activated by light. The good news is that algae living alongside the creatures can soak up the toxin and blunt its damage, researchers report in the May 6 Science.

    But that also means that bleached coral reefs lacking algae may be more vulnerable to death. Heat-stressed corals and anemones can eject helpful algae that provide oxygen and remove waste products, which turns reefs white. Such bleaching is becoming more common as a result of climate change (SN: 4/7/20).

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    The findings hint that sunscreen pollution and climate change combined could be a greater threat to coral reefs and other marine habitats than either would be separately, says Craig Downs. He is a forensic ecotoxicologist with the nonprofit Haereticus Environmental Laboratory in Amherst, Va., and was not involved with the study.

    Previous work suggested that oxybenzone can kill young corals or prevent adult corals from recovering after tissue damage. As a result, some places, including Hawaii and Thailand, have banned oxybenzone-containing sunscreens.

    In the new study, environmental chemist Djordje Vuckovic of Stanford University and colleagues found that glass anemones (Exaiptasia pallida) exposed to oxybenzone and UV light add sugars to the chemical. While such sugary add-ons would typically help organisms detoxify chemicals and clear them from the body, the oxybenzone-sugar compound instead becomes a toxin that’s activated by light.

    Anemones exposed to either simulated sunlight or oxybenzone alone survived the length of the experiment, or 21 days, the team showed. But all anemones exposed to fake sunlight while submersed in water containing the chemical died within 17 days.

    Algae can soak up oxybenzone and its toxic by-products, a study shows. Sea anemones lacking algae (white) died sooner than animals with algae (brown) when exposed to oxybenzone and UV light.Djordje Vuckovic and Christian Renicke

    The anemones’ algal friends absorbed much of the oxybenzone and the toxin that the animals were exposed to in the lab. Anemones lacking algae died days sooner than anemones with algae.

    In similar experiments, algae living inside mushroom coral (Discosoma sp.) also soaked up the toxin, a sign that algal relationships are a safeguard against its harmful effects. The coral’s algae seem to be particularly protective: Over eight days, no mushroom corals died after being exposed to oxybenzone and simulated sunlight.

    It’s still unclear what amount of oxybenzone might be toxic to coral reefs in the wild. Another lingering question, Downs says, is whether other sunscreen components that are similar in structure to oxybenzone might have the same effects. Pinning that down could help researchers make better, reef-safe sunscreens.   More

  • in

    Newly proposed search strategies improve computational cost of the bicycle-sharing problem

    Bicycle sharing systems (BSSs) are transport solutions wherein users can rent a bicycle from a depot or ‘port,’ travel, and then return the bike to the same port or different port. BSSs are growing in popularity around the world because they are eco-friendly, reduce traffic congestion, and offer added health benefits to users. But eventually, a port becomes either full or empty in a BSS. This means that users are no longer able to rent a bike (when empty) or return one (when full). To address this issue, bikes need to be rebalanced among the ports in a BSS so that users are always able to use them. This rebalancing must also be carried out in a way that is beneficial to BSS companies so that they can reduce labor costs, as well as carbon emissions from rebalancing vehicles.
    There are several existing approaches to BSS rebalancing, however, most solution algorithms are computationally expensive and take a lot of time to find an ‘exact’ solution in cases where there are a large number of ports. Even finding an approximate solution is computationally expensive. Previously, a research team led by Prof. Tohru Ikeguchi from Tokyo University of Science proposed a ‘multiple-vehicle bike sharing system routing problem with soft constraints’ (mBSSRP-S) that can find the shortest travel times for multiple bike rebalancing vehicles with the caveat that the optimal solution can sometimes violate the real-world limitations of the problem. Now, in a recent study published in MDPI’s Applied Sciences, the team has proposed two strategies to search for approximate solutions to the mBSSRP-S that can reduce computational costs without affecting performance. The research team also featured PhD student Ms. Honami Tsushima of Tokyo University of Science and Prof. Takafumi Matsuura of Nippon Institute of Technology.
    Describing their research, Prof. Ikeguchi says, “Earlier, we had proposed the mBSSRP-S and that offered improved performance as compared to our original mBSSRP, which did not allow the violation of constraints. But the mBSSRP-S also increased the overall computational cost of the problem because it had to calculate both the feasible and infeasible solutions of the mBSSRP. Therefore, we have now proposed two consecutive search strategies to address this problem.”
    The proposed search strategies look for feasible solutions in a much shorter period of time as compared to the one originally proposed with mBSSRP-S. The first strategy focuses on reducing the number of ‘neighboring’ solutions (solutions that are numerically close to a solution to the optimization problem) before finding a feasible solution. The strategy employs two well-known algorithms called ‘Or-opt’ and ‘CROSS-exchange,’ to reduce the overall time taken to compute a solution. The feasible solution here refers to values that satisfy the constraints of mBSSRP.
    The second strategy changes the problem to be solved based on the feasible solution to either the mBSSRP problem or the mBSSRP-S problem and then searches for good near-optimal solutions in a short time by either Or-opt or CROSS-exchange.
    The research team then performed numerical experiments to evaluate the computational cost and performance of their algorithms. “With the application of these two strategies, we have succeeded in reducing computational time while maintaining performance,” reveals Prof. Ikeguchi. “We also found that once we calculated the feasible solution, we could find short travel times for the rebalancing vehicles quickly by solving the hard constraint problem, mBSSRP, instead of mBSSRP-S.”
    The popularity of BSSs is only expected to grow in the future. The new solution-search strategies proposed here will go a long way towards realizing convenient and comfortable BSSs that benefit users, companies, and the environment.
    Story Source:
    Materials provided by Tokyo University of Science. Note: Content may be edited for style and length. More

  • in

    Researchers now able to predict battery lifetimes with machine learning

    Technique could reduce costs of battery development.
    Imagine a psychic telling your parents, on the day you were born, how long you would live. A similar experience is possible for battery chemists who are using new computational models to calculate battery lifetimes based on as little as a single cycle of experimental data.
    In a new study, researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have turned to the power of machine learning to predict the lifetimes of a wide range of different battery chemistries. By using experimental data gathered at Argonne from a set of 300 batteries representing six different battery chemistries, the scientists can accurately determine just how long different batteries will continue to cycle.
    In a machine learning algorithm, scientists train a computer program to make inferences on an initial set of data, and then take what it has learned from that training to make decisions on another set of data.
    “For every different kind of battery application, from cell phones to electric vehicles to grid storage, battery lifetime is of fundamental importance for every consumer,” said Argonne computational scientist Noah Paulson, an author of the study. “Having to cycle a battery thousands of times until it fails can take years; our method creates a kind of computational test kitchen where we can quickly establish how different batteries are going to perform.”
    “Right now, the only way to evaluate how the capacity in a battery fades is to actually cycle the battery,” added Argonne electrochemist Susan “Sue” Babinec, another author of the study. “It’s very expensive and it takes a long time.”
    According to Paulson, the process of establishing a battery lifetime can be tricky. “The reality is that batteries don’t last forever, and how long they last depends on the way that we use them, as well as their design and their chemistry,” he said. “Until now, there’s really not been a great way to know how long a battery is going to last. People are going to want to know how long they have until they have to spend money on a new battery.” More

  • in

    'Nanomagnetic' computing can provide low-energy AI

    Researchers have shown it is possible to perform artificial intelligence using tiny nanomagnets that interact like neurons in the brain.
    The new method, developed by a team led by Imperial College London researchers, could slash the energy cost of artificial intelligence (AI), which is currently doubling globally every 3.5 months.
    In a paper published today in Nature Nanotechnology, the international team have produced the first proof that networks of nanomagnets can be used to perform AI-like processing. The researchers showed nanomagnets can be used for ‘time-series prediction’ tasks, such as predicting and regulating insulin levels in diabetic patients.
    Artificial intelligence that uses ‘neural networks’ aims to replicate the way parts of the brain work, where neurons talk to each other to process and retain information. A lot of the maths used to power neural networks was originally invented by physicists to describe the way magnets interact, but at the time it was too difficult to use magnets directly as researchers didn’t know how to put data in and get information out.
    Instead, software run on traditional silicon-based computers was used to simulate the magnet interactions, in turn simulating the brain. Now, the team have been able to use the magnets themselves to process and store data — cutting out the middleman of the software simulation and potentially offering enormous energy savings.
    Nanomagnetic states
    Nanomagnets can come in various ‘states’, depending on their direction. Applying a magnetic field to a network of nanomagnets changes the state of the magnets based on the properties of the input field, but also on the states of surrounding magnets. More

  • in

    How much does eating meat affect nations’ greenhouse gas emissions?

    The food we eat is responsible for an astounding one-third of global greenhouse gas emissions caused by human activities, according to two comprehensive studies published in 2021.

    “When people talk about food systems, they always think about the cow in the field,” says statistician Francesco Tubiello, lead author of one of the reports, appearing in last June’s Environmental Research Letters. True, cows are a major source of methane, which, like other greenhouse gases, traps heat in the atmosphere. But methane, carbon dioxide and other planet-warming gases are released from several other sources along the food production chain.

    Before 2021, scientists like Tubiello, of the Food and Agriculture Organization of the United Nations, were well aware that agriculture and related land use changes made up roughly 20 percent of the planet’s greenhouse gas emissions. Such land use changes include cutting down forests to make way for cattle grazing and pumping groundwater to flood fields for the sake of agriculture.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    But new modeling techniques used by Tubiello and colleagues, plus a study from a group at the European Commission Tubiello worked with, brought to light another big driver of emissions: the food supply chain. All the steps that take food from the farm to our plates to the landfill — transportation, processing, cooking and food waste — bring food-related emissions up from 20 percent to 33 percent.

    To slow climate change, the foods we eat deserve major attention, just like fossil fuel burning, says Amos Tai, an environmental scientist at the Chinese University of Hong Kong. The fuller picture of food-related emissions demonstrates that the world needs to make drastic changes to the food system if we are to reach international goals for reducing global warming.

    Change from developing countries

    Scientists have gained a clearer understanding of global human-related emissions in recent years through databases like EDGAR, or Emissions Database for Global Atmospheric Research, developed by the European Union. The database covers every country’s human-emitting activities, from energy production to landfill waste, from 1970 to the present. EDGAR uses a unified methodology to calculate emissions for all economic sectors, says Monica Crippa, a scientific officer at the European Commission’s Joint Research Centre.

    Crippa and colleagues, with help from Tubiello, built a companion database of food system–related emissions called EDGAR-FOOD. Using that database, the researchers arrived at the same one-third estimate as Tubiello’s group.

    Crippa’s team’s calculations, reported in Nature Food in March 2021, split food system emissions into four broad categories: land (including both agriculture and related land use changes), energy (used for producing, processing, packaging and transporting goods), industry (including the production of chemicals used in farming and materials used to package food) and waste (from unused food).

    The land sector is the biggest culprit in food system emissions, Crippa says, accounting for about 70 percent of the global total. But the picture looks different across different nations. The United States and other developed countries rely on highly centralized megafarms for much of their food production; so the energy, industry and waste categories make up more than half of these countries’ food system emissions.

    In developing countries, agriculture and changing land use are far greater contributors. Emissions in historically less developed countries have also been rising in the last 30 years, as these countries have cut down wild areas to make way for industrial farming and started eating more meat, another major contributor to emissions with impacts across all four categories.

    As a result, agriculture and related landscape shifts have driven major increases in food system emissions among developing countries in recent decades, while emissions in developed countries have not grown.

    For instance, China’s food emissions shot up by almost 50 percent from 1990 to 2018, largely due to a rise in meat-eating, according to the EDGAR-FOOD database. In 1980, the average Chinese person ate about 30 grams of meat a day, Tai says. In 2010, the average person in China ate almost five times as much, or just under 150 grams of meat a day.

    Top-emitting economies

    In recent years, Crippa says, six economies, the top emitters, have been responsible for more than half of total global food emissions. These economies, in order, are China, Brazil, the United States, India, Indonesia and the European Union. The immense populations of China and India help drive their high numbers. Brazil and Indonesia make the list because large swaths of their rainforests have been cut down to make room for farming. When those trees come down, vast amounts of carbon flow into the atmosphere (SN: 7/3/21 & 7/17/21, p. 24).

    The United States and the European Union are on the list because of heavy meat consumption. In the United States, meat and other animal products contribute the vast majority of food-related emissions, says Richard Waite, a researcher at the World Resources Institute’s food program in Washington, D.C.

    Waste is also a huge issue in the United States: More than one-third of food produced never actually gets eaten, according to a 2021 report from the U.S. Environmental Protection Agency. When food goes uneaten, the resources used to produce, transport and package it are wasted. Plus, the uneaten food goes into landfills, which produce methane, carbon dioxide and other gases as the food decomposes.

    Meat consumption drives emissions

    Climate advocates who want to reduce food emissions often focus on meat consumption, as animal products lead to far greater emissions than plants. Animal production uses more land than plant production, and “meat production is heavily inefficient,” Tai says.

    “If we eat 100 calories of grain, like maize or soybeans, we get that 100 calories,” he explains. All the energy from the food is delivered directly to the person who eats it. But if the 100 calories’ worth of grain is instead fed to a cow or a pig, when the animal is killed and processed for food, just one-tenth of the energy from that 100 calories of grain goes to the person eating the animal.

    Methane production from “the cow in the field” is another factor in meat consumption: Cows release this gas via their manure, burps and flatulence. Methane traps more heat per ton emitted than carbon dioxide, Tubiello says. So emissions from cattle farms can have an outsize impact (SN: 11/28/15, p. 22). These livestock emissions account for about one-third of global methane emissions, according to a 2021 U.N. report.

    Shifting from meats to plants

    U.S. residents should consider how they can shift to what Brent Kim calls “plant-forward” diets. “Plant-forward doesn’t mean vegan. It means reducing animal product intake, and increasing the share of plant foods that are on the plate,” says Kim, program officer at the Johns Hopkins Center for a Livable Future.

    Kim and colleagues estimated food emissions by diet and food group for 140 countries and territories, using a similar modeling framework to EDGAR-FOOD. However, the framework includes only the food production emissions (i.e. agriculture and land use), not processing, transportation and other pieces of the food system incorporated in EDGAR-FOOD.

    Producing the average U.S. resident’s diet generates more than 2,000 kilograms of greenhouse gas emissions per year, the researchers reported in 2020 in Global Environmental Change. The group measured emissions in terms of “CO2 equivalents,” a standardized unit allowing for direct comparisons between CO2 and other greenhouse gases like methane.

    Going meatless one day a week brings down that figure to about 1,600 kilograms of CO2 equivalents per year, per person. Going vegan — a diet without any meat, dairy or other animal products — cuts it by 87 percent to under 300. Going even two-thirds vegan offers a sizable drop to 740 kilograms of CO2 equivalents.

    Kim’s modeling also offers a “low food chain” option, which brings emissions down to about 300 kilograms of CO2 equivalents per year, per person. Eating low on the food chain combines a mostly plant-based diet with animal products that come from more climate-friendly sources that do not disturb ecological systems. Examples include insects, smaller fish like sardines, and oysters and other mollusks.

    Tai agrees that not everybody needs to become a vegetarian or vegan to save the planet, as meat can have important cultural and nutritional value. If you want to “start from the biggest polluter,” he says, focus on cutting beef consumption.

    But enough people need to make these changes to “send a signal back to the market” that consumers want more plant-based options, Tubiello says. Policy makers at the federal, state and local levels can also encourage climate-friendly farming practices, reduce food waste in government operations and take other actions to cut down the resources used in food production, Waite says.

    For example, the World Resources Institute, where Waite works, is part of an initiative called the Cool Food Pledge, in which companies, universities and city governments have signed on to reduce the climate impacts of the food they serve. The institutions agree to track the food they purchase every year to ensure they are progressing toward their goals, Waite says.

    Developed countries like the United States — which have been heavy meat consumers for decades — can have a big impact by changing food choices. Indeed, a paper published in Nature Food in January shows that if the populations of 54 high-income nations switched to a plant-focused diet, annual emissions from these countries’ agricultural production could drop by more than 60 percent. More

  • in

    Bye, bye, biopsy? Handheld device could painlessly identify skin cancers

    Skin biopsies are no fun: doctors carve away small lumps of tissue for laboratory testing, leaving patients with painful wounds that can take weeks to heal. That’s a price worth paying if it enables early cancer treatment. However, in recent years, aggressive diagnostic efforts have seen the number of biopsies grow around four times faster than the number of cancers detected, with about 30 benign lesions now biopsied for every case of skin cancer that’s found.
    Researchers at Stevens Institute of Technology are now developing a low-cost handheld device that could cut the rate of unnecessary biopsies in half and give dermatologists and other frontline physicians easy access to laboratory-grade cancer diagnostics. “We aren’t trying to get rid of biopsies,” said Negar Tavassolian, director of the Bio-Electromagnetics Laboratory at Stevens. “But we do want to give doctors additional tools and help them to make better decisions.”
    The team’s device uses millimeter-wave imaging — the same technology used in airport security scanners — to scan a patient’s skin. (In earlier work, Tavassolian and her team had to work with already biopsied skin for the device to detect if it was cancerous.)
    Healthy tissue reflects millimeter-wave rays differently than cancerous tissue, so it’s theoretically possible to spot cancers by monitoring contrasts in the rays reflected back from the skin. To bring that approach into clinical practice, the researchers used algorithms to fuse signals captured by multiple different antennas into a single ultrahigh-bandwidth image, reducing noise and quickly capturing high-resolution images of even the tiniest mole or blemish.
    Spearheaded by Amir Mirbeik Ph.D. ’18, the team used a tabletop version of their technology to examine 71 patients during real-world clinical visits, and found their methods could accurately distinguish benign and malignant lesions in just a few seconds. Using their device, Tavassolian and Mirbeik could identify cancerous tissue with 97% sensitivity and 98% specificity — a rate competitive with even the best hospital-grade diagnostic tools.
    “There are other advanced imaging technologies that can detect skin cancers, but they’re big, expensive machines that aren’t available in the clinic,” said Tavassolian, whose work appears in the March 23 issue of Scientific Reports. “We’re creating a low-cost device that’s as small and as easy to use as a cellphone, so we can bring advanced diagnostics within reach for everyone.”
    Because the team’s technology delivers results in seconds, it could one day be used instead of a magnifying dermatoscope in routine checkups, giving extremely accurate results almost instantly. “That means doctors can integrate accurate diagnostics into routine checkups, and ultimately treat more patients,” said Tavassolian.
    Unlike many other imaging methods, millimeter-wave rays harmlessly penetrate about 2mm into human skin, so the team’s imaging technology provides a clear 3D map of scanned lesions. Future improvements to the algorithm powering the device could significantly improve mapping of lesion margins, enabling more precise and less invasive biopsying for malignant lesions.
    The next step is to pack the team’s diagnostic kit onto an integrated circuit, a step that could soon allow functional handheld millimeter-wave diagnostic devices to be produced for as little as $100 a piece — a fraction of the cost of existing hospital-grade diagnostic equipment. The team is already working to commercialize their technology and hopes to start putting their devices in clinicians’ hands within the next two years.
    “The path forward is clear, and we know what we need to do,” said Tavassolian. “After this proof of concept, we need to miniaturize our technology, bring the price down, and bring it to the market.”
    Story Source:
    Materials provided by Stevens Institute of Technology. Note: Content may be edited for style and length. More