More stories

  • in

    Scientists devise new technique to increase chip yield from semiconductor wafer

    Scientists from the Nanyang Technological University, Singapore (NTU Singapore) and the Korea Institute of Machinery & Materials (KIMM) have developed a technique to create a highly uniform and scalable semiconductor wafer, paving the way to higher chip yield and more cost-efficient semiconductors.
    Semiconductor chips commonly found in smart phones and computers are difficult and complex to make, requiring highly advanced machines and special environments to manufacture.
    Their fabrication is typically done on silicon wafers and then diced into the small chips that are used in devices. However, the process is imperfect and not all chips from the same wafer work or operate as desired. These defective chips are discarded, lowering semiconductor yield while increasing production cost.
    The ability to produce uniform wafers at the desired thickness is the most important factor in ensuring that every chip fabricated on the same wafer performs correctly.
    Nanotransfer-based printing — a process that uses a polymer mould to print metal onto a substrate through pressure, or ‘stamping’ — has gained traction in recent years as a promising technology for its simplicity, relative cost-effectiveness, and high throughput.
    However, the technique uses a chemical adhesive layer, which causes negative effects, such as surface defects and performance degradation when printed at scale, as well as human health hazards. For these reasons, mass adoption of the technology and consequent chip application in devices has been limited. More

  • in

    What's the prevailing opinion on social media? Look at the flocks, says researcher

    A University at Buffalo communication researcher has developed a framework for measuring the slippery concept of social media public opinion.
    These collective views on a topic or issue expressed on social media, distinct from the conclusions determined through survey-based public opinion polling, have never been easy to determine. But the “murmuration” framework developed and tested by Yini Zhang, PhD, an assistant professor of communication in the UB College of Arts and Sciences, and her collaborators addresses challenges, like identifying online demographics and factoring for opinion manipulation, that are characteristic on these digital battlegrounds of public discourse.
    Murmuration identifies meaningful groups of social media actors based on the “who-follows-whom” relationship. The actors attract like-minded followers to form “flocks,” which serve as the units of analysis. As opinions form and shift in response to external events, the flocks’ unfolding opinions move like the fluid murmuration of airborne starlings.
    The framework and the findings from an analysis of social network structure and opinion expression from over 193,000 Twitter accounts, which followed more than 1.3 million other accounts, suggest that flock membership can predict opinion and that the murmuration framework reveals distinct patterns of opinion intensity. The researchers studied Twitter because of the ability to see who is following whom, information that is not publicly accessible on other platforms.
    The results, published in the Journal of Computer-Mediated Communication, further support the echo chamber tendencies prevalent on social media, while adding important nuance to existing knowledge.
    “By identifying different flocks and examining the intensity, temporal pattern and content of their expression, we can gain deeper insights far beyond where liberals and conservatives stand on a certain issue,” says Zhang, an expert in social media and political communication. “These flocks are segments of the population, defined not by demographic variables of questionable salience, like white women aged 18-29, but by their online connections and response to events.
    “As such, we can observe opinion variations within an ideological camp and opinions of people that might not be typically assumed to have an opinion on certain issues. We see the flocks as naturally occurring, responding to things as they happen, in ways that take a conversational element into consideration.”
    Zhang says it’s important not to confuse public opinion, as measured by survey-based polling methods, and social media public opinion.
    “Arguably, social media public opinion is twice removed from the general public opinion measured by surveys,” say Zhang. “First, not everyone uses social media. Second, among those who do, only a subset of them actually express opinions on social media. They tend to be strongly opinionated and thus more willing to express their views publicly.”
    Murmuration offers insights that can complement information gathered through survey-based polling. It also moves away from mining social media for text from specific tweets. Murmuration takes full advantage of social media’s dynamic aspect. When text is removed from its context, it becomes difficult to accurately determine questions about what led to the discussion, when it began, and how it evolved over time.
    “Murmuration can allow for research that makes better use of social media data to study public opinion as a form of social interaction and reveal underlying social dynamics,” says Zhang.
    Story Source:
    Materials provided by University at Buffalo. Original written by Bert Gambini. Note: Content may be edited for style and length. More

  • in

    Pivotal technique harnesses cutting-edge AI capabilities to model and map the natural environment

    Scientists have developed a pioneering new technique that harnesses the cutting-edge capabilities of AI to model and map the natural environment in intricate detail.
    A team of experts, including Charlie Kirkwood from the University of Exeter, has created a sophisticated new approach to modelling the Earth’s natural features in greater detail and accuracy.
    The new technique can recognise intricate features and aspects of the terrain far beyond the capabilities of more traditional methods and use these to generate enhanced-quality environmental maps.
    Crucially, the new system could also pave the way to unlocking new discoveries of the relationships within the natural environment, that may help tackle some of the greater climate and environment issues of the 21st century.
    The study is published in leading journal Mathematical Geosciences, as part of a special issue on geostatistics and machine learning.
    Modelling and mapping the environment is a lengthy, time consuming and expensive process. Cost limits the number of observations that can be obtained, which means that creating comprehensive spatially-continuous maps depends upon filling in the gaps between these observations. More

  • in

    Tiny battery-free devices float in the wind like dandelion seeds

    Wireless sensors can monitor how temperature, humidity or other environmental conditions vary across large swaths of land, such as farms or forests.
    These tools could provide unique insights for a variety of applications, including digital agriculture and monitoring climate change. One problem, however, is that it is currently time-consuming and expensive to physically place hundreds of sensors across a large area.
    Inspired by how dandelions use the wind to distribute their seeds, a University of Washington team has developed a tiny sensor-carrying device that can be blown by the wind as it tumbles toward the ground. This system is about 30 times as heavy as a 1 milligram dandelion seed but can still travel up to 100 meters in a moderate breeze, about the length of a football field, from where it was released by a drone. Once on the ground, the device, which can hold at least four sensors, uses solar panels to power its onboard electronics and can share sensor data up to 60 meters away.
    The team published these results March 16 in Nature.
    “We show that you can use off-the-shelf components to create tiny things. Our prototype suggests that you could use a drone to release thousands of these devices in a single drop. They’ll all be carried by the wind a little differently, and basically you can create a 1,000-device network with this one drop,” said senior author Shyam Gollakota, a UW professor in the Paul G. Allen School of Computer Science & Engineering. “This is amazing and transformational for the field of deploying sensors, because right now it could take months to manually deploy this many sensors.”
    Because the devices have electronics on board, it’s challenging to make the whole system as light as an actual dandelion seed. The first step was to develop a shape that would allow the system to take its time falling to the ground so that it could be tossed around by a breeze. The researchers tested 75 designs to determine what would lead to the smallest “terminal velocity,” or the maximum speed a device would have as it fell through the air. More

  • in

    Toward a quantum computer that calculates molecular energy

    Quantum computers are getting bigger, but there are still few practical ways to take advantage of their extra computing power. To get over this hurdle, researchers are designing algorithms to ease the transition from classical to quantum computers. In a new study in Nature, researchers unveil an algorithm that reduces the statistical errors, or noise, produced by quantum bits, or qubits, in crunching chemistry equations.
    Developed by Columbia chemistry professor David Reichman and postdoc Joonho Lee with researchers at Google Quantum AI, the algorithm uses up to 16 qubits on Sycamore, Google’s 53-qubit computer, to calculate ground state energy, the lowest energy state of a molecule. “These are the largest quantum chemistry calculations that have ever been done on a real quantum device,” Reichman said.
    The ability to accurately calculate ground state energy, will enable chemists to develop new materials, said Lee, who is also a visiting researcher at Google Quantum AI. The algorithm could be used to design materials to speed up nitrogen fixation for farming and hydrolysis for making clean energy, among other sustainability goals, he said.
    The algorithm uses a quantum Monte Carlo, a system of methods for calculating probabilities when there are a large number of random, unknown variables at play, like in a game of roulette. Here, the researchers used their algorithm to determine the ground state energy of three molecules: heliocide (H4), using eight qubits for the calculation; molecular nitrogen (N2), using 12 qubits; and solid diamond, using 16 qubits.
    Ground state energy is influenced by variables such as the number of electrons in a molecule, the direction in which they spin, and the paths they take as they orbit a nucleus. This electronic energy is encoded in the Schrodinger equation. Solving the equation on a classical computer becomes exponentially harder as molecules get bigger, although methods for estimating the solution have made the process easier. How quantum computers might circumvent the exponential scaling problem has been an open question in the field.
    In principle, quantum computers should be able to handle exponentially larger and more complex calculations, like those needed to solve the Schrodinger equation, because the qubits that make them up take advantage of quantum states. Unlike binary digits, or bits, made up of ones and zeros, qubits can exist in two states simultaneously. Qubits, however, are fragile and error-prone: the more qubits used, the less accurate the final answer. Lee’s algorithm harnesses the combined power of classical and quantum computers to solve chemistry equations more efficiently while minimizing the quantum computer’s mistakes.
    “It’s the best of both worlds,” Lee said. “We leveraged tools that we already had as well as tools that are considered state-of-the-art in quantum information science to refine quantum computational chemistry.”
    A classical computer can handle most of Lee’s quantum Monte Carlo simulation. Sycamore jumps in for the last, most computationally complex step: the calculation of the overlap between a trial wave function — a guess at the mathematical description of the ground state energy that can be implemented by the quantum computer — and a sample wave function, which is part of the Monte Carlo’s statistical process. This overlap provides a set of constraints, known as the boundary condition, to the Monte Carlo sampling, which ensures the statistical efficiency of the calculation.
    The prior record for solving ground state energy used 12 qubits and a method called the variational quantum eigensolver, or VQE. But VQE ignored the effects of interacting electrons, an important variable in calculating ground state energy that Lee’s quantum Monte Carlo algorithm now includes. Adding virtual correlation techniques from classic computers could help chemists tackle even larger molecules, Lee said.
    The hybrid classical-quantum calculations in this new work were found to be as accurate as some of the best classical methods. This suggests that problems could be solved more accurately and/or quickly with a quantum computer than without — a key milestone for quantum computing. Lee and his colleagues will continue to tweak their algorithm to make it more efficient, while engineers work to build better quantum hardware.
    “The feasibility of solving larger and more challenging chemical problems will only increase with time,” Lee said. “This gives us hope that quantum technologies that are being developed will be practically useful.”
    Story Source:
    Materials provided by Columbia University. Original written by Ellen Neff. Note: Content may be edited for style and length. More

  • in

    This fabric can hear your heartbeat

    Someday our clothing may eavesdrop on the soundtrack of our lives, capturing the noises around and inside us.

    A new fiber acts as a microphone — picking up speech, rustling leaves and chirping birds — and turns those acoustic signals into electrical ones. Woven into a fabric, the material can even hear handclaps and faint sounds, such as its wearer’s heartbeat, researchers report March 16 in Nature. Such fabrics could provide a comfortable, nonintrusive — even fashionable — way to monitor body functions or aid with hearing.

    Acoustic fabrics have existed for perhaps hundreds of years, but they’re used to dampen sound, says Wei Yan, a materials scientist at Nanyang Technological University in Singapore. Fabric as a microphone is “totally a different concept,” says Yan, who worked on the fabric while at MIT.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Yan and his colleagues were inspired by the human eardrum. Sound waves cause vibrations in the eardrum, which are converted to electrical signals by the cochlea. “It turns out that this eardrum is made of fibers,” says Yoel Fink, a materials scientist at MIT. In the eardrum’s inner layers, collagen fibers radiate from the center, while others form concentric rings. The crisscrossing fibers play a role in hearing and look similar to the fabrics people weave, Fink says.

    Analogous to what’s happening in an eardrum, sound vibrates fabric at the nanoscale. In the new fabric, cotton fibers and others of a somewhat stiff material called Twaron efficiently convert incoming sound to vibrations. Woven together with these threads is a single fiber that contains a blend of piezoelectric materials, which produce a voltage when pressed or bent (SN: 8/22/17). The buckling and bending of the piezoelectric-containing fiber create electrical signals that can be sent through a tiny circuit board to a device that reads and records the voltage.

    The fabric microphone is sensitive to a range of noise levels, from a quiet library to heavy traffic, the team reports, although it is continuing to investigate what signal processing is needed to detangle target sounds from ambient noise. Integrated into clothing, this sound-sensing fabric feels like regular fabric, Yan says. And it continued to work as a microphone after washing it 10 times.

    Woven into fabric, a specialized fiber (pictured, center) creates electrical signals when bent or buckled, turning the entire material into a microphone.Fink Lab/MIT, Elizabeth Meiklejohn/RISD, Greg Hren

    Piezoelectric materials have “huge potential” for applications from observing the function of bodies to monitoring the integrity of aircraft materials, says Vijay Thakur, a materials scientist at Scotland’s Rural College in Edinburgh who was not part of this work. They’ve even been proposed for energy generation, but, he says, many uses have been limited by the tiny voltages they produce (SN: 10/1/15). The way the fibers are made in this fabric — sandwiching a blend of piezoelectric materials between other components, including a flexible, stretchy outer material — concentrates the energy from the vibrations into the piezoelectric layer, enhancing the signal it produces.

    As a proof of concept, the team incorporated the fabric into a shirt, which could hear its wearer’s heart like a stethoscope does. Used this way, the fabric microphone could listen for murmurs and may someday be able to provide information similar to an echocardiogram, an ultrasound of the heart, Thakur says. If it proves effective as a monitoring and diagnostic tool, placing such microphones into clothing may someday make it easier for doctors to track heart conditions in young children, who have trouble keeping still, he says.

    The team also anticipates that fabric microphones could aid hearing and communication. Another shirt the team created had two piezoelectric fibers spaced apart on the shirt’s back. Based on when each fiber picked up the sound, this shirt can be used to detect the direction a clap came from. And when hooked up to a power source, the fabric microphones can project sound as a speaker.

    “For the past 20 years, we’ve been trying to introduce a new way of thinking about fabrics,” Fink says. Beyond providing beauty and warmth, fabrics may help solve technological problems. And perhaps, Fink says, they can beautify technology too.  More

  • in

    AI to predict antidepressant outcomes in youth

    Mayo Clinic researchers have taken the first step in using artificial intelligence (AI) to predict early outcomes with antidepressants in children and adolescents with major depressive disorder, in a study published in The Journal of Child Psychology and Psychiatry. This work resulted from a collaborative effort between the departments of Molecular Pharmacology and Experimental Therapeutics, and Psychiatry and Psychology, at Mayo Clinic, with support from Mayo Clinic’s Center for Individualized Medicine.
    “This preliminary work suggests that AI has promise for assisting clinical decisions by informing physicians on the selection, use and dosing of antidepressants for children and adolescents with major depressive disorder,” says Paul Croarkin, D.O., a Mayo Clinic psychiatrist and senior author of the study. “We saw improved predictions of treatment outcomes in samples of children and adolescents across two classes of antidepressants.”
    In the study, researchers identified variation in six depressive symptoms: difficulty having fun, social withdrawal, excessive fatigue, irritability, low self-esteem and depressed feelings.
    They assessed these symptoms with the Children’s Depression Rating Scale-Revised to predict outcomes to 10 to 12 weeks of antidepressant pharmacotherapy: The six symptoms predicted 10- to 12-week outcomes at four to six weeks in fluoxetine testing datasets, with an average accuracy of 73%. The same six symptoms predicted 10- to 12-week outcomes at four to six weeks in duloxetine testing datasets, with an average accuracy of 76%. In placebo-treated patients, predicting response and remission accuracy was significantly lower than for antidepressants at 67%.These outcomes show the potential of AI and patient data to ensure children and adolescents receive treatment that has the highest likelihood of delivering therapeutic benefits with minimized side effects, explains Arjun Athreya, Ph.D., a Mayo Clinic researcher and lead author of the study.
    “We designed the algorithm to mimic a clinician’s logic of treatment management at an interim time point based on their estimated guess of whether a patient will likely or not benefit from pharmacotherapy at the current dose,” says Dr. Athreya. “Hence, it was essential for me as a computer engineer to embed and observe the practice closely to not only understand the needs of the patient, but also how AI can be consumed and useful to the clinician to benefit the patient.”
    Next steps
    The research findings are a foundation for future work incorporating physiological information, brain-based measures and pharmacogenomic data for precision medicine approaches in treating youth with depression. This will improve the care of young patients with depression, and help clinicians initiate and dose antidepressants in patients who benefit most.
    “Technological advances are understudied tools that could enhance treatment approaches,” says Liewei Wang, M.D., Ph.D., the Bernard and Edith Waterman Director of the Pharmacogenomics Program and Director of the Center for Individualized Medicine at the Mayo Clinic. “Predicting outcomes in children and adolescents treated for depression is critical in managing what could become a lifelong disease burden.”
    Acknowledgments
    This work was supported by Mayo Clinic Foundation for Medical Education and Research; the National Science Foundation under award No. 2041339; and the National Institute of Mental Health under awards R01MH113700, R01MH124655 and R01AA027486. The content is solely the authors’ responsibility and does not necessarily represent the official views of the funding agencies. The authors have declared no competing or potential conflicts of interest.
    Story Source:
    Materials provided by Mayo Clinic. Original written by Colette Gallagher. Note: Content may be edited for style and length. More

  • in

    Nuclear reactor power levels can be monitored using seismic and acoustic data

    Seismic and acoustic data recorded 50 meters away from a research nuclear reactor could predict whether the reactor was in an on or off state with 98% accuracy, according to a new study published in Seismological Research Letters.
    By applying several machine learning models to the data, researchers at Oak Ridge National Laboratory could also predict when the reactor was transitioning between on and off, and estimate its power levels, with about 66% accuracy.
    The findings provide another tool for the international community to cooperatively verify and monitor nuclear reactor operations in a minimally invasive way, said the study’s lead author Chengping Chai, a geophysicist at Oak Ridge. “Nuclear reactors can be used for both benign and nefarious activities. Therefore, verifying that a nuclear reactor is operating as declared is of interest to the nuclear nonproliferation community.”
    Although seismic and acoustic data have long been used to monitor earthquakes and the structural properties of infrastructure such as buildings and bridges, some researchers now use the data to take a closer look at the movements associated with industrial processes. In this case, Chai and colleagues deployed seismic and acoustic sensors around the High Flux Isotope Reactor at Oak Ridge, a research reactor used to produced neutrons for studies in physics, chemistry, biology, engineering and materials science.
    The reactor’s power status is a thermal process, with a cooling tower that dissipates heat. “We found that seismo-acoustic sensors can record the mechanical signatures of vibrating equipment such as fans and pumps at the cooling tower at an accuracy enough to shed light into operational questions,” Chai said.
    The researchers then compared a number of machine learning algorithms to discover which were best at estimating the reactor’s power state from specific seismo-acoustic signals. The algorithms were trained with seismic-only, acoustic-only and both types of data collected over a year. The combined data produced the best results, they found.
    “The seismo-acoustic signals associated with different power levels show complicated patterns that are difficult to analyze with traditional techniques,” Chai explained. “The machine learning approaches are able to infer the complex relationship between different reactor systems and their seismo-acoustic fingerprint and use it to predict power levels.”
    Chai and colleagues detected some interesting signals during the course of their study, including the vibrations of a noisy pump in the reactor’s off state, which disappeared when the pump was replaced.
    Chai said it is a long-term and challenging goal to associate seismic and acoustic signatures with different industrial activities and equipment. For the High Flux Isotope Reactor, preliminary research shows that fans and pumps have different seismo-acoustic fingerprints, and that different fan speeds have their own unique signatures.
    “Some normal but less frequent activities such as yearly or incidental maintenance need to be distinguished in seismic and acoustic data,” Chai said. To better understand how these signatures relate to specific operations, “we need to study both the seismic and acoustic signatures of instruments and the background noise at various industrial facilities.”
    Story Source:
    Materials provided by Seismological Society of America. Note: Content may be edited for style and length. More