More stories

  • in

    Do Alexa and Siri make kids bossier? New research suggests you might not need to worry

    Chatting with a robot is now part of many families’ daily lives, thanks to conversational agents such as Apple’s Siri or Amazon’s Alexa. Recent research has shown that children are often delighted to find that they can ask Alexa to play their favorite songs or call Grandma.
    But does hanging out with Alexa or Siri affect the way children communicate with their fellow humans? Probably not, according to a recent study led by the University of Washington that found that children are sensitive to context when it comes to these conversations.
    The team had a conversational agent teach 22 children between the ages of 5 and 10 to use the word “bungo” to ask it to speak more quickly. The children readily used the word when a robot slowed down its speech. While most children did use bungo in conversations with their parents, it became a source of play or an inside joke about acting like a robot. But when a researcher spoke slowly to the children, the kids rarely used bungo, and often patiently waited for the researcher to finish talking before responding.
    The researchers published their findings in June at the 2021 Interaction Design and Children conference.
    “We were curious to know whether kids were picking up conversational habits from their everyday interactions with Alexa and other agents,” said senior author Alexis Hiniker, a UW assistant professor in the Information School. “A lot of the existing research looks at agents designed to teach a particular skill, like math. That’s somewhat different from the habits a child might incidentally acquire by chatting with one of these things.”
    The researchers recruited 22 families from the Seattle area to participate in a five-part study. This project took place before the COVID-19 pandemic, so each child visited a lab with one parent and one researcher. For the first part of the study, children spoke to a simple animated robot or cactus on a tablet screen that also displayed the text of the conversation. More

  • in

    Researchers develop new tool for analyzing large superconducting circuits

    The next generation of computing and information processing lies in the intriguing world of quantum mechanics. Quantum computers are expected to be capable of solving large, extremely complex problems that are beyond the capacity of today’s most powerful supercomputers.
    New research tools are needed to advance the field and fully develop quantum computers. Now Northwestern University researchers have developed and tested a theoretical tool for analyzing large superconducting circuits. These circuits use superconducting quantum bits, or qubits, the smallest units of a quantum computer, to store information.
    Circuit size is important since protection from detrimental noise tends to come at the cost of increased circuit complexity. Currently there are few tools that tackle the modeling of large circuits, making the Northwestern method an important contribution to the research community.
    “Our framework is inspired by methods originally developed for the study of electrons in crystals and allows us to obtain quantitative predictions for circuits that were previously hard or impossible to access,” said Daniel Weiss, corresponding and first author of the paper. He is a fourth-year graduate student in the research group of Jens Koch, an expert in superconducting qubits.
    Koch, an associate professor of physics and astronomy in Weinberg College of Arts and Sciences, is a member of the Superconducting Quantum Materials and Systems Center (SQMS) and the Co-design Center for Quantum Advantage (C2QA). Both national centers were established last yearby the U.S. Department of Energy (DOE). SQMSis focused on building and deploying a beyond-state-of-the-art quantum computer based on superconducting technologies. C2QA is building the fundamental tools necessary to create scalable, distributed and fault-tolerant quantum computer systems.
    “We are excited to contribute to the missions pursued by these two DOE centers and to add to Northwestern’s visibility in the field of quantum information science,” Koch said.
    In their study, the Northwestern researchers illustrate the use of their theoretical tool by extracting from a protected circuit quantitative information that was unobtainable using standard techniques.
    Details were published today (Sept. 13) in the open access journal Physical Review Research.
    The researchers specifically studied protected qubits. These qubits are protected from detrimental noise by designand could yield coherence times (how long quantum information is retained) that are much longer than current state-of-the-art qubits.
    These superconducting circuits are necessarily large, and the Northwestern tool is a means for quantifying the behavior of these circuits. There are some existing tools that can analyze large superconducting circuits, but each works well only when certain conditions are met. The Northwestern method is complementary and works well when these other tools may give suboptimal results.
    Story Source:
    Materials provided by Northwestern University. Original written by Megan Fellman. Note: Content may be edited for style and length. More

  • in

    Star attraction: Magnetism generated by star-like arrangement of molecules

    A 2D nanomaterial consisting of organic molecules linked to metal atoms in a specific atomic-scale geometry shows non-trivial electronic and magnetic properties due to strong interactions between its electrons.
    A new study, published today, shows the emergence of magnetism in a 2D organic material due to strong electron-electron interactions; these interactions are the direct consequence of the material’s unique, star-like atomic-scale structure.
    This is the first observation of local magnetic moments emerging from interactions between electrons in an atomically thin 2D organic material.
    The findings have potential for applications in next-generation electronics based on organic nanomaterials, where tuning of interactions between electrons can lead to a vast range of electronic and magnetic phases and properties.
    STRONG ELECTRON-ELECTRON INTERACTIONS IN A 2D ORGANIC KAGOME MATERIAL
    The Monash University study investigated a 2D metal-organic nanomaterial composed of organic molecules arranged in a kagome geometry, that is, following a ‘star-like’ pattern. More

  • in

    Quantum materials cut closer than ever

    DTU and Graphene Flagship researchers have taken the art of patterning nanomaterials to the next level. Precise patterning of 2D materials is a route to computation and storage using 2D materials, which can deliver better performance and much lower power consumption than today’s technology.
    One of the most significant recent discoveries within physics and material technology is two-dimensional materials such as graphene. Graphene is stronger, smoother, lighter, and better at conducting heat and electricity than any other known material.
    Their most unique feature is perhaps their programmability. By creating delicate patterns in these materials, we can change their properties dramatically and possibly make precisely what we need.
    At DTU, scientists have worked on improving state of the art for more than a decade in patterning 2D materials, using sophisticated lithography machines in the 1500 m2 cleanroom facility. Their work is based in DTU’s Center for Nanostructured Graphene, supported by the Danish National Research Foundation and a part of The Graphene Flagship.
    The electron beam lithography system in DTU Nanolab can write details down to 10 nanometers. Computer calculations can predict exactly the shape and size of patterns in the graphene to create new types of electronics. They can exploit the charge of the electron and quantum properties such as spin or valley degrees of freedom, leading to high-speed calculations with far less power consumption. These calculations, however, ask for higher resolution than even the best lithography systems can deliver: atomic resolution.
    “If we really want to unlock the treasure chest for future quantum electronics, we need to go below 10 nanometers and approach the atomic scale,” says professor and group leader at DTU Physics, Peter Bøggild. More

  • in

    A universal system for decoding any type of data sent across a network

    Every piece of data that travels over the internet — from paragraphs in an email to 3D graphics in a virtual reality environment — can be altered by the noise it encounters along the way, such as electromagnetic interference from a microwave or Bluetooth device. The data are coded so that when they arrive at their destination, a decoding algorithm can undo the negative effects of that noise and retrieve the original data.
    Since the 1950s, most error-correcting codes and decoding algorithms have been designed together. Each code had a structure that corresponded with a particular, highly complex decoding algorithm, which often required the use of dedicated hardware.
    Researchers at MIT, Boston University, and Maynooth University in Ireland have now created the first silicon chip that is able to decode any code, regardless of its structure, with maximum accuracy, using a universal decoding algorithm called Guessing Random Additive Noise Decoding (GRAND). By eliminating the need for multiple, computationally complex decoders, GRAND enables increased efficiency that could have applications in augmented and virtual reality, gaming, 5G networks, and connected devices that rely on processing a high volume of data with minimal delay.
    The research at MIT is led by Muriel Médard, the Cecil H. and Ida Green Professor in the Department of Electrical Engineering and Computer Science, and was co-authored by Amit Solomon and Wei Ann, both graduate students at MIT; Rabia Tugce Yazicigil, assistant professor of electrical and computer engineering at Boston University; Arslan Riaz and Vaibhav Bansal, both graduate students at Boston University; Ken R. Duffy, director of the Hamilton Institute at the National University of Ireland at Maynooth; and Kevin Galligan, a Maynooth graduate student. The research will be presented at the European Solid-States Device Research and Circuits Conference next week.
    Focus on noise
    One way to think of these codes is as redundant hashes (in this case, a series of 1s and 0s) added to the end of the original data. The rules for the creation of that hash are stored in a specific codebook. More

  • in

    Emissions from computing and ICT could be worse than previously thought

    Global computing could be responsible for a greater share of greenhouse gas emissions than previously thought and these emissions will continue to rise significantly unless action is taken, a new study highlights.
    A team of researchers from Lancaster University and sustainability consultancy Small World Consulting Ltd claim that previous calculations of ICT’s share of global greenhouse emissions, estimated at 1.8-2.8%, likely fall short of the sector’s real climate impact as they only show a partial picture.
    The researchers point out that some of these prior estimates do not account for the full life-cycle and supply chain of ICT products and infrastructure — such as: the energy expended in manufacturing the products and equipment; the carbon cost associated with all of their components and the operational carbon footprint of the companies behind them; the energy consumed when using the equipment; and also their disposal after they have fulfilled their purpose.
    The researchers argue ICT’s true proportion of global greenhouse gas emissions could be around 2.1-3.9% — though they stress that there are still significant uncertainties around these calculations. Although like for like comparisons are difficult, these figures would suggest ICT has emissions greater than the aviation industry, which are around 2 % of global emissions.
    In addition, the paper warns that new trends in computing and ICT such as big data and AI, the Internet of Things, as well as blockchain and cryptocurrencies, risk driving further substantial growth in ICT’s greenhouse gas footprint.
    In their new paper ‘The real climate and transformative impact of ICT: A critique of estimates, trends and regulations’ published today by the journal Patterns, the researchers looked at two central issues — ICT’s own carbon footprint, as well as ICT’s impact on the rest of the economy. More

  • in

    AI can make better clinical decisions than humans: Study

    It’s an old adage: there’s no harm in getting a second opinion. But what if that second opinion could be generated by a computer, using artificial intelligence? Would it come up with better treatment recommendations than your professional proposes?
    A pair of Canadian mental-health researchers believe it can. In a study published in the Journal of Applied Behavior Analysis, Marc Lanovaz of Université de Montréal and Kieva Hranchuk of St. Lawrence College, in Ontario, make a case for using AI in treating behavioural problems.
    “Medical and educational professionals frequently disagree on the effectiveness of behavioral interventions, which may cause people to receive inadequate treatment,” said Lanovaz, an associate professor who heads the Applied Behavioural Research Lab at UdeM’s School of Psychoeducation.
    To find a better way, Lanovaz and Hranchuk, a professor of behavioural science and behavioural psychology at St. Lawrence, compiled simulated data from 1,024 individuals receiving treatment for behavioral issues.
    The researchers then compared the treatment conclusions drawn in each case by five doctoral-level behavior analysts with those produced by a computer model the two academics developed using machine learning.
    “The five professionals only came to the same conclusions approximately 75 per cent of the time,” said Lanovaz. “More importantly, machine learning produced fewer decision-making errors than did all the professionals.”
    Given these very positive results, the next step would be to “integrate our models in an app that could automatically make decisions or provide feedback about how treatment is progressing,” he added.
    The goal, the researchers believe, should be to use machine learning to facilitate the work of professionals, not actually replace them, while also making treatment decisions more consistent and predictable.
    “For example, doctors could someday use the technology to help them decide whether to continue or terminate the treatment of people with disorders as varied as autism, ADHD, anxiety and depression,” Lanovaz said.
    “Individualized clinical and educational decision-making is one of the cornerstones of psychological and behavioral treatment. Our study may thus lead to better treatment options for the millions of individuals who receive these types of services worldwide.”
    Story Source:
    Materials provided by University of Montreal. Note: Content may be edited for style and length. More

  • in

    Largest virtual universe free for anyone to explore

    Forget about online games that promise you a “whole world” to explore. An international team of researchers has generated an entire virtual UNIVERSE, and made it freely available on the cloud to everyone.
    Uchuu (meaning “Outer Space” in Japanese) is the largest and most realistic simulation of the Universe to date. The Uchuu simulation consists of 2.1 trillion particles in a computational cube an unprecedented 9.63 billion light-years to a side. For comparison, that’s about three-quarters the distance between Earth and the most distant observed galaxies. Uchuu will allow us to study the evolution of the Universe on a level of both size and detail inconceivable until now.
    Uchuu focuses on the large-scale structure of the Universe: mysterious halos of dark matter which control not only the formation of galaxies, but also the fate of the entire Universe itself. The scale of these structures ranges from the largest galaxy clusters down to the smallest galaxies. Individual stars and planets aren’t resolved, so don’t expect to find any alien civilizations in Uchuu. But one way that Uchuu wins big in comparison to other virtual worlds is the time domain; Uchuu simulates the evolution of matter over almost the entire 13.8 billion year history of the Universe from the Big Bang to the present. That is over 30 times longer than the time since animal life first crawled out of the seas on Earth.
    Julia F. Ereza, a Ph.D. student at IAA-CSIC who uses Uchuu to study the large-scale structure of the Universe explains the importance of the time domain, “Uchuu is like a time machine: we can go forward, backward and stop in time, we can ‘zoom in’ on a single galaxy or ‘zoom out’ to visualize a whole cluster, we can see what is really happening at every instant and in every place of the Universe from its earliest days to the present, being an essential tool to study the Cosmos.”
    An international team of researchers from Japan, Spain, U.S.A., Argentina, Australia, Chile, France, and Italy created Uchuu using ATERUI II, the world’s most powerful supercomputer dedicated to astronomy. Even with all this power, it still took a year to produce Uchuu. Tomoaki Ishiyama, an associate professor at Chiba University who developed the code used to generate Uchuu, explains, “To produce Uchuu we have used … all 40,200 processors (CPU cores) available exclusively for 48 hours each month. Twenty million supercomputer hours were consumed, and 3 Petabytes of data were generated, the equivalent of 894,784,853 pictures from a 12-megapixel cell phone.”
    Before you start worrying about download time, the research team used high-performance computational techniques to compress information on the formation and evolution of dark matter haloes in the Uchuu simulation into a 100-terabyte catalog. This catalog is now available to everyone on the cloud in an easy to use format thanks to the computational infrastructure skun6 located at the Instituto de Astrofísica de Andalucía (IAA-CSIC), the RedIRIS group, and the Galician Supercomputing Center (CESGA). Future data releases will include catalogues of virtual galaxies and gravitational lensing maps.
    Big Data science products from Uchuu will help astronomers learn how to interpret Big Data galaxy surveys expected in coming years from facilities like the Subaru Telescope and the ESA Euclid space mission.
    Story Source:
    Materials provided by National Institutes of Natural Sciences. Note: Content may be edited for style and length. More