More stories

  • in

    Using a video game to understand the origin of emotions

    Emotions are complex phenomena that influence our minds, bodies and behaviour. A number of studies have sought to connect given emotions, such as fear or pleasure, to specific areas of the brain, but without success. Some theoretical models suggest that emotions emerge through the coordination of multiple mental processes triggered by an event. These models involve the brain orchestrating adapted emotional responses via the synchronisation of motivational, expressive and visceral mechanisms. To investigate this hypothesis, a research team from the University of Geneva (UNIGE) studied brain activity using functional MRI. They analysed the feelings, expressions and physiological responses of volunteers while they were playing a video game that had been specially developed to arouse different emotions depending on the progress of the game. The results, published in the journal PLOS Biology, show that different emotional components recruit several neural networks in parallel distributed throughout the brain, and that their transient synchronisation generates an emotional state. The somatosensory and motor pathways are two of the areas involved in this synchronisation, thereby validating the idea that emotion is grounded in action-oriented functions in order to allow an adapted response to events.
    Most studies use passive stimulation to understand the emergence of emotions: they typically present volunteers with photos, videos or images evoking fear, anger, joy or sadness while recording the cerebral response using electroencephalography or imaging. The goal is to pinpoint the specific neural networks for each emotion. “The problem is, these regions overlap for different emotions, so they’re not specific,” begins Joana Leitão, a post-doctoral fellow in the Department of Fundamental Neurosciences (NEUFO) in UNIGE’s Faculty of Medicine and at the Swiss Centre for Affective Sciences (CISA). “What’s more, it’s likely that, although these images represent emotions well, they don’t evoke them.”
    A question of perspective
    Several neuroscientific theories have attempted to model the emergence of an emotion, although none has so far been proven experimentally. The UNIGE research team subscribe to the postulate that emotions are “subjective”: two individuals faced with the same situation may experience a different emotion. “A given event is not assessed in the same way by each person because the perspectives are different,” continues Dr Leitão.
    In a theoretical model known as the component process model (CPM) — devised by Professor Klaus Scherer, the retired founding director of CISA- an event will generate multiple responses in the organism. These relate to components of cognitive assessment (novelty or concordance with a goal or norms), motivation, physiological processes (sweating or heart rate), and expression (smiling or shouting). In a situation that sets off an emotional response, these different components influence each other dynamically. It is their transitory synchronisation that might correspond to an emotional state.
    Emotional about Pacman
    The Geneva neuroscientists devised a video game to evaluate the applicability of this model. “The aim is to evoke emotions that correspond to different forms of evaluation,” explains Dr Leitão. “Rather than viewing simple images, participants play a video game that puts them in situations they’ll have to evaluate so they can advance and win rewards.” The game is an arcade game that is similar to the famous Pacman. Players have to grab coins, touch the “nice monsters,” ignore the “neutral monsters” and avoid the “bad guys” to win points and pass to the next level.
    The scenario involves situations that trigger the four components of the CPM model differently. At the same time, the researchers were able to measure brain activity via imaging; facial expression by analysing the zygomatic muscles; feelings via questions; and physiology by skin and cardiorespiratory measurements. “All of these components involve different circuits distributed throughout the brain,” says the Geneva-based researcher. “By cross-referencing the imagery data with computational modelling, we were able to determine how these components interact over time and at what point they synchronise to generate an emotion.”
    A made-to-measure emotional response
    The results also indicate that a region deep in the brain called the basal ganglia is involved in this synchronisation. This structure is known as a convergence point between multiple cortical regions, each of which is equipped with specialised affective, cognitive or sensorimotor processes. The other regions involve the sensorimotor network, the posterior insula and the prefrontal cortex. “The involvement of the somatosensory and motor zones accords with the postulate of theories that consider emotion as a preparatory mechanism for action that enables the body to promote an adaptive response to events,” concludes Patrik Vuilleumier, full professor at NEUFO and senior author of the study.

    Story Source:
    Materials provided by Université de Genève. Note: Content may be edited for style and length. More

  • in

    Tech makes it possible to digitally communicate through human touch

    Instead of inserting a card or scanning a smartphone to make a payment, what if you could simply touch the machine with your finger?
    A prototype developed by Purdue University engineers would essentially let your body act as the link between your card or smartphone and the reader or scanner, making it possible for you to transmit information just by touching a surface.
    The prototype doesn’t transfer money yet, but it’s the first technology that can send any information through the direct touch of a fingertip. While wearing the prototype as a watch, a user’s body can be used to send information such as a photo or password when touching a sensor on a laptop, the researchers show in a new study.
    “We’re used to unlocking devices using our fingerprints, but this technology wouldn’t rely on biometrics — it would rely on digital signals. Imagine logging into an app on someone else’s phone just by touch,” said Shreyas Sen, a Purdue associate professor of electrical and computer engineering.
    “Whatever you touch would become more powerful because digital information is going through it.”
    The study is published in Transactions on Computer-Human Interaction, a journal by the Association for Computing Machinery. Shovan Maity, a Purdue alum, led the study as a Ph.D. student in Sen’s lab. The researchers also will present their findings at the Association for Computing Machinery’s Computer Human Interaction (ACM CHI) conference in May.

    advertisement

    The technology works by establishing an “internet” within the body that smartphones, smartwatches, pacemakers, insulin pumps and other wearable or implantable devices can use to send information. These devices typically communicate using Bluetooth signals that tend to radiate out from the body. A hacker could intercept those signals from 30 feet away, Sen said.
    Sen’s technology instead keeps signals confined within the body by coupling them in a so-called “Electro-Quasistatic range” that is much lower on the electromagnetic spectrum than typical Bluetooth communication. This mechanism is what enables information transfer by only touching a surface.
    Even if your finger hovered just one centimeter above a surface, information wouldn’t transfer through this technology without a direct touch. This would prevent a hacker from stealing private information such as credit card credentials by intercepting the signals.
    The researchers demonstrated this capability in the lab by having a person interact with two adjacent surfaces. Each surface was equipped with an electrode to touch, a receiver to get data from the finger and a light to indicate that data had transferred. If the finger directly touched an electrode, only the light of that surface turned on. The fact that the light of the other surface stayed off indicated that the data didn’t leak out.
    Similarly, if a finger hovered as close as possible over a laptop sensor, a photo wouldn’t transfer. But a direct touch could transfer a photo.

    advertisement

    Credit card machines and apps such as Apple Pay use a more secure alternative to Bluetooth signals — called near-field communication — to receive a payment from tapping a card or scanning a phone. Sen’s technology would add the convenience of making a secure payment in a single gesture.
    “You wouldn’t have to bring a device out of your pocket. You could leave it in your pocket or on your body and just touch,” Sen said.
    The technology could also replace key fobs or cards that currently use Bluetooth communication to grant access into a building. Instead, a person might just touch a door handle to enter.
    Like machines today that scan coupons, gift cards and other information from a phone, using this technology in real life would require surfaces everywhere to have the right hardware for recognizing your finger.
    The software on the device that a person is wearing would also need to be configured to send signals through the body to the fingertip — and have a way to turn off so that information, such as a payment, wouldn’t be transferred to every surface equipped to receive it.
    The researchers believe that the applications of this technology would go beyond how we interact with devices today.
    “Anytime you are enabling a new hardware channel, it gives you more possibilities. Think of big touch screens that we have today — the only information that the computer receives is the location of your touch. But the ability to transfer information through your touch would change the applications of that big touch screen,” Sen said.
    A video about the research is available on YouTube at https://youtu.be/-2oscW5i5DQ.

    Story Source:
    Materials provided by Purdue University. Original written by Kayla Wiles. Note: Content may be edited for style and length. More

  • in

    Mapping quantum structures with light to unlock their capabilities

    A new tool that uses light to map out the electronic structures of crystals could reveal the capabilities of emerging quantum materials and pave the way for advanced energy technologies and quantum computers, according to researchers at the University of Michigan, University of Regensburg and University of Marburg.
    A paper on the work is published in Science.
    Applications include LED lights, solar cells and artificial photosynthesis.
    “Quantum materials could have an impact way beyond quantum computing,” said Mackillo Kira, professor of electrical engineering and computer science at the University of Michigan, who led the theory side of the new study. “If you optimize quantum properties right, you can get 100% efficiency for light absorption.”
    Silicon-based solar cells are already becoming the cheapest form of electricity, although their sunlight-to-electricity conversion efficiency is rather low, about 30%. Emerging “2D” semiconductors, which consist of a single layer of crystal, could do that much better — potentially using up to 100% of the sunlight. They could also elevate quantum computing to room temperature from the near-absolute-zero machines demonstrated so far.
    “New quantum materials are now being discovered at a faster pace than ever,” said Rupert Huber, professor of physics at the University of Regensburg in Germany, who led the experimental work. “By simply stacking such layers one on top of the other under variable twist angles, and with a wide selection of materials, scientists can now create artificial solids with truly unprecedented properties.”
    The ability to map these properties down to the atoms could help streamline the process of designing materials with the right quantum structures. But these ultrathin materials are much smaller and messier than earlier crystals, and the old analysis methods don’t work. Now, 2D materials can be measured with the new laser-based method at room temperature and pressure.

    advertisement

    The measurable operations include processes that are key to solar cells, lasers and optically driven quantum computing. Essentially, electrons pop between a “ground state,” in which they cannot travel, and states in the semiconductor’s “conduction band,” in which they are free to move through space. They do this by absorbing and emitting light.
    The quantum mapping method uses a 100 femtosecond (100 quadrillionths of a second) pulse of red laser light to pop electrons out of the ground state and into the conduction band. Next the electrons are hit with a second pulse of infrared light. This pushes them so that they oscillate up and down an energy “valley” in the conduction band, a little like skateboarders in a halfpipe.
    The team uses the dual wave/particle nature of electrons to create a standing wave pattern that looks like a comb. They discovered that when the peak of this electron comb overlaps with the material’s band structure — its quantum structure — electrons emit light intensely. That powerful light emission along, with the narrow width of the comb lines, helped create a picture so sharp that researchers call it super-resolution.
    By combining that precise location information with the frequency of the light, the team was able to map out the band structure of the 2D semiconductor tungsten diselenide. Not only that, but they could also get a read on each electron’s orbital angular momentum through the way the front of the light wave twisted in space. Manipulating an electron’s orbital angular momentum, known also as a pseudospin, is a promising avenue for storing and processing quantum information.
    In tungsten diselenide, the orbital angular momentum identifies which of two different “valleys” an electron occupies. The messages that the electrons send out can show researchers not only which valley the electron was in but also what the landscape of that valley looks like and how far apart the valleys are, which are the key elements needed to design new semiconductor-based quantum devices.
    For instance, when the team used the laser to push electrons up the side of one valley until they fell into the other, the electrons emitted light at that drop point, too. That light gives clues about the depths of the valleys and the height of the ridge between them. With this kind of information, researchers can figure out how the material would fare for a variety of purposes.
    The paper is titled, “Super-resolution lightwave tomography of electronic bands in quantum materials.” This research was funded by the Army Research Office, German Research Foundation and U-M College of Engineering Blue Sky Research Program. More

  • in

    Chaotic early solar system collisions resembled 'Asteroids' arcade game

    One Friday evening in 1992, a meteorite ended a more than 150 million-mile journey by smashing into the trunk of a red Chevrolet Malibu in Peekskill, New York. The car’s owner reported that the 30-pound remnant of the earliest days of our solar system was still warm and smelled of sulfur.
    Nearly 30 years later, a new analysis of that same Peekskill meteorite and 17 others by researchers at The University of Texas at Austin and the University of Tennessee, Knoxville, has led to a new hypothesis about how asteroids formed during the early years of the solar system.
    The meteorites studied in the research originated from asteroids and serve as natural samples of the space rocks. They indicate that the asteroids formed though violent bombardment and subsequent reassembly, a finding that runs counter to the prevailing idea that the young solar system was a peaceful place.
    The study was published in print Dec.1 in the journal Geochimica et Cosmochimica Acta.
    The research began when co-author Nick Dygert was a postdoctoral fellow at UT’s Jackson School of Geosciences studying terrestrial rocks using a method that could measure the cooling rates of rocks from very high temperatures, up to 1,400 degrees Celsius.
    Dygert, now an assistant professor at the University of Tennessee, realized that this method — called a rare earth element (REE)-in-two-pyroxene thermometer — could work for space rocks, too.

    advertisement

    “This is a really powerful new technique for using geochemistry to understand geophysical processes, and no one had used it to measure meteorites yet,” Dygert said.
    Since the 1970s, scientists have been measuring minerals in meteorites to figure out how they formed. The work suggested that meteorites cooled very slowly from the outside inward in layers. This “onion shell model” is consistent with a relatively peaceful young solar system where chunks of rock orbited unhindered. But those studies were only capable of measuring cooling rates from temperatures near about 500 degrees Celsius.
    When Dygert and Michael Lucas, a postdoctoral scholar at the University of Tennessee who led the work, applied the REE-in-two-pyroxene method, with its much higher sensitivity to peak temperature, they found unexpected results. From around 900 degrees Celsius down to 500 degrees Celsius, cooling rates were 1,000 to 1 million times faster than at lower temperatures.
    How could these two very different cooling rates be reconciled?
    The scientists proposed that asteroids formed in stages. If the early solar system was, much like the old Atari game “Asteroids,” rife with bombardment, large rocks would have been smashed to bits. Those smaller pieces would have cooled quickly. Afterward, when the small pieces reassembled into larger asteroids we see today, cooling rates would have slowed.

    advertisement

    To test this rubble pile hypothesis, Jackson School Professor Marc Hesse and first-year doctoral student Jialong Ren built a computational model of a two-stage thermal history of rubble pile asteroids for the first time.
    Because of the vast number of pieces in a rubble pile — 1015 or a thousand trillions — and the vast array of their sizes, Ren had to develop new techniques to account for changes in mass and temperature before and after bombardment.
    “This was an intellectually significant contribution,” Hesse said.
    The resulting model supports the rubble pile hypothesis and provides other insights as well. One implication is that cooling slowed so much after reassembly not because the rock gave off heat in layers. Rather, it was that the rubble pile contained pores.
    “The porosity reduces how fast you can conduct heat,” Hesse said. “You actually cool slower than you would have if you hadn’t fragmented because all of the rubble makes kind of a nice blanket. And that’s sort of unintuitive.”
    Tim Swindle of the Lunar and Planetary Laboratory at the University of Arizona, who studies meteorites but was not involved in the research, said that this work is a major step forward.
    “This seems like a more complete model, and they’ve added data to part of the question that people haven’t been talking about, but should have been. The jury is still out, but this is a strong argument.”
    The biggest implication of the new rubble pile hypothesis, Dygert said, is that these collisions characterized the early days of the solar system.
    “They were violent, and they started early on,” he said.
    The research was supported by NASA. The Smithsonian National Museum of Natural History supplied samples of meteorites for the study. More

  • in

    New machine learning tool tracks urban traffic congestion

    A new machine learning algorithm is poised to help urban transportation analysts relieve bottlenecks and chokepoints that routinely snarl city traffic.
    The tool, called TranSEC, was developed at the U.S. Department of Energy’s Pacific Northwest National Laboratory to help urban traffic engineers get access to actionable information about traffic patterns in their cities.
    Currently, publicly available traffic information at the street level is sparse and incomplete. Traffic engineers generally have relied on isolated traffic counts, collision statistics and speed data to determine roadway conditions. The new tool uses traffic datasets collected from UBER drivers and other publicly available traffic sensor data to map street-level traffic flow over time. It creates a big picture of city traffic using machine learning tools and the computing resources available at a national laboratory.
    “What’s novel here is the street level estimation over a large metropolitan area,” said Arif Khan, a PNNL computer scientist who helped develop TranSEC. “And unlike other models that only work in one specific metro area, our tool is portable and can be applied to any urban area where aggregated traffic data is available.”
    UBER-fast traffic analysis
    TranSEC (which stands for transportation state estimation capability) differentiates itself from other traffic monitoring methods by its ability to analyze sparse and incomplete information. It uses machine learning to connect segments with missing data, and that allows it to make near real-time street level estimations.

    advertisement

    In contrast, the map features on our smart phones can help us optimize our journey through a city landscape, pointing out chokepoints and suggesting alternate routes. But smart phone tools only work for an individual driver trying to get from point A to point B. City traffic engineers are concerned with how to help all vehicles get to their destinations efficiently. Sometimes a route that seems efficient for an individual driver leads to too many vehicles trying to access a road that wasn’t designed to handle that volume of traffic.
    Using public data from the entire 1,500-square-mile Los Angeles metropolitan area, the team reduced the time needed to create a traffic congestion model by an order of magnitude, from hours to minutes. The speed-up, accomplished with high-performance computing resources at PNNL, makes near-real-time traffic analysis feasible. The research team recently presented that analysis at the August 2020 virtual Urban Computing Workshop as part of the Knowledge Discovery and Data Mining (SIGKDD) conference, and in September 2020 they sought the input of traffic engineers at a virtual meeting on TranSEC.
    “TranSEC has the potential to initiate a paradigm shift in how traffic professionals monitor and predict system mobility performance,” said Mark Franz, a meeting attendee and a research engineer at the Center for Advanced Transportation Technology, University of Maryland, College Park. “TranSEC overcomes the inherent data gaps in legacy data collection methods and has tremendous potential.”
    Machine learning improves accuracy over time
    The machine learning feature of TranSEC means that as more data is acquired and processed it becomes more refined and useful over time. This kind of analysis is used to understand how disturbances spread across networks. Given enough data, the machine learning element will be able to predict impacts so that traffic engineers can create corrective strategies.

    advertisement

    “We use a graph-based model together with novel sampling methods and optimization engines, to learn both the travel times and the routes,” said Arun Sathanur, a PNNL computer scientist and a lead researcher on the team. “The method has significant potential to be expanded to other modes of transportation, such as transit and freight traffic. As an analytic tool, it is capable of investigating how a traffic condition spreads.”
    With PNNL’s data-driven approach, users can upload real-time data and update TranSEC on a regular basis in a transportation control center. Engineers can use short-term forecasts for decision support to manage traffic issues. PNNL’s approach is also extensible to include weather or other data that affect conditions on the road.
    Computing power for transportation planners nationwide
    Just as situational awareness of conditions informs an individual driver’s decisions, TranSEC’s approach provides situational awareness on a system-wide basis to help reduce urban traffic congestion.
    “Traffic engineers nationwide have not had a tool to give them anywhere near real-time estimation of transportation network states,” said Robert Rallo, PNNL computer scientist and principal investigator on the TranSEC project. “Being able to predict conditions an hour or more ahead would be very valuable, to know where the blockages are going to be.”
    While running a full-scale city model still requires high-performance computing resources, TranSEC is scalable. For example, a road network with only the major highways and arterials could be modeled on a powerful desktop computer.
    “We are working toward making TranSEC available to municipalities nationwide,” said Katherine Wolf, project manager for TranSEC.
    Eventually, after further development, TranSEC could be used to help program autonomous vehicle routes, according to the research team.
    Video: https://www.youtube.com/watch?v=8S4bLv9CtOo
    The project was supported by the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy’s Vehicle Technologies Office, Energy Efficient Mobility Systems Program. More

  • in

    Self-repairing gelatin-based film could be a smart move for electronics

    Dropping a cell phone can sometimes cause superficial cracks to appear. But other times, the device can stop working altogether because fractures develop in the material that stores data. Now, researchers have made an environmentally friendly, gelatin-based film that can repair itself multiple times and still maintain the electronic signals needed to access a device’s data. The material could be used someday in smart electronics and health-monitoring devices. More