More stories

  • in

    Computer scientists set benchmarks to optimize quantum computer performance

    Computer scientists have shown that existing compilers, which tell quantum computers how to use their circuits to execute quantum programs, inhibit the computers’ ability to achieve optimal performance. Specifically, their research has revealed that improving quantum compilation design could help achieve computation speeds up to 45 times faster than currently demonstrated. More

  • in

    An AI algorithm to help identify homeless youth at risk of substance abuse

    While many programs and initiatives have been implemented to address the prevalence of substance abuse among homeless youth in the United States, they don’t always include data-driven insights about environmental and psychological factors that could contribute to an individual’s likelihood of developing a substance use disorder.
    Now, an artificial intelligence (AI) algorithm developed by researchers at the College of Information Sciences and Technology at Penn State could help predict susceptibility to substance use disorder among young homeless individuals, and suggest personalized rehabilitation programs for highly susceptible homeless youth.
    “Proactive prevention of substance use disorder among homeless youth is much more desirable than reactive mitigation strategies such as medical treatments for the disorder and other related interventions,” said Amulya Yadav, assistant professor of information sciences and technology and principal investigator on the project. “Unfortunately, most previous attempts at proactive prevention have been ad-hoc in their implementation.”
    “To assist policymakers in devising effective programs and policies in a principled manner, it would be beneficial to develop AI and machine learning solutions which can automatically uncover a comprehensive set of factors associated with substance use disorder among homeless youth,” added Maryam Tabar, a doctoral student in informatics and lead author on the project paper that will be presented at the Knowledge Discovery in Databases (KDD) conference in late August.
    In that project, the research team built the model using a dataset collected from approximately 1,400 homeless youth, ages 18 to 26, in six U.S. states. The dataset was collected by the Research, Education and Advocacy Co-Lab for Youth Stability and Thriving (REALYST), which includes Anamika Barman-Adhikari, assistant professor of social work at the University of Denver and co-author of the paper.
    The researchers then identified environmental, psychological and behavioral factors associated with substance use disorder among them — such as criminal history, victimization experiences and mental health characteristics. They found that adverse childhood experiences and physical street victimization were more strongly associated with substance use disorder than other types of victimization (such as sexual victimization) among homeless youth. Additionally, PTSD and depression were found to be more strongly associated with substance use disorder than other mental health disorders among this population, according to the researchers.

    advertisement

    Next, the researchers divided their dataset into six smaller datasets to analyze geographical differences. The team trained a separate model to predict substance abuse disorder among homeless youth in each of the six states — which have varying environmental conditions, drug legalization policies and gang associations. The team observed several location-specific variations in the association level of some factors, according to Tabar.
    “By looking at what the model has learned, we can effectively find out factors which may play a correlational role with people suffering from substance abuse disorder,” said Yadav. “And once we know these factors, we are much more accurately able to predict whether somebody suffers from substance use.”
    He added, “So if a policy planner or interventionist were to develop programs that aim to reduce the prevalence of substance abuse disorder, this could provide useful guidelines.”
    Other authors on the KDD paper include Dongwon Lee, associate professor, and Stephanie Winkler, doctoral student, both in the Penn State College of Information Sciences and Technology; and Heesoo Park of Sungkyunkwan University.
    Yadav and Barman-Adhikari are collaborating on a similar project through which they have developed a software agent that designs personalized rehabilitation programs for homeless youth suffering from opioid addiction. Their simulation results show that the software agent — called CORTA (Comprehensive Opioid Response Tool Driven by Artificial Intelligence) — outperforms baselines by approximately 110% in minimizing the number of homeless youth suffering from opioid addiction.

    advertisement

    “We wanted to understand what the causative issues are behind people developing opiate addiction,” said Yadav. “And then we wanted to assign these homeless youth to the appropriate rehabilitation program.”
    Yadav explained that data collected by more than 1,400 homeless youth in the U.S. was used to build AI models to predict the likelihood of opioid addiction among this population. After examining issues that could be the underlying cause of opioid addiction — such as foster care history or exposure to street violence — CORTA solves novel optimization formulations to assign personalized rehabilitation programs.
    “For example, if a person developed an opioid addiction because they were isolated or didn’t have a social circle, then perhaps as part of their rehabilitation program they should talk to a counselor,” explained Yadav. “On the other hand, if someone developed an addiction because they were depressed because they couldn’t find a job or pay their bills, then a career counselor should be a part of the rehabilitation plan.”
    Yadav added, “If you just treat the condition medically, once they go back into the real world, since the causative issue still remains, they’re likely to relapse.”
    Yadav and Barman-Adhikari will present their paper on CORTA, “Optimal and Non-Discriminative Rehabilitation Program Design for Opioid Addiction Among Homeless Youth,” at the International Joint Conference on Artificial Intelligence-Pacific Rim International Conference on Artificial Intelligence (IJCAI-PRICAI), which was to be held in July 2020 but is being rescheduled due to the novel coronavirus pandemic.
    Other collaborators on the CORTA project include Penn State doctoral students Roopali Singh (statistics), Nikolas Siapoutis (statistics) and Yu Liang (informatics). More

  • in

    Linking sight and movement

    To get a better look at the world around them, animals constantly are in motion. Primates and people use complex eye movements to focus their vision (as humans do when reading, for instance); birds, insects, and rodents do the same by moving their heads, and can even estimate distances that way. Yet how these movements play out in the elaborate circuitry of neurons that the brain uses to “see” is largely unknown. And it could be a potential problem area as scientists create artificial neural networks that mimic how vision works in self-driving cars.
    To better understand the relationship between movement and vision, a team of Harvard researchers looked at what happens in one of the brain’s primary regions for analyzing imagery when animals are free to roam naturally. The results of the study, published Tuesday in the journal Neuron, suggest that image-processing circuits in the primary visual cortex not only are more active when animals move, but that they receive signals from a movement-controlling region of the brain that is independent from the region that processes what the animal is looking at. In fact, the researchers describe two sets of movement-related patterns in the visual cortex that are based on head motion and whether an animal is in the light or the dark.
    The movement-related findings were unexpected, since vision tends to be thought of as a feed-forward computation system in which visual information enters through the retina and travels on neural circuits that operate on a one-way path, processing the information piece by piece. What the researchers saw here is more evidence that the visual system has many more feedback components where information can travel in opposite directions than had been thought.
    These results offer a nuanced glimpse into how neural activity works in a sensory region of the brain, and add to a growing body of research that is rewriting the textbook model of vision in the brain.
    “It was really surprising to see this type of [movement-related] information in the visual cortex because traditionally people have thought of the visual cortex as something that only processes images,” said Grigori Guitchounts, a postdoctoral researcher in the Neurobiology Department at Harvard Medical School and the study’s lead author. “It was mysterious, at first, why this sensory region would have this representation of the specific types of movements the animal was making.”
    While the scientists weren’t able to definitively say why this happens, they believe it has to do with how the brain perceives what’s around it.

    advertisement

    “The model explanation for this is that the brain somehow needs to coordinate perception and action,” Guitchounts said. “You need to know when a sensory input is caused by your own action as opposed to when it’s caused by something out there in the world.”
    For the study, Guitchounts teamed up with former Department of Molecular and Cellular Biology Professor David Cox, alumnus Javier Masis, M.A. ’15, Ph.D. ’18, and postdoctoral researcher Steffen B.E. Wolff. The work started in 2017 and wrapped up in 2019 while Guitchounts was a graduate researcher in Cox’s lab. A preprint version of the paper published in January.
    The typical setup of past experiments on vision worked like this: Animals, like mice or monkeys, were sedated, restrained so their heads were in fixed positions, and then given visual stimuli, like photographs, so researchers could see which neurons in the brain reacted. The approach was pioneered by Harvard scientists David H. Hubel and Torsten N. Wiesel in the 1960s, and in 1981 they won a Nobel Prize in medicine for their efforts. Many experiments since then have followed their model, but it did not illuminate how movement affects the neurons that analyze.
    Researchers in this latest experiment wanted to explore that, so they watched 10 rats going about their days and nights. The scientists placed each rat in an enclosure, which doubled as its home, and continuously recorded their head movements. Using implanted electrodes, they measured the brain activity in the primary visual cortex as the rats moved.
    Half of the recordings were taken with the lights on. The other half were recorded in total darkness. The researchers wanted to compare what the visual cortex was doing when there was visual input versus when there wasn’t. To be sure the room was pitch black, they taped shut any crevice that could let in light, since rats have notoriously good vision at night.

    advertisement

    The data showed that on average, neurons in the rats’ visual cortices were more active when the animals moved than when they rested, even in the dark. That caught the researchers off guard: In a pitch-black room, there is no visual data to process. This meant that the activity was coming from the motor cortex, not an external image.
    The team also noticed that the neural patterns in the visual cortex that were firing during movement differed in the dark and light, meaning they weren’t directly connected. Some neurons that were ready to activate in the dark were in a kind of sleep mode in the light.
    Using a machine-learning algorithm, the researchers encoded both patterns. That let them not only tell which way a rat was moving its head by just looking at the neural activity in its visual cortex, but also predict the movement several hundred milliseconds before the rat made it.
    The researchers confirmed that the movement signals came from the motor area of the brain by focusing on the secondary motor cortex. They surgically destroyed it in several rats, then ran the experiments again. The rats in which this area of the brain was lesioned no longer gave off signals in the visual cortex. However, the researchers were not able to determine if the signal originates in the secondary motor cortex. It could be only where it passes through, they said.
    Furthermore, the scientists pointed out some limitations in their findings. For instance, they only measured the movement of the head, and did not measure eye movement. The study is also based on rodents, which are nocturnal. Their visual systems share similarities with humans and primates, but differ in complexity. Still, the paper adds to new lines of research and the findings could potentially be applied to neural networks that control machine vision, like those in autonomous vehicles.
    “It’s all to better understand how vision actually works,” Guitchounts said. “Neuroscience is entering into a new era where we understand that perception and action are intertwined loops. … There’s no action without perception and no perception without action. We have the technology now to measure this.”
    This work was supported by the Harvard Center for Nanoscale Systems and the National Science Foundation Graduate Research Fellowship. More

  • in

    This online calculator can predict your stroke risk

    Doctors can predict patients’ risk for ischemic stroke based on the severity of their metabolic syndrome, a conglomeration of conditions that includes high blood pressure, abnormal cholesterol levels and excess body fat around the abdomen and waist, a new study finds.
    The study found that stroke risk increased consistently with metabolic syndrome severity even in patients without diabetes. Doctors can use this information — and a scoring tool developed by a UVA Children’s pediatrician and his collaborator at the University of Florida — to identify patients at risk and help them reduce that risk.
    “We had previously shown that the severity of metabolic syndrome was linked to future coronary heart disease and type 2 diabetes,” said UVA’s Mark DeBoer, MD. “This study showed further links to future ischemic strokes.”
    Ischemic Stroke Risk
    DeBoer developed the scoring tool, an online calculator to assess the severity of metabolic syndrome, with Matthew J. Gurka, PhD, of the Department of Health Outcomes and Biomedical Informatics at the University of Florida, Gainesville. The tool is available for free at https://metscalc.org/.
    To evaluate the association between ischemic stroke and metabolic syndrome, DeBoer and Gurka reviewed more than 13,000 participants in prior studies and their stroke outcomes. Among that group, there were 709 ischemic strokes over a mean period of 18.6 years assessed in the studies. (Ischemic strokes are caused when blood flow to the brain is obstructed by blood clots or clogged arteries. Hemorrhagic strokes, on the other hand, are caused when blood vessels rupture.)
    The researchers used their tool to calculate “Z scores” measuring the severity of metabolic syndrome among the study participants. They could then analyze the association between metabolic syndrome and ischemic stroke risk.
    The subgroup with the highest association between metabolic syndrome and risk for ischemic stroke was white women, the researchers found. In this group, the research team was able to identify relationships between the individual contributors to metabolic syndrome, such as high blood pressure, and stroke risk.
    The researchers note that race and sex did not seem to make a major difference in stroke risk overall, and they caution that the increased risk seen in white women could be the results of chance alone. “Nevertheless,” they write in a new scientific article outlining their findings, “these results are notable enough that they may warrant further study into race and sex differences.”
    The overall relationship between metabolic syndrome severity and stroke risk was clear, however. And this suggests people with metabolic syndrome can make lifestyle changes to reduce that risk. Losing weight, exercising more, choosing healthy foods — all can help address metabolic syndrome and its harmful effects.
    DeBoer hopes that the tool he and Gurka developed will help doctors guide patients as they seek to reduce their stroke risk and improve their health and well-being.
    “In case there are still individuals out there debating whether to start exercising or eating a healthier diet,” DeBoer said, “this study provides another wake-up call to motivate us all toward lifestyle changes.”

    Story Source:
    Materials provided by University of Virginia Health System. Note: Content may be edited for style and length. More

  • in

    Sounds of action: Using ears, not just eyes, improves robot perception

    People rarely use just one sense to understand the world, but robots usually only rely on vision and, increasingly, touch. Carnegie Mellon University researchers find that robot perception could improve markedly by adding another sense: hearing.
    In what they say is the first large-scale study of the interactions between sound and robotic action, researchers at CMU’s Robotics Institute found that sounds could help a robot differentiate between objects, such as a metal screwdriver and a metal wrench. Hearing also could help robots determine what type of action caused a sound and help them use sounds to predict the physical properties of new objects.
    “A lot of preliminary work in other fields indicated that sound could be useful, but it wasn’t clear how useful it would be in robotics,” said Lerrel Pinto, who recently earned his Ph.D. in robotics at CMU and will join the faculty of New York University this fall. He and his colleagues found the performance rate was quite high, with robots that used sound successfully classifying objects 76 percent of the time.
    The results were so encouraging, he added, that it might prove useful to equip future robots with instrumented canes, enabling them to tap on objects they want to identify.
    The researchers presented their findings last month during the virtual Robotics Science and Systems conference. Other team members included Abhinav Gupta, associate professor of robotics, and Dhiraj Gandhi, a former master’s student who is now a research scientist at Facebook Artificial Intelligence Research’s Pittsburgh lab.
    To perform their study, the researchers created a large dataset, simultaneously recording video and audio of 60 common objects — such as toy blocks, hand tools, shoes, apples and tennis balls — as they slid or rolled around a tray and crashed into its sides. They have since released this dataset, cataloging 15,000 interactions, for use by other researchers.
    The team captured these interactions using an experimental apparatus they called Tilt-Bot — a square tray attached to the arm of a Sawyer robot. It was an efficient way to build a large dataset; they could place an object in the tray and let Sawyer spend a few hours moving the tray in random directions with varying levels of tilt as cameras and microphones recorded each action.
    They also collected some data beyond the tray, using Sawyer to push objects on a surface.
    Though the size of this dataset is unprecedented, other researchers have also studied how intelligent agents can glean information from sound. For instance, Oliver Kroemer, assistant professor of robotics, led research into using sound to estimate the amount of granular materials, such as rice or pasta, by shaking a container, or estimating the flow of those materials from a scoop.
    Pinto said the usefulness of sound for robots was therefore not surprising, though he and the others were surprised at just how useful it proved to be. They found, for instance, that a robot could use what it learned about the sound of one set of objects to make predictions about the physical properties of previously unseen objects.
    “I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” he said. For instance, a robot couldn’t use sound to tell the difference between a red block or a green block. “But if it was a different object, such as a block versus a cup, it could figure that out.”
    The Defense Advanced Research Projects Agency and the Office of Naval Research supported this research.

    Story Source:
    Materials provided by Carnegie Mellon University. Original written by Byron Spice. Note: Content may be edited for style and length. More

  • in

    Many medical 'rainy day' accounts aren't getting opened or filled

    One-third of the people who could benefit from a special type of savings account to cushion the blow of their health plan deductible aren’t doing so, according to a new study.
    And even among people who do open a health savings account (HSA), half haven’t put any money into it in the past year. This means they may be missing a chance to avoid taxes on money that they can use to pay for their health insurance deductible and other health costs.
    The study also finds that those who buy their health insurance themselves, and select a high-deductible plan on an exchange such as www.healthcare.gov, are less likely to open an HSA than those who get their insurance from employers who offer only a high-deductible option.
    HSAs are different from the flexible spending accounts that some employers offer. HSAs can only be opened by people in health plans that require them to pay a deductible of $1,400 for an individual or $2,800 for a family before their insurance benefits kick in.
    In a new paper in JAMA Network Open, a team led by researchers at the University of Michigan and VA Ann Arbor Healthcare System reports results from a national survey of more than 1,600 participants in high-deductible health plans.
    Key findings
    They note that half of those who had an HSA and put money into their account in the past year had socked away $2,000 or more. And among those who hadn’t put money in, 40% said it was because they already had enough savings to cover their costs.

    advertisement

    But those with lower levels of education were much less likely to have opened an HSA, and to have contributed to it even if they did open one. Those with lower levels of understanding of health insurance concepts, called health insurance literacy, were also less likely to put money in their HSA if they had one.
    And one-third of those who didn’t put money into their HSA said it was because they couldn’t afford to save up for health costs.
    “These findings are concerning, given that nearly half of Americans with private insurance now have high-deductible plans,” says Jeffrey Kullgren, M.D., M.S., MPH, who led the study and has done other research on HDHPs and health care consumerism. “While policymakers have focused on expanding availability and permitted uses of HSAs, and increasing how much money they can hold, work is needed to help eligible enrollees open them and use them to get the care they need at a price they can afford.”
    Change needed
    In the new paper and in a report from the U-M Institute for Healthcare Policy and Innovation, Kullgren and his colleagues call for more efforts to increase uptake of HSAs, and contributions to HSAs, by employers, health insurers and the health systems that provide care and bill insurers for that care.

    advertisement

    Targeted interventions, especially those aimed at people with lower levels of education or health insurance literacy, should be developed.
    The researchers note that as federal and state exchanges prepare to open for 2021 enrollment, expanding the types of exchange plans that are eligible to be linked to an HSA will be important. Currently, just 7% of the plans bought on exchanges are eligible for an HSA, even though many exchange plans come with a high deductible.
    The study data come from an online survey of English-speaking adults under age 65; the study population was weighted to include a higher proportion of people with chronic health conditions than the national population.
    For the survey, the researchers asked respondents about HSAs using the National Health Interview Survey definition of an HSA as “a special account or fund that can be used to pay for medical expenses” that are “sometimes referred to as Health Savings Accounts (HSAs), Health Reimbursement Accounts (HRAs), Personal Care accounts, Personal Medical funds, or Choice funds, and are different from Flexible Spending Accounts.” More

  • in

    Academia from home

    As the uncertainty around reopening college and university campuses this fall continues, those who work, study, teach and conduct research are navigating the uncertain terrain of the “new normal.” They are balancing physical distancing and other COVID-19 prevention practices with productivity, creating home workspaces and mastering communications and teamwork across time and space.
    Turns out, there’s a group of people for whom these challenges are not new. Postdoctoral researchers — people in the critical phase between graduate school and permanent academic positions — are part of a small but growing cohort that has been turning to remote work to meet the challenges of their young careers. Often called upon to relocate multiple times for short-term, full-time appointments, postdocs and their families have to endure heightened financial costs, sacrificed career opportunities and separations from their support communities.
    But with the right practices and perspectives, remote work can level the playing field, especially for those in underrepresented groups, according to Kurt Ingeman, a postdoctoral researcher in UC Santa Barbara’s Department of Ecology, Evolution and Marine Biology. And, like it or not, with COVID-19 factoring into virtually every decision we now make, he noted, it’s an idea whose time has come.
    “We started this project in the pre-pandemic times but it seems more relevant than ever as academics are forced to embrace work-from-home,” said Ingeman, who makes the case for embracing remote postdoctoral work in the journal PLOS Computational Biology. Family and financial considerations drove his own decision to design a remote position; many early-career researchers face the same concerns, he said.
    It takes a shift in perspective to overcome resistance to having remote research teammates. Principal investigators often don’t perceive the remote postdoc as a fully functional member of the lab and worry about the loss of spontaneous informal actions, and interactions, that can generate new ideas, Ingeman said.
    “These are totally valid concerns,” he said. “We suggest (in the paper) ways to use digital tools to fully integrate remote postdocs into lab activities, like mentoring graduate students or coding and writing together. These same spaces are valuable for virtual coffee chats and other informal interactions.”
    Communication enabled by technology is in fact foundational to a good remote postdoc experience, according to Ingeman and co-authors, who advocate for investment in and use of reliable videoconferencing tools that can help create rapport between team members, and the creation of digital spaces to share documents and files. Transparency and early expectation setting are keys to a good start. In situations where proximity would have naturally led to interaction, the researchers recommend having a robust communications plan. Additionally, postdocs would benefit from establishing academic connections within their local community to combat isolation.

    advertisement

    There are benefits to reap from such arrangements and practices, the researchers continued. For the postdoc, it could mean less stress and hardship, and more focus on work. For the team, it could mean a wider network overall.
    “For me, remote postdoc work was a real bridge to becoming an independent researcher,” said Ingeman, who “struggled with isolation early on,” but has since gained a local academic community, resulting in productive new research collaborations.
    Additionally, opening the postdoc pool to remote researchers can result in a more diverse set of applicants.
    “The burdens of relocating for a temporary postdoc position often fall hardest on members of underrepresented groups,” Ingeman added. “So the idea of supporting remote work really stand out to me as an equity issue.”
    Of course, not all postdoc positions can be remote; lab and field work still require a presence. But as social distancing protocols and pandemic safety measures are forcing research teams to minimize in-person contact or undergo quarantine at a moment’s notice, developing remote research skills may well become a valuable part of any early-career researcher’s toolkit.
    “Even labs and research groups that are returning to campus in a limited way may face periodic campus closures, so it makes sense to integrate remote tools now,” Ingeman said. “Our suggestions for remote postdocs are absolutely applicable to other lab members working from home during closures.” More