More stories

  • in

    Linking sight and movement

    To get a better look at the world around them, animals constantly are in motion. Primates and people use complex eye movements to focus their vision (as humans do when reading, for instance); birds, insects, and rodents do the same by moving their heads, and can even estimate distances that way. Yet how these movements play out in the elaborate circuitry of neurons that the brain uses to “see” is largely unknown. And it could be a potential problem area as scientists create artificial neural networks that mimic how vision works in self-driving cars.
    To better understand the relationship between movement and vision, a team of Harvard researchers looked at what happens in one of the brain’s primary regions for analyzing imagery when animals are free to roam naturally. The results of the study, published Tuesday in the journal Neuron, suggest that image-processing circuits in the primary visual cortex not only are more active when animals move, but that they receive signals from a movement-controlling region of the brain that is independent from the region that processes what the animal is looking at. In fact, the researchers describe two sets of movement-related patterns in the visual cortex that are based on head motion and whether an animal is in the light or the dark.
    The movement-related findings were unexpected, since vision tends to be thought of as a feed-forward computation system in which visual information enters through the retina and travels on neural circuits that operate on a one-way path, processing the information piece by piece. What the researchers saw here is more evidence that the visual system has many more feedback components where information can travel in opposite directions than had been thought.
    These results offer a nuanced glimpse into how neural activity works in a sensory region of the brain, and add to a growing body of research that is rewriting the textbook model of vision in the brain.
    “It was really surprising to see this type of [movement-related] information in the visual cortex because traditionally people have thought of the visual cortex as something that only processes images,” said Grigori Guitchounts, a postdoctoral researcher in the Neurobiology Department at Harvard Medical School and the study’s lead author. “It was mysterious, at first, why this sensory region would have this representation of the specific types of movements the animal was making.”
    While the scientists weren’t able to definitively say why this happens, they believe it has to do with how the brain perceives what’s around it.

    advertisement

    “The model explanation for this is that the brain somehow needs to coordinate perception and action,” Guitchounts said. “You need to know when a sensory input is caused by your own action as opposed to when it’s caused by something out there in the world.”
    For the study, Guitchounts teamed up with former Department of Molecular and Cellular Biology Professor David Cox, alumnus Javier Masis, M.A. ’15, Ph.D. ’18, and postdoctoral researcher Steffen B.E. Wolff. The work started in 2017 and wrapped up in 2019 while Guitchounts was a graduate researcher in Cox’s lab. A preprint version of the paper published in January.
    The typical setup of past experiments on vision worked like this: Animals, like mice or monkeys, were sedated, restrained so their heads were in fixed positions, and then given visual stimuli, like photographs, so researchers could see which neurons in the brain reacted. The approach was pioneered by Harvard scientists David H. Hubel and Torsten N. Wiesel in the 1960s, and in 1981 they won a Nobel Prize in medicine for their efforts. Many experiments since then have followed their model, but it did not illuminate how movement affects the neurons that analyze.
    Researchers in this latest experiment wanted to explore that, so they watched 10 rats going about their days and nights. The scientists placed each rat in an enclosure, which doubled as its home, and continuously recorded their head movements. Using implanted electrodes, they measured the brain activity in the primary visual cortex as the rats moved.
    Half of the recordings were taken with the lights on. The other half were recorded in total darkness. The researchers wanted to compare what the visual cortex was doing when there was visual input versus when there wasn’t. To be sure the room was pitch black, they taped shut any crevice that could let in light, since rats have notoriously good vision at night.

    advertisement

    The data showed that on average, neurons in the rats’ visual cortices were more active when the animals moved than when they rested, even in the dark. That caught the researchers off guard: In a pitch-black room, there is no visual data to process. This meant that the activity was coming from the motor cortex, not an external image.
    The team also noticed that the neural patterns in the visual cortex that were firing during movement differed in the dark and light, meaning they weren’t directly connected. Some neurons that were ready to activate in the dark were in a kind of sleep mode in the light.
    Using a machine-learning algorithm, the researchers encoded both patterns. That let them not only tell which way a rat was moving its head by just looking at the neural activity in its visual cortex, but also predict the movement several hundred milliseconds before the rat made it.
    The researchers confirmed that the movement signals came from the motor area of the brain by focusing on the secondary motor cortex. They surgically destroyed it in several rats, then ran the experiments again. The rats in which this area of the brain was lesioned no longer gave off signals in the visual cortex. However, the researchers were not able to determine if the signal originates in the secondary motor cortex. It could be only where it passes through, they said.
    Furthermore, the scientists pointed out some limitations in their findings. For instance, they only measured the movement of the head, and did not measure eye movement. The study is also based on rodents, which are nocturnal. Their visual systems share similarities with humans and primates, but differ in complexity. Still, the paper adds to new lines of research and the findings could potentially be applied to neural networks that control machine vision, like those in autonomous vehicles.
    “It’s all to better understand how vision actually works,” Guitchounts said. “Neuroscience is entering into a new era where we understand that perception and action are intertwined loops. … There’s no action without perception and no perception without action. We have the technology now to measure this.”
    This work was supported by the Harvard Center for Nanoscale Systems and the National Science Foundation Graduate Research Fellowship. More

  • in

    This online calculator can predict your stroke risk

    Doctors can predict patients’ risk for ischemic stroke based on the severity of their metabolic syndrome, a conglomeration of conditions that includes high blood pressure, abnormal cholesterol levels and excess body fat around the abdomen and waist, a new study finds.
    The study found that stroke risk increased consistently with metabolic syndrome severity even in patients without diabetes. Doctors can use this information — and a scoring tool developed by a UVA Children’s pediatrician and his collaborator at the University of Florida — to identify patients at risk and help them reduce that risk.
    “We had previously shown that the severity of metabolic syndrome was linked to future coronary heart disease and type 2 diabetes,” said UVA’s Mark DeBoer, MD. “This study showed further links to future ischemic strokes.”
    Ischemic Stroke Risk
    DeBoer developed the scoring tool, an online calculator to assess the severity of metabolic syndrome, with Matthew J. Gurka, PhD, of the Department of Health Outcomes and Biomedical Informatics at the University of Florida, Gainesville. The tool is available for free at https://metscalc.org/.
    To evaluate the association between ischemic stroke and metabolic syndrome, DeBoer and Gurka reviewed more than 13,000 participants in prior studies and their stroke outcomes. Among that group, there were 709 ischemic strokes over a mean period of 18.6 years assessed in the studies. (Ischemic strokes are caused when blood flow to the brain is obstructed by blood clots or clogged arteries. Hemorrhagic strokes, on the other hand, are caused when blood vessels rupture.)
    The researchers used their tool to calculate “Z scores” measuring the severity of metabolic syndrome among the study participants. They could then analyze the association between metabolic syndrome and ischemic stroke risk.
    The subgroup with the highest association between metabolic syndrome and risk for ischemic stroke was white women, the researchers found. In this group, the research team was able to identify relationships between the individual contributors to metabolic syndrome, such as high blood pressure, and stroke risk.
    The researchers note that race and sex did not seem to make a major difference in stroke risk overall, and they caution that the increased risk seen in white women could be the results of chance alone. “Nevertheless,” they write in a new scientific article outlining their findings, “these results are notable enough that they may warrant further study into race and sex differences.”
    The overall relationship between metabolic syndrome severity and stroke risk was clear, however. And this suggests people with metabolic syndrome can make lifestyle changes to reduce that risk. Losing weight, exercising more, choosing healthy foods — all can help address metabolic syndrome and its harmful effects.
    DeBoer hopes that the tool he and Gurka developed will help doctors guide patients as they seek to reduce their stroke risk and improve their health and well-being.
    “In case there are still individuals out there debating whether to start exercising or eating a healthier diet,” DeBoer said, “this study provides another wake-up call to motivate us all toward lifestyle changes.”

    Story Source:
    Materials provided by University of Virginia Health System. Note: Content may be edited for style and length. More

  • in

    Sounds of action: Using ears, not just eyes, improves robot perception

    People rarely use just one sense to understand the world, but robots usually only rely on vision and, increasingly, touch. Carnegie Mellon University researchers find that robot perception could improve markedly by adding another sense: hearing.
    In what they say is the first large-scale study of the interactions between sound and robotic action, researchers at CMU’s Robotics Institute found that sounds could help a robot differentiate between objects, such as a metal screwdriver and a metal wrench. Hearing also could help robots determine what type of action caused a sound and help them use sounds to predict the physical properties of new objects.
    “A lot of preliminary work in other fields indicated that sound could be useful, but it wasn’t clear how useful it would be in robotics,” said Lerrel Pinto, who recently earned his Ph.D. in robotics at CMU and will join the faculty of New York University this fall. He and his colleagues found the performance rate was quite high, with robots that used sound successfully classifying objects 76 percent of the time.
    The results were so encouraging, he added, that it might prove useful to equip future robots with instrumented canes, enabling them to tap on objects they want to identify.
    The researchers presented their findings last month during the virtual Robotics Science and Systems conference. Other team members included Abhinav Gupta, associate professor of robotics, and Dhiraj Gandhi, a former master’s student who is now a research scientist at Facebook Artificial Intelligence Research’s Pittsburgh lab.
    To perform their study, the researchers created a large dataset, simultaneously recording video and audio of 60 common objects — such as toy blocks, hand tools, shoes, apples and tennis balls — as they slid or rolled around a tray and crashed into its sides. They have since released this dataset, cataloging 15,000 interactions, for use by other researchers.
    The team captured these interactions using an experimental apparatus they called Tilt-Bot — a square tray attached to the arm of a Sawyer robot. It was an efficient way to build a large dataset; they could place an object in the tray and let Sawyer spend a few hours moving the tray in random directions with varying levels of tilt as cameras and microphones recorded each action.
    They also collected some data beyond the tray, using Sawyer to push objects on a surface.
    Though the size of this dataset is unprecedented, other researchers have also studied how intelligent agents can glean information from sound. For instance, Oliver Kroemer, assistant professor of robotics, led research into using sound to estimate the amount of granular materials, such as rice or pasta, by shaking a container, or estimating the flow of those materials from a scoop.
    Pinto said the usefulness of sound for robots was therefore not surprising, though he and the others were surprised at just how useful it proved to be. They found, for instance, that a robot could use what it learned about the sound of one set of objects to make predictions about the physical properties of previously unseen objects.
    “I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” he said. For instance, a robot couldn’t use sound to tell the difference between a red block or a green block. “But if it was a different object, such as a block versus a cup, it could figure that out.”
    The Defense Advanced Research Projects Agency and the Office of Naval Research supported this research.

    Story Source:
    Materials provided by Carnegie Mellon University. Original written by Byron Spice. Note: Content may be edited for style and length. More

  • in

    Hurricanes have names. Some climate experts say heat waves should, too

    Hurricane Maria and Heat Wave Henrietta?
    For decades, meteorologists have named hurricanes and ranked them according to severity. Naming and categorizing heat waves too could increase public awareness of the extreme weather events and their dangers, contends a newly formed group that includes public health and climate experts. Developing such a system is one of the first priorities of the international coalition, called the Extreme Heat Resilience Alliance.
    Hurricanes get attention because they cause obvious physical damage, says Jennifer Marlon, a climate scientist at Yale University who is not involved in the alliance. Heat waves, however, have less visible effects, since the primary damage is to human health.
    Heat waves kill more people in the United States than any other weather-related disaster (SN: 4/3/18). Data from the National Weather Service show that from 1986 to 2019, there were 4,257 deaths as a result of heat. By comparison, there were fewer deaths by floods (2,907), tornadoes (2,203) or hurricanes (1,405) over the same period.
    What’s more, climate change is amplifying the dangers of heat waves by increasing the likelihood of high temperature events worldwide. Heat waves linked to climate change include the powerful event that scorched Europe during June 2019 (SN: 7/2/19) and sweltering heat in Siberia during the first half of 2020 (SN: 7/15/20).
    Some populations are particularly vulnerable to health problems as a result of high heat, including people over 65 and those with chronic medical conditions, such as neurodegenerative diseases and diabetes. Historical racial discrimination also places minority communities at disproportionately higher risk, says Aaron Bernstein, a pediatrician at Boston Children’s Hospital and a member of the new alliance. Due to housing policies, communities of color are more likely to live in urban areas, heat islands which lack the green spaces that help cool down neighborhoods (SN: 3/27/09).
    Aaron Bernstein, a pediatrician at Boston Children’s Hospital, says giving heat waves names and severity rankings may help save lives.John Wilcox for Coverage, a BCBS of MA news service
    Part of the naming and ranking process will involve defining exactly what a heat wave is. No single definition currently exists. The National Weather Service issues an excessive heat warning when the maximum heat index — which reflects how hot it feels by taking humidity into account — is forecasted to exceed about 41° Celsius (105° Fahrenheit) for at least two days and nighttime air temperatures stay above roughly 24° C (75° F). The World Meteorological Organization and World Health Organization more broadly describe heat waves as periods of excessively hot weather that cause health problems.
    Without a universally accepted definition of a heat wave, “we don’t have a common understanding of the threat we face,” Bernstein says. He has been studying the health effects of global environmental changes for nearly 20 years and is interim director of the Center for Climate, Health and the Global Environment at the Harvard T.H. Chan School of Public Health.
    Defined categories for heat waves could help local officials better prepare to address potential health problems in the face of rising temperatures. And naming and categorizing heat waves could increase public awareness of the health risks posed by these silent killers.
    “Naming [heat waves] will make something invisible more visible,” says climate communicator Susan Joy Hassol of Climate Communication, a project of the Aspen Global Change Institute, a nonprofit organization based in Colorado that’s not part of the new alliance. “It also makes it more real and concrete, rather than abstract.”
    The alliance is in ongoing conversations with the National Oceanic and Atmospheric Administration, the World Meteorological Organization and other institutions to develop a standard naming and ranking practice.
    “People know when a hurricane’s coming,” Hassol says. “It’s been named and it’s been categorized, and they’re taking steps to prepare. And that’s what we need people to do with heat waves.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox More

  • in

    Many medical 'rainy day' accounts aren't getting opened or filled

    One-third of the people who could benefit from a special type of savings account to cushion the blow of their health plan deductible aren’t doing so, according to a new study.
    And even among people who do open a health savings account (HSA), half haven’t put any money into it in the past year. This means they may be missing a chance to avoid taxes on money that they can use to pay for their health insurance deductible and other health costs.
    The study also finds that those who buy their health insurance themselves, and select a high-deductible plan on an exchange such as www.healthcare.gov, are less likely to open an HSA than those who get their insurance from employers who offer only a high-deductible option.
    HSAs are different from the flexible spending accounts that some employers offer. HSAs can only be opened by people in health plans that require them to pay a deductible of $1,400 for an individual or $2,800 for a family before their insurance benefits kick in.
    In a new paper in JAMA Network Open, a team led by researchers at the University of Michigan and VA Ann Arbor Healthcare System reports results from a national survey of more than 1,600 participants in high-deductible health plans.
    Key findings
    They note that half of those who had an HSA and put money into their account in the past year had socked away $2,000 or more. And among those who hadn’t put money in, 40% said it was because they already had enough savings to cover their costs.

    advertisement

    But those with lower levels of education were much less likely to have opened an HSA, and to have contributed to it even if they did open one. Those with lower levels of understanding of health insurance concepts, called health insurance literacy, were also less likely to put money in their HSA if they had one.
    And one-third of those who didn’t put money into their HSA said it was because they couldn’t afford to save up for health costs.
    “These findings are concerning, given that nearly half of Americans with private insurance now have high-deductible plans,” says Jeffrey Kullgren, M.D., M.S., MPH, who led the study and has done other research on HDHPs and health care consumerism. “While policymakers have focused on expanding availability and permitted uses of HSAs, and increasing how much money they can hold, work is needed to help eligible enrollees open them and use them to get the care they need at a price they can afford.”
    Change needed
    In the new paper and in a report from the U-M Institute for Healthcare Policy and Innovation, Kullgren and his colleagues call for more efforts to increase uptake of HSAs, and contributions to HSAs, by employers, health insurers and the health systems that provide care and bill insurers for that care.

    advertisement

    Targeted interventions, especially those aimed at people with lower levels of education or health insurance literacy, should be developed.
    The researchers note that as federal and state exchanges prepare to open for 2021 enrollment, expanding the types of exchange plans that are eligible to be linked to an HSA will be important. Currently, just 7% of the plans bought on exchanges are eligible for an HSA, even though many exchange plans come with a high deductible.
    The study data come from an online survey of English-speaking adults under age 65; the study population was weighted to include a higher proportion of people with chronic health conditions than the national population.
    For the survey, the researchers asked respondents about HSAs using the National Health Interview Survey definition of an HSA as “a special account or fund that can be used to pay for medical expenses” that are “sometimes referred to as Health Savings Accounts (HSAs), Health Reimbursement Accounts (HRAs), Personal Care accounts, Personal Medical funds, or Choice funds, and are different from Flexible Spending Accounts.” More

  • in

    Academia from home

    As the uncertainty around reopening college and university campuses this fall continues, those who work, study, teach and conduct research are navigating the uncertain terrain of the “new normal.” They are balancing physical distancing and other COVID-19 prevention practices with productivity, creating home workspaces and mastering communications and teamwork across time and space.
    Turns out, there’s a group of people for whom these challenges are not new. Postdoctoral researchers — people in the critical phase between graduate school and permanent academic positions — are part of a small but growing cohort that has been turning to remote work to meet the challenges of their young careers. Often called upon to relocate multiple times for short-term, full-time appointments, postdocs and their families have to endure heightened financial costs, sacrificed career opportunities and separations from their support communities.
    But with the right practices and perspectives, remote work can level the playing field, especially for those in underrepresented groups, according to Kurt Ingeman, a postdoctoral researcher in UC Santa Barbara’s Department of Ecology, Evolution and Marine Biology. And, like it or not, with COVID-19 factoring into virtually every decision we now make, he noted, it’s an idea whose time has come.
    “We started this project in the pre-pandemic times but it seems more relevant than ever as academics are forced to embrace work-from-home,” said Ingeman, who makes the case for embracing remote postdoctoral work in the journal PLOS Computational Biology. Family and financial considerations drove his own decision to design a remote position; many early-career researchers face the same concerns, he said.
    It takes a shift in perspective to overcome resistance to having remote research teammates. Principal investigators often don’t perceive the remote postdoc as a fully functional member of the lab and worry about the loss of spontaneous informal actions, and interactions, that can generate new ideas, Ingeman said.
    “These are totally valid concerns,” he said. “We suggest (in the paper) ways to use digital tools to fully integrate remote postdocs into lab activities, like mentoring graduate students or coding and writing together. These same spaces are valuable for virtual coffee chats and other informal interactions.”
    Communication enabled by technology is in fact foundational to a good remote postdoc experience, according to Ingeman and co-authors, who advocate for investment in and use of reliable videoconferencing tools that can help create rapport between team members, and the creation of digital spaces to share documents and files. Transparency and early expectation setting are keys to a good start. In situations where proximity would have naturally led to interaction, the researchers recommend having a robust communications plan. Additionally, postdocs would benefit from establishing academic connections within their local community to combat isolation.

    advertisement

    There are benefits to reap from such arrangements and practices, the researchers continued. For the postdoc, it could mean less stress and hardship, and more focus on work. For the team, it could mean a wider network overall.
    “For me, remote postdoc work was a real bridge to becoming an independent researcher,” said Ingeman, who “struggled with isolation early on,” but has since gained a local academic community, resulting in productive new research collaborations.
    Additionally, opening the postdoc pool to remote researchers can result in a more diverse set of applicants.
    “The burdens of relocating for a temporary postdoc position often fall hardest on members of underrepresented groups,” Ingeman added. “So the idea of supporting remote work really stand out to me as an equity issue.”
    Of course, not all postdoc positions can be remote; lab and field work still require a presence. But as social distancing protocols and pandemic safety measures are forcing research teams to minimize in-person contact or undergo quarantine at a moment’s notice, developing remote research skills may well become a valuable part of any early-career researcher’s toolkit.
    “Even labs and research groups that are returning to campus in a limited way may face periodic campus closures, so it makes sense to integrate remote tools now,” Ingeman said. “Our suggestions for remote postdocs are absolutely applicable to other lab members working from home during closures.” More

  • in

    Simple mod makes quantum states last 10,000 times longer

    If we can harness it, quantum technology promises fantastic new possibilities. But first, scientists need to coax quantum systems to stay yoked for longer than a few millionths of a second.
    A team of scientists at the University of Chicago’s Pritzker School of Molecular Engineering announced the discovery of a simple modification that allows quantum systems to stay operational — or “coherent” — 10,000 times longer than before. Though the scientists tested their technique on a particular class of quantum systems called solid-state qubits, they think it should be applicable to many other kinds of quantum systems and could thus revolutionize quantum communication, computing and sensing.
    The study was published Aug. 13 in Science.
    “This breakthrough lays the groundwork for exciting new avenues of research in quantum science,” said study lead author David Awschalom, the Liew Family Professor in Molecular Engineering, senior scientist at Argonne National Laboratory and director of the Chicago Quantum Exchange. “The broad applicability of this discovery, coupled with a remarkably simple implementation, allows this robust coherence to impact many aspects of quantum engineering. It enables new research opportunities previously thought impractical.”
    Down at the level of atoms, the world operates according to the rules of quantum mechanics — very different from what we see around us in our daily lives. These different rules could translate into technology like virtually unhackable networks or extremely powerful computers; the U.S. Department of Energy released a blueprint for the future quantum internet in an event at UChicago on July 23. But fundamental engineering challenges remain: Quantum states need an extremely quiet, stable space to operate, as they are easily disturbed by background noise coming from vibrations, temperature changes or stray electromagnetic fields.
    Thus, scientists try to find ways to keep the system coherent as long as possible. One common approach is physically isolating the system from the noisy surroundings, but this can be unwieldy and complex. Another technique involves making all of the materials as pure as possible, which can be costly. The scientists at UChicago took a different tack.

    advertisement

    “With this approach, we don’t try to eliminate noise in the surroundings; instead, we “trick” the system into thinking it doesn’t experience the noise,” said postdoctoral researcher Kevin Miao, the first author of the paper.
    In tandem with the usual electromagnetic pulses used to control quantum systems, the team applied an additional continuous alternating magnetic field. By precisely tuning this field, the scientists could rapidly rotate the electron spins and allow the system to “tune out” the rest of the noise.
    “To get a sense of the principle, it’s like sitting on a merry-go-round with people yelling all around you,” Miao explained. “When the ride is still, you can hear them perfectly, but if you’re rapidly spinning, the noise blurs into a background.”
    This small change allowed the system to stay coherent up to 22 milliseconds, four orders of magnitude higher than without the modification — and far longer than any previously reported electron spin system. (For comparison, a blink of an eye takes about 350 milliseconds). The system is able to almost completely tune out some forms of temperature fluctuations, physical vibrations, and electromagnetic noise, all of which usually destroy quantum coherence.
    The simple fix could unlock discoveries in virtually every area of quantum technology, the scientists said.
    “This approach creates a pathway to scalability,” said Awschalom. “It should make storing quantum information in electron spin practical. Extended storage times will enable more complex operations in quantum computers and allow quantum information transmitted from spin-based devices to travel longer distances in networks.”
    Though their tests were run in a solid-state quantum system using silicon carbide, the scientists believe the technique should have similar effects in other types of quantum systems, such as superconducting quantum bits and molecular quantum systems. This level of versatility is unusual for such an engineering breakthrough.
    “There are a lot of candidates for quantum technology that were pushed aside because they couldn’t maintain quantum coherence for long periods of time,” Miao said. “Those could be re-evaluated now that we have this way to massively improve coherence.
    “The best part is, it’s incredibly easy to do,” he added. “The science behind it is intricate, but the logistics of adding an alternating magnetic field are very straightforward.”

    Story Source:
    Materials provided by University of Chicago. Original written by Louise Lerner. Note: Content may be edited for style and length. More

  • in

    COVID-19 symptom tracker ensures privacy during isolation

    An online COVID-19 symptom tracking tool developed by researchers at Georgetown University Medical Center ensures a person’s confidentiality while being able to actively monitor their symptoms. The tool is not proprietary and can be used by entities that are not able to develop their own tracking systems.
    Identifying and monitoring people infected with COVID-19, or exposed to people with infection, is critical to preventing widespread transmission of the disease. Details of the COVID19 Symptom Tracker and a pilot study were published August 13, 2020, in the Journal of Medical Information Research (JMIR).
    “One of the major impediments to tracking people with, or at risk of, COVID-19 has been an assurance of privacy and confidentiality,” says infectious disease expert Seble G. Kassaye, MD, MS, lead author and associate professor of medicine at Georgetown University Medical Center. “Our online system provides a method for efficient, active monitoring of large numbers of individuals under quarantine or home isolation, while maintaining privacy.”
    The Georgetown internet tool assigns a unique identifier as people enter their symptoms and other relevant demographic data. One function in the system allows institutions to generate reports about items on which people can act, such as symptoms that might require medical attention. Additionally, people using the system are provided with information and links to Centers for Disease Control and Prevention COVID-19 recommendations and instructions for how people with symptoms should seek care.
    Development of the system was rapid — it took five days to design. The joint project included Georgetown University’s J.C. Smart, PhD, chief scientist of AvesTerra, a knowledge management environment that supports data integration and synthesis to identify actionable events and maintain privacy, and Georgetown’s vice president for research and chief technology officer, Spiros Dimolitsas, PhD.
    “We knew that time was of the essence and the challenges of traditional contact tracing became very clear to us based on one of our first patients who had over 500 exposures,” says Kassaye. “This was what motivated us to work on this, essentially day and night.”
    The tool launched on March 20, followed by initial testing of the system with the voluntary participation of 48 Georgetown University School of Medicine students or their social contacts. Participants were asked to enter data twice daily for three days between March 31 and April 5, 2020.
    “The lack of identifying data being collected in the system should reassure individual users and alleviate personal inhibitions that appear to be the Achille’s heel of other digital contact tracing apps that require identifying information,” says Kassaye. She also noted that this system could be used by health-related organizations during the re-opening of business to provide reassurance to their users that the enterprise is actively, rather than passively, monitoring its staff.
    Feedback from healthcare groups using the platform led to the release of a Spanish language version. As the data currently needs to be entered through the website, development of an app for cellphone use could greatly enhance the usability of the tool, said the investigators. For places where internet access is problematic, the researchers are also pursuing development of a voice activated version.
    The tracker can be view at: https://www.covidgu.org
    In addition to Kassaye, Amanda B. Spence of Georgetown University Medical Center contributed to this work. Other authors include Edwin Lau and John Cederholm, LEDR Technologies Inc.; and David M. Bridgeland, Hanging Steel Productions LLC.
    This work was partially supported by a National Institutes of Health grant UL1TR001409. The authors report no conflicts of interest. More