Forge AI Health Friday Roundup - October 15, 2021

October 15, 2021

In today’s Roundup: COVID’s impact on “fly-in” medical missions; alarm and debate over FHIR hacking report; real-world AI study finds “negligible” tradeoff between fairness, accuracy; breast cancer poses greater risks for Black women; seeking clarity on ivermectin; convolutional neural networks gaining ground in facial recognition; mixing COVID vaccines and boosters; FDA seeks lower sodium levels; developing trustworthy AI; NISO seeks to make paper retraction more visible; much more:


Photograph of a human-shaped robot with its arms thrown up in the air. Image credit: Adam Lukomski/Unsplash
Image credit: Adam Lukomski/Unsplash

DEEP BREATHS

  • According to this report from San Francisco CBS affiliate KPIX, the Robot Apocalypse may be delayed on account of being stuck in traffic: “They come all day, right to the end of 15th Avenue, where there’s nothing else to do but make some kind of multi-point turn and head out the way they came in. Not long after that car is gone, there will be another, which will make the same turn and leave, before another car shows up and does the exact same thing. And while there are some pauses, it never really stops….The cars, packed with technology, stop in a queue as if they are completely baffled by the dead end.” (H/T @CaseyNewton)
  • “I hope I never recover from this.” The official recordholder for the title of oldest person to fly in space is now…Captain Kirk? Star Trek actor William Shatner has now displaced previous recordholder Wally Funk, who just a few weeks previously also rode the Jeff Bezos-funded Blue Origin rocket past the Karman Line that demarcates (by consensus) the edge of outer space. The New York Times’ Daniel E. Slotnik has the story.
  • This is not necessarily new, but Theo Sanderson’s “Up-Goer Five Text Editor” (based on an XKCD comic by Randall Munroe will challenge your ability to express yourself as simply and directly as possible (@jamesian).

AI, STATISTICS & DATA SCIENCE

Photograph of flames against a dark background. Image credit: Cullan Smith/Unsplash
Image credit: Cullan Smith/Unsplash
  • “While the report found that the EHR platforms examined in the study had good security in place, third-party clinical data aggregators and mobile apps were a completely different story: with “widely systemic” vulnerabilities that allowed access to EHR data….The report makes it clear that the vulnerabilities aren’t inherent to FHIR, rather, it’s how the blueprint is implemented as it’s up to the developer.” An article by Jessica Davis at SC Media reports findings from a study by cybersecurity expert Alissa Knight that identifies what Knight characterizes as serious vulnerabilities with the health apps designed under to meet FHIR (Fast Healthcare Interoperability and Resources) standards. The report is generating a substantial degree of controversy and concern among the health tech, cybersecurity, and patient advocacy communities and could have significant implications for the burgeoning FHIR API ecosystem (H/T @BraveBosom).
  • “…few studies have examined the practical trade-offs between fairness and accuracy in real-world settings to understand how these bounds and methods translate into policy choices and impact on society. Our empirical study fills this gap by investigating the impact of mitigating disparities on accuracy, focusing on the common context of using machine learning to inform benefit allocation in resource-constrained programmes across education, mental health, criminal justice and housing safety. Here we describe applied work in which we find fairness–accuracy trade-offs to be negligible in practice.” A study by Rodolfa and colleagues, recently published in Nature Machine Intelligence, examines real-world tradeoffs between fairness and accuracy in applying machine learning to policy decisions (H/T @rayidghani).
  • “This article focuses on addressing three key research communities because: trustworthy AI adds new desired properties above and beyond those for trustworthy computing; AI systems require new formal methods techniques, and in particular, the role of data raises brand new research questions; and AI systems can likely benefit from the scrutiny of formal methods for ensuring trustworthiness.” A review article by Jeannette M. Wing published in Communications of the ACM, surveys the case for the development of “trustworthy AI.”
  • “The extent to which today’s computer scientists and mathematicians will be able to keep pace with the hype surrounding AI at the moment is unclear. What’s crystal clear, however, is that machine learning and deep learning are being stitched into our health systems for the long run. AI is here to stay. But the path forward requires substantial trust in science. It also requires extraordinary evidence that the technology works.” In the Fall issue of Harvard Public Health, Chris Sweeney provides a sweeping historical perspective on the application of AI in public health research, and why the current moment may represent a major inflection point – for good or ill – in the use of the technology in public health.
  • “Traditional Neural Network based models find it challenging to recognize faces that are partially occluded with hands, scarves, or other barriers. But Convolutional Neural Networks (CNNs) are highly capable of learning features and accurately identifying the image, even when the only fully visible part of the face is the eyes. Adding gradient descent optimization to the CNN approach has resulted in a better result for many researchers, making advances in identification from other features: eyebrows, cheeks, or foreheads. As a result, facial recognition systems research using CNNs is booming.” An article at Data Science Central by Stephanie Glen reports that rapid refinements to facial recognition software – particularly the varieties making use of convolutional neural networks – may be able to identify individuals on the basis of only a small bit of exposed facial features, and countermeasures such as lower-face masks and sunglasses will no longer be proof against the algorithms.
  • “Natural language processing (NLP) systems have become a central technology in communication, education, medicine, artificial intelligence, and many other domains of research and development. While the performance of NLP methods has grown enormously over the last decade, this progress has been restricted to a minuscule subset of the world's 6,500 languages. We introduce a framework for estimating the global utility of language technologies as revealed in a comprehensive snapshot of recent publications in NLP.” A preprint research paper by Blasi and colleagues, available at arXiv, surveys global inequities in research and applications of natural language, particularly among the large body of  languages that fall outside the small handful of those in which NLP studies most frequently take place. 
  • “Artificial intelligence (AI) is increasingly becoming a tool for researchers in other science and technology fields, forging collaborations across disciplines. Stanford University in California, which produces an index that tracks AI-related data, finds in its 2021 report that the number of AI journal publications grew by 34.5% from 2019 to 2020; up from 19.6% between 2018 and 2019.” A feature article at Nature by Jack Leeming investigates how AI is being applied across a wide variety of scientific disciplines.
  • “Now, after 15 years of advocating for and developing “interpretable” machine learning algorithms that allow humans to see inside AI, Rudin’s contributions to the field have earned her the $1 million Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity from the Association for the Advancement of Artificial Intelligence (AAAI). Founded in 1979, AAAI serves as the prominent international scientific society serving AI researchers, practitioners and educators.” Duke’s Ken Kingery, writing for the Pratt School of Engineering website, highlights a recent major honor bestowed on Duke engineering professor and AI expert Cynthia Rudin.

BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH

  • “Black women, in particular, experience marked inequities in breast cancer treatment and survival. Recent statistics from the American Cancer Society demonstrate widening breast cancer mortality rates between Black and white women, with a 41% higher death rate for Black women. This striking disparity reflects a combination of factors, including more advanced disease at diagnosis among Black women and differential treatment receipt. Black women experience greater socioeconomic barriers and bias from oncologists.” An opinion article published in the Philadelphia Inquirer by radiologist Christine Edmonds and breast surgeon Oluwadamilola “Lola” Fayanju seeks to raise awareness of the elevated breast cancer risk borne by Black women in the U.S.
    African American woman receiving a mammogram, assisted by a healthcare worker. Image credit: National Cancer Institute
    Image credit: National Cancer Institute
  • A progress report on a National Academies roundtable devoted to genomics and precision health, authored by Duke’s Geoffrey Ginsburg and colleagues, has been published in the American Journal of Human Genetics.
  • “If endangered hosts are highly connected in host–parasite networks, then future host extinctions will also drive parasite extinctions, destabilizing ecological networks. If threatened hosts are not highly connected, however, then network structure should not be greatly affected by the loss of threatened hosts.” A paper published by Herrera and colleagues in Philosophical Transactions of the Royal Society B reports on analyses that suggest the growing threat of extinction for the world’s primate species is likely to have complex ecological follow-on effects – including the resultant extinction of specialize parasite species (H/T @DukeGenomics).
  • “Ivermectin, when used correctly, has prevented millions of potentially fatal and debilitating infectious diseases. It’s meant to be prescribed only to treat infections caused by parasites. It’s not meant to be prescribed by parasites looking to extract money from desperate people during a pandemic. It’s my sincere hope that this unfortunate and tragic chapter in the otherwise incredible story of a lifesaving medication will come to a quick end.” In an explainer for The Conversation, pharmacy professor Jeffrey R. Aeschlimann describes why ivermectin is a world-changing drug, how the process of drug repurposing works, and why dosing oneself with the antiparasitic agent as a prophylactic or treatment measure against COVID is both risky and unsupported by current scientific evidence.
  • “The story of how Ravkoo reinvented itself as an ivermectin supplier reveals how the telemedicine boom, accelerated by the pandemic, has left patients vulnerable. U.S. health officials have warned for more than a year that ‘rogue online pharmacies’ could seize on misinformation and medical distrust to peddle unproven and potentially dangerous prescription drugs to treat COVID-19.” And speaking of which: this investigative report by Time’s Vera Bergengruen draws back the curtain on an astonishing scheme to fill prescriptions for unproven COVID treatments, including ivermectin and hydroxychloroquine, with the pharmacy company involved charging sky-high prices – and then on at least some occasions, failing to deliver any product at all.
  • “For many decades, medical missions have flown doctors and nurses from wealthy countries into poorer nations, set up temporary clinics, treated as many patients as they could in a week or two, then flown their staff back home. These visits may have helped those who were in the front of the line for care — but don't help those who can't be seen during a limited visit or need follow-up care — until and unless the mission returns.” A story by NPR’s Joanne Silberner and Adela Wu for the “Goats and Soda” blog examines how the COVID pandemic has changed so-called “fly-in” mission in which physicians, nurses and other specialists from wealthy nations are deployed to nations or regions with fewer healthcare resources (with quotes from Duke neurosurgeon Michael Haglund).
  • “…we show that the majority of newly formed oesophageal tumours are eliminated through competition with mutant clones in the adjacent normal epithelium. We followed the fate of nascent, microscopic, pre-malignant tumours in a mouse model of oesophageal carcinogenesis and found that most were rapidly lost with no indication of tumour cell death, decreased proliferation or an anti-tumour immune response. However, deep sequencing of ten-day-old and one-year-old tumours showed evidence of selection on the surviving neoplasms.” A thought-provoking paper published in Nature by Colom and colleagues examines the mechanisms that seemingly curb tumor proliferation, even as genetic errors and mutations pile up with increasing age.
  • NPR correspondent Rob Stein reports on the release of a new paper from NIH investigators (available as a preprint from MedRxiv) that may open the door to mixing and matching of vaccine boosters: “If you got the Johnson & Johnson vaccine as your first COVID-19 shot, a booster dose of either the Moderna or Pfizer-BioNTech vaccine apparently could produce a stronger immune response than a second dose of J&J's vaccine. That's the finding of a highly anticipated study released Wednesday….And if you started out with either Pfizer or Moderna, it probably doesn't matter that much, the research suggests, as long as you get one of the two mRNA vaccines as a booster.”

Leak
Image credit:  Joe Zlomek/Unsplash

COMMUNICATIONS & DIGITAL SOCIETY

  • “It is a critically important issue. Trust in scholarly communications, and the scientific process that it represents, is what makes scholarly publishing different from other forms of publishing. We see this as there being a problem with retracted science and the communication about retracted status. We hope to, through this process, improve the ecosystem awareness of retracted status.” Retraction Watch interviews Todd Carpenter, executive director of the US National Information Standards Organization (NISO), which is on a mission to make problematic papers – ones with retractions or expressions of concern – more readily identifiable in the scientific literature.
  • “Tuesday’s announcement stated that Facebook plans to comb through some of the online discussion groups to remove individuals whose work isn’t related to safety and security. The changes will occur in ‘the coming months’ and ‘with the expectation that sensitive Integrity discussions will happen in closed, curated forums in the future.’” The New York Times’ Ryan Mac has a rather, er, recursive story about Facebook, in the wake of a whistleblowing episode and ongoing scrutiny, attempting to clamp down on leaks of internal discussions in an internal discussion that was promptly leaked….
  • “It may not be over even when physical disease, measured in illness and mortality, has greatly subsided. It may continue as the economy recovers and life returns to a semblance of normality. The lingering psychological shock of having lived in prolonged fear of severe illness, isolation and painful death takes long to fade.” New York Times’ medical correspondent Gina Kolata pulls back the focus to provide a long view of the historical course of current and past pandemics, and suggests that it’s unlikely we’ll reach a clear-cut point where the COVID pandemic is officially over and done with.
  • In The Verge this week, we have multiple stories involving robots being variously charming (skating, slacklining, etc.) or terrifying (quadrupedal all-terrain robots, outfitted with automatic assault rifles).
  • “If scholars increasingly cite beyond ‘top’ journals, cite work in languages other than English, and connect their own work with broader contexts, this could help to promote a more equitable publishing environment. We suggest that when developing our own academic papers, and when critiquing the work of others, it is important to increasingly adopt and encourage these simple practices.” A post at the London School of Economics’ Impact Blog by Shannon Mason and Margaret Merga makes the case for citing less prominent scholarly journals as means to expanding the diversity of research.

Photograph showing a cook sprinkling salt into a dish. Image credit: Bank Phrom/Unsplash
Image credit: Bank Phrom/Unsplash

POLICY

  • “In this cross-sectional study of 216 drugs granted accelerated approval from 1992 through 2020, relative to all drugs paid for by Medicaid, products with accelerated approval comprised less than 1% of use. Despite their infrequent use, annual net spending on drugs with accelerated approval represented 6.4% to 9.1% of net spending on all drugs covered by Medicaid (in 2015 and 2018, respectively).” An analysis published in JAMA’s Health Forum by Sachs and colleagues suggests that drugs approved under the US Food and Drug Administration’s accelerated approval program account for a disproportionate share of state Medicaid spending (H/T @DusetzinaS).
  • “In most cases, White men were paid a higher median compensation than men of other races/ethnicities and women of all races/ethnicities…Of faculty of the same race/ethnicity, men had a higher median compensation than women in most cases, indicating that gender is the primary factor driving compensation inequities.” The American Association of Medical Colleges (AAMC) has recently released a report examining salary and compensation information from US medical schools according to gender and race/ethnicity (H/T @UREssien).
  • Content warning: the following article contains discussion of suicide. “Students say that virtual learning has been intellectually and emotionally draining for many of them, and that public-health restrictions on campuses designed to reduce the spread of the coronavirus have increased loneliness and isolation.” At the Chronicle of Higher Education, Sarah Brown reports on growing concern at the US Department of Education and Justice Department over the risk of suicide among students – risks that have been heightened by stressors associated with the COVID pandemic’s effects.
  • “Over the next 2.5 years, the FDA’s target sodium levels aim to cut average intake by 12% — from 3,400 to 3,000 milligrams a day. That would still leave average intake above the federally recommended limit of 2,300 milligrams a day for people 14 and older. But the agency says it will monitor industry progress and keep issuing updated targets to bring levels closer to the recommended limit over time.” The Associated Press reports on the release of new (and for the time being, voluntary) FDA guidelines on sodium levels for food companies.