Forge Friday Roundup - July 23, 2021

July 23, 2021

In today’s Roundup: AI, robots, and human cognitive biases; mental health burdens among physicians; how important is explainable AI in medicine?; falling vaccination rates and rising COVID cases; the new era of multiomics research; “borg” DNA assimilates genes in its path; sharp divide on privacy issues roils Web’s governing consortium; “tortured phrases” betray dubious publications; states that didn’t expand Medicaid saw growth in medical debt; assessing how well social media managed the 2020 election misinformation blitz; much more:


A 2019 photo shows an uncrewed Blue Origin spacecraft capsule returning to the ground. The capsule is just above scrub terrain in West Texas, with three large blue and orange parachutes deployed. Image credit: Blue Origin
Image credit: Blue Origin

DEEP BREATHS

  • The launch of Amazon founder and CEO Jeff Bezos’ Blue Origin spacecraft (dubbed “New Shepherd”) this week, carrying a group of mostly very well-heeled passengers (including Bezos himself) has absorbed a great deal of media attention, but you’d have to look long and hard to find a better and more thoughtful perspective on the flight than that provided in the New York Times by one of our favorite science fiction authors, Mary Robinette Kowal: “As we move forward into the world where commercial spaceflight offers opportunities to go based, not on skills, but on the amount of money in one’s wallet, we will have to continue to ask the question: Who is space is really for? But for the moment, for those four minutes of Blue Origin’s flight on Tuesday, space will be for Wally Funk, and those three men who are fortunate enough to be able to witness her joy firsthand.” And as a nice chaser for that article, consider this update from NASA’s Jet Propulsion Laboratory, which reminds us that the Perseverance Mars rover is about to start its search for signs of life on the red planet in earnest.

AI, STATISTICS & DATA SCIENCE

  • “…the grand visions of the past are gone. Today, instead of being a shorthand for technological prowess, Watson stands out as a sobering example of the pitfalls of technological hype and hubris around A.I…The march of artificial intelligence through the mainstream economy, it turns out, will be more step-by-step evolution than cataclysmic revolution.” An article by the New York Times’ Steve Lohr traces the tortuous path of IBM’s Watson AI project, once touted as a major breakthrough capable of transforming medical practice, but now trying to emerge from its status as a cautionary example of the downside to overheated marketing (H/T @MarkRDeLong).
  • “The success of AlphaFold in this paper may not come as a big shock to many scientists; rather, more as confirmation of the already-suspected capabilities of such technology, says Andrei Lupas, the director of the Max Planck Institute for Developmental Biology and an assessor at CASP. Similar systems are following close behind. Academics from the University of Washington have already designed a protein prediction tool similar to AlphaFold 2, called RoseTTaFold….Fundamentally, the news shows that this is something that AI can just do better.”  Wired’s Grace Browne explores some of the implications of a recent publication (and release of a public dataset) from AlphaFold’s AI-guided protein-folding project (H/T @ShannonValor).
  • “If explainability should not be a strict requirement for AI/ML in health care, what then? Regulators like the FDA should focus on those aspects of the AI/ML system that directly bear on its safety and effectiveness—in particular, how does it perform in the hands of its intended users? To accomplish this, regulators should place more emphasis on well-designed clinical trials, at least for some higher-risk devices, and less on whether the AI/ML system can be explained.” A thought-provoking policy paper by Babic and colleagues at Science questions whether chasing “explainable” AI in healthcare algorithms (as opposed to using “black box” technologies” constitutes something of a “red herring” when it comes to ensuring safety and effectiveness (H/T @rusincovitch).
    Photograph of a “solved” Rubik’s Cube puzzle with all faces showing a single color, sitting on a white surface with the cube’s shadow extending behind it. Image credit: Etienne Boulanger/Unsplash
    Image credit: Etienne Boulanger/Unsplash
  • “By watching a video of a robot hand-solving Rubik’s Cube at OpenAI, an AI research lab, we think that the AI can perform all other simpler tasks because it can perform such a complex one. We overlook the fact that this AI’s neural network was only trained for a limited type of task; solving the Rubik’s Cube in that configuration. If the situation changes—for example, holding the cube upside down while manipulating it—the algorithm does not work as well as might be expected.” An article at IEEE Spectrum by Boston Dynamics scientist Sangbae Kim explores the cognitive biases that permeate the fields of AI and robotics research – and that greatly influence the public narratives around these technologies.
  • “Whatever the acronym, all these techniques aim to glean complex biological insights that might be undetectable using any single method. But the task is computationally challenging, and making sense of the resulting data even more so. A fast-growing suite of software tools can help.” At Nature, Jeffrey M. Perkel documents the emergence of new era of computationally intensive tools for wrangling enormous (and related) layers of “multiomics” data in basic science research.
  •  “Support for healthcare data sharing for direct care without explicit consent is broad but not universal. There is net support for the sharing of de-identified data for research to the NHS, academia, and the charitable sector, but not the commercial sector. A single national NHS-hosted system for patients to control the use of their NHS data for clinical purposes and for research would have broad public support.” A preprint by Jones and colleagues available from medRxiv offers a snapshot of survey findings regarding public attitudes toward the sharing of data from the UK’s National Health Service.
  • “Unfortunately, though, fake papers aren’t only generated as pranks. Entire companies make money writing gibberish papers and submitting them to predatory journals that hardly reject anything because they charge a fee for publishing. Such companies, also dubbed paper mills, are getting more and more sophisticated in their methods.” In an essay at Towards Data Science, Rhea Moutafis considers some of the ways that technology is complicating issues of authorship and ethics in scientific publication.
  • It seems slightly odd that one can get better internet on the International Space Station, whizzing around in low earth orbit, than one can at McMurdo Research Station in Antarctica, but that does appear to be the case, as Chelsea Harvey reports at Scientific American: “…high volumes of data pose a challenge for scientists at McMurdo. The limited bandwidth means there’s only so much material they can transmit back to the U.S. Often, researchers are unable to fully analyze their data until they lug it home to their labs on hard drives. That’s a drag on the scientific process. It means important findings could go months before being detected. It can also cause problems for researchers out in the field.”
  • “For my students who suffer ethics ennui, I believe they possess a genuine interest in values of equity. They assert that the classic texts of CRT speak to them, that they are moved by the ideas and eloquence of W.E.B. DuBois, Patricia Williams, Kimberlé Crenshaw, and others…The most lauded avenues of action annoy them. They tell me that the politics of refusal is for people who are otherwise gainfully employed and secure in their communities.” In an essay at her Substack channel, Stanford ethics professor Ruth Starkman ponders the disjoint between highly touted AI ethics initiatives and actual substantive efforts that go beyond decorative “ethicswashing” for the tech industry.

BASIC SCIENCE, CLINICAL RESEARCH & PUBLIC HEALTH

Graffiti-OK
Image credit: Jerome/Unsplash
  • “The impact of vaccines is remarkable. They’re standing up to the variants nature has thrown our way. The overwhelming majority of hospitalizations and deaths — some 98% to 99% of the latter — are among people not fully vaccinated. (No vaccine prevents all severe outcomes.)…But so many people remain unvaccinated that, nationwide, cases have more than doubled in recent weeks — a jump driven not just by Delta, but also the country’s lapsing of mitigation efforts and people traveling and reconnecting socially.” At STAT News, Andrew Joseph takes stock of US progress in fighting the COVID pandemic – and areas where we risk backsliding.
  • “Novavax’s quest to scale up operations underscores how difficult it can be to launch a vaccine ― even with the formula and technology in hand. So what happened? It has had the financial backing of the U.S. government and full faith of international agencies. Everything took longer than expected: hiring necessary researchers and scientists, getting supplies and transferring its vaccine technology. It didn’t move at warp speed.” At Kaiser Health News, Sarah Jane Tribble and Rachana Pradhan explore what happened as Novavax – one of the early hot prospects for COVID vaccine development – has fallen behind in the stretch.
  • “Mental health professionals often tell people that there is no shame in being mentally ill. The illness is a disease, or the product of traumatic experiences, or both, depending on your particular camp — in any case, it isn’t the patient’s fault. And yet clinicians fervently guard their own histories of mental illness. A physician patient of mine drives to a different town to fill prescriptions so he is not recognized as a doctor by the pharmacist. Another asks me to provide her with drug samples so her insurance company won’t be informed of her psychiatric medications.” A poignant opinion article at STAT News by psychiatrist Susan T. Mahler plumbs the often hidden burden of mental illness shouldered by clinicians.
  • “These extra-long DNA strands, which the scientists named in honour of the aliens, join a diverse collection of genetic structures — circular plasmids, for example — known as extrachromosomal elements (ECEs). Most microbes have one or two chromosomes that encode their primary genetic blueprint. But they can host, and often share between them, many distinct ECEs. These carry non-essential but useful genes, such as those for antibiotic resistance.” Nature’s Amber Dance reports on a recent study by Al-Shayeb and colleagues (available as a preprint from bioRxiv) that describes a potentially new (and remarkably large) extrachromosomal element, informally named “the Borg” due to its assimilationist habits – a strand of DNA, recently identified residing in a species of archaea known as Methanoperedens, that hoovers up genes from surrounding microorganisms.
  • “With every passing day, the pace of vaccinations only seems to drag a little closer to the gutter. As of July 12, it had fallen off by half again. The Great Vaccine Decline now appears to be an ugly force of nature. If it continues, further horrors are all but guaranteed to follow. Sadly, those horrors may be the only thing that stops it.” At The Atlantic, Daniel Engber confronts a grim vista as COVID-19 vaccination rates across the US fall as the more contagious Delta variant of the virus is driving an acceleration in cases and hospitalizations, and plumbs the possible reasons for fading enthusiasm for immunization.
  • “We found that the rural counties with the highest proportion of residents age 85+ face unique challenges to supporting successful aging among the oldest old, including resource constraints, limited services, isolated locations, and widespread service areas. Still, interviewees identified particular reasons why the oldest old remain in their counties, with many highlighting positive aspects of rural environments and community.” An interview study by Henning-Smith, Lahr, and Tanem published in the journal Research on Aging explores a sometimes overlooked segment of the US population – the “oldest old” (persons 85 years or older), many of whom live in rural counties where access to care and resources may present difficulties.
  • “Impending doom”: a wrenching article by AL.com’s Dennis Pillion provides a window into the worsening trajectory of the COVID pandemic as the delta variant makes its way through Alabama, where only about a third of the population has been vaccinated.
  • “These findings suggest that intentional and targeted financial investment in structural, scalable, and sustainable place-based interventions in neighborhoods that are still experiencing the lasting consequences of structural racism and segregation is a vital step toward achieving health equity.” A cross-sectional study published by South and colleagues in JAMA Network Current examines correlations between the upkeep of housing in lower-income neighborhoods and incidence of crime.

COMMUNICATIONS & DIGITAL SOCIETY

Black and white closeup photo of a spiderweb with drops of dew clinging to the strands. Image credit: Gabe Rebra/Unsplash
Image credit: Gabe Rebra/Unsplash
  • “…lately, that spirit of collaboration has been under intense strain as the W3C has become a key battleground in the war over web privacy. Over the last year, far from the notice of the average consumer or lawmaker, the people who actually make the web run have converged on this niche community of engineers to wrangle over what privacy really means, how the web can be more private in practice and how much power tech giants should have to unilaterally enact this change.” A fascinating essay by Issie Lapowsky at Protocol draws back the curtain on the deliberations of an obscure yet influential group of engineers and computer scientists – the World Wide Web Consortium (W3C) – over contentious issues related to the tracking of user activity across websites.
  • “The fact that the words were Bourdain’s doesn’t make it any less eerie that an artificial voice speaks them. And in a genre like documentary, that purports to reflect reality — albeit one inevitably tempered by what’s included, what’s omitted, and the filmmaker’s particular agenda and biases — it’s especially jarring. It’s also a slippery slope, because if you’re going to fake Bourdain’s voice, why not go all-in and recreate scenes with him using CGI?” At The Input, Craig Wilson reports on the decision by documentary filmmakers to use an AI application to create a counterfeit version of the late Anthony Bourdain’s voice that is then used to “read” personal correspondence (in this case, an email sent by Bourdain to another recipient).
  • “Tortured phrases are what happens to words that get translated from English into a foreign language, then back to English — perhaps by a computer trying to generate a scholarly publication for a group of unscrupulous authors.” Retraction Watch highlights a recent paper that describes a worrisome (if also occasionally amusing) trend at the more dubious margins of scientific publishing – the use of automated translations to “launder” plagiarized or auto-generated text – a phenomenon that can be detected by scanning for so-called “tortured phrases.”
  • “The spyware that infiltrated seven of the analyzed phones is called Pegasus. It secretly unlocks the contents of a target’s mobile phone and transforms it into a listening device. NSO says it licenses the tool exclusively to government agencies to combat terrorism and other serious crimes. In India, use of the spyware appears to have gone well beyond those objectives.” The Washington Post’s Joanna Slater and Niha Masih report on revelations that the Indian government appears to be making widespread use of a phone-hacking application known as Pegasus. Marketed as a counter-terrorism tool, the spyware has been found on phones of government officials, journalists, and critics of the current government.
  • “In the individual studies, the interventions with significant effects on the peer review process included the addition of a statistical reviewer, the use of checklists and guidelines, editorial prescreening of manuscripts, the assignment of a shorter deadline to accept the invitation to review, and the blinding of the reviewers to authors' identity. In the meta‐analysis groupings, Gaudino et al found that reviewer‐level interventions led to a small improvement in quality measures at the expense of increased review time.” A meta-analysis by Guadino and colleagues (with accompanying editorial by Barry London) published in the Journal of the American Heart Association scrutinizes experimental approaches designed to improve multiple different aspects of the process of scientific peer review (H/T @RetractionWatch).
  • “The news is a worrying development for the field. CrowdTangle has enabled and shaped years’ worth of coverage of misinformation, especially since Facebook acquired it. The tool offers unique access to trending topics, public accounts and communities, and viral posts on Facebook, Instagram and Reddit that would otherwise be largely inaccessible.” With Facebook once again embroiled in controversy about the propagation of misinformation on the platform, First Draft’s Tommy Shane asks what the apparent sidelining of Facebook’s CrowdTangle analytics tool will mean for journalists and researchers trying to understand and counter the infodemic.
  • “Paid political speech is a valuable channel for democratic debate. While further study is needed, our analysis finds that banning political ads may not yield the promised benefits. We also recommend that the government make it easier for a wider range of platforms, particularly smaller platforms, to run political ads.” A report authored by Scott Babwah Brennen and Matt Perault from the Duke Center On Science and Technology Policy takes stock of the effectiveness of attempts by various social media platforms to limit the spread of misinformation around the 2020 US presidential election and provides some recommendations for the future.

POLICY

A pink ceramic piggy bank with a surgical mask over its face stands next to stacks of coins. Image credit: Konstantin Evdokimov/Unsplash
Image credit: Konstantin Evdokimov/Unsplash
  • A study by Kluender and colleagues published this week in JAMA examines the extent and distribution of medical debt in the US from 2009-2020: “…an estimated 17.8% of individuals in the US had medical debt in collections in June 2020 (reflecting care provided prior to the COVID-19 pandemic). Medical debt was highest among individuals who lived in the South and in zip codes in the lowest income deciles and became more concentrated in lower-income communities in states that did not expand Medicaid.” (H/T @jross119).
  • “…public health is asking people to sacrifice their personal preferences and interests for the sake of others and sometimes telling people they have to do that; it’s kind of like an enforced altruism. So how does bioethics deal with this? It starts by acknowledging that this is about asking or requiring people to recognize that we are part of a system in which we limit some aspect of our liberties for the sake of others.” At Issues in Science and Technology, William Kearney interviews bioethicist R. Alta Charo about a range of contentious issues, including COVID-related measures, that have recently emerged from the biomedical arena.
  • “Our findings draw attention to the history of research abuses against people of colour in Western psychedelic research. In light of these findings, we urge a call-to-action to current psychedelic researchers to prioritise culturally inclusive and socially responsible research methods in current and future studies.” An article in BMJ Journal of Medical Ethics by Strauss and colleagues revisits a historical episode which vulnerable populations were subjected to exploitative research involving psychedelic drugs.
  • “…these results provide little evidence that UCCs replace costly ER visits or that they crowd out visits to patients' regular doctors. Instead, the evidence is consistent with the possibility that UCCs—which are increasingly owned by or contract with hospital systems—induce greater spending on hospital care.” A working paper by  Janet Currie, Anastasia Karpova, and Dan Zeltzer, available from the National Bureau of Economic Research, examines the fiscal impact of new urgent care centers opening in a neighborhood that had previously lacked one.
  • “Researchers and people who run diabetes prevention efforts said participation is low because of the way Medicare has set up the program. It pays program providers too little: a maximum of $704 per participant, and usually much less, for dozens of classes over two years. It also imposes cumbersome billing rules, doesn’t adequately publicize the programs and requires in-person classes with no online options, except during the pandemic emergency period.” A joint Seattle Times/Kaiser Health News investigation by Harris Meyer explores why a Medicare diabetes prevention program designed to help people at risk of developing type 2 diabetes is currently falling short of its full potential – and what CMS is doing to change that.
  • NBC News’ Elisha Fieldstadt reports on the American Academy of Pediatrics’ recent recommendation for a “layered approach” to COVID prevention as school opens in the fall – an approach that includes rigorous masking protocols for staff and students.

Author