In today’s Wednesday Roundup: artificial nerves for robots, using AI for clinical documentation, the need for (computational) speed, a new look at an old psychological experiment, AHRQ’s National Guidelines Clearinghouse to be shuttered, and much more.
Where AI and Hardware Meet
- Teaching robots to (literally) feel? SingularityHub has an overview of research on creating artificial nerves for robots and prosthetics, including this work recently reported in Science.
- The Economist reports on hardware “bottlenecking” for AI applications – and the resulting growth of interest in developing specialized chips capable of handling the computational challenges. Meanwhile, the US reasserts its claim to the world’s fastest computer, capable of 200 petaflops.
Data Science: Methods and Applications
- “This reimagining of clinical documentation, powered by AI, fundamentally alters the relationship between the physician and the EHR and moves the industry closer to the true spirit and vision of ‘meaningful use.’” In Mayo Clinic Proceedings, Lin and colleagues look at the promise and complexities of using AI to lighten the burdens of clinical documentation.
- A post on Medium reports on the initial experiences of a joint IBM-Geisinger Health System project that applied machine learning to build a predictive model for sepsis risk.
- Complicated data in psychiatry: Mental illness is a function of both genetics and exposure, but how each contributes and the directions of causality are not always apparent. In this article in JAMA Psychiatry, Guloksuz and colleagues discuss the challenges of modeling risk for mental illness.
- In a preprint available from ArXiv, O’Sullivan describes a device designed to make EEG readings more accessible in the NICU, presenting interpretable readings by replacing visual feedback with audio feedback with help of AI.
- An article published in JAMA as part of its Guide to Statistics and Methods explores the use of permuted blocks and stratification in randomized trials.
- Your algorithm might be biased (and it's your fault). If you train a machine with stereotypical data, it will learn (and perpetuate) stereotypes. An article from the Princeton Alumni Weekly explains why it pays to be careful with training data and explores how to avoid this problem.
Media and Communication
- In the age of fragmentation, viral disinformation, and growing skepticism of institutions, how can we restore trust in news? NiemanLabs takes a look.
- You know all about the Stanford Prison Experiment. Or do you? On Medium, author Ben Blum unwinds the tangled history and mythology surrounding the famous psychological study.
- Social media has opened a window onto the intersection of the personal and the professional as physicians increasingly share candid accounts of their own challenges and frustrations. In BMJ, ethicist Daniel Sokol questions whether it might be better to maintain some boundaries. The question is particularly fraught at a time when concerns about burnout and mental and emotional pressures on physicians are gaining wider attention.
- More from NiemanLabs: media transparency organizations DocumentCloud and MuckRock are merging.
Clinical Care and Public Health
- In the American Journal of Preventive Medicine, Moran and colleagues report on surges in soda marketing that coincided with periodic payment of SNAP benefits. The Washington Post has additional context.
- “Home-time represents a novel, easily measured, patient-centered endpoint that may reflect effectiveness of interventions in future HF studies.” A recent study published in the Journal of the American College of Cardiology leverages registry and claims data to advance a novel endpoint for heart failure.
- “…the reinvention of cancer therapy needs time, patience and diligence — and, yes, skepticism.” Siddhartha Mukherjee argues in the New York Times for a more expansive view of “precision medicine” in cancer treatment.
- A clearinghouse for clinical practice guidelines maintained by the Agency for Healthcare Quality and Research is being shut down, STAT News reports.