C L O S L E R
Moving Us Closer To Osler
A Miller Coulson Academy of Clinical Excellence Initiative

Book review of “The algorithm will see you now”

Takeaway

In this medical thriller set in the not-too-distant future, diagnosis and treatment are performed entirely by AI. The novel serves as a reminder that AI must be used thoughtfully and responsibly.  

Lifelong Learning in Clinical Excellence | January 23, 2024 | 1 min read

By Joan Naidorf, DO

The year is 2035 and AI has been so thoroughly integrated into medicine that life-and-death decisions are made by an algorithm that can predict whether cancer treatment will be successful or not. What could possibly go wrong? In her novel, Dr. Jennifer Lycette imagines this for us. 

 

Surgery resident Hope Kestrel works for Seattle-based health system, Prognostic Intelligent Medical Algorithms (PRIMA), in the oncology unit. The use of computerized surveillance and ominous acronyms read like an Orwellian fantasy. The omnipresent robotic Online Speech and Recognition System (OSLR) alludes to Sir William Osler, whose voice can be called up from anywhere in the hospital. 

 

Diagnosis and treatment are performed at PRIMA entirely with AI and OSLR. Genetic input from each patient is determined in advance, using previous input within the Algorithm, whether an individual will benefit from a particular treatment or is instead likely to be a “non-responder.” The goal, of course, is to optimize. 

 

“AI frees both patients and doctors from the fallacy of choice,” Kestrel proclaims. “The algorithms are more trustworthy than people.” 

 

Eventually, some of the physicians, including Kestrel, suspect that something is rotten at the core of the Algorithm. It turns out that PRIMA seeks to leverage its technology in a corporate takeover of both regional and national cancer care. The antagonist, hospital administrator Maddox, removes dissenters, pits residents against each other, and strongarms her way through any threats to this plan. 

 

Lycette’s multilayered story also explores issues of racial bias in treatment, sexual harassment, and the removal of human decision and interaction in healthcare.  

 

Here are three things I took away from the novel: 

 

1. AI is only as good as the data we feed it. We must make sure to capture fair data to avoid biases. 

 

2. Principled and determined healthcare professionals can still make a difference when fighting unethical entities in healthcare. 

 

3. We cannot guarantee a cure for our patients, but we can always provide comfort.  

 

 

 

 

 

 

 

This piece expresses the views solely of the author. It does not necessarily represent the views of any organization, including Johns Hopkins Medicine.