I walked into the hospital room of “Gladys,” 54-year-old woman who, like practically a quarter of ER patients, had belly pain,1 but something didn’t seem right. I’m not trying to be deliberately, annoyingly vague: something I couldn’t articulate kept me in her room.2Her vital signs looked normal and she wasn’t in terrible distress. Yet I loitered despite the relentless administrative pressure to see more patients, faster.
Gladys and I were chatting about her impending grandchild when, in less time than it takes you to read this sentence, Gladys sat up, vomited a huge quantity of blood, her blood pressure bottomed out, and she lost consciousness. I had to intubate, attempt a balloon tamponade, and initiate massive transfusion protocol—quickly. The ER team was able to stabilize her—barely. Luckily, I was already in the room.
Textbooks can’t describe and standardized tests can’t detect gestalt. Or, as Gen Z would say, “vibes.” A lot of ER docs like “spidey-sense”—the tingle that whispers: “order a CT scan to rule out a life-threatening diagnosis,” though the patient’s symptoms don’t exactly indicate it. Gestalt is built and honed by seeing thousands of patients. Do emergency medicine for 80 hours a week for three to four years —the length of an ER residency—and a resident doctor will have spent around 10,000 hours on direct patient care. It’s during those encounters that doctors are (supposed to be) guided towards developing and deepening the fundamental mental models that run in their cognitive background while evaluating each new patient.
But how does a doctor know if their models are accurate or adequate?
Answering that question really means asking if the education student doctors receive is both adequately teaching the fundamentals as well as teaching resident physicians how to think about and evaluate their own thought processes. And Dan Luu’s “Why don’t schools teach debugging” got me thinking about the way science and medical education universally teaches the fundamentals: badly.
When I suggested to the professor that he spend half an hour reviewing algebra for those students who never had the material covered cogently in high school, I was told in no uncertain terms that it would be a waste of time because some people just can't hack it in engineering. I was told that I wouldn't be so naive once the semester was done, because some people just can't hack it in engineering. I was told that helping students with remedial material was doing them no favors; they wouldn't be able to handle advanced courses anyway because some students just can't hack it in engineering. I was told that Purdue has a loose admissions policy and that I should expect a high failure rate, because some students just can't hack it in engineering.
There seems to be a mass delusion in the sciences that someone—not you, but someone—must have, or at least should have already, taught a student the fundamentals by the time they get in front of you, so that you can focus on the interesting, juicy, complex conversations, presumably with the “smart” people who already get it, the smart people who can hack it. But most of the people who get it had to get it somewhere3—why shouldn’t that somewhere be with you?
In medicine, we often mistake the speed of initial understanding with a students’ capacity for mastery. This expectation starts in pre-med courses. Organic chemistry (“O-chem”) is the big pre-med “weed out” course because it both requires high-volume memorization and is one of the first times students have to learn a new way of thinking. In general chemistry, basic chemical equations are algebraic. The process of balancing two sides of an equation works off a mathematical model most students see in high school. O-chem, however, demands that you think in the language O-chem provides, which is a long list of various chemical reactions whose effects you memorize like vocabulary, and then you have to figure out how to use them to solve the puzzle of turning one chemical into another. You memorize the facts, then have to think in a new way to understand the fundamentals. You can’t just regurgitate. O-chem problems, much like complex patients, have multiple solutions. That’s why it’s hard.
Learning the language of chemistry to talk about chemistry resembles the way doctors learn the language of the body so they can think about the body. That’s the only logical reason, apart from a de-facto IQ test, I can come up with for having to take organic chemistry, because there’s no practical utility for O-chem in day-to-day doctoring. I suspects it’s also a test—one of many— to see if we’re willing to torture ourselves and jump through hoops. Extra points to anyone who asks “How high?” Masochism is heavily selected for in medical students, which is an essay topic for another time.
My O-Chem teacher was a lot like Luu’s engineering prof, who assumed that “hacking it” had more to do with inherent student capabilities rather than the quality of the teaching. On the first day of class, he announced that a quarter of the class would drop by the end of the month, and half of what remained would get a C or lower in a non-curved course. It seems like a sign of a lazy educator to announced he wouldn’t be capable of adequately explaining the material to students. Was half the room too stupid or lazy to understand? “Too stupid to understand” doesn’t usually sign up to be 25% of a 315 person O-Chem class at 7:30 a.m.
I had a really hard time getting it. Although the professor had office hours three times a week, showing up to them meant trading his ire for his help, which I (and many others) did. He said that, if I was having so much trouble understanding early on, I should leave, because some people won’t get it. Instead, I found an outside tutor who helped me understand how to approach problems, and change to a less linear way of thinking, and O-chem finally clicked. It was as if I was able to take all the words I’d memorized and finally speak the language, and think in that language, without having to translate. Was I slow on the uptake? Was my professor a bad teacher? Probably. Neither, however, precluded me from eventually getting it, suggesting both that I had the necessary processing power, and someone could teach the material.
Speed of understanding, however, only becomes more important as medical education continues into residency.
A doctor’s foundational clinical mental models are built during residency, but the apprenticeship model of residency has flaws. An attending physician (an attending is a physician who has completed residency ) may be a skilled clinician but a poor educator, or not have the time, patience, or inclination to educate residents. A resident may only have three years in which to gain the practical, clinical knowledge they need to practice independently for the rest of their career. This puts a great deal of pressure on a student. But it should also places the burden of that education on attending physicians, many of whom aren’t given adequate tools and time to teach during their own workdays.
An ER shift usually goes like this: One attending physician oversees an ER “pod\area” and between 1-3 residents. Patients are assigned to that pod, residents assign themselves to patients, and the attending is responsible for seeing and evaluating all the patients while supervising the residents. An ER doctor sees, on average, 2-4 patients per hour, over a 8-12 hour shift, with multiple patients being juggled at once (I’ve cared for more than 20 active patients at a time). Patients who require intensive resuscitation or procedures may need an hour or more of sustained attention, while the board (the list of patients assigned to a pod) backs up. Being an Emergency Physician is about interruptions and fragmented time.
Even if you haven’t been in an ER lately, the news of worsening overcrowding, boarding times and uptick in patient visits is all over the news. It's worse than you imagine. ER attendings are beholden to any number of administrative metrics—patient satisfaction, charting completion, door to doc times—but especially throughput speed. You have to “move the meat”—ER lingo for getting patients in and out of the department quickly—to try and keep up with that endless stream of patients. Correcting a resident’s incorrect treatment plan only takes a moment, but stopping to interrogate the thought process that led that resident to the wrong answer takes time during which another patient arrives in anaphylactic shock, someone is bleeding onto the floor, another three patients are vomiting and a gunshot wound is being wheeled into the trauma bay. Those patients have to come first.
Teaching during a shift interrupts a busy workflow and means that attendings have to trade time completing their charts, for example, in order to teach, which then results in having to stay late or bring work home. Teaching or staying late doesn’t (usually) come with extra compensation, so the motivation needs to be intrinsic.
There are attempts to standardize resident education and overcome the variables that affect teaching on-shift, mostly with weekly “conferences” consistent of educational lectures, the quality of which also varies extensively depending on the lecturer. Let’s just say most wouldn’t be invited onto a TEDx stage. It’s not a bad way to learn the basics and facts: things like the biochemical changes cause by a kidney with stage III kidney disease, how to perform a simple interrupted suture, or how to calculate cardiac risk stratification scores.
But the fundamentals—which I’m defining as how you use and manipulate those basic facts to solve real problems in real patients, like when to use that simple interrupted suture or whether other factors would recommend a mattress suture, how a patient’s cardiac risk score interacts with other facts to influence a treatment plan, or what do when your kidney failure patient is coding and you don’t know their complete medication history—are learned mostly in real-time, on shift. How to think about facts, how to use facts, is where the art of medicine lies.
Even simulation-lab patient encounters don’t adequately recreate the challenges of rapid decision making in a busy ER, and tend to focus on common patient presentations. But patient’s rarely read the textbook and present accordingly. Basic facts are only tools. The facts don’t teach you to think like a doctor, any more than being able to identify a hammer makes someone a handyman.
There are a lot of misaligned incentives in resident education: attendings are judged by metrics of speed and patient satisfaction, residents want to learn and be seen as “good” so they can graduate and be recommended for a job, and hospitals want more patients to be seen, faster.
Because of the top to bottom relentless pressure to move the meat, a “good” resident, a resident who can “hack it” is a resident who is able to work quickly with minimal risk to patients. Who wouldn’t want to work with a resident whose incentives are aligned with yours? I know I feel more optimistic about the day when I see that I’m working with a resident who moves quickly and can help me do my job more smoothly.
But that means that someone— not me, but someone—must have, or should have already, taught the student the fundamentals by the time they get in front of me, so that I’m not slowed down and can focus on the interesting, juicy, complex conversations with a resident who “gets it.” Residents are smart. They know that’s what’s desirable: already knowing the things that they are really there to learn.
The residents who master material in a way that allows them to work quickly, whether or not they understand deeply, are prized and praised. Problem is that deep understanding usually requires sacrificing speed (initially), and there’s an inevitable bottleneck that happens when someone is laying the groundwork for fluency in a new skill.
How can we tell when the resident is quick and right via luck or guessing—and when they’re quick and right because they understand? We can’t, really, not until the right situation presents itself. And the truth is, doctors can get away with a lot of algorithmic thinking before a patient presents who is both complex in unexpected ways and in ways that might kill them if you get it wrong.
That’s what separates the physicians from many other members of the medical team: The training to get away from the algorithm and use a deep understanding to come up with novel solutions. That’s also why algorithmic thinking can be so dangerous. So many patients never bother reading the flowcharts before they arrive and only presenting with the allowed symptoms.
Skill can be confused with speed of mastery and competence can be confused with confidence, because we want it to be. What I keep coming back to is that so much of science and medical training comes down to perception of skill, as opposed to actual skill.
For example, when I was in residency, a friend was given feedback that she wasn’t seeing enough patients on shifts. When she asked how many she saw compared to other residents, she was told that they didn’t have the numbers, but they could see she was slow, and she needed to show she could keep up, or wouldn’t be able to hack it. The electronic medical record had a search option where you could pull up the number of patient’s you’d seen in the last six months. So she looked. And then she looked up the numbers of all the other residents. Out of thirteen residents, she was ranked #6. When my friend brought this information back to the program director, she was told that it didn’t matter what the numbers showed, she gave the perception of being slow, and needed to fix it. Her program director wouldn’t even look at the data.
I remember presenting a patient to one of my attendings and saying that, given his clinical picture and my list of possible diagnoses, I wasn’t sure what the best next test would be.
“I want you to be sure,” my attending said.
“Yes, I want to be sure as well, but I’ve never seen this and so I’m not sure.”
“You need to be more confident in your plans.”
I countered that I’d be more confident in my plan if I could discuss the few different plans I’m considering and learn which was the most appropriate to the patient and why, so that the next time I saw a similar patient, I had a better understanding of why I was doing what I’m doing instead of just being perceived as knowing it. That’s when I’d be confident. Plus, I only had three years to get that kind of feedback before I was the one providing that feedback to others. There’s never another time during their career when a doc has the opportunity to run every single patient they see by a more experienced clinician.
I’m as skeptical of residents (and attendings) with too much confidence as none: I don’t want my residents to be too confident, to be too thoroughly convinced of their own rightness and way. I want them to have the freedom to admit when they don’t know everything. Because they don’t. I don’t, either. In The Name of the Rose, William of Baskerville is a monk but also a proto-detective in the mold of Sherlock Holmes, and when he’s trying to solve a series of increasingly bizarre murders, he tells his sidekick, Adso, that “we mustn’t dismiss any hypothesis, no matter how farfetched.” And so it often is in medicine. Being okay with uncertainty will make both residents and attending’s lives better, and more importantly, patient lives safer.
The problem of perception as the most important metric of skill is a problem with most forms of physician evaluation. When going for your quarterly review, your boss, who doesn’t directly watch you interact with patients, uses nursing and peer perception of your skills, as well as certain metrics of speed and patient satisfaction scores, to determine the depth of your knowledge and ability to care for patients.
There’s no direct, objective observation of your interactions with patients, or discussion of how you break down complex problems, or philosophy to approaching new patients, or what you do when you’re faced with an unfamiliar problem, all of which give a much deeper understanding of who someone really is as a physician. When you need a new job, you’re required to get letters of recommendations from colleagues who have also never watched you interact with patients directly, and who don’t have any idea what kind of a doctor you actually are, just what kind of a doctor you appear to be.
This is a systems problem. Most physicians who are hired to work in residencies didn’t get formal training on how to educate. They happen to work at a site that has residents, and so they have to teach. We base how we teach on how we were taught. We praise for what we were praised for. I’m not immune. I catch myself doing it, too. We teach our residents how to succeed in a system where a doctor’s success and a patient’s successful care don’t always spring from the well of thought.
Incentivizing deep learning and deep thought means reducing the time pressure on both attendings and residents. If hospitals valued people over profits, they’d hire more attendings to both see patients and supervise, spreading both the patient care and educational workload. The existing arguments that this is cost prohibitive is laughable. For example David Reich, CEO of Mt. Sinai Hospital in NYC made $1,808,577 in 2023 (excluding bonuses, which can be impressive). According to Glassdoor (and on par with my experience) the salary for a full time ER physician in NYC is between $200-275k a year.
The center for Medicare and Medicaid is the primary source of graduate medical education (residency) funding. Per the Graham Center interactive GME data tool Mt. Sinai received around $175k a year per resident, and pays them a salary of $84,479\year, leaving 90K to pay for their “education.” Remember that ER patients are being billed, and the physician pay comes from hospital profit. There should be plenty of room to hire a few additional physicians, if administrators stopped to remember that the residents being trained will one day be the attendings caring for them.
Until incentives align, and the hospitals reward and pay physicians for doing the work of educating in addition to their clinical work; until teaching attendings have adequate training on how to educate; until hospitals are willing to staff adequately so there’s time to teach, the system will remain broken.
Part Two (coming soon): How can we teach our students (and ourselves) to think better within the system we have?
If you’ve gotten this far, consider the Go Fund Me that’s funding my husband Jake’s ongoing cancer treatment. Essays and Archives are not paywalled, but your support gives us more time to focus on both writing and each other, which we appreciate!
If you enjoyed reading, let me know by giving the heart button below a tap, commenting, sharing, and subscribing, if you don’t already.
Though it feels to docs like everyone has belly pain, that is not, strictly speaking, true.
ER docs are tracked and judged on the amount of time we spend with each patient, so no one takes extra time in a patient room needlessly.
Obvious exceptions can be made for the true statistical IQ outliers. But most people aren’t Paul Erdos who began doing mathematics at the age of three, although, like Erdos, many medical students and residents are fuled by amphetamines. You can thank Freud for that. Incidentally, Erdos is responsible for one of my favorite science quotes: “A mathematician is a machine for turning coffee into theorems.”
I completely agree with Susan K.
"Until incentives align, and the hospitals reward and pay physicians for doing the work of educating in addition to their clinical work; until teaching attendings have adequate training on how to educate; until hospitals are willing to staff adequately so there’s time to teach, the system will remain broken."
This was an extremely important message. Thank you