There seems to be a great deal of misunderstanding about Medical School. There are valid questions about curriculum, defined as what should be taught, and when, and who should teach it. But recent calls for students to “gain fluency in [health] systems” are completely misplaced. Here’s why.
What do you need to know to be a doctor?”
What kind of doctor are you going to be, and what kind of setting are you going to practice in (if you’re going to practice at all)? Because what you need to know to be a self-employed general surgeon in a rural area is completely different from what you need to know as a hospital employed pathologist. Or a suburban solo family doctor. Or an urban pediatrician. Or an academic rheumatologist.
It’s neither possible nor appropriate for medical school to claim to teach everything every kind of doctor needs.
The good news is that actual medical schools and other institutions involved in medical education really do understand this. As always, it’s the suits, bureaucrats, consumers, politicians, and policy makers who stridently insist that this or that “needs to be taught in medical school,” when most times all they’re doing is taking valuable time away from what really needs to happen in medical school.
What is medical really about? Two things.
First, you need to learn a huge volume of basic information about the human body, how it works, and how things go wrong with it. That includes anatomy (gross and micro), biochemistry, physiology, pathology, microbiology, and pharmacology. That’s your first two years right there. Most of the other stuff (statistics, sexuality, ethics) is to keep you from going crazy, but the firehose of information is the whole point. Although the testing process makes it seem as if you need to memorize it all, you really don’t. There’s nothing wrong with knowing where to look things up. BUT you need to know about it. That’s the main difference between PA and NP training (and what leads to the not-knowing-what-you-don’t-know debacle).
Next, you need to learn to apply all this information to the process of dealing with actual humans. This includes learning how to elicit information from your patients (how to take a history, a skill you then spend a lifetime refining), performing a physical exam, and interpreting diagnostic tests and imaging.
Notice that you don’t learn a whole lot about treatment. That’s because treatment is what you learn AFTER you are technically a doctor, in postgraduate (residency) education. That’s where people learn to be whatever kind of doctor they’re going to be. Each specialty uses the material from medical school in different proportions. Surgeons use their anatomy knowledge a hell of a lot more than psychiatrists. Pathologists don’t use as much of their pharmacology as pediatricians. And so on.
I like to say that undergrad teaches you what you need to get through the first two years of medical school. The first two years of med school teach you how to get through the second two years. And medical school teaches you what you need to know for residency, which is where you actually learn to be a doctor.
All this other crap, like “systems” and contracting and payment models are important to know about, but not in medical school. Valid concerns have been raised about the current structure of med school curricula, but losing sight of its primary aim — learning what you need to know TO BECOME a doctor, not TO BE one — isn’t going to help.