Clinician shortages have been a fact of patient life for a while, and COVID just served to highlight how dire the situation could get. Oh, and also make it worse by exacerbating burnout, disenchantment, and heightened stress. The National Institutes of Health reports a loss of over 300,000 clinicians in 2021. That’s a lot.
The American Association of Medical Colleges estimates a shortage of 86,000 physicians by 2036 (this does not include nurses, physician assistants, and other clinicians who help distribute the workload more evenly). This may not seem like too much spread over 50 states and five territories, but those of us who already have to wait months for appointments or who can’t find a doctor when ours retires can tell you how bad it is now, let alone how bad it will be then.
And my research tells me we kinda did this to ourselves. In 1980, an advisory committee sent a report to the Department of Health and Human Services a report predicting a surplus of doctors.
So, what did we do?
Instead of letting the market take care of it, we, in our wisdom, imposed a 25-year moratorium on both the creation of new medical schools and increased enrollment in existing schools. Both measures were successful. Add in high medical school debt, the obstacles set in the way of already underrepresented student populations, and dropping salaries, it’s no wonder the shortages continue to rise every year.
I wonder how loud the yelling would get if 2020 HHS could talk to 1980 HHS.
But I digress.
Shortages in all areas are dangerous but, in my estimation, primary care shortages are the most dangerous. For several reasons.
First, primary care is where we first build trust if we build trust at all. We may not see them often – hopefully only once a year – but primary care is where we go when something just isn’t right. We go to them for every cough that won’t go away, every boo-boo that isn’t healing properly, and we call them in the middle of the night when something goes wrong. They are often local – a part of our communities. And every year, they take a look at us inside and out, and tell us we’re ok. Or we’re not.
Second, primary care is where we start. Primary care clinicians have a bird’s-eye view. Ideally, they have copies of our records from any specialists we have seen, so they are tuned in to all the possibilities, as opposed to being focused on one system or organ. They know our baselines, so they recognize when something changes, as well as the breadth of a change that could indicate trouble in more than one system. They can also begin a regimen for treatment themselves or send us to specialists if necessary.
Third, the patient population isn’t static. Americans are aging, which brings more diagnoses. We are also more highly insured since the Affordable Care Act, which means more people are seeking care. And, of course, the technology to diagnose conditions, especially chronic conditions, is getting better every day (yay!). But when more and more of us are properly diagnosed, more and more of us require clinician care (boo!).
Fourth, even if we could self-diagnose, many of us have limited access to the specialists we need. Insurance, or the specialist practice itself, requires a primary care visit before granting access.
So, demand is rising, and supply is dropping. Sounds like a recipe for disaster to me. The powers that be are aware of the problem and, of course, have been for a while. And there is a search for solutions.
But I am afraid that, without access to a place to start, patients will fall back to delayed diagnoses, which means symptoms that worsen and multiply before they are treated. Or they skip medical treatment altogether, which means more emergency room visits, greater likelihood of bounce-back hospital visits, and higher mortality rates. Bad news all around.
We need to fix this.