Saturday, March 30, 2013

National Doctors' Day

When I began my career in medicine, there was no Doctors' Day.  At least that's the way I remember it.  But I do like it. Over the years the hospitals where I've practiced have marked the day by doing nice things to express appreciation for the doctors on staff.  These have included a free breakfast or some small gift.  The fact that the gift was the kind one might get "free" with the purchase of a fragrance at a department store - I have collected a few gym bags over the years - was always reassuring, in that I wouldn't have wanted them to spend the hospital's limited funds on anything extravagant.  I recall one year when people who worked in the hospital's upper management came to work in old clothes and washed the cars parked in the doctors' parking lot.  It was a small hospital with a small number of doctors on staff, and the gesture was personal and much appreciated.

So this year, when I got a very nice compact umbrella, I tried to recall how far back this custom went and decided to look it up.  It appears March 30, 1991 was the first National Doctors' Day.

What does it take to establish a national day (or week or month) for something? As you probably know, it's an Act of Congress.  Specifically, this was a Joint Resolution "to designate March 30, 1991 as National Doctors Day."

Whereas society owes a debt of gratitude to physicians for the contributions of physicians in enlarging the reservoir of scientific knowledge, increasing the number of scientific tools, and expanding the ability of health professionals to use the knowledge and tools effectively in the never-ending fight against disease; and
Whereas society owes a debt of gratitude to physicians for the sympathy and compassion of physicians in ministering to the sick and in alleviating human suffering: Now, therefore, be it
    Resolved by the Senate and House of Representatives of the United States of America in Congress assembled, That--
    (1) March 30, 1991, is designated as `National Doctors Day'; and
    (2) the President is authorized and requested to issue a proclamation calling on the people of the United States to observe the day with appropriate programs, ceremonies, and activities.

[For those of you who have trouble with apostrophes, that's supposed to be a plural possessive, and the way it's spelled in the title of this essay is correct.  The only argument supporting the official spellings "Mother's Day" and "Father's Day" is that each of us has only one mother and only one father.  While that's true biologically, our biological parents aren't necessarily the ones we're honoring on those days, and with blended families and same-sex couples....  But I digress. The American Medical Association spells it correctly.  The original spelling as found in the Congressional Record is incorrect.  What a surprise.]

Crawford W. Long, M.D.
Why March 30th?  The answer seems to be Crawford Williamson Long, MD.  Dr. Long was a north Georgia physician credited with the first use of ether as a medical anesthetic on March 30, 1842.  The first Doctors' Day was observed in Georgia on March 30, 1933.  The US House of Representatives adopted a resolution commemorating March 30, 1958 as Doctors' Day.  But it was not until 1990 that the above-cited Joint Resolution, signed into law by George H.W. Bush, made it official for 1991.

This is very fitting.  The medical profession is devoted to alleviating suffering, and Dr. Long's pioneering effort had the purpose of enabling doctors to perform therapeutic procedures for patients without inflicting pain.

The Congressional Record is surely full of designated national days.  However, a fair amount of Web searching this morning failed to turn up a compiled list of them. There are plenty of Web sites that have lists, but they all seem to include many that are invented, whimsical, or just plain silly.  Not that Congress has never done that sort of thing, but I couldn't find a list of days that had been officially designated by Congress.  Maybe I just wasn't using the right search terms.

But why one for doctors?  Are they under-appreciated?  Remember, we're talking about a profession that is highly respected and whose practitioners are financially rewarded for the many years of education and training required to become physicians and the additional many years of their lives they devote to saving lives, restoring health, helping patients to manage chronic disease, and comforting the afflicted.  Don't they feel sufficiently appreciated without a special day to honor them?

Maybe they do.  But on National Doctors' Day I think about one group of doctors who definitely are under-appreciated.  The new ones.  The young physicians still in training.  The ones who have graduated from medical school and are in graduate medical education programs learning their chosen specialties.  It is my privilege to work with them every day at an academic medical center, where I see just how under-appreciated they are.  They work more hours than anyone else in the building.  They have a high-stress job for which they are under-paid.  They never have enough time to sleep, to get a little exercise to stay fit and healthy, to spend with their families, or just to relax and read a non-medical book once in a while.  They literally (and I mean that in the correct sense of actually, truly, and really) exude blood, sweat, and tears in putting forth the effort required of them to care for their patients.  And yet they are, all too often, treated - by patients, by nurses, and by their teachers - as though they are not yet "real doctors."

Will we have a national day to honor and appreciate them?  Unlikely.  We just expect them to wait, as they do for everything else, until they have completed their training and have arrived at the point at which they will be considered "real doctors."  But the doctoring they do each day they come to work is very real. And I want them to know they are appreciated now.

National Doctors' Day is for you, my friends.  You know who you are, and you know you earn this appreciation every day.

  

Tuesday, March 26, 2013

Shorter Work Hours for Medical Interns Harming Patients?

Can health reporting in the popular press get any worse?  Every time I think it has reached a new low in accuracy and reliability, somebody comes along and blows that last new low right out of the water.

I have a dear friend who runs a news clipping service.  From her base as a teacher of emergency medicine in Saginaw, Michigan, she posts links to articles about issues in our health care system, some from trade publications (e.g., Health Affairs), many from the popular press.  Knowing, as I do, that Kathleen is esteemed by her colleagues in academic emergency medicine and beloved by her trainees, when she posts something, I am much inclined to read it.

Today there was a link to an article in USA Today that said medical interns working shorter hours were making more mistakes.  Citing a study published in a major medical journal, the article reported, "Most concerning: Medical errors harming patients increased 15% to 20% among residents compared with residents who worked longer shifts."

I will admit to a dual bias here.  First, I am naturally skeptical about anything that is counter-intuitive, and I was once a medical intern whose quality of decision making unquestionably deteriorated with sleep deprivation.  Second, when I read something in the popular press about a medical study that doesn't make sense, I naturally assume that the medical reporter got it wrong.

In 1984 Libby Zion was a freshman at Bennington College.  She got sick and was admitted to a New York hospital.  A serious error was made in her medical care, and she died the next day.  The error was thought to be due, at least in part, to long hours worked by medical residents.  Sleep-deprived doctors are more likely to make bad decisions, or so thought the Bell Commission, whose recommendations to limit the hours of doctors in training were adopted by the state.  Similar restrictions were adopted by the Accreditation Council for Graduate Medical Education (ACGME) in 2003.

Then in 2011 the ACGME further modified the work rules.  One of the changes was to limit shift length for first-year residents (sometimes called interns) to 16 hours.

A study just published in the American Medical Association's internal medicine journal examined the effects of this recent change by surveying medical interns before and after the 2011 change in the rules.  Interns were asked about their work hours, how much sleep they were getting, their overall state of well being, whether they were having symptoms of depression, and whether they thought they were making mistakes on the job.

One would expect, intuitively, that doctors who don't have to work as many hours, and who don't have to work more than 16 hours (which in most other lines of work is called a "double" shift), would get more rest and have an overall improved sense of well being.  If the Bell Commission was right, they would also make fewer mistakes.

So one of the findings of this study was a bit of a surprise.  The interns thought they were making more mistakes after the new rules took effect.

Notice I didn't say they actually were making more mistakes, or that they were making more mistakes that caused harm to patients.  The authors of the study didn't say that, because that isn't what their findings showed.  But that didn't keep the writer for USA Today from saying that.

[Is she a complete idiot?  Is this all about selling newspapers?  Both?]

So what did the authors find?  Well, first, you have to know what they asked. Interns were asked, on a survey, whether they were concerned about having made any major medical errors in the preceding three months.  Before the change in work rules, 19.9% said yes; after the change, 22.3% said yes.  So, first, although this change reached statistical significance, it's a pretty small change.

Second, this is a matter of perception.  They were more concerned about having made major medical errors.  That doesn't mean they actually had made more errors.  It also doesn't mean they made errors that harmed patients.  No one asked them to sit down at the end of every shift and write down whether they thought they had made any serious mistakes - and what they were, so a researcher could read it and decide whether it really was a serious error.  The fact that this is a perception is very important, because in recent years doctors have been hearing a lot about medical errors and the importance of identifying and reporting them in order to learn from them and reduce their incidence.  It only stands to reason that in this climate, the perception of the occurrence of medical errors might increase, over time, without a change in the reality.

Let us imagine that the perception and the reality do have some connection. What might be the explanation for that?  Well, first, the medical interns might be expected to complete the same amount of work in fewer hours.  That is likely to increase their level of stress and contribute to error.  Second, if shifts are shorter, there will be more changes of shift, meaning more handing off of responsibility for patient care from one doctor to another.  Hand-offs are associated with error.  We know this.  It is not only intuitive, but we have abundant evidence to prove it.  And we're working on that.  We are devoting a lot of time and attention to making hand-offs better.  You see, if hand-offs are of high quality, there is reason to believe they can increase, rather than decrease, the quality of care - and reduce, rather than increase, the frequency of errors.  If hand-offs are well done, they provide an opportunity for two doctors to engage in a discussion of a patient's case and learn from each other.  Do you think that discussion will be more fruitful if the doctor passing the baton to her colleague is finishing a really long shift, or a shorter one?  The doctor finishing the longer shift is mentally exhausted and just wants to get the heck out of the hospital and go home to sleep.

Certainly if we are going to have hospital-based trainees working fewer hours overall, we have to recognize and deal with the consequences.  Unless we want them to have to complete the same amount of work in fewer hours, we have to have more of them, and we have to have ways of limiting the quantity of work, not just the number of hours.  And we are working diligently to accomplish that.

We also have to realize that if a doctor is in training for five years after medical school, and we say he cannot work more than 80 hours per week, he's not going to get as much experience in those five years as someone who has worked 100 or 120 hours per week.  There is a famous story in which the chairman of the department of surgery at a major medical school says that the disadvantage of being on call (which means staying in the hospital around the clock) every other night is that you miss half the interesting cases.  But I can tell you from my own experience as a trainee, and from many years as a teacher, that one does not learn well when one is not adequately rested.  I have concluded that, in medical training, the concept of "quality time" is very real.

Part of my keen interest in this subject is that I care deeply about the future of my profession, and I want to make sure the training that future generations of doctors get is optimal for the acquisition of knowledge and skills.  And part of it is paternalistic, in the sense that my current trainees are the same age as my children, and I care about their well-being as I would if they were my own.

So when a health reporter for a major news outlet gets something so completely and inexcusably wrong, I am beyond exasperated.  What, I ask you what, will it take for them to do what they must to get the facts right?


Thursday, March 14, 2013

Get Your Snake Oil Here!

An earnest, ruggedly handsome, middle-aged male looks into the camera.

"My wife takes Dr. Bob's Snake Oil every day.  I haven't been, though.  Just not convinced."

A thoughtful look comes across his face.

"Then I found out about a study on the long-term health benefits of snake oil.

"It turns out Dr. Bob's Snake Oil is what they used in the study.
"I guess my wife was right."

Of course his wife was right.  That goes without saying.  Wives are always right - oh, and yes, the corollary that husbands are always wrong is also true - especially when it comes to matters of health.  That's just the way things are in our society.  Families depend on wives and mothers to make the right decisions about our health.

Remember, "choosy moms choose Jif."  They could never have made that commercial work as well by saying kids like it better.  No, Jif is a peanut butter of clearly superior quality, so it only makes sense that it would be the peanut butter of choice for choosy moms.  If you don't care about what your kids put into their stomachs - which (gasp!) could be some other brand, like Skippy, that just doesn't measure up to the standard of choosy moms - then buy any old kind.  But if you're a choosy mom....

So our friend the commercial actor realizes that his imaginary (or at least off-camera) wife must be right because someone did a study.  He doesn't say who did the study.  Nothing about how it was done, whether it was done in such a way that it could have proved anything at all, or what possible long-term health benefits were being measured.  No, our friend was convinced by the mere fact that someone did a study.

This seems patently ridiculous.  But it is exactly the sort of thing we read and hear in commercial advertisements all the time.  A product is described as being "clinically tested."  That sounds great if you overlook the fact that we're not being told anything about the results of the clinical testing.

Quite often things that have been "clinically tested" are shown to be completely worthless.  Sometimes they are shown to be harmful.  If I told you that getting smacked in the side of the head with a 2x4 has been "clinically tested," would you assume that means it's good for you?  I hope not.  I hope you'd be thinking, yeah, it's been clinically tested and found to be a very effective way to get a concussion - or at least a headache.

Of course some things we believe are good for us have not been proven helpful in clinical trials.  A favorite example used by many of us who are ardent advocates of practicing medicine based on scientific evidence is the parachute. No one has done a study to prove that parachutes are good for you when you jump out of an airplane.

We're doing a study to figure out whether parachutes are good for you.  We are recruiting subjects.  You will be selected at random to jump out of an airplane, either with or without a parachute.  Doing such a study is ethically acceptable only if we really don't know which is better.  It might seem intuitively obvious, but the fact is no one has ever really studied it.  Can we sign you up?

Some studies will not be done, and that's OK.  But if a study is done, that tells you nothing unless you know the particulars of how it was done and what it showed.  It is hard to imagine how our friend in the television commercial for Dr. Bob's Snake Oil could be so clueless.

Well, here's how.  Not only is his wife a long-time, loyal consumer, but the product isn't Dr. Bob's Snake Oil.  It's Centrum Silver, a daily multivitamin "for people over 50."  So the proposition that it is worthwhile seems quite plausible.  Not as intuitively appealing, perhaps, as using a parachute when you go jumping out of a plane, but certainly plausible.

But what does this commercial tell us?  It tells us that someone decided to do a study to determine whether there might be long-term health benefits associated with taking a daily multivitamin.  It tells us nothing about what potential long-term health benefits were considered.  It also tells us nothing about whether the possibility of long-term harm to health was also considered and whether the study was designed to detect any of that.  It tells us nothing about whether any health benefits were actually found - only that someone looked for them.

How many in the television audience decide to investigate such claims for themselves?  I'm guessing the number is small.  And you're guessing the number includes me.  Yep.

It turns out that this was the Harvard Physicians' Health Study II.  This study was actually pretty decent science, at least as far as I have been able to determine. About 15,000 mostly healthy male doctors aged 50 or older were recruited to participate in a decade-long study to see if taking vitamins was beneficial to health.  One arm of the study randomized participants to taking either Centrum Silver or placebo.  At the end of the study, a statistically significant difference was found.  For every 1,000 men in the study, there were 17 new cancers detected among those taking Centrum Silver and 18 among those taking placebo.

Other outcome measures included cardiovascular disease, cataracts, macular degeneration, and "early cognitive decline" (dementia).  No differences were reported for those.  Just as important, the study did not detect any harm caused by taking the multivitamin - because no one looked for harm.

If you have an inclination toward thinking about numbers, you probably already realize that the difference of one cancer per 1,000 participants in the study seems very small.  For those of you who really like math, read the next paragraph for additional insight.  Everyone else skip to the one after that.

We often talk about the risk of something happening in terms of what is called a "hazard ratio."  If the hazard ratio is 1.0 between two groups, then there is no difference, because the likelihood of something happening is exactly the same in one group as the other.  If the hazard ratio is less than one for the treatment group, that's better (or lower risk).  If it's more than one, that's worse (or higher risk).  For any hazard ratio, or any other way of reporting differences, we use what statisticians call confidence intervals.  That's a way of expressing how likely we would be to get the same results if we did the study on a very large population of subjects.  As you can imagine, differences are much more meaningful when they come from bigger studies.  If you flip a coin twice, and you get one head and one tail, that doesn't give you nearly so much confidence that this is a 50-50 proposition as it would if you flipped it ten thousand times and got 5,000 each head and tail results.  So we calculate a 95% confidence interval, and that means if we did this study on the universe of subjects, we could be 95% sure the result would be in this range.  So for this difference between 17 and 18 cancers per thousand subjects, the hazard ratio was 0.92.  The confidence interval was 0.86-0.998.  Notice that the upper end of the 95% confidence interval is so close to 1.0 (no difference) that the calculation had to be carried out to three decimal places to show that it was ever so slightly less than 1.0.

This study was published in the Journal of the American Medical Association (JAMA) last fall.  Pfizer, the maker of Centrum Silver, had its marketing campaign all set to go.  To its credit, Pfizer's television commercial - the one with the earnest fellow admitting his wife was right - says nothing about reducing cancer risk.

I think I know why the one difference that barely - just barely - achieved statistical significance is not being touted by the advertising.  It may be entirely illusory. You see, as it turns out, among the study participants, there were some who had a history of cancer already when they enrolled.  And the very modest reduction in risk of subsequent development of cancer was seen in that group.  If one looks at the others, with no history of cancer, the difference in risk of subsequent development of cancer disappears.

Why wouldn't Pfizer tell us any of that?  Well, if you say taking their vitamin helps just a little, if you've already had cancer, to reduce the risk of developing further cancer, but otherwise not, then you have a much smaller target market.  And the one thing you definitely don't want to say is that this study showed no difference in cancer mortality.

So what is the bottom line on this study?  You knew it when you started reading: snake oil.


Friday, March 8, 2013

Health Care as a Right

The Constitution of the United States of America has a Bill of Rights.  A careful reading shows that it enumerates rights of the people that may not be infringed by the Congress.  For the most part, that barring of infringement has been extended to the states by the Doctrine of Incorporation.  There are some guarantees of rights in other parts of the Constitution, and like those mentioned in the Bill of Rights, these are protected from infringement by the government.

So, by and large, there are no "positive" rights.  You do not have a right to food, clothing, or shelter.  This is because we have no governing document that requires anyone to provide anyone else with these things.

In fact, you don't even have most of the rights listed in the first ten amendments to the Constitution, at least not in any absolute sense.  If you are in your workplace, having a conversation with your boss, and you call him a name that is insulting, offensive, and not acceptable in polite company, you may be fired.  That does not violate your right to free speech.  The First Amendment says, "Congress shall make no law ... abridging the freedom of speech," not that you can mouth off to your employer without fear of consequences.

Constitutional rights are mainly restrictions on things government can do that would interfere with your freedoms.  What about the Declaration of Independence, which says we have certain "unalienable" rights that include life, liberty, and the pursuit of happiness?  One could argue that a right to life includes a right to basic health care.  Yeah, good luck with that argument.  And the Declaration is not a governing document for this nation.  It is simply an explanation of why we decided we would no longer be part of the British Empire.

With one important exception, there is no right to health care.  You can buy health care.  No one is obligated to give it to you free.  With one (temporary) exception.

Back in the 1980s, lawmakers in our nation's capital became aware that some hospitals were refusing to provide emergency medical care to patients who lacked the means to pay for it, and some people had bad outcomes, including death, because of that.  If you went to a private hospital and lacked the ability to pay, you could be turned away and told to go to a public hospital (meaning one that was run with public dollars), where your lack of money would not be an obstacle.  If that public hospital was many miles away, and you might die before you got there, that was just too bad.

Congress enacted a law to put a stop to that.  It later became known as the Emergency Medical Treatment & Labor Act (EMTALA).  It created certain obligations for hospitals.  If you go to a hospital emergency department, you are entitled under the law to a medical screening examination (MSE), which means whatever it takes to figure out whether you have an emergency medical condition (EMC).  (EMC has a statutory definition, which means it's not an emergency just because you say it is.  A woman who is in active labor is covered by this law, which is why that's part of the name.)  If you have an EMC, the hospital must provide stabilizing care.  And hospitals must do this without regard for your ability to pay.  That doesn't mean they cannot bill you later.  EMTALA does not create a right to free health care.  It just means a hospital must provide an MSE first, and if you have an EMC must provide stabilizing treatment, before asking you to pay.

If your health problem is not an EMC, no one has any obligation to provide you with anything beyond the MSE (which determines that you have no EMC).  You're on your own.

Nowadays, everybody knows about this right, because hospitals are required to post signs about it, and at least two high-profile politicians (George W. Bush and Mitt Romney) have said poor folks are not without health care, because they can just go to a hospital ED.

But not everyone understands that this right, at least according to the law, applies only to emergencies.

Recently a colleague (and former resident) of mine posted a Facebook status update that said, "I'm feeling a strong urge to define the word emergency this morning!"

As I've said, EMC has a statutory definition.  She knows that.  A new definition is not needed.  But what is needed, in her view, is some way of making patients aware that the word "emergency" on the sign outside her department has a definition, and that the ED is for emergencies.

Those of us who work in hospital EDs have long been bemused by the fact that some people, when they see the words "emergency department" on the sign, read instead "24-hour, primary care, walk-in free clinic."

The EMTALA sign tells people what they are entitled to when they come to the emergency department.  It tells them about their right to an MSE and their right to stabilizing care if they have an EMC, and their right to be transferred to another hospital that can provide the level of care they need if this one cannot.  Nowhere on the sign does it say that they are entitled to nothing if, once they've had their MSE, it is determined that they don't have an EMC.  The sign does not say, "If your problem is not an emergency, you are entitled to exactly nothing."  The sign also does not say, "Nothing this sign says means the hospital cannot, or will not, send you a bill."

There are many reasons people seek care in EDs for problems that even they readily recognize are not emergencies.  These include lack of access to primary care, lack of financial resources, and the 24-7 convenience of the ED, among others.  And sometimes they have fears or anxieties or lack of knowledge that keep them from understanding when something isn't really an emergency.

When medical students decide to pursue post-graduate (residency) training in emergency medicine, it's not because they yearn to spend their days caring for people with colds, sore throats, and strained lower backs.  They want to take care of patients with heart attacks, strokes, and life-threatening infections, and people who've been in car crashes, or fallen off roofs, or been stabbed or shot.  In other words, they want to snatch people from the jaws of death.

Not long ago there was a public education campaign in New Zealand about how emergency departments are for emergencies.  Patients there have excellent access to primary care, with extended hours.  They are strongly encouraged to use it when the problem is not an emergency.

We could do that in the United States if we were determined to build a health care system with a sound infrastructure, which requires robust availability of primary care.

Oh, and one other really important thing: New Zealand was one of the first countries in the world to provide universal health care.  Will the United States be the last?  I think we already know the answer to that question is yes.  The remaining question is how many more decades will pass before we do the right thing.