Wednesday, December 4, 2013

Truth and the Internet

Everyone who travels by air likes to whine about the experience.  It's mostly about the TSA, but sometimes it's about the behavior of other passengers.  I wrote about that myself a couple of months ago.  Recently there was a Twitter-based account of the interaction between a fellow who works as a producer on an ABC network television show and a fellow passenger (Diane from 7A).  The story was later apparently outed as entirely fictional by its author.  Perspectives on that ranged from amusement to annoyance to musings on the assumptions people make about the veracity of what they read online.

This brought to mind the many conversations I have had with friends and relatives about our reading habits.  Not about reading online, or newspapers or magazines - but about our selection of books.  My reading of books is exclusively nonfiction, mostly history, and especially US political history.  I favor biographies, and especially those of important figures in American history, and even more particularly American presidents.  I find history fascinating.  However, I have learned that even the most scholarly works of historiography involve a prism.  I am looking at something through the eyes of another human being.  I am on my third biography of Lincoln, and three biographers see him in ways that are significantly different from each other.  Getting such varied perspectives affords me insight into the art of interpreting the historical record.

I find history much more interesting than fiction.  My sense is that, if I were reading a novel, and found myself becoming engrossed in the events of someone's life, I would stop every so often and think, oh, this didn't really happen (except in someone's imagination).  The implication inherent in that thought is that, because it didn't really happen, it doesn't matter.

So the story of "Diane from 7A" got me wondering about the way people look at such things.  Does it matter if the account of the interaction between two passengers on a plane as related via Twitter really happened? If you found it funny, is it any less funny if it didn't really happen?  My experience with funny stories people tell each other in conversation is that if something is truly amusing, they may want to know whether that really happened or was just made up, but not because it is any less funny if it's fictional.  No, it seems the reason they ask is so they will know whether the person telling the story has a great sense of humor.  After all, which is the person you want to be around: the person who has a few true funny stories (because, after all, most of us don't have funny things happen around us very often), or the person who takes ordinary experiences and weaves tales around them for everyone else's amusement?

When I read something from a conventional news outlet, whether in print or online, I expect it to be factual.  The reporters and editors have a responsibility to their readers to get the facts right and to report them as objectively as possible. But just about everyone else is, in my view, entitled to some literary license, or at least to an individualistic interpretation that may be strongly colored by an array of biases.

Tom Hulce as Mozart in the film Amadeus
When I was in college, I took three music courses.  Two of them were on specific periods (Classical and Romantic) and were taught by the same professor.  The term paper assignment was to select a composer and write about his life and works.  For the Classical period, I chose Mozart.  This was before the Internet, so I went to the big Free Library in downtown Philadelphia and read all of the biographies of Mozart I could find.  Some years later the film "Amadeus" appeared in theaters.  The story was related, at least in part, from the perspective of rival composer Antonio Salieri.  Even so, I found the portrayal of the title character jarring.  I reflected upon my readings from the decade before, when I wrote my term paper.  I looked at the film's protagonist and couldn't help thinking, "That's not the Mozart I knew."  But this was a movie, not a documentary produced for the BBC, and not a history-for-television-audiences such as Ken Burns might have produced.  The fellow who wrote the screenplay and the film's director are surely entitled to all kinds of license.  I thought they portrayed Mozart as a silly ass, and I didn't like that, but then I didn't really know Mozart.  I knew him only as his biographers depicted him, and traditionally biographies have been written by admirers.

As a lad, when I went to a public library and saw books neatly characterized as fiction or nonfiction, I thought life was just that simple.  Either it really happened, just as described, or it's the product of someone's imagination.  But life is not just that simple, and the Internet is certainly not.  My assumption nowadays is that everything I read online comes from someone who has considered himself at liberty to take license in telling a story.  Yes, I still expect "straight news" journalists to try to be factual and objective.  But they are human, and all humans have a point of view.  So I look for a variety of points of view and compare them, not assuming that any one is more reliable than others.

Sometimes when I look at things I am pretty sure they are fictional.  Some photos, for example, defy credulity, and - despite being something of a technophobe (or at least a techno-naïf) - I recognize that they've been altered.  It isn't always quite as obvious as the image to the right, but for anyone but a true babe in the woods, it should be plain to see that it's the product of an inventive mind.

"Debunking" websites like Snopes.com are a wonderful resource.  I use them all the time, taking advantage of the time and effort they put into investigating Internet-based stories and letting the rest of us know whether they have found truth or fiction.  On social media, people are always posting things that are then promptly outed as "hoaxes" by their friends who have done a little checking.  This sort of skepticism is a good thing, I suppose.  And it seems to have become the rule, rather than the exception, at least in my social media circles.  A recent posting about the death of a minor celebrity was immediately followed by a series of skeptical postings noting that all the links that turned up in a Google search seemed to stem from one source, which then was the subject of a discussion of its reliability (as might well be appropriate for an Internet source that mainly reports on celebrities).

Some of the stories I see are inspiring.  Some are funny.  Mostly I don't care whether they are truth or fiction, because their effect on my mood at the moment is independent of their grounding in reality.  It just doesn't matter whether Diane ever really sat in seat 7A.

It does matter to me whether there was anything Abraham Lincoln might have done to prevent the Civil War.  And it matters to me what he really believed, at the core of his being, about African slavery, and why he thought certain infringements upon civil liberties were justifiable in time of war.  And I guess maybe that's why I have such a strong preference for nonfiction: I want to spend my time reading about things that really matter.


Monday, December 2, 2013

The Truth About Trans Fat

A few weeks ago the Food & Drug Administration announced the opening of a comment period on its regulatory proposal to ban trans fats from being added to foods sold by industry (processed foods) and restaurants.

The latest available data suggest that the current average consumption of trans fats is about one gram per day (according to the FDA, as reported by CNN last month).  Given that experts on nutrition recommend we get no more than 1% of our daily caloric intake from trans fat, it appears the average American is already well below that.  (A gram of fat yields about 9 calories, so that would be 1% of a 900 calorie diet, far less than what most people must consume to meet their needs for energy from food calories.)

So it would seem the elimination of trans fats is unlikely to make much difference in the public health, because consumption will change little.  Certain foods that we still eat, though, contain notable amounts of these substances, and so if your diet includes significant numbers of, say, doughnuts, your trans fat intake might be a good bit higher than average.

Let's back up a little to look at how trans fats became part of our diet.

Several decades back the experts told us animal fat was bad for our health.  And we responded by switching to vegetable oils.  So the first thing to understand is the difference between a fat and an oil.  In simplest terms, these substances are all built on molecules that are esters of glycerol (glycerides).  If such a substance is liquid at room temperature, we call it an oil; if it's solid, we call it a fat.  In nature, most of the glyceride-based substances from animals are fats, while most from vegetables are oils.  There are well known exceptions.  Coconut oil, for example, is solid at room temperature but liquifies slightly above that.

On the molecular level when you look at the bonding of carbon atoms to each other and to hydrogen atoms in the hydrocarbon chain that is the backbone of a glyceride, oils have fewer hydrogen atoms.  The electrons that are part of the carbon atom, instead of participating in bonds with hydrogen atoms, form double bonds between the carbon atoms.

If we take an oil, which has fewer hydrogen atoms, and add some, we get a hydrogenated oil, which then behaves more like a fat (solid at room temperature). This is called a "partially hydrogenated oil," and you've probably seen phrases like that in the lists of ingredients on packages of food.  In nature, the hydrogen atoms tend to be on the same side of the carbon chain, a configuration called "cis" in organic chemistry.  When we add hydrogen atoms artificially, they tend to go on opposite sides of the chain, and chemists call that "trans."  There is very little "trans" in naturally occurring animal fat.

So is trans fat bad for us because it's different from naturally occurring cis fat?

We used to think just the opposite.  Those of us who are old enough to remember when margarine was first introduced will recall that it was thought to be much better for our health than butter.  Margarine (partially hydrogenated vegetable oil) is trans fat.  Butter is naturally occurring animal (cis) fat.

As the decades passed, there was intriguing evidence from population studies that trans fats might actually be worse than cis fats.  They seemed to be associated with an increased incidence of colon cancer.  And while we thought margarine was more "heart healthy" than butter, trans fats seem to be associated with worse, rather than better, findings when we have our blood tested for lipids. Specifically, trans fats are associated with higher levels of low-density lipoprotein (LDL) - the so-called "bad cholesterol."

My regular readers know that I am fascinated by the notion of truth in science. As knowledge advances, we are convinced that we are getting inexorably closer to the truth.  We believe that what we "know" now is always more likely to be the truth, or closer to the truth, than what we "knew" before.  We believe we are steadily getting better at discovering scientific truth, that our scientific methods are constantly bringing us closer to a complete and accurate understanding of scientific phenomena, including anything having to do with the life sciences - and therefore with health and nutrition.

We may be right about this.  I'd certainly like to think so.  But I believe it is important to bear in mind that evidence isn't better just because it's newer.  Any time we have new evidence, we should regard it with a certain level of skepticism - not because we are clinging to our "old" truths, but because evidence should always be examined closely, and its quality and meaning judged carefully.

So let's imagine how we might go about answering the question of whether trans fats are bad.  We'd start with a very large number of people willing to be assigned at random to any of several groups whose consumption of trans fats would be carefully controlled over a period of many years.  We'd assign them to getting, say, 1%, 3%, 5% (and so on) of their daily caloric consumption from trans fats. And then we'd follow them for two or three decades, closely monitoring health outcomes we think might be affected by consumption of trans fats.

As you might surmise, no one has done a study like that.  Instead, we have epidemiologic evidence.  We ask people to tell us about their dietary habits, we hope that their recall and reporting are correct, and we compare them with other people whose diets (as they recall and report them) are different but who are otherwise similar in ways we think might influence their health.

As you likely realize, the second kind of study is much easier to do, but it is highly vulnerable to what scientists call "confounding."  Many other factors, dietary and otherwise, may affect people's health, and trying to identify them all and make adjustments in our statistical analysis, is tremendously challenging.

Thus a study done prospectively, in which the use of large numbers of subjects and the process of randomization minimize confounders, is much more reliable. We still have the question of whether subjects will do what they're told, but we can ask them to keep an accurate diary and see for ourselves whether they followed instructions.  A diary maintained contemporaneously tends to be much more reliable than recall and self-reporting.

Trans fats are bad.  That is the message people have been hearing for quite a while now.  And some laws have already been enacted.  Restaurants have been prohibited from using trans fats in New York City, for example.  Thus our consumption of trans fats has already declined very substantially.

So we can predict that, if the FDA declares that trans fats are no longer "generally recognized as safe" and prohibits their use in commercially prepared foods, the impact on the public health may be modest at best.  Might it affect an individual who eats a lot of doughnuts much more?  Maybe.

Doughnuts and some other baked goods are made with trans fats to give them a lighter texture.  Without trans fats, doughnuts would likely seem more oily. Partially hydrogenated oils stay in the "matrix" of the food and do not exude out of it.  If doughnuts wind up being greasier because of a new FDA regulation, some people won't like it.  I suspect the folks who make doughnuts will figure out a way to solve that problem, but I'm no food chemist, so I really don't know.

Let us keep in mind what we're talking about here.  This is a proposed government regulation that would say, "Thou shalt not" use trans fats - unless you can convince the FDA that doughnuts made with trans fats really are safe.

What level of evidence should we expect for such an edict?  There are some things for which we very reasonably believe we don't need evidence from a prospective, randomized, controlled trial.  No one would seriously suggest, to offer my favorite example, that we need to do a study to find out whether it really is safer to jump out of an airplane with a parachute than without one.

When it comes to public health and the use of case control studies to try to figure something out, however, there are so many sources of error, bias, and confounding, that I have to wonder whether what we "know" today is clearly much closer to the truth than what we "knew" when the food scientists first created margarine to save us from animal fat.  Just a quick review of the dizzying array of opinions on whether polyunsaturated versus monounsaturated vegetable oils are best for us will give you an idea of how challenging it is to be certain about things in this realm.

I don't have a problem with the government telling us what is good for us or outlawing what it thinks is bad for us.

Ha!  I probably caught at least a few of you not paying close attention, reading that sentence and thinking I really meant that.  The fact is, I do have a problem with that, especially when the evidence for what is good or bad for us is somewhat less than compelling.  And this, I believe, is an instance in which it is quite reasonable to doubt the evidence and to ask whether what we "know" today is scientific truth.


Wednesday, November 20, 2013

Is It Time to Scrap the Second Amendment?

Earlier this month I wrote about the controversy stirred up among gun owners when a writer for the special interest magazine Guns & Ammo suggested some gun controls are not "infringements" on the right to keep and bear arms (RKBA). Most surprising to me was that a longtime gun writer seemed to have a basic misunderstanding of the meaning of the term "well regulated" as used in the text of the Second Amendment.

The following week a law professor from Texas A&M University, Mary Margaret Penrose, spoke as a member of a panel at a symposium held at the law school of the University of Connecticut.  Penrose advocated ditching the Second Amendment, as part of a broader call for a constitutional convention to draft a comprehensive revision of the U.S. Constitution.  Penrose believes many things in the Constitution, written more than two centuries ago, inadequately address the issues facing modern society.  Although we have judicial interpretation to apply its provisions to current legal questions, and the document itself provides a mechanism for amendment, Penrose prefers a wider approach.

Penrose at UConn

Specifically regarding the Second Amendment, Penrose said the Framers included it in the Bill of Rights because of 18th-Century aversion to standing armies.  The idea was that standing armies enabled oppression of a people by their rulers, and it was preferable to avoid having them.  The alternative was that all of the citizenry be armed - or at least able bodied adult males, who - as a group - would make up a sort of "unorganized" militia (as distinguished from various state militias, which were "organized").

Now, of course, we have grown accustomed to having "standing armies."  The United States has quite a large number of uniformed personnel in organized forces, full time, around the globe.  So we obviously do not need an armed citizenry: it is no longer, in the words of the Second Amendment, "necessary to the security of a free state" that all able bodied citizens have privately owned firearms and be practiced in their use.

Gun rights advocates would point out that an armed citizenry may not be essential for national security, as we have delegated responsibility for that to the national government, but the other broad societal purpose of an armed citizenry in the minds of the framers was as a "bulwark against tyranny" by our own government.  The idea was that our central government wouldn't get "too big for its britches" and be tempted to oppress the populace if the people were armed and clearly intolerant of an oppressive regime.

Do we still need an armed citizenry to restrain our own government, lest it become oppressive and exhibit too little regard for the people's civil liberties?  We could talk about that at great length, but chances are those who say yes would be labeled paranoid, while those on the other side of the argument would be called naive.

But we're missing a very important point.  The Framers wrote the Second Amendment to restrain the central government, and so their writings are focused on the relationship between the government and the people.  The government, they wrote, must not infringe upon this right that existed to guarantee the people's ability to resist tyranny.  The Framers didn't write about hunting for sustenance or armed self defense against criminal attack.  The importance of gun ownership for those purposes was so universally understood that it did not require exposition. Furthermore, that aspect of gun ownership was not connected, in the minds of the Framers, with the relationship between the government and the people, and the right to use guns for those purposes was not thought of as a political right.

Penrose suggests we should drop the Second Amendment and leave it up to the states to regulate the ownership and use of firearms as they see fit.

Coincidentally, the Fall 2013 issue of Tufts Magazine (obviously New England is a hotbed of intellectual curiosity) includes a fascinating article about how the United States can be divided into eleven regions, with marked differences in attitudes about things like gun rights, gun control, and violence as a social problem - and the proper solutions to that problem.

Tufts Magazine: Eleven Nations

The thesis that there are stark regional differences in people's beliefs about such things dovetails rather nicely with the contention by Penrose that we should leave gun rights up to the states.

There is, however, an obvious flaw in this reasoning.  While there may be dramatic differences in people's tolerance for, or willingness to accept, stringent controls on the private ownership of firearms from state to state, these differences are seen on a societal level and cannot be assumed to reflect the beliefs or desires of individuals.  Penrose seems to think if we leave gun rights up to the states, people can just sort themselves out.  If I live in Massachusetts, where gun rights are little respected and gun controls are strict, I can just move to Arkansas.  On the other hand, if I live in Vermont and am appalled that one may carry a concealed handgun without a permit, I can just move to our nation's capital, where there are no such permits issued to ordinary citizens, or to one of the states where permits are extremely difficult to obtain.

Of course she believes the enlightened folks who live in states with strict controls will need help from the federal government to keep them safe from illegal trafficking across state lines.  As you may know, New York City Mayor Michael Bloomberg is convinced that gun crime would disappear from the Big Apple if he could just shut off the flow of guns from Virginia.

I consider myself fortunate to live in a state (Pennsylvania) with a modest regimen of controls.  If I want to buy a handgun, all I need is money to pay the asking price and a clean record, so when the dealer queries the National Instant Check System, the sale will be approved.  To get a permit to carry a concealed handgun, I must do some paperwork, pay a reasonable fee, and have a clean record and character references willing to vouch for me.  But if I lived in any of a number of other states, it would be much more difficult, at least in some locales. Try getting a permit in New York City or most counties in California.

And so the question arises whether my right to keep and bear arms should depend on my zip code.  If one views RKBA as a political right, then the answer is yes.  We make political decisions about political rights.  From the time of our nation's origin, we have restricted the right to vote.  Early on it belonged only to adult males, and only to whites.  We have repeatedly expanded suffrage, to include all races, then to include women, and then to include everyone at least 18 years of age.  But the right to vote is a political right, and so we have made political decisions, as an electorate, about how broad that right should be.

Gun ownership is different.  We live in a time and a set of social circumstances in which the gun is widely considered to be an essential implement of self defense. Certainly other options exist, including training in martial arts and the use of other weapons of varying lethality.  But for most people effective self defense is available mainly through personal ownership of a gun and the achievement of proficiency in its use.  Many advocates of strict gun control (or even outright bans) deny that this is so, insisting that gun ownership makes people less, rather than more, safe and secure.  But I have read the published literature on this at great length, and it has convinced me that the intended victim of a criminal assault is considerably less likely to be injured or killed if armed than if not.

Effective self defense is a fundamental or natural right.  In political philosophy and jurisprudence, these are terms of art with carefully elucidated meanings.  But suffice it to say that fundamental, natural human rights exist independent of political constructs.  We may guarantee them against infringement by governments in our constitutions, but such rights would exist even without those guarantees.

If you accept that characterization of the right to effective self defense, and you accept that in modern society effective self defense is most readily, and realistically, available through personal ownership of guns, then it becomes clear that it makes no sense for restrictions on RKBA to vary from one political jurisdiction to another.

That answers the question of whether the Second Amendment should be deleted from our constitution because we no longer need it or because it is more appropriate for the states to be given free reign to determine gun rights.

[You may have noticed that, as a general rule, people to the left of center on the political spectrum are unfriendly to gun rights and quite friendly to abortion rights. They staunchly oppose giving the states free reign over the latter but are quite content to have the states that wish to restrict gun rights do just that, and the more the better.]

No, the Second Amendment should stay put.  And the question of what sort of regulation of private ownership of firearms it permits should be decided by that arbiter of what the United States Constitution allows: the Supreme Court.  There is no guarantee that the Supreme Court will always respect the fundamental, natural right of armed self defense and strike down laws that unduly restrict RKBA.  But I am inclined toward greater confidence in the high court than in 50 state legislatures.


Friday, November 8, 2013

A Well-Regulated Militia

One of the stories on CNN this morning was about the resignation of the editor of the special interest magazine Guns & Ammo.  For some years I subscribed to that publication, but life is so full of demands on my time that all of my magazine subscriptions have long since lapsed.  Most of my non-medical reading is in the realm of history, especially American political history.

The CNN story caught my eye because a highlight blurb ran across the bottom of my TV screen while I was eating breakfast.  I don't expect to see "Guns & Ammo" in that news ticker.  So I had to investigate what the fuss was all about.

The magazine's editor, Jim Bequette, who was soon to retire from his position anyway, resigned after a column by longtime gun writer Dick Metcalf generated an impressive volume of negative feedback from readers.  Metcalf's column was titled, "Let's Talk Limits: Do certain firearms regulations really constitute infringement?"  Bequette thought Metcalf's opinion piece "would generate a healthy exchange of ideas on gun rights."

It certainly generated an exchange of ideas.  Gun rights and regulations have long been a subject of intense controversy, at least since the 1960s, when the federal government enacted a sweeping new law after the assassinations of Bobby Kennedy and Martin Luther King, Jr.  In the years that followed, the National Rifle Association became, in large measure, an advocacy organization with a single-minded focus on opposing new gun laws.  Other organizations were formed in response to the perception that the NRA was not sufficiently committed to the fight: Gun Owners of America and the Citizens' Committee for the Right to Keep and Bear Arms among them.  The view that the NRA was too willing to compromise was personified in Neal Knox, a leader in the NRA who went off to start his own hard-line group, The Firearms Coalition, in the early '80s.

The readers of Guns & Ammo were so offended by Metcalf's column that Bequette terminated the relationship between the magazine and Metcalf and then promptly resigned his position as editor, accelerating the timetable of his departure, which had been scheduled for the end of the calendar year.  Bequette penned an apology to readers, saying he made a mistake in publishing the column.

I read the column, and a sample of the comments posted on many media Websites in response.  No surprises: gun rights advocates slammed Metcalf's opinion, while others, claiming to be gun owners but not "gun nuts," said Metcalf's view was eminently reasonable.

The essence of Metcalf's perspective is that some regulation makes sense.  It's hard to argue with that, and courts (including the U.S. Supreme Court) have repeatedly held that some regulation of guns is consistent with the Second Amendment to the Constitution's proscription on infringement of the right to keep and bear arms.

As always, the devil is in the details of proposed regulations, and I will come to that shortly.  But I was perplexed by Metcalf's misapprehension of the language of the Second Amendment.  It is quite common, but not something I would expect from someone who has written about guns for decades.

Metcalf quoted the Second Amendment:
A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.
He then proceeded to focus on the words "well regulated" as justification for government regulation of firearms.  What came immediately to mind was the Internet meme originating from the 1987 movie "The Princess Bride."  It is so well suited to pointing out when people are misusing the language: "You keep using that word.  I do not think it means what you think it means."

The meaning of the word "regulated" seems straightforward, but the Framers of the Constitution were not writing about government regulations (which didn't exist in those days).  The reference to a "well regulated militia" meant that a militia, in order to be an effective fighting force, should have requisite supplies (including guns) and should regularly assemble and drill.  The inclusion of this clause in the Amendment was intended to demonstrate the importance of private ownership of, and proficiency with, firearms, which members of the militia typically provided themselves.  The "militia" as the term was used in this context, meant essentially all able-bodied adult males.  Furthermore, this introductory clause was illustrative of a key reason for prohibiting infringement of the right: that an armed citizenry was essential to the common defense and to maintaining liberty.  It was not intended to mean that the right was limited to members of some organized militia (such as the National Guard).

I could not help being surprised that someone well versed in matters related to firearms would base his argument, at least in part, on a misinterpretation of the language of the Second Amendment.  Advocates of gun control do this all the time, and I always wonder whether they do so knowingly or not, but for a gun writer to make this error in a gun magazine seemed quite odd.

Linguistics aside, though, we certainly do regulate guns, and there is overwhelming consensus on particular points, such as prohibiting violent criminals and the dangerously mentally ill from acquiring firearms.

Metcalf pointed to the example of his native Illinois, which only recently ceased being the only state that does not allow concealed carry of handguns.  The change in Illinois was even more pronounced than that suggests, because there are other states in which carry permits are available but are rarely issued except to VIPs.  Now, Illinois is a "shall issue" state, while others remain categorized as having "severely restrictive, may issue" laws and practices.

The new law in Illinois requires 16 hours of training for an applicant to be issued a permit.  What could be wrong with that?

The answer to that question lies in the fundamental premise that regulations can be used to infringe upon a right.  Metcalf's subtitle asks whether regulations constitute infringements.  The answer, as is so often the case, is, it depends. Suppose a state legislature decides to require that gun owners, or those issued carry permits, provide proof that they have liability insurance.  If such insurance were very expensive (think malpractice insurance for doctors), the average gun owner couldn't afford it, and it would thus constitute infringement.

What about this requirement for training?  Sixteen hours doesn't seem like a lot. Suppose, instead, it was 40 hours.  Would that be too much?  What if the legislature decided the only training that would be accepted would be a course held at one of the campuses in the state college system, and the course was two hours a night, two nights a week, for a 16-week semester (total 64 hours)?  And suppose the tuition for the course was $1,000.  And suppose further that the course could be cancelled, at any of the campuses, if the number of students who signed up for it was too small.

I like the idea that people who are issued concealed carry permits should be required to achieve and maintain proficiency with a firearm and that they know something about the legal and ethical principles governing the use of lethal force in self defense.  But it is easy to see how writing such requirements into law is difficult to do while assuring that compliance is not so burdensome as to amount to infringement of a right guaranteed against infringement by the U.S. Constitution.

I live in a "shall issue" state (Pennsylvania) with minimal requirements.  Fill out an application, list two character references, and pay a fee (which covers the cost of doing a background check to make sure the applicant has a clean record).  No training requirement, no marksmanship test.  Is that good?  I admit to some ambivalence.  There are several states (Alaska, Arizona, Arkansas, Vermont, and Wyoming) that do not require a permit for concealed carry.  The rationale is that no permit should be required to exercise a constitutional right.

In discussions of these issues, people often draw an analogy with driving a car, which requires passing a test to get a license.  Typically the test involves answering questions about traffic laws and demonstrating one's ability to operate a motor vehicle.  When I took the test at the age of 16, I had had less than an hour of practice and thought the test was ridiculously easy.  Was it better than nothing?  I suppose so.  Is the analogy a good one?  Well, both cars and guns can be lethal weapons.  There is no constitutional right to operate a motor vehicle on a public thoroughfare.  Of course there were no motor vehicles in the late 18th Century, and firearms were firmly connected, in the minds of the Framers, with fundamental liberty interests.  So the analogy is imperfect.  Perhaps, given the unique role of guns in human societies, there is no perfect analogy, or even a very good one.

At the most basic level, the problem is the perception of the "slippery slope," as I described it in a March, 2012 essay for this blog:

http://bobsolomon.blogspot.com/2012/03/gun-control-and-slippery-slope.html

Advocates for gun rights see every regulation, every requirement, as holding the potential for undue, unwarranted restriction, which then amounts to infringement. Any requirement that might not seem, at first blush, to be an infringement, will surely lead to a more restrictive one that will infringe.

The slippery slope problem is based on a fundamental mistrust between advocates for gun rights and advocates for gun control.  Until we figure out a way around that, we will be fighting an uphill battle to get to civil discourse, and writers like Metcalf who are "just trying to be reasonable" will be pilloried as traitors.

Wednesday, October 30, 2013

The Affordable Care Act: Blessing or Curse?

This morning the Secretary of Health and Human Services, Kathleen Sebelius, was testifying before a committee of the United States House of Representatives. Nominally, the subject of the hearing was the mess that is the healthcare.gov Website, about which we've been hearing so much since people started trying to use it on October 1.  To say the Website crashed and burned would be very charitable.

But of course the give-and-take at the hearing ranged much more widely, and there was much discourse on the Affordable Care Act that went far beyond the fact that people haven't been able to sign up for new health insurance coverage through the Website.  That is as it should be, because the hearing should have been very short if it were about nothing else:

Rep. Gerrymander: "The Website is a disaster."

Sebelius: "Yes, it is.  President Obama said to me, 'You didn't build that,' and he's right, I didn't.  If I had, I should be fired.  But it's a mess, and I'll see to it that it gets straightened out.  It's a temporary problem, and it won't keep anyone from signing up long before the deadline.  Oh, and by the way, a Website that seems to have been designed by incompetent fools doesn't have anything to do with the merits of the law or the wisdom of imposing an individual mandate."

Rep. Gerrymander: "Thank you, Madame Secretary.  Mr. Chairman, I yield the balance of my time to the American people, so the 24-hour news channels can go back to their special coverage on twerking."

But members of the committee had a lot to say about the Affordable Care Act. Occasionally they actually asked Secretary Sebelius a question instead of just prattling on, and every so often they even allowed her as much time to answer a question as they spent asking it.

The dividing line between Democrats, who like the ACA, and Republicans, who don't, was not subtle.  One Democrat representative said the ACA would achieve universal coverage, putting an end to the problem of having 50 million uninsured Americans.  That is so obviously delusional that I'm sure the only reason she wasn't immediately removed to a psychiatric institution is that she is a Member of Congress.  Even the most ardent proponents of the ACA know it will not do that.

Some people who will be newly eligible for Medicaid won't sign up.  Some who are not, and who will face a penalty if they don't sign up for health insurance and pay for it, will choose to pay the penalty, because even with whatever subsidy they might qualify for, they still would have to pay more than they think they can afford in premiums, and the penalty is a lot cheaper.  Some who will be newly eligible for Medicaid live in states where Medicaid is not being expanded.  The Supreme Court told Congress it doesn't have the power to force the states to expand Medicaid, and some won't.  Those people are out in the cold, and they'll stay there.  Some people will actually go from having some sort of coverage to having none.  Their small-business employers will drop coverage, or their employment status will be changed so they don't qualify, and they'll be forced into the exchanges, where they will discover the penalty is cheap compared with the premiums (see above).  Others have individual coverage that doesn't meet the requirements of the law.  Those policies will be cancelled, and the new policies they could get instead, through the exchanges, will be more expensive, and they will pay the much cheaper penalty and go without insurance.

So the Congresswoman who made that claim for the ACA was either indulging in hyperbole (to put it mildly), or she cannot do arithmetic and thinks reducing the number of uninsured by about 60% (from 50 million to 20 million) is the same as reducing it to zero.

[Oh, by the way, if you're wondering about the 10-12 million undocumented immigrants, they're not counted in the 50 million, so if you want to count them, make it 60 million uninsured, and the total left uninsured after the ACA is fully implemented will be 30 million.]

So the glass is half full.  Not full.  I hope the Congresswoman isn't too thirsty.

Much of the rest of what I heard was also untrue, and I gave a little thought - but only a little, because it may not really matter - to whether the people saying these things really believed them and were just mistaken, or whether they are liars.  The fact that all this was taking place on Capitol Hill strongly favors both, because these are people highly skilled in saying false things and say them so often that they begin to believe them.

Sebelius, for example, said that the newly insured will be able to see primary care doctors and stop going to emergency departments, where the care they receive is very expensive and ineffective.  Multiple lies in one statement.  Wow!  An artist!

The newly insured may have Medicaid, though expanded Medicaid coverage. Try calling a primary care doctor's office to become a new patient and wait for the response when you say your insurance coverage is Medicaid.  Dr. Smith is not taking new patients.  (Translation: Your kind ain't welcome 'round here.)  Maybe your new insurance is not Medicaid.  Dr. Smith may very well still not be taking new patients, because there is a serious shortage of primary care doctors.  My internist, who has a strictly office-based practice, providing adult primary care, hasn't been taking new patients for years.  When I started seeing him as a new patient, the answer I got when I first called for an appointment was that he wasn't taking new patients.  Then I made the case, to the receptionist, for being an exception to his not taking new patients: (1) my mother-in-law had been his patient; (2) my wife was his patient; (3) I was a resident when he was junior faculty, and we'd known each other for 30 years.  If you think those things made it a slam dunk, think again.  I wouldn't have been surprised if she'd still said no.

So you have insurance now, but you still can't find a primary care doctor, or when you do find one, getting a timely appointment is no mean feat.  So, you call the office when you're sick, and you are offered an appointment the second Thursday of next week.  By then, you figure, you'll be over whatever it is that ails you, or you'll be dead, and it doesn't seem prudent to take the "watchful waiting" approach to see which it is.  "Really?" you ask.  "Nothing sooner than that?"  "If you think you need to be seen right away, go to the ER," says the receptionist, trying to be helpful but condemning you to a two-hour wait on a Monday afternoon, when every emergency department in the galaxy is swamped.

Secretary Sebelius doesn't really believe the ACA will reduce ED visits by people who can be seen in a primary care doctor's office instead.  She knows better. She was governor of a state, and she's a cabinet secretary, and it's not possible to have that resume and be stupid.  Well, maybe it is, but I've listened to her, and I'm pretty sure she's not stupid.  She cannot possibly believe what she said.  And that makes her a liar.  Surprise!

Then there is the part about emergency departments being terribly expensive and ineffective.  Is that so?  Well, let's look at those claims.  If you go to the emergency department because you have a fever and a cough and are worried you have pneumonia, does it cost more to provide the care than it would if you go to your doctor's office?  Well, you see a doctor and are examined.  Then you have a chest x-ray.  The doctor looks at the x-ray, and tells you that you do have pneumonia.  Discussion ensues about whether you are sick enough to be treated in hospital.  You and the doctor agree that you are not.  You leave with a prescription for an antibiotic suitable for pneumonia.  The cost of providing that care is no higher than if you'd gone to the doctor's office.  But there is no way for you to know that.  You don't get a bill showing the cost.  You get a bill showing the price.  And the price will be high compared to what you'd pay if you went to the doctor's office.  That's partly because the ED has higher overhead, but mostly because the ED has to provide care to people who cannot pay, and everyone else must pay more to make up for that.  Your doctor's office has no such obligation.

How about the "ineffective" part?  At the emergency department you didn't need an appointment, although you may have had to wait.  The ED never closes.  You saw a doctor, and a chest x-ray was ordered.  Where did you go for that?  Down the hall.  Some primary care doctors' offices can do basic x-rays and lab tests. Many cannot.  Your doctor read the x-ray himself.  He does that all the time, and he's really good at it.  No waiting for a radiologist to read it, although one may have read it right after it was done, depending on how busy she was.  Oh, and by the way, while your regular doctor may be pretty good with illnesses, how about injuries?  Emergency physicians see all comers, all the time.  They have to be knowledgeable about everything, and what they are best at is figuring out if what you have is going to cause you serious harm or kill you.

Does that sound "ineffective" to you?  When I heard that word, I didn't take it personally.  I know Secretary Sebelius is just saying something that many folks inside the beltway say all the time, even the ones who know better.  And I know if she ever needs to see a doctor right away for something that's really worrying her, and she goes to an ED for that purpose, "ineffective" will be the word farthest from her mind.

There is no question that the ACA, when fully implemented, will reduce the number of uninsured in this country by 50-60%.  And that is a good thing.  But having insurance is not the same thing as having timely access to high-quality medical care.  That is what people want.

There is one place where people have timely access to high-quality medical care 24-7-365.  You know where that is.  It's where I work.  Can we take care of everyone's health care needs in the emergency department?  Of course not.  But in a health care system that serves so many so poorly, providing care in a way that is highly fragmented and often chaotic, in the ED we do our best - which is pretty darned good -  to pull it all together and provide excellent care to everyone who shows up.

If Secretary Sebelius, or anyone else inside the beltway, thinks we are too expensive, or ineffective, or that the ACA means people won't need us except for true emergencies, they are seriously misinformed or just being careless with the truth.  I'd like to give them the benefit of the doubt, assume they are seriously misinformed, and keep striving to correct that.  It might help if this blog were required reading for them.

Monday, October 28, 2013

Space on a Plane and the Loss of Civility

Earlier this month I flew from Pittsburgh to Seattle for a medical conference.

I am not a frequent flyer any more.
 
When I concluded my six-year tenure as a member of the Board of Directors of the American College of Emergency Physicians, I suddenly had much less reason to travel, and the airlines are interested only in how much you have flown lately, not historically.

Once one is not a frequent flyer, all the little things one took for granted disappear: the shorter line (when there is one) at TSA Security Theater (acknowledgement to a lovely colleague from Georgia who calls it that); an earlier boarding zone; no extra fee for checked bags; and an upgrade once in a chartreuse moon (blue moons happened more often than upgrades for me).

So, bereft of all those little perks I had come to take for granted, I was not surprised that changing planes at Chicago's O'Hare, the nation's second busiest airport, was not the least pleasant thing about this trip.

However, as much as I think the airlines, and TSA, and the folks who run the airports could all do things to make air travel less of an ordeal, especially for those of us who are not "preferred" on any airline, this trip made me realize that there are things we can do to make life better for each other.

Repeatedly I found myself wondering whatever happened to civility.

Examples of its disappearance abounded.  There was the fellow who saw that I was taking longer than he would like to get my stuff into those plastic bins at security - and believe me, I'm very efficient, having done it so many times that even someone my former-schoolteacher mother would have called a slow learner could do it with blinding speed.  So he looked at me, heaved a loud sigh of exasperation, and made a great show of barging past me to put his own stuff on the conveyor ahead of mine.

As a result of this, he got to wait for the tram to the gates 30 seconds longer than I did.

Then there is the waiting for the tram.  At the Pittsburgh airport, the people exiting the tram get out first, from the other side, before the doors open for those boarding.  Some inexperienced travelers don't know this, and so they stand back from the doors to allow room for exiting passengers.  I watch in amazement as some who know this isn't necessary slide right in front of them to take positions that will allow them to board the tram first.  The line from the movie "Norma Rae" comes to mind: "You must be from New York."

At the gate, I am not flying Southwest, the only airline that organizes boarding in very small groups, thereby minimizing the rush and confusion attendant upon all other boarding schemes, in which all 14 thousand people in Zone 2 want to be the first of their group to board.  If you fly, you've seen it: some passengers behave as though they are boarding in Zone 1, only to hang back toward the end, then pause - revealing they aren't holding boarding passes for Zone 1 at all - before stepping briskly forward to begin boarding as soon as the gate agent reaches for the microphone to announce Zone 2.  I am convinced there is some prize for being the very first to board in one's zone that I've just not heard about yet.

Why are people so eager to get onto the plane?  Why do they want to sit in seats just large enough for elementary school children any longer than they must?  You see, space in the overhead bins for carry-on luggage is limited.  The sooner one boards, the more likely it is that there will be space available.  Being in a later-boarding zone and finding that the space is all full is quite frustrating, as one tries to work one's way back forward in the plane and hand the bag off to a flight attendant to be checked with all the luggage that people will be waiting for at baggage claim (for one of life's longer versions of eternity).

Walking down the aisle of the plane I happen upon yet another example of incivility.  I see a young man smartly hoist his carry-on bag into a space in the bin directly over row 8, where I am going to sit, and then keep walking down the aisle. I then see that there is no more space left in an overhead bin anywhere near row 8.  So I keep walking, and eventually locate space above row 22.  Guess who's sitting in row 22?  Indeed!  The same fellow whose bag is now above my seat. As I place my bag in the bin above him, I realize I will now have to make my way back forward while the flow of boarding passengers down the aisle, barely wide enough for one person, is moving in the opposite direction.  As I'm doing my impression of a salmon swimming upstream, I wonder why he thought this was a good idea.  Did he really think all the space in the bins closer to his seat would be full?  No, I'm pretty sure he didn't.  I'm pretty sure he didn't think at all, about anything other than grabbing the first space he saw, and that was probably not thought but a reflex, something from our reptilian ancestors.  Yes, that's it.  No thinking at all, because "thoughtless" is the word that best fits his behavior.  And at the end of the flight, I will have to wait for nearly all of the passengers to disembark (I cannot bring myself to use the absurd "deplane") before I can head back to row 22 to retrieve my bag.  Fortunately, I do not have a tight connection at O'Hare.

Did I mention - yes, I think I did - how busy an airport O'Hare is?  I have time enough to grab a bite to eat and a drink.  I'm pretty sure I can find something far better than what will be "available for purchase" on the flight from Chicago to Seattle.  (No complimentary meals nowadays, although thoughts of such things are great when one is in a nostalgic mood.)  But it is early evening, and O'Hare has meal facilities that are adequate for the number of passengers changing planes there at 4 AM, not 5 PM.  After considerable searching for a restaurant with seats, I give up on that idea and notice - joyfully - a place that is selling decent-looking sandwiches with a line short enough that I should get to the cash register before my flight is boarding, with perhaps a nanosecond to spare.

On the flight from Chicago to Seattle, I know I will be in my child-sized seat for about four hours, and I realize I should use the time to get some work done on my notebook computer.  It's a 15-inch MacBook Pro - not the smallest choice, but not overly large, either, and I've paid the extra money for a seat in the part of the plane United calls Economy Plus, which means my knees are not pressing into the back of the seat in front of me.

The plane reaches cruising altitude, and we have been given permission to use our electronic devices - as long as smartphones are in airplane mode, so you can use them only to play games and do other things that don't involve sending and receiving signals.  I don't quite understand this, because I'm pretty sure there are no signals to receive at 30,000 feet, and I'm also pretty sure I cannot use my smartphone to direct the plane to land in Tahiti instead of Seattle.

Now it's time to take out my computer and get to work.  Then I notice that the fellow in the seat in front of me is reclining.  This means I can open my computer just barely far enough to see the screen and have room to get my fingers onto the keyboard to type.  To make the effort more interesting, every so often, he repositions his body in his seat, hurling himself against the seatback, during his flopping-fish imitation, with sufficient force to cause me to snatch my computer off the tray and pull it all the way back against my chest to keep it from being damaged by the shockwave.

This mystifies me.  By "this," I mean two things.  First, why are seats built to recline, when the only thing that does is make the passenger immediately behind even more miserable, while changing the angle of the seatback far too little to make a difference in the ability of the person "reclining" to fall asleep?  Second, why does the person who wants to move the seatback to that useless angle not realize this, and think, "Oh, that won't help me, it will only torture the passenger behind me, so I won't do it."

I have a personal rule.  I rarely think of reclining my seatback, but if it crosses my mind, I do it only if the seat behind me is unoccupied.  If that seat is occupied, I do not look around and ask the occupant if s/he minds if I recline.  Doing that would put my fellow passenger in an awkward position, having two choices.  First would be to lie: "No, I don't mind at all if you put your seat back, reducing my personal space when I hoped it was already at an irreducible minimum."  Second would be to say, "Well, actually I do mind, and I would rather you didn't."  A fellow passenger may not tell the truth and pick option 2 for fear of being perceived as a jerk.  So, my thought on putting the seat back: it is an act of hostility, something one does only because one delights in torturing the person seated directly behind.

I'm trying to soft-pedal this.  Notice I called it an act of hostility, not an act of war. Those familiar with the language of geopolitics will see the difference.  I didn't call it an act of war, nor did I - tempting though it was - call it a war crime.  I did think about calling it that.  You see, I am convinced that, were they evaluating the practice of placing coach passengers in those tiny seats for flights lasting longer than an hour or so, Amnesty International would declare it torture.  And so, a passenger who deliberately makes it worse for the person directly behind may, indeed, be committing a war crime.

By now you know how much I enjoyed my transcontinental adventure.  And I was flying United Airlines - you know, that airline that uses theme music from George Gershwin (who, I am quite certain, would strongly disapprove) and the slogan "Fly the Friendly Skies."  Friendly?  What is the appropriate response to that?  I think a guffaw fits rather nicely, don't you?

But if we want the skies to be friendly, what say we begin with each other?


Friday, October 25, 2013

Turn in Your Drugs? Just Say No!

Tomorrow (October 26, 2013) is National Drug Take-Back Day.  The Drug Enforcement Administration (DEA) says this day "aims to provide a safe, convenient, and responsible means of disposing of prescription drugs, while also educating the general public about the potential for abuse of medications."
This has been a twice-a-year event since October 2010, and tomorrow will be the seventh such specially designated day.

On the last National Drug Take-Back Day this past April, the public turned in nearly three quarters of a million pounds (or 371 tons, if you are among my "tons of friends" who like the word tons better). Press releases from the DEA offer some insight into the rationale underlying the program:
According to the 2011 Substance Abuse and Mental Health Services Administration’s National Survey on Drug Use and Health (NSDUH), twice as many Americans regularly abused prescription drugs than the number of those who regularly used cocaine, hallucinogens, heroin, and inhalants combined.   That same study revealed more than 70 percent of people abusing prescription pain relievers got them through friends or relatives, a statistic that includes raiding the family medicine cabinet.  
Although I am neither a lawyer nor a statistician - nor do I play either of them on TV, nor did I stay at a Holiday Inn Express last night - I love to look at sentences like that and examine what they really mean.

More than 70% of people using prescription drugs got them through "friends or relatives."  I am walking the streets and happen to see my buddy Jim.  "Hey Jim! Ya got any Oxys?"  (For the uninitiated, this is short for oxycodone, a potent prescription analgesic derived from morphine, which in turn was derived from the opium poppy.  These drugs are all generically referred to as opioids.)  Jim says he does.  We settle on a price, and both walk away happy.  Is Jim a drug dealer? You could say that.  But he is also my friend, and he counts in the statistic cited above.

So what are we really worried about here?  Mama is going out to get the mail. She slips on wet grass and falls, resulting in a rather nasty fracture of her ankle. She goes to the hospital and gets operated on by an orthopedic surgeon.  When she is discharged, the orthopedist estimates how much pain medicine she'll need for how long and writes a prescription for that.  As it turns out, she doesn't use anywhere near all of it, and it sits in the medicine cabinet.  Her teenage son or daughter or friends of theirs visiting the house discover the pills, and you know what might happen from there.

So what should you do?  Well, first, before you open the medicine cabinet, look in the mirror on the outside of it.  Ask yourself if you are the sort of person who might get addicted to opioids.  This requires a bit of introspection.  When you were taking the pills prescribed for some painful condition, did you always take them solely because you were in pain?  Or did you ever take them partly because you liked the way they made you feel?  Because they seemed to make the day go better?  Because when you took them your mood was better, you were less likely to get into an argument with your spouse, less likely to yell at the kids when they didn't really deserve that?  All of those other reasons for taking the pain pills - reasons other than straightforward pain relief - suggest you may have a propensity for getting addicted.  And maybe once you no longer need the pills for the problem they were prescribed to help with, it would be better to get rid of them.

As you might guess, though, that propensity toward addiction is more - much more - of a problem for people with chronic pain, who can't just get rid of the pills any more than they can just get rid of the source of chronic pain.  So those unfortunate folks have to ask themselves those questions about why they're taking the pills frequently, if they want to make sure they aren't slipping from seeking pain relief into seeking escape from life's annoyances or seeking mood elevation.

If, however, you are confident that you are not inclined toward addiction - maybe the pills just don't really do anything for your mood (not everyone experiences euphoria from opioids), or maybe you just don't have those kinds of at-risk personality traits - you probably don't see any good reason to get rid of your pain pills.  After all, you never know when you might need them for something else, and as long as they're still "good," why throw them away?  That's just wasteful. (By the way, most prescription drugs have a shelf life that goes far beyond the expiration date on the bottle.)

If you're not worried about yourself, and you have no irresponsible people living in your household, then why would you turn in your drugs?

You wouldn't.  No, you would, instead, respond to this program by asking, "What? Have these people lost their minds?"  And you might make a list of the reasons why this is a dumb idea.  (1) It's wasteful to throw away perfectly good medicine. (2) If I need a strong pain reliever in the future, I can just go to the medicine cabinet instead of bothering my doctor.  (3) If I had turned in my drugs and had to go see my doctor for an evaluation resulting in the issuance of a new prescription, his records would reflect the old prescription and the new one, and he might get the wrong idea about how much pain medicine I'm using, unless I remembered to tell him I used only half of what he gave me the last time and turned in the rest. (4) My primary care doctor never likes to give anyone more than ten pills of a controlled substance, because he always feels like Big Brother at the DEA is watching him, and any time I ask him for pain medicine, I get the uneasy feeling that he thinks maybe I'm a whiner who should just "walk it off."  So if I hold on to the extra pills left over from the orthopedist who operated on my broken ankle, I can avoid all that.

I'm sure you could come up with some more reasons of your own to add to this list.

You probably know, if you're a regular reader - and especially if you read my essay on New York City Mayor Michael Bloomberg's anti-Big Gulp campaign - that I am an avowed skeptic regarding any government programs based on what folks in government think is good for us.

Just in case you thought I was going to overlook an opportunity to skewer Big Pharma, there is funding for a public education campaign about this program from a company than makes opioids.  Turn in your drugs, and we get to sell more of them when you need them later.  Yes, I know, I'm just a cynic.

So, do I think you should turn in your drugs?  If you're like most people, Just Say No (credit to Nancy Reagan, although she was looking at a different aspect of the drug problem when that tag line was developed).  But there is a certain value in this program.  I hope it will get people to think about whether their own use of prescription opioids might not be solely for pain relief, that maybe they like the mood elevating effect, and that could suggest they're at risk for getting addicted. And I hope it will get people to think about whether prescription opioids in the home might be found and abused by teenagers or other irresponsible persons, in which case taking steps to secure potentially dangerous drugs is the logical solution.  This might mean turning them in, or it might mean locking them up.  If you have a potentially irresponsible person in your household, you should no more leave a dangerous drug unsecured than you would leave a firearm unsecured.

So, tomorrow ... have a nice day.  And hold on to your drugs, unless you're taking them just to have a nice day, or you have another good reason to turn them in.

Wednesday, September 25, 2013

Obamacare is Coming, and the Sky is Not Falling

First the forces arrayed against it tried to defeat it but failed.  Then the cry was, "Repeal and Replace."  The question "Replace with what?" was never adequately answered.  And so the opposition moved on to "defund Obamacare."  After all, no legislation ever enacted by Congress and signed into law by the president can be implemented without funding, and that funding was not guaranteed in perpetuity in the original bill.  So perhaps we could pass a budget for the fiscal year about to begin without funding for the Affordable Care Act (ACA).  Or we could say we won't go along the next time we have to raise the ceiling on the national debt unless we abandon funding for the ACA.

What's so bad about the ACA?  Let me count the ways it will destroy our health care system, in the view of its detractors.

(1) It will put the government in charge of your health care.  This one is being promoted by a television advertisement described by everyone in the media as "creepy."  It shows a young woman in a doctor's office, apparently there for a certain kind of physical examination that is common for young women but has few (if any) fans, and the leering practitioner is wearing an Uncle Sam Halloween costume.  Needless to say, the woman in the commercial is instantly far less amenable to proceeding with the visit than she was at the start.

This makes me laugh.  Those of us in the health care industry know just how much the government is already in charge of your health care.  This is because the feds already have very extensive control over how health care is financed, and when you control that....  Well, this seems pretty obvious.  There are so many rules about what we can and cannot do that emanate from Washington. The degree of such control has been steadily increasing since the enactment of federal health insurance programs in the 1960s.

(2) It will create death panels.  No, it won't.  What it will do is quite far from that - and arguably falls quite short of what we should be doing in this area.  We spend a very large amount of money on end-of-life care.  A substantial part of that spending pays for care that is very unlikely to benefit dying patients.  It may prolong life without any meaningful quality.  It may prolong suffering without any sort of trade-off that the patient would find worthwhile.  The most common reason for the expenditure of vast sums for non-beneficial care is that no one spent time with patients and families to talk about options and how they fit with the patient's personal values.  In our health care system, the default is "do everything," and the default is what happens when the patient and family have not given careful consideration, with advice and guidance from a trusted physician, to what they really do and don't want.  In the early debates over the ACA, there were provisions that would have required doctors to talk to patients about such things, and from the reaction I thought euthanasia for everyone over 70 was the topic of discussion.  When I was a medical student I had a frank discussion with my grandmother about her options.  I knew what her state of health was.  I knew what CPR was like.  I thought she might not want it.  I talked to her about it at length, so she could give it some thought and make decisions about what sort of care she wanted.  Guess who else talked with her about such things?  That's right.  Nobody.  And in the 30 years since she died, little has changed.

(3) It will raise the cost of health insurance for everyone.  This one is rather more complicated.  If you currently are paying nothing for health insurance, because you choose not to buy it, your costs will certainly rise.  The idea here is that insurance is a mechanism for spreading risk, and risk cannot be spread equitably if some people opt out.  This involves a very direct trade-off.  If we say everyone has to have insurance, then we take away the only plausible excuse the insurers have for excluding sick people.  They say people will just wait to buy insurance until they need it, like someone who buys automobile insurance after his car is stolen or wrecked.  We wouldn't allow that.  So if we say everyone has to have health insurance, we can tell insurers they may no longer exclude people with "pre-existing conditions" - or charge them higher premiums, which can effectively do the same thing as denial.

In some states, current rates of denial of coverage exceed 30%.  The ACA will put an end to that, and requiring everyone to have coverage is essential to making that work.

Similarly, we must require everyone to have insurance at some basic minimum level.  Certain things must be covered, with reasonable limits on out-of-pocket expenses.  If we don't do that, then we have the same problem as when some people opt out.  My dad didn't have a very good opinion of mental health services. I think at some level he thought people with mental illness were just weak characters who should buck up and get a grip on life.  He didn't want to pay premiums for health insurance that included mental health services, which he was sure he would never need, to subsidize those who really just needed some life lessons or a sympathetic ear.  And there we have the same problem: if we allow those who think they don't need coverage to opt out, we're not spreading risk effectively.

The same principle applies to "catastrophic coverage."  If I can afford to pay $50,000 a year out of pocket for health care, I can find a really cheap policy to cover me for expenses beyond that, because the actuarial risk that the insurer will ever pay anything is low.  But very few people can accept risk of that magnitude.  If the system lets me do it because I can, then once again risk is not being effectively spread.  So anyone who currently has coverage that doesn't kick in until spending is in the catastrophic range is going to pay more under the ACA.

On the other hand, many low-income folks will qualify for Medicaid who currently have nothing.  And many more who aren't poor enough to qualify for Medicaid will be able, especially with the help of tax credits, to afford health insurance, when up until now it just hasn't been an option.  We tend to think of the ones whose employers haven't offered health insurance, but there are plenty of hardworking people who have had "access" to employer-based health coverage, but it has simply exceeded what they could afford to pay.  They will no longer have to go without, because under the ACA there is a limit on the percentage of your income that you're expected to pay for coverage before you qualify for a subsidy.

So we are requiring people who heretofore have opted out to enter the risk pool, and requiring people to buy health insurance that has a higher level of coverage than what they would otherwise have chosen, thereby making insurance as a mechanism for spreading the risk truly workable.  At the same time, we are adding people to the Medicaid rolls and subsidizing premiums for the working poor who are not quite poor enough to qualify for Medicaid (which will require expenditure of tax dollars).  What, then, are we doing?  Just ask Joe the Plumber: we are spreading the wealth around.

By now my regular readers know I'm innately conservative.  I think the government wastes a lot of money.  I like the Jeffersonian ideal of smaller, less intrusive government.  I believe in personal responsibility.  So why on earth would I like the ACA?

In all honesty, I don't really like the ACA, because I believe the goal is universal coverage, and this will leave us well short of that.  We will still have at least 20 million without health insurance coverage.  That is my best estimate.  Call it pessimistic, but I prefer to be pessimistic and then be pleasantly surprised if things go better than I though they would.  This means I have to "like" the ACA because I am not willing to let the perfect be the enemy of the good.

Does it spread the wealth around?  Absolutely.  Will it personally cost me more in health insurance premiums, or taxes, or both?  Guaran-damn-teed.  So why am I for it?

I have spent the last 30 years practicing medicine and observing a very painful fact of life.  I live in the wealthiest nation in the history of the world, a nation that is at the same time the only modern, industrialized nation on this globe that fails to provide a universal system of health care.  Every single day of my working life I am face to face with people whose health has been neglected because the resources are simply not available to them to tend to it.  The consequences of that neglect land them in the emergency department, far worse off than should ever have come to pass.

Eighty-five percent of Americans have health insurance.  Many of them really don't care about the other 15%.  I am ashamed to live in a society where that is true.  I am heartened by the thought that those of us who do care are in the majority.  The latest public opinion polls say more than 60% oppose the "defunding" of the ACA.  Maybe many of the other 40% dislike the ACA for other than selfish reasons.  But to those who oppose it because they do not support the ideal of decent health care for all as a societal responsibility, I say shame on you.

 

Sunday, September 8, 2013

Syria: A Stamp of Disapproval?

Bashar al-Assad has been directing the use of chemical weapons against his own country's people, in violation of "international norms" - and of an agreement to which Syria is not a signatory.

The "international community" is outraged.  Let me point out that there really is no such thing as the international community outside of works of fiction.  There is an organization called the United Nations.  It is ineffectual and often ignored, but it actually exists, unlike the "international community."

So what about that outrage?  If the outrage is international, what sort of international action is planned based on the outrage?  Right.  None.  The UN is doing nothing.  There is no "coalition of the willing." The French support action by the United States.  That's worth something, I suppose, given the Franco-American friendship that goes back to the 18th century.  But it's hardly tangible.

So what are we going to do?  The president is talking about using Tomahawk cruise missiles, perhaps 200 of them, at a cost of about $1M apiece.  That is a $200M Stamp of Disapproval.  At what will we target them?  Chemical weapons stockpiles?  Obviously a bad idea, the notion of blowing them up and dispersing sarin gas across the countryside.  How about the manufacturing facilities?  Well, they're buried underground, where cruise missiles don't reach, and they're probably buried well enough that our supply of "bunker buster" bombs, which haven't been upgraded in the last decade, won't be effective.  We could disrupt "command and control" - for a few weeks, maybe.  Not exactly a well-chosen objective.

We're not aiming for regime change, which is just as well.  What we'd like in Syria is a secular democracy, such as one finds in any number of other countries in the region.  That number happens to be one, and it's Israel.  The idea of secular democracy is about as likely to catch on in the Arab Middle East as Miley Cyrus is to be invited to put on a "twerking" demonstration at the Saudi Royal Palace in Riyadh.

So just what is the objective?  Two days from now, on the eve of the twelfth anniversary of the attacks on the World Trade Center and the Pentagon, President Obama will address the nation on prime time television.  Presumably he will answer that question.  Until then, I am left to speculate.  And, as my daughters and other young adults might say, "I got nothin'."  At least nothing that makes any sense or seems achievable.

We certainly cannot look to the examples of Afghanistan, Iraq, or Libya to find good results, but then when you are presuming to choose among various groups who all despise the Judeo-Christian West and its values, you really cannot expect to accomplish good things (from our perspective) by replacing one with another.

Why do we care about the ghastly happenings in Syria enough to risk initiating a large-scale regional war?  In 1994 in Rwanda, half a million people were killed in a genocide of shocking proportions over a period of one hundred days.  And we did ... nothing.  The "international community" also sat on the sidelines.  Oh, that's right, it created the International Criminal Court, which has been marvelously effective.  In the decade since its creation, it has yielded one conviction (now being appealed).  President Clinton has described US inaction as one of his administration's major foreign policy failures.  Hello, Captain Obvious!

But we don't really care about Africa, do we?  It's the Dark Continent.  That makes it easy for us to pretend we don't see what happens there.  There isn't any oil (except in Nigeria).  Geopolitical instability there doesn't seem to affect US interests.  The Mideast is another story altogether.  Lots of US interests there. The world's economy is very directly affected by the stability (or lack thereof) of the region.  So, if anyone tells you that our interest in stopping the killing in Syria is humanitarian, repeat after me: "Yeah, right."

So the president may be able to convince us that important US national security interests are at stake in the region and that intervening in some fashion in the Syrian government's appalling attacks on the country's own population will serve American purposes.  I've been trying to connect those dots since this became the lead story in every day's news reporting, and I'm still not seeing it.

I don't see how we will do anything to stop Assad's war crimes.  I don't see any prospects for starting Syria on a path to becoming a secular democracy.  And I think those are the only two goals worth pursuing.  The president has not made his case yet.  Most surveys of public opinion show that his style of "leading from behind," as it is derisively called by Republicans, has convinced very few Americans that his plans for intervention make any sense at all.  I'll be working Tuesday evening, but I expect to watch him later, courtesy of my DVR, or at least read his speech online.  I'm openminded.

Mr. President, you got some 'splainin' to do.

 

Sunday, August 11, 2013

The Business of Shaving

Recently a friend (high school classmate, now a Facebook friend whom I've not seen since high school) posted a status update that afforded a glimpse into his exploration of the art of shaving (the straightforward meaning of that phrase, not the Website of that name).  I was intrigued.

Over the years I've used everything from the old fashioned safety razor, holding a single blade that, in my youth, cost 10¢ to the current high-tech cartridges that have five blades and cost upwards of $4.

I've also used electric shavers from all of the major manufacturers.

[I will briefly dispense with the electrics.  Battery
life, meaning how long the battery lasts until it will no longer hold a charge, meaning it's no longer a cordless rechargeable, is pathetic.  Keeping the blades sharp and lubricated, finding the right pre-shave conditioner to suit one's beard, and developing the technique needed to get the closest and smoothest possible shave are all far more trouble than it's worth.  Even if you are meticulous about all of that and spend $200 for a top model, you still cannot get a shave as close and smooth as you can with a blade.]

[I must also say a few words about the old-fashioned straight razor.  I go to a barber who is a traditional Italian practitioner of the art, and he assures me it's far more difficult to learn to use a straight razor on oneself than on a customer.  That, he says, is why the safety razor got its name - and why he uses one on himself.]

The post by Bill (my old friend, a talented writer) piqued my interest.  He was going old school, to the traditional wet shave with a safety razor and shave soap applied with a brush.  Gee, I wondered, was that back-to-the-basics approach economical?

Some prefer exotic materials for the handle
 other than wood, which might not
 stand up to moisture over time.
Well, that depends.  Blades for a safety razor can be had for as little as 25¢ apiece, although the fancy German steel versions cost more than a dollar.  That's still a lot cheaper than $4+ for the high-tech cartridges.  But then what about shaving cream or soap?  Again, you can spend a little or a lot.  You can buy a traditional cream that comes in a tube that costs less than $10 and claims to be enough for 100 shaves when applied in the recommended thin layer.  You can spend $50 for a similar amount of really fancy stuff.  And how are you going to apply it?  With your hand, or a brush?  A shaving brush made of synthetic material is cheap.  Boar bristles cost a good bit more.

Apologies to animal rights activists
The traditional material, introduced by the French centuries ago, is badger hair, and you can spend $200 for an elegant brush with an exotic handle. The aficionados say there is nothing like badger to prepare the beard for the best possible shave.

The most common choice for lathering is the stuff that comes in a can and that every supermarket and drugstore carries, and that will probably cost somewhere around 10¢ a shave, give or take, depending on how much you use.

So, at the cheap end, buy a handle for a few bucks, blades that cost 25¢ apiece in bulk, and the inexpensive shave cream or gel in an aerosol can, and you're probably going to spend about 15¢ a shave.  But so many of us spend so much more.  And big companies like Gillette put a lot of money into marketing to get us to do just that.  They are very successful in getting us to spend more than $4 apiece for their high-tech cartridges.  (The best online price I found for the Fusion Pro-Glide was about $3.50.)
Remember, Gillette is the company that brought us the original
double-edged safety razor blade, patented in 1904 and supplied to American troops in World War I.  But Gillette does a lot of research on design, and their engineers are quite convinced that each advance, adding blades up to their current five-blade design, has meant a better shave: smoother, closer, easier, faster, less dependent on perfect technique, and with less irritation.  If you shave every day and put in a new cartridge every week, which is a common pattern, you'll be spending 50¢ a shave just for the blades.  Some men use a cartridge far longer than a week, but I can tell the difference between shave #1 and shave #7, so I find the weekly routine sensible.

With a bit of guidance from my friend Bill, I learned that there are others selling competitive cartridges for less.  And that made me wonder: are they really as good?  How can that be?  If they are sacrificing nothing in quality of materials and manufacturing process, if they are attentive to quality control and spending enough on marketing to have a successful business model, how can their product be that much cheaper?  It has to be profit.  And my regular readers know how I am fascinated by profit and the profit motive.

In my online research I have found two companies making razors and cartridges clearly intended to compete with Gillette's top-of-the-line Fusion Pro-Glide model. Both of them cost about a third less.  I have tried out one of them and have judged it to be of comparable quality and performance.  That made me really keen to investigate profit.

An article published in 2009 gave me some answers.  The Gillette Fusion Pro-Glide cartridge costs less than 10¢ to manufacture.

Add another few cents for packaging.  Each cartridge brings about $2.50 in profit for Gillette and another 75¢ profit for the retailer.  Of course there are some distribution costs.

But even considering the cost of research and development, that per-cartridge profit for Gillette is eye-popping.  It's no wonder that Gillette is the most profitable division of parent company Procter & Gamble, with a profit margin upwards of 30%. Gillette makes Big Pharma look like pikers.

I was raised by parents who belonged to labor unions.
My dad was not quite a socialist,
but he was a staunch believer in workers' rights and the importance of protecting them. And he saw corporate greed almost everywhere he looked. Maybe he sometimes saw it when it really wasn't there, but there is so much of it in America that one really needn't use any imagination to see it around every corner or in every nook and cranny. I'm sure this is why, at least in part, I am always ready to believe that every manufacturer is engaging in price gouging. Very consistently, over the years, my willingness to believe has been supported by cold, hard facts.  This is one of those instances.

But hey, I thought, maybe the undercut-pricing competitors are making their blades overseas, where labor costs are lower, while Gillette's razors are manufactured right here in the good ol' US of A.  Sure enough, Dollar Shave Club, which started up early last year, makes its stuff in Asia.  So what about Gillette, the Goliath to such Davids?  Seven years ago Gillette opened a new facility, which is its largest manufacturing plant for razors and blades in ... Poland.  If you want to help the Poles and think the Asians are already quite sufficiently economically successful without any more help from exports to the US, then go ahead and stick with Gillette.  If you think manufacturing is now global and it really makes no difference, or that you're happy to buy American, but if it's not American it doesn't matter where it's from as long as it's not sweatshop labor, then have a look at the competitors.  (There's another one, called Dorco, for which I cannot vouch, because I have not tried their products, but they certainly deserve mention.)

For me the calculus is simple - so simple that it is not, of course, calculus, but mere arithmetic.  Sacrifice profit to give me a product of equal worth at a lower price, and I will be a customer.

One more thing: don't lie to me.  Last year, convinced - and I wonder what produced this epiphany - that men were dismayed about high prices, Gillette's marketing geniuses initiated a campaign to tell us the cartridges for the Fusion Pro-Glide need be changed only once every 5 weeks.  The obvious goal was to get us to think that the lower-priced alternatives weren't going to save us that much money over time, and that we could get real savings right away just by using the blades longer.  If we all did that, their sales could be cut in half overnight, but they knew that wouldn't happen.  They just don't want to see their two-thirds share of the market shrink.  But when I was using the cartridges for two weeks, I felt a big difference between shave #14 one day and shave #1 the next.  So don't tell me five weeks.

Gillette (and the other big boys) could get serious about the competition and lower their prices.  That's the American way.  Just ask the Walton family.  They could drive their competitors out of business in short order.  Or they could do it the other American way and try to buy them out.  It will be interesting to see how this story unfolds.