The phone rang in the emergency department at two o'clock in the morning. The unit clerk answered, spoke briefly, hung up, and turned to tell me a patient upstairs had died, and the nurse needed me to come and pronounce death.
It is an oddity of state law where I practice that, when a patient dies in the hospital, a physician must pronounce death. A nurse can identify the absence of breathing and a heartbeat every bit as well as I can. But nobody asks me about things like this when they periodically revise the Nurse Practice Act.
I am at a community hospital that is part of our health network. My primary position the last three years has been at a big teaching hospital, where there are residents (doctors training in specialties) in the hospital around the clock, and so there is no shortage of physicans to attend to such matters. Here, however, late at night there is only one doctor in the building - the emergency physician - and so all responsibilities requiring a doctor belong to one person.
Sometimes that means attending to emergencies involving inpatients: critically low blood pressure, respiratory distress, prolonged seizures. Sometimes it means seeing patients at the very end of life.
As I walked into the patient's room, I was reminded of one of the very first times I did this. A very recent medical school graduate in my first year of residency training, I was accompanied to that patient's room by an equally newly minted nurse. I entered the room and gazed at the ceiling. The new graduate nurse asked me what I was looking at, and I explained that when a person has just died, if you watch closely, you can see the spirit rise. I was lucky she didn't smack me for that twisted sense of humor, instead briefly thinking I meant it and then saying, "Oh, you're teasing me."
Tonight I placed my stethoscope on the patient's chest and listened for a heartbeat as I watched for the rise and fall of the chest that would be present if she were breathing. No heartbeat. No breathing. The nurse handed me the patient's chart, and I signed the "pronouncement of death" portion of the state form.
My note on the chart said, "Called to see patient who had ceased to breathe. No heartbeat or respirations. Death pronounced." I signed my name.
I looked at the patient's face and thought about whether her expression was peaceful. And I realized I knew nothing about her other than that she was now dead. I could review the chart to see if I could figure out the cause. When I read newspaper obituaries, they rarely say anything about the cause of death, unless it's a famous person. And I often wonder.
But that wasn't what I really wanted to know. I had a much longer list of questions.
What was she like as a child? What were her hopes and dreams? What about as a young woman? Did she fall in love and get married? Did she bear and raise children? Was she a homemaker, a wife and mother? Did she work outside the home, pursue a career? Was she a homebody, or did she travel? What were her aspirations for herself, her husband, her children? Were they fulfilled?
Did she have grandchildren? How many? What was she like as a grandmother? Did she try to make up for all the mistakes she had made as a mother through lack of experience? Did she try to give the benefit of that parenting experience to her children and their partners raising that next generation?
If she had married and had children and grandchildren, what were their memories of her, and which ones were their most cherished? Did she outlive her husband? How badly, and for how long, had she missed him after his passing? If he had survived her, how would her death affect him?
Had she had any regrets? Had she come to terms with them?
Had she ever made a bucket list? How many of the things on that list had she been able to cross off? How many were left? How many were only dreams, things she knew would never be crossed off but she thought still belonged there?
I will never know the answers to those questions, but I will ask them just the same, about every patient I care for who dies, and every one I see for the first time at the very end of life. I will write on the chart, but nothing I write there is what matters.
Every one of us is a collection of answers to such a list of questions. Each time I reflect on them, I think about what the answers will be at the end of my own life. Sometimes I think I should not dwell on my own mortality. But I believe keeping in mind that the journey is finite can sharpen our focus on making the most of it. And I happen to be in a profession in which reminders that the journey is finite are all around me, all the time.
Friday, May 2, 2014
Thursday, March 13, 2014
Do "Stand Your Ground" Laws Make Things Better or Worse?
A recent report on National Public Radio examined this question. Has the enactment of "Stand Your Ground" laws made things worse instead of better?
To review, for those who weren't paying attention when the public spotlight shone on such laws in the aftermath of the killing by George Zimmerman of Trayvon Martin in Florida, such a law says, in essence, that one may use lethal force in self defense when s/he is anywhere s/he has a legal right to be.
Florida's "Stand Your Ground" law wasn't actually relevant to the case, as Zimmerman claimed self-defense as justification for the shooting without reference to the provisions of the law, but that was a legal distinction that was lost on the general public because it was largely ignored in the news reporting.
The NPR report relied heavily on a new study in the Journal of Human Resources by Cheng and Hoekstra from Texas A&M University. These investigators are economists who used the tools of social scientists to examine empirical data, comparing states that adopted new laws of this nature with others that did not, over time.
The first problem with the study, which is not a major flaw, is that it combines two different kinds of laws. The first is the Stand Your Ground law, and the second is the Castle Doctrine. The castle doctrine, simply, is the idea that one has no obligation to retreat from criminal assault in one's own home. The authors assert that the castle doctrine is a feature of English common law (which is historically true) and commonly applied in the US, sometimes in statute, sometimes in case law (which is a bit misleading, because there are some states - New York a prime example - in which the duty to retreat, even from one's own dwelling, remains).
So, the incorporation of the castle doctrine into statute and the enactment of stand your ground laws are combined for purposes of this study. The researchers then looked at the effect of such changes on homicide rates. The paper is long (43 pages) and, as one might expect of a study that incorporates statistical analysis and modeling, rather dense. But it is methodologically rigorous. Like any such study, it necessarily relies upon some assumptions that are subject to question or criticism, but the authors (to their credit) acknowledge this and (in some particulars) show how alternative assumptions affect the results of the analysis.
The central conclusion is that these laws have been associated with an 8% increase in homicides relative to states that did not adopt such laws. To be clear, that does not include killings that were reported to the national database as justifiable homicides, although the investigators note that only a small fraction (probably no more than 20%) of justifiable homicides are reported that way.
Using the 20% assumption, they estimate that about half the difference in killings between adopting and non-adopting states can be accounted for by justifiable homicide. They also acknowledge that if they had used a 10% assumption, which they grant is entirely within the realm of reasonable possibility, all of the additional homicides would then (statistically) be accounted for by the "justifiable" category. As a matter of definition, "justifiable" in this context means that lethal force was used to stop a felony in progress in a situation in which law enforcement or a prosecutor considered it proper for a civilian to do so.
[Oh, about this commonly accepted 20% assumption: we must realize it is based on the work of a criminologist whose methods are also very rigorous but who is perceived by pro-gun-control advocates as supportive of the right of armed self defense. Any time you're looking at work in this field, it is important to know whether the researchers have any biases, either real or perceived.]
So where does that leave us? What does it mean if homicides increased by 8%, which is significant, and half of them were justifiable? Is it bad to have more homicides whether they were justifiable or not? Did the justifiable homicides take some career criminals off the streets, thereby preventing future crimes? What about the half (and remember the underlying assumption) of homicides that may not have been justifiable? Who committed them, and why?
The challenging questions to answer are the ones about the myriad subtle effects of these laws. Do they increase the number of people who choose to own guns for personal protection? The number who carry guns as they go about their daily lives? The willingness of those who keep and bear arms to use lethal force in self defense? Do they lower the psychological threshold for using lethal force? How do they affect the inclination of those who are armed to do everything possible to deescalate conflict, which is the legally and ethically correct thing to do? Do they increase (or decrease) the likelihood that a verbal argument will escalate into a fistfight and then a shooting?
We can all speculate about the answers to these questions, and our answers will be strongly influenced by our own biases. But the answers to these questions are unknown.
It would be interesting to examine the difference in homicide rates following adoption of these laws on a more granular level. How many more homicides were committed by people having a previously clean record, in lawful possession of a gun (with a permit, if not in their own homes), with the killing ultimately found not to be justifiable? Although there is an abundance of data showing that holders of carry permits very rarely use their guns in the commission of crimes, we don't really know the answers to these questions. Zimmerman was acquitted in the killing of Martin, but even a cursory analysis of the incident as reported in the news suggests that there was plenty of opportunity for deescalation and avoidance of the shooting.
My own personal perspective on these laws is simple. If I ever (and I strongly prefer never) am forced to use lethal force in self defense, I want the burden of proof to be on the prosecutor to show that I acted unreasonably. If that cannot be shown according to the standard required under criminal law (reasonable doubt), I want to be shielded from civil liability, where the standard is much lower (preponderance of the evidence). The first part seems to me to be of self-evident necessity to someone who uses lethal force to preserve his own life. The second part may be every bit as important, because in a civil negligence case one has no access to a public defender, and even a successful defense is likely to be financially ruinous.
The broader question remains, I think, unanswered. Do these laws merely protect the intended victim of a crime from being victimized by the legal system, or do they lower people's threshold for using lethal force and make ours a more violent society? Cheng and Hoekstra have given us an interesting look at the data. They conclude that these laws do not deter crime, and they worry that the increase in the homicide rate might be the result of many killings that should not have occurred - and would not have occurred absent a change in the law. They are right to worry about that. We all should. But we cannot yet draw firm conclusions.
To review, for those who weren't paying attention when the public spotlight shone on such laws in the aftermath of the killing by George Zimmerman of Trayvon Martin in Florida, such a law says, in essence, that one may use lethal force in self defense when s/he is anywhere s/he has a legal right to be.
Florida's "Stand Your Ground" law wasn't actually relevant to the case, as Zimmerman claimed self-defense as justification for the shooting without reference to the provisions of the law, but that was a legal distinction that was lost on the general public because it was largely ignored in the news reporting.
The NPR report relied heavily on a new study in the Journal of Human Resources by Cheng and Hoekstra from Texas A&M University. These investigators are economists who used the tools of social scientists to examine empirical data, comparing states that adopted new laws of this nature with others that did not, over time.
The first problem with the study, which is not a major flaw, is that it combines two different kinds of laws. The first is the Stand Your Ground law, and the second is the Castle Doctrine. The castle doctrine, simply, is the idea that one has no obligation to retreat from criminal assault in one's own home. The authors assert that the castle doctrine is a feature of English common law (which is historically true) and commonly applied in the US, sometimes in statute, sometimes in case law (which is a bit misleading, because there are some states - New York a prime example - in which the duty to retreat, even from one's own dwelling, remains).
So, the incorporation of the castle doctrine into statute and the enactment of stand your ground laws are combined for purposes of this study. The researchers then looked at the effect of such changes on homicide rates. The paper is long (43 pages) and, as one might expect of a study that incorporates statistical analysis and modeling, rather dense. But it is methodologically rigorous. Like any such study, it necessarily relies upon some assumptions that are subject to question or criticism, but the authors (to their credit) acknowledge this and (in some particulars) show how alternative assumptions affect the results of the analysis.
The central conclusion is that these laws have been associated with an 8% increase in homicides relative to states that did not adopt such laws. To be clear, that does not include killings that were reported to the national database as justifiable homicides, although the investigators note that only a small fraction (probably no more than 20%) of justifiable homicides are reported that way.
Using the 20% assumption, they estimate that about half the difference in killings between adopting and non-adopting states can be accounted for by justifiable homicide. They also acknowledge that if they had used a 10% assumption, which they grant is entirely within the realm of reasonable possibility, all of the additional homicides would then (statistically) be accounted for by the "justifiable" category. As a matter of definition, "justifiable" in this context means that lethal force was used to stop a felony in progress in a situation in which law enforcement or a prosecutor considered it proper for a civilian to do so.
[Oh, about this commonly accepted 20% assumption: we must realize it is based on the work of a criminologist whose methods are also very rigorous but who is perceived by pro-gun-control advocates as supportive of the right of armed self defense. Any time you're looking at work in this field, it is important to know whether the researchers have any biases, either real or perceived.]
So where does that leave us? What does it mean if homicides increased by 8%, which is significant, and half of them were justifiable? Is it bad to have more homicides whether they were justifiable or not? Did the justifiable homicides take some career criminals off the streets, thereby preventing future crimes? What about the half (and remember the underlying assumption) of homicides that may not have been justifiable? Who committed them, and why?
The challenging questions to answer are the ones about the myriad subtle effects of these laws. Do they increase the number of people who choose to own guns for personal protection? The number who carry guns as they go about their daily lives? The willingness of those who keep and bear arms to use lethal force in self defense? Do they lower the psychological threshold for using lethal force? How do they affect the inclination of those who are armed to do everything possible to deescalate conflict, which is the legally and ethically correct thing to do? Do they increase (or decrease) the likelihood that a verbal argument will escalate into a fistfight and then a shooting?
We can all speculate about the answers to these questions, and our answers will be strongly influenced by our own biases. But the answers to these questions are unknown.
It would be interesting to examine the difference in homicide rates following adoption of these laws on a more granular level. How many more homicides were committed by people having a previously clean record, in lawful possession of a gun (with a permit, if not in their own homes), with the killing ultimately found not to be justifiable? Although there is an abundance of data showing that holders of carry permits very rarely use their guns in the commission of crimes, we don't really know the answers to these questions. Zimmerman was acquitted in the killing of Martin, but even a cursory analysis of the incident as reported in the news suggests that there was plenty of opportunity for deescalation and avoidance of the shooting.
My own personal perspective on these laws is simple. If I ever (and I strongly prefer never) am forced to use lethal force in self defense, I want the burden of proof to be on the prosecutor to show that I acted unreasonably. If that cannot be shown according to the standard required under criminal law (reasonable doubt), I want to be shielded from civil liability, where the standard is much lower (preponderance of the evidence). The first part seems to me to be of self-evident necessity to someone who uses lethal force to preserve his own life. The second part may be every bit as important, because in a civil negligence case one has no access to a public defender, and even a successful defense is likely to be financially ruinous.
The broader question remains, I think, unanswered. Do these laws merely protect the intended victim of a crime from being victimized by the legal system, or do they lower people's threshold for using lethal force and make ours a more violent society? Cheng and Hoekstra have given us an interesting look at the data. They conclude that these laws do not deter crime, and they worry that the increase in the homicide rate might be the result of many killings that should not have occurred - and would not have occurred absent a change in the law. They are right to worry about that. We all should. But we cannot yet draw firm conclusions.
Thursday, February 13, 2014
Undo the Flu
You cannot be serious, I thought.
[Flashback to the early 1980s. Professional tennis player John McEnroe was my favorite, not because of his whiny, bad boy personality but because he was a magician at the net. I see him standing on the court, hands on hips, one of them holding a tennis racquet, staring in disbelief at an official who has just made a call with which McEnroe plainly disagrees. "You cannot be serious!" McEnroe yells. This phrase later became the title of his autobiography.]
"Undo the Flu." You cannot be serious. This is wrong, on so many levels.
First, one cannot "undo the flu." One can treat the symptoms: fever, sore throat, cough, headache, muscle aches. This constellation of symptoms comprises what we call an "influenza-like illness." At the peak of flu season, the statistical likelihood that such an illness is caused by the influenza virus, as opposed to one of several other viruses that can cause the same syndrome, is about 50%. Why does this matter? We have anti-viral drugs that have activity against the influenza virus but not the others. So, if you get "the flu," there's only a 50-50 chance it's influenza and a drug that works against the virus might help.
There is a test (by swabbing the inside of your nose) that is pretty fair at distinguishing whether it really is the influenza virus. If it's positive, the doctor might prescribe an anti-viral drug. If you watch TV, you've probably seen the commercial for the most popular one, called oseltamivir, marketed under the trade name Tamiflu. When it first came out, I joked that I would prescribe it only for patients named Tami. (I'd be flexible on the spelling, so Tammy could have a prescription, too.)
Why was I unimpressed with it? For the typical patient with influenza, if the drug is started within 48 hours of onset of illness, it shortens the duration by about a day. Beyond 48 hours, it probably makes little or no difference. There are some patients for whom the drug is clearly recommended: patients sick enough with influenza to require hospitalization, for example. But for most people it will make a very modest difference, and only if it's actually the influenza virus, and only if it's started very early on.
It does not "undo the flu." If you really want to undo the flu as a public health problem, get on the bandwagon of advocacy for widespread vaccination. The vaccine is not perfect, because it is not 100% effective, and (like any medical intervention) it can have side effects, and it doesn't protect against those other viruses. But if undoing the flu is your goal, prepare your immune system to fight it off before you get it.
Is there anything else you could accomplish by seeing a doctor for an influenza-like illness? The symptoms can all be treated with medicine you can buy in a drugstore without a prescription. If you're not sure what to buy, because you don't watch TV commercials, just ask the pharmacist. What can a doctor prescribe that's better? Well, there is one thing. You can get a prescription for a narcotic.
For many centuries, human beings have known the effects of the opium poppy. The active substance derived from that plant is morphine, and we've made numerous modifications to the morphine molecule. Some of them are used as pain relievers and cough suppressants. They also elevate mood for most people, which is what makes them potentially addictive. So if you want to feel better by suppressing cough, relieving pain, and improving your mood, you could hope the doctor will prescribe a narcotic. But don't count on that, because the heavy emphasis on how prescription narcotics are turning us into a nation of addicts, and killing us in droves through overdoses, has made many doctors skittish about prescribing them for anyone who doesn't have cancer pain or a broken bone.
By now it should be obvious that I'm saying influenza is - for most people - an acute, self-limited illness. Translation from medical jargon: it comes and it goes, and nothing much makes any difference in the natural history of the illness. Medicines may ameliorate the symptoms, and most of them can be had without a prescription. Nothing makes the illness go away faster, with the very modest exception of oseltamivir, prescribed under just the right circumstances, as I've explained.
So the "Undo the Flu" billboard is encouraging you to consult a health care professional for an illness which, most likely, will not benefit from professional health care.
I am now picturing the look on the face of Joseph Califano if he were reading this billboard. Califano was Secretary of Health, Education & Welfare (HEW) in the Carter Administration. Califano believed that in medicine supply generates demand. He was the architect of US health policy that led to dramatic slowing of the growth in the supply of doctors. As a result, that supply has not kept pace with demand, and today everyone agrees there is a shortage of doctors, especially in primary care, and for the next few decades there won't be enough of them to meet the medical needs of an aging population.
But the proliferation of urgent care centers, and the marketing of their services, is a spectacular proof of Califano's thesis about supply generating demand, and nothing captures the phenomenon better than a billboard encouraging people to seek professional health care of very modest (if any) benefit.
Urgent care facilities certainly have a niche to fill. Plenty of patients do need to see a doctor (or a nurse practitioner) for episodic care, and not everyone has a primary care doctor who can offer a timely or convenient appointment. But we have developed a habit, in the US and many other affluent nations, of seeking professional health care for every minor illness and every trivial injury. That is a costly habit. I don't know the size of its contribution to our ever-increasing national health care budget, but I think it is significant. And I think we should not be fostering it.

A few years ago I had the privilege of caring for a professional bull rider who'd been thrown and stomped. He had serious injuries to internal organs in the chest and abdomen. Over his protests, his buddies put him in the back of a pickup truck and brought him to the hospital. When I explained my findings and told him he needed to be under the care of a trauma surgeon, he told me he really thought he'd be fine. He thought he could just walk it off.
Of course he was wrong, but an awful lot of us see doctors when we really could just walk it off, because seeing a doctor won't make any real difference. Maybe we all need a little bit more of the cowboy in us.
Wednesday, January 15, 2014
Responsible Gun Ownership
Two recent stories featured by media outlets have gotten me thinking about responsible gun ownership. Both involved people who appeared to be law-abiding, rational citizens of the sort who can be trusted with concealed carry of a handgun.
The first incident was one of an "accidental discharge" of a semiautomatic pistol by a woman who is a member of the Kentucky state legislature (Rep. Leslie Combs). News reports indicated she was unloading the pistol in her office, although the reason for this was not entirely clear. She said something about having decided she did not wish to use the gun any more, according to a few news accounts, although most reports gave no reason at all for her actions. This doesn't really make any sense. If a member of the legislature with a concealed carry permit decided that she no longer wished to go about armed with a loaded handgun, why would she suddenly decide to unload it in her office?
Setting aside that question, the incident is illustrative of a few important points. First is that one must be thoroughly familiar with the operations of any handgun one carries or otherwise possesses. She clearly had a lapse: apparently she did not realize the gun had a round in the chamber when she pulled the trigger. Presumably she made the error of thinking it was unloaded after removing the magazine. This error is sufficiently common that some pistols are designed with a "magazine safety." This makes it impossible to fire a round that is in the chamber if there is no magazine in place in the gun's receiver.
[In case you're wondering, there is a reason some pistols do not have a magazine safety. Although such a safety makes a pistol somewhat more idiot-resistant, it also makes it impossible to fire the round in the chamber if one is in the midst of changing magazines or has somehow accidentally pressed the button that ejects the magazine. In a life-threatening situation, being unable to fire the round in the chamber because there is no magazine in the receiver could have tragic consequences. On the other hand, making a pistol more idiot-resistant has great appeal. When I teach people about pistols, I tell them they must be aware of how a magazine safety operates, decide whether they want the pistol they are going to keep or carry to have such a safety or not, and know whether any pistol they possess is so designed. I also teach them how to find out, if they aren't sure, by having a round in the chamber, removing the magazine, pointing the gun downrange, and squeezing the trigger.]
So my guess is that Rep. Combs had removed the magazine and had not checked for a round in the chamber. Then, when she pulled the trigger, she learned two things: there was a round in the chamber, and her pistol did not have a magazine safety. Such foolishness on the part of a gun owner is most unfortunate and proved quite embarrassing, given that the incident occurred in her office. Did she do anything right? Absolutely. She obeyed the first and most important rule of gun safety: she had the gun pointed in a safe direction. (That rule is commonly stated either of two ways. Always have the gun pointed in a safe direction. Never point a gun at anything you do not intend to destroy. Notice the inherent assumption that any gun is loaded, no matter how sure you are that it isn't. The bullet struck the base of a bookcase.)
Why did this happen? Inadequate training? Mental lapse? We cannot tell from the news reports. Despite the statement that she had decided she no longer wished to have a loaded gun in her possession, for unstated reasons, she remains a staunch supporter of the right to keep and bear arms.
The other news story was much more disturbing.
A retired police office got into an argument with another patron of a movie theater, who was texting on a phone. The movie hadn't started yet. The officer went to report the other patron to cinema personnel, who reportedly took no action. The texting patron was nevertheless annoyed about having been reported. The argument escalated. The texting patron reportedly threw popcorn at the officer. There may be missing details in the reports of the sequence of events, but the accounts I've read make no mention of any blows being struck before the officer drew a pistol and shot the other patron and that man's wife. The man who was texting died, and his wife was wounded and taken to hospital.
At the cinema near my home, where I go to see a movie on infrequent occasions, there is a very emphatic announcement, which goes on at some length, about how the use of a phone for talking, texting, web browsing, etc. during the movie will not be tolerated and will result in the removal of the patron. That seems reasonable to me. Confronting someone who is texting before the movie has begun does not. If I didn't know there was going to be an announcement, I might say to my fellow moviegoer, in the most congenial manner I could summon, that I hoped he wouldn't do that during the movie. But probably not. And if his reaction was unfriendly, I'd probably just move.
I've read a great deal about the decision to carry a concealed handgun, and one of the recurring themes is that this places upon the person carrying the gun a great burden of responsibility to avoid conflict, and to do everything possible to deescalate any conflict that occurs. A verbal argument that has the potential to escalate into a fistfight becomes a much more serious undertaking if it has the potential to escalate into a shooting.
The striking thing about this story is that the shooter was a retired police officer. If anyone has been trained to use, in any sort of interpersonal conflict, all manner of behaviors short of physical force, and all manner of physical force short of lethal force, it is a police officer. Therefore, the shooter's background is a compelling reminder that training is no guarantee that a person will adhere to important principles of nonviolent conflict resolution. Such education and training is a good thing, and not only for people who have guns. In my work as an emergency physician, I see people every day who could have benefited from it and perhaps avoided injuries from punches, kicks, impact weapons, and sharp implements.
As I'm sure you would guess, these stories have generated many comments on news websites and in social media. Those who dislike guns cite these incidents as evidence that human beings simply cannot be trusted to behave carefully and rationally, even when they appear to be the sort who could be relied upon to do just that. The conclusion they draw is that people just should not be allowed to go about armed.
As you might guess if you are a regular reader, that is not the conclusion I reach.
This is because I believe in the right of self defense as a fundamental, natural human right, and I believe we live in a society in which, for some of us, the exercise of that right may require the availability of lethal force in the form of a gun. Without that, the elderly, the frail, the weak, and the slow will always be at the mercy of those who are malevolent and who are young, strong, and quick, even if they are not themselves armed.
So the conclusions I draw are straightforward. All who possess firearms have an obligation to be well trained, to practice frequently, and to learn thoroughly and internalize profoundly the legal and ethical principles governing the use of lethal force in self defense.
The first incident was one of an "accidental discharge" of a semiautomatic pistol by a woman who is a member of the Kentucky state legislature (Rep. Leslie Combs). News reports indicated she was unloading the pistol in her office, although the reason for this was not entirely clear. She said something about having decided she did not wish to use the gun any more, according to a few news accounts, although most reports gave no reason at all for her actions. This doesn't really make any sense. If a member of the legislature with a concealed carry permit decided that she no longer wished to go about armed with a loaded handgun, why would she suddenly decide to unload it in her office?
Setting aside that question, the incident is illustrative of a few important points. First is that one must be thoroughly familiar with the operations of any handgun one carries or otherwise possesses. She clearly had a lapse: apparently she did not realize the gun had a round in the chamber when she pulled the trigger. Presumably she made the error of thinking it was unloaded after removing the magazine. This error is sufficiently common that some pistols are designed with a "magazine safety." This makes it impossible to fire a round that is in the chamber if there is no magazine in place in the gun's receiver.
[In case you're wondering, there is a reason some pistols do not have a magazine safety. Although such a safety makes a pistol somewhat more idiot-resistant, it also makes it impossible to fire the round in the chamber if one is in the midst of changing magazines or has somehow accidentally pressed the button that ejects the magazine. In a life-threatening situation, being unable to fire the round in the chamber because there is no magazine in the receiver could have tragic consequences. On the other hand, making a pistol more idiot-resistant has great appeal. When I teach people about pistols, I tell them they must be aware of how a magazine safety operates, decide whether they want the pistol they are going to keep or carry to have such a safety or not, and know whether any pistol they possess is so designed. I also teach them how to find out, if they aren't sure, by having a round in the chamber, removing the magazine, pointing the gun downrange, and squeezing the trigger.]
So my guess is that Rep. Combs had removed the magazine and had not checked for a round in the chamber. Then, when she pulled the trigger, she learned two things: there was a round in the chamber, and her pistol did not have a magazine safety. Such foolishness on the part of a gun owner is most unfortunate and proved quite embarrassing, given that the incident occurred in her office. Did she do anything right? Absolutely. She obeyed the first and most important rule of gun safety: she had the gun pointed in a safe direction. (That rule is commonly stated either of two ways. Always have the gun pointed in a safe direction. Never point a gun at anything you do not intend to destroy. Notice the inherent assumption that any gun is loaded, no matter how sure you are that it isn't. The bullet struck the base of a bookcase.)
Why did this happen? Inadequate training? Mental lapse? We cannot tell from the news reports. Despite the statement that she had decided she no longer wished to have a loaded gun in her possession, for unstated reasons, she remains a staunch supporter of the right to keep and bear arms.
The other news story was much more disturbing.
A retired police office got into an argument with another patron of a movie theater, who was texting on a phone. The movie hadn't started yet. The officer went to report the other patron to cinema personnel, who reportedly took no action. The texting patron was nevertheless annoyed about having been reported. The argument escalated. The texting patron reportedly threw popcorn at the officer. There may be missing details in the reports of the sequence of events, but the accounts I've read make no mention of any blows being struck before the officer drew a pistol and shot the other patron and that man's wife. The man who was texting died, and his wife was wounded and taken to hospital.
At the cinema near my home, where I go to see a movie on infrequent occasions, there is a very emphatic announcement, which goes on at some length, about how the use of a phone for talking, texting, web browsing, etc. during the movie will not be tolerated and will result in the removal of the patron. That seems reasonable to me. Confronting someone who is texting before the movie has begun does not. If I didn't know there was going to be an announcement, I might say to my fellow moviegoer, in the most congenial manner I could summon, that I hoped he wouldn't do that during the movie. But probably not. And if his reaction was unfriendly, I'd probably just move.
I've read a great deal about the decision to carry a concealed handgun, and one of the recurring themes is that this places upon the person carrying the gun a great burden of responsibility to avoid conflict, and to do everything possible to deescalate any conflict that occurs. A verbal argument that has the potential to escalate into a fistfight becomes a much more serious undertaking if it has the potential to escalate into a shooting.
The striking thing about this story is that the shooter was a retired police officer. If anyone has been trained to use, in any sort of interpersonal conflict, all manner of behaviors short of physical force, and all manner of physical force short of lethal force, it is a police officer. Therefore, the shooter's background is a compelling reminder that training is no guarantee that a person will adhere to important principles of nonviolent conflict resolution. Such education and training is a good thing, and not only for people who have guns. In my work as an emergency physician, I see people every day who could have benefited from it and perhaps avoided injuries from punches, kicks, impact weapons, and sharp implements.
As I'm sure you would guess, these stories have generated many comments on news websites and in social media. Those who dislike guns cite these incidents as evidence that human beings simply cannot be trusted to behave carefully and rationally, even when they appear to be the sort who could be relied upon to do just that. The conclusion they draw is that people just should not be allowed to go about armed.
As you might guess if you are a regular reader, that is not the conclusion I reach.
This is because I believe in the right of self defense as a fundamental, natural human right, and I believe we live in a society in which, for some of us, the exercise of that right may require the availability of lethal force in the form of a gun. Without that, the elderly, the frail, the weak, and the slow will always be at the mercy of those who are malevolent and who are young, strong, and quick, even if they are not themselves armed.
So the conclusions I draw are straightforward. All who possess firearms have an obligation to be well trained, to practice frequently, and to learn thoroughly and internalize profoundly the legal and ethical principles governing the use of lethal force in self defense.
Wednesday, December 4, 2013
Truth and the Internet
Everyone who travels by air likes to whine about the experience. It's mostly about the TSA, but sometimes it's about the behavior of other passengers. I wrote about that myself a couple of months ago. Recently there was a Twitter-based account of the interaction between a fellow who works as a producer on an ABC network television show and a fellow passenger (Diane from 7A). The story was later apparently outed as entirely fictional by its author. Perspectives on that ranged from amusement to annoyance to musings on the assumptions people make about the veracity of what they read online.
This brought to mind the many conversations I have had with friends and relatives about our reading habits. Not about reading online, or newspapers or magazines - but about our selection of books. My reading of books is exclusively nonfiction, mostly history, and especially US political history. I favor biographies, and especially those of important figures in American history, and even more particularly American presidents. I find history fascinating. However, I have learned that even the most scholarly works of historiography involve a prism. I am looking at something through the eyes of another human being. I am on my third biography of Lincoln, and three biographers see him in ways that are significantly different from each other. Getting such varied perspectives affords me insight into the art of interpreting the historical record.
I find history much more interesting than fiction. My sense is that, if I were reading a novel, and found myself becoming engrossed in the events of someone's life, I would stop every so often and think, oh, this didn't really happen (except in someone's imagination). The implication inherent in that thought is that, because it didn't really happen, it doesn't matter.
So the story of "Diane from 7A" got me wondering about the way people look at such things. Does it matter if the account of the interaction between two passengers on a plane as related via Twitter really happened? If you found it funny, is it any less funny if it didn't really happen? My experience with funny stories people tell each other in conversation is that if something is truly amusing, they may want to know whether that really happened or was just made up, but not because it is any less funny if it's fictional. No, it seems the reason they ask is so they will know whether the person telling the story has a great sense of humor. After all, which is the person you want to be around: the person who has a few true funny stories (because, after all, most of us don't have funny things happen around us very often), or the person who takes ordinary experiences and weaves tales around them for everyone else's amusement?
When I read something from a conventional news outlet, whether in print or online, I expect it to be factual. The reporters and editors have a responsibility to their readers to get the facts right and to report them as objectively as possible. But just about everyone else is, in my view, entitled to some literary license, or at least to an individualistic interpretation that may be strongly colored by an array of biases.
When I was in college, I took three music courses. Two of them were on specific periods (Classical and Romantic) and were taught by the same professor. The term paper assignment was to select a composer and write about his life and works. For the Classical period, I chose Mozart. This was before the Internet, so I went to the big Free Library in downtown Philadelphia and read all of the biographies of Mozart I could find. Some years later the film "Amadeus" appeared in theaters. The story was related, at least in part, from the perspective of rival composer Antonio Salieri. Even so, I found the portrayal of the title character jarring. I reflected upon my readings from the decade before, when I wrote my term paper. I looked at the film's protagonist and couldn't help thinking, "That's not the Mozart I knew." But this was a movie, not a documentary produced for the BBC, and not a history-for-television-audiences such as Ken Burns might have produced. The fellow who wrote the screenplay and the film's director are surely entitled to all kinds of license. I thought they portrayed Mozart as a silly ass, and I didn't like that, but then I didn't really know Mozart. I knew him only as his biographers depicted him, and traditionally biographies have been written by admirers.
As a lad, when I went to a public library and saw books neatly characterized as fiction or nonfiction, I thought life was just that simple. Either it really happened, just as described, or it's the product of someone's imagination. But life is not just that simple, and the Internet is certainly not. My assumption nowadays is that everything I read online comes from someone who has considered himself at liberty to take license in telling a story. Yes, I still expect "straight news" journalists to try to be factual and objective. But they are human, and all humans have a point of view. So I look for a variety of points of view and compare them, not assuming that any one is more reliable than others.
Sometimes when I look at things I am pretty sure they are fictional. Some photos, for example, defy credulity, and - despite being something of a technophobe (or at least a techno-naïf) - I recognize that they've been altered. It isn't always quite as obvious as the image to the right, but for anyone but a true babe in the woods, it should be plain to see that it's the product of an inventive mind.
"Debunking" websites like Snopes.com are a wonderful resource. I use them all the time, taking advantage of the time and effort they put into investigating Internet-based stories and letting the rest of us know whether they have found truth or fiction. On social media, people are always posting things that are then promptly outed as "hoaxes" by their friends who have done a little checking. This sort of skepticism is a good thing, I suppose. And it seems to have become the rule, rather than the exception, at least in my social media circles. A recent posting about the death of a minor celebrity was immediately followed by a series of skeptical postings noting that all the links that turned up in a Google search seemed to stem from one source, which then was the subject of a discussion of its reliability (as might well be appropriate for an Internet source that mainly reports on celebrities).
Some of the stories I see are inspiring. Some are funny. Mostly I don't care whether they are truth or fiction, because their effect on my mood at the moment is independent of their grounding in reality. It just doesn't matter whether Diane ever really sat in seat 7A.
It does matter to me whether there was anything Abraham Lincoln might have done to prevent the Civil War. And it matters to me what he really believed, at the core of his being, about African slavery, and why he thought certain infringements upon civil liberties were justifiable in time of war. And I guess maybe that's why I have such a strong preference for nonfiction: I want to spend my time reading about things that really matter.
This brought to mind the many conversations I have had with friends and relatives about our reading habits. Not about reading online, or newspapers or magazines - but about our selection of books. My reading of books is exclusively nonfiction, mostly history, and especially US political history. I favor biographies, and especially those of important figures in American history, and even more particularly American presidents. I find history fascinating. However, I have learned that even the most scholarly works of historiography involve a prism. I am looking at something through the eyes of another human being. I am on my third biography of Lincoln, and three biographers see him in ways that are significantly different from each other. Getting such varied perspectives affords me insight into the art of interpreting the historical record.
I find history much more interesting than fiction. My sense is that, if I were reading a novel, and found myself becoming engrossed in the events of someone's life, I would stop every so often and think, oh, this didn't really happen (except in someone's imagination). The implication inherent in that thought is that, because it didn't really happen, it doesn't matter.
So the story of "Diane from 7A" got me wondering about the way people look at such things. Does it matter if the account of the interaction between two passengers on a plane as related via Twitter really happened? If you found it funny, is it any less funny if it didn't really happen? My experience with funny stories people tell each other in conversation is that if something is truly amusing, they may want to know whether that really happened or was just made up, but not because it is any less funny if it's fictional. No, it seems the reason they ask is so they will know whether the person telling the story has a great sense of humor. After all, which is the person you want to be around: the person who has a few true funny stories (because, after all, most of us don't have funny things happen around us very often), or the person who takes ordinary experiences and weaves tales around them for everyone else's amusement?
When I read something from a conventional news outlet, whether in print or online, I expect it to be factual. The reporters and editors have a responsibility to their readers to get the facts right and to report them as objectively as possible. But just about everyone else is, in my view, entitled to some literary license, or at least to an individualistic interpretation that may be strongly colored by an array of biases.
![]() |
| Tom Hulce as Mozart in the film Amadeus |
As a lad, when I went to a public library and saw books neatly characterized as fiction or nonfiction, I thought life was just that simple. Either it really happened, just as described, or it's the product of someone's imagination. But life is not just that simple, and the Internet is certainly not. My assumption nowadays is that everything I read online comes from someone who has considered himself at liberty to take license in telling a story. Yes, I still expect "straight news" journalists to try to be factual and objective. But they are human, and all humans have a point of view. So I look for a variety of points of view and compare them, not assuming that any one is more reliable than others.
Sometimes when I look at things I am pretty sure they are fictional. Some photos, for example, defy credulity, and - despite being something of a technophobe (or at least a techno-naïf) - I recognize that they've been altered. It isn't always quite as obvious as the image to the right, but for anyone but a true babe in the woods, it should be plain to see that it's the product of an inventive mind.
"Debunking" websites like Snopes.com are a wonderful resource. I use them all the time, taking advantage of the time and effort they put into investigating Internet-based stories and letting the rest of us know whether they have found truth or fiction. On social media, people are always posting things that are then promptly outed as "hoaxes" by their friends who have done a little checking. This sort of skepticism is a good thing, I suppose. And it seems to have become the rule, rather than the exception, at least in my social media circles. A recent posting about the death of a minor celebrity was immediately followed by a series of skeptical postings noting that all the links that turned up in a Google search seemed to stem from one source, which then was the subject of a discussion of its reliability (as might well be appropriate for an Internet source that mainly reports on celebrities).
Some of the stories I see are inspiring. Some are funny. Mostly I don't care whether they are truth or fiction, because their effect on my mood at the moment is independent of their grounding in reality. It just doesn't matter whether Diane ever really sat in seat 7A.
It does matter to me whether there was anything Abraham Lincoln might have done to prevent the Civil War. And it matters to me what he really believed, at the core of his being, about African slavery, and why he thought certain infringements upon civil liberties were justifiable in time of war. And I guess maybe that's why I have such a strong preference for nonfiction: I want to spend my time reading about things that really matter.
Monday, December 2, 2013
The Truth About Trans Fat
A few weeks ago the Food & Drug Administration announced the opening of a comment period on its regulatory proposal to ban trans fats from being added to foods sold by industry (processed foods) and restaurants.
The latest available data suggest that the current average consumption of trans fats is about one gram per day (according to the FDA, as reported by CNN last month). Given that experts on nutrition recommend we get no more than 1% of our daily caloric intake from trans fat, it appears the average American is already well below that. (A gram of fat yields about 9 calories, so that would be 1% of a 900 calorie diet, far less than what most people must consume to meet their needs for energy from food calories.)
So it would seem the elimination of trans fats is unlikely to make much difference in the public health, because consumption will change little. Certain foods that we still eat, though, contain notable amounts of these substances, and so if your diet includes significant numbers of, say, doughnuts, your trans fat intake might be a good bit higher than average.
Let's back up a little to look at how trans fats became part of our diet.
Several decades back the experts told us animal fat was bad for our health. And we responded by switching to vegetable oils. So the first thing to understand is the difference between a fat and an oil. In simplest terms, these substances are all built on molecules that are esters of glycerol (glycerides). If such a substance is liquid at room temperature, we call it an oil; if it's solid, we call it a fat. In nature, most of the glyceride-based substances from animals are fats, while most from vegetables are oils. There are well known exceptions. Coconut oil, for example, is solid at room temperature but liquifies slightly above that.
On the molecular level when you look at the bonding of carbon atoms to each other and to hydrogen atoms in the hydrocarbon chain that is the backbone of a glyceride, oils have fewer hydrogen atoms. The electrons that are part of the carbon atom, instead of participating in bonds with hydrogen atoms, form double bonds between the carbon atoms.
If we take an oil, which has fewer hydrogen atoms, and add some, we get a hydrogenated oil, which then behaves more like a fat (solid at room temperature). This is called a "partially hydrogenated oil," and you've probably seen phrases like that in the lists of ingredients on packages of food. In nature, the hydrogen atoms tend to be on the same side of the carbon chain, a configuration called "cis" in organic chemistry. When we add hydrogen atoms artificially, they tend to go on opposite sides of the chain, and chemists call that "trans." There is very little "trans" in naturally occurring animal fat.
So is trans fat bad for us because it's different from naturally occurring cis fat?
We used to think just the opposite. Those of us who are old enough to remember when margarine was first introduced will recall that it was thought to be much better for our health than butter. Margarine (partially hydrogenated vegetable oil) is trans fat. Butter is naturally occurring animal (cis) fat.
As the decades passed, there was intriguing evidence from population studies that trans fats might actually be worse than cis fats. They seemed to be associated with an increased incidence of colon cancer. And while we thought margarine was more "heart healthy" than butter, trans fats seem to be associated with worse, rather than better, findings when we have our blood tested for lipids. Specifically, trans fats are associated with higher levels of low-density lipoprotein (LDL) - the so-called "bad cholesterol."
My regular readers know that I am fascinated by the notion of truth in science. As knowledge advances, we are convinced that we are getting inexorably closer to the truth. We believe that what we "know" now is always more likely to be the truth, or closer to the truth, than what we "knew" before. We believe we are steadily getting better at discovering scientific truth, that our scientific methods are constantly bringing us closer to a complete and accurate understanding of scientific phenomena, including anything having to do with the life sciences - and therefore with health and nutrition.
We may be right about this. I'd certainly like to think so. But I believe it is important to bear in mind that evidence isn't better just because it's newer. Any time we have new evidence, we should regard it with a certain level of skepticism - not because we are clinging to our "old" truths, but because evidence should always be examined closely, and its quality and meaning judged carefully.
So let's imagine how we might go about answering the question of whether trans fats are bad. We'd start with a very large number of people willing to be assigned at random to any of several groups whose consumption of trans fats would be carefully controlled over a period of many years. We'd assign them to getting, say, 1%, 3%, 5% (and so on) of their daily caloric consumption from trans fats. And then we'd follow them for two or three decades, closely monitoring health outcomes we think might be affected by consumption of trans fats.
As you might surmise, no one has done a study like that. Instead, we have epidemiologic evidence. We ask people to tell us about their dietary habits, we hope that their recall and reporting are correct, and we compare them with other people whose diets (as they recall and report them) are different but who are otherwise similar in ways we think might influence their health.
As you likely realize, the second kind of study is much easier to do, but it is highly vulnerable to what scientists call "confounding." Many other factors, dietary and otherwise, may affect people's health, and trying to identify them all and make adjustments in our statistical analysis, is tremendously challenging.
Thus a study done prospectively, in which the use of large numbers of subjects and the process of randomization minimize confounders, is much more reliable. We still have the question of whether subjects will do what they're told, but we can ask them to keep an accurate diary and see for ourselves whether they followed instructions. A diary maintained contemporaneously tends to be much more reliable than recall and self-reporting.
Trans fats are bad. That is the message people have been hearing for quite a while now. And some laws have already been enacted. Restaurants have been prohibited from using trans fats in New York City, for example. Thus our consumption of trans fats has already declined very substantially.
So we can predict that, if the FDA declares that trans fats are no longer "generally recognized as safe" and prohibits their use in commercially prepared foods, the impact on the public health may be modest at best. Might it affect an individual who eats a lot of doughnuts much more? Maybe.
Doughnuts and some other baked goods are made with trans fats to give them a lighter texture. Without trans fats, doughnuts would likely seem more oily. Partially hydrogenated oils stay in the "matrix" of the food and do not exude out of it. If doughnuts wind up being greasier because of a new FDA regulation, some people won't like it. I suspect the folks who make doughnuts will figure out a way to solve that problem, but I'm no food chemist, so I really don't know.
Let us keep in mind what we're talking about here. This is a proposed government regulation that would say, "Thou shalt not" use trans fats - unless you can convince the FDA that doughnuts made with trans fats really are safe.
What level of evidence should we expect for such an edict? There are some things for which we very reasonably believe we don't need evidence from a prospective, randomized, controlled trial. No one would seriously suggest, to offer my favorite example, that we need to do a study to find out whether it really is safer to jump out of an airplane with a parachute than without one.
When it comes to public health and the use of case control studies to try to figure something out, however, there are so many sources of error, bias, and confounding, that I have to wonder whether what we "know" today is clearly much closer to the truth than what we "knew" when the food scientists first created margarine to save us from animal fat. Just a quick review of the dizzying array of opinions on whether polyunsaturated versus monounsaturated vegetable oils are best for us will give you an idea of how challenging it is to be certain about things in this realm.
I don't have a problem with the government telling us what is good for us or outlawing what it thinks is bad for us.
Ha! I probably caught at least a few of you not paying close attention, reading that sentence and thinking I really meant that. The fact is, I do have a problem with that, especially when the evidence for what is good or bad for us is somewhat less than compelling. And this, I believe, is an instance in which it is quite reasonable to doubt the evidence and to ask whether what we "know" today is scientific truth.
The latest available data suggest that the current average consumption of trans fats is about one gram per day (according to the FDA, as reported by CNN last month). Given that experts on nutrition recommend we get no more than 1% of our daily caloric intake from trans fat, it appears the average American is already well below that. (A gram of fat yields about 9 calories, so that would be 1% of a 900 calorie diet, far less than what most people must consume to meet their needs for energy from food calories.)
So it would seem the elimination of trans fats is unlikely to make much difference in the public health, because consumption will change little. Certain foods that we still eat, though, contain notable amounts of these substances, and so if your diet includes significant numbers of, say, doughnuts, your trans fat intake might be a good bit higher than average.
Let's back up a little to look at how trans fats became part of our diet.
Several decades back the experts told us animal fat was bad for our health. And we responded by switching to vegetable oils. So the first thing to understand is the difference between a fat and an oil. In simplest terms, these substances are all built on molecules that are esters of glycerol (glycerides). If such a substance is liquid at room temperature, we call it an oil; if it's solid, we call it a fat. In nature, most of the glyceride-based substances from animals are fats, while most from vegetables are oils. There are well known exceptions. Coconut oil, for example, is solid at room temperature but liquifies slightly above that.
On the molecular level when you look at the bonding of carbon atoms to each other and to hydrogen atoms in the hydrocarbon chain that is the backbone of a glyceride, oils have fewer hydrogen atoms. The electrons that are part of the carbon atom, instead of participating in bonds with hydrogen atoms, form double bonds between the carbon atoms.
If we take an oil, which has fewer hydrogen atoms, and add some, we get a hydrogenated oil, which then behaves more like a fat (solid at room temperature). This is called a "partially hydrogenated oil," and you've probably seen phrases like that in the lists of ingredients on packages of food. In nature, the hydrogen atoms tend to be on the same side of the carbon chain, a configuration called "cis" in organic chemistry. When we add hydrogen atoms artificially, they tend to go on opposite sides of the chain, and chemists call that "trans." There is very little "trans" in naturally occurring animal fat.
So is trans fat bad for us because it's different from naturally occurring cis fat?
We used to think just the opposite. Those of us who are old enough to remember when margarine was first introduced will recall that it was thought to be much better for our health than butter. Margarine (partially hydrogenated vegetable oil) is trans fat. Butter is naturally occurring animal (cis) fat.
As the decades passed, there was intriguing evidence from population studies that trans fats might actually be worse than cis fats. They seemed to be associated with an increased incidence of colon cancer. And while we thought margarine was more "heart healthy" than butter, trans fats seem to be associated with worse, rather than better, findings when we have our blood tested for lipids. Specifically, trans fats are associated with higher levels of low-density lipoprotein (LDL) - the so-called "bad cholesterol."
My regular readers know that I am fascinated by the notion of truth in science. As knowledge advances, we are convinced that we are getting inexorably closer to the truth. We believe that what we "know" now is always more likely to be the truth, or closer to the truth, than what we "knew" before. We believe we are steadily getting better at discovering scientific truth, that our scientific methods are constantly bringing us closer to a complete and accurate understanding of scientific phenomena, including anything having to do with the life sciences - and therefore with health and nutrition.
We may be right about this. I'd certainly like to think so. But I believe it is important to bear in mind that evidence isn't better just because it's newer. Any time we have new evidence, we should regard it with a certain level of skepticism - not because we are clinging to our "old" truths, but because evidence should always be examined closely, and its quality and meaning judged carefully.
So let's imagine how we might go about answering the question of whether trans fats are bad. We'd start with a very large number of people willing to be assigned at random to any of several groups whose consumption of trans fats would be carefully controlled over a period of many years. We'd assign them to getting, say, 1%, 3%, 5% (and so on) of their daily caloric consumption from trans fats. And then we'd follow them for two or three decades, closely monitoring health outcomes we think might be affected by consumption of trans fats.
As you might surmise, no one has done a study like that. Instead, we have epidemiologic evidence. We ask people to tell us about their dietary habits, we hope that their recall and reporting are correct, and we compare them with other people whose diets (as they recall and report them) are different but who are otherwise similar in ways we think might influence their health.
As you likely realize, the second kind of study is much easier to do, but it is highly vulnerable to what scientists call "confounding." Many other factors, dietary and otherwise, may affect people's health, and trying to identify them all and make adjustments in our statistical analysis, is tremendously challenging.
Thus a study done prospectively, in which the use of large numbers of subjects and the process of randomization minimize confounders, is much more reliable. We still have the question of whether subjects will do what they're told, but we can ask them to keep an accurate diary and see for ourselves whether they followed instructions. A diary maintained contemporaneously tends to be much more reliable than recall and self-reporting.
Trans fats are bad. That is the message people have been hearing for quite a while now. And some laws have already been enacted. Restaurants have been prohibited from using trans fats in New York City, for example. Thus our consumption of trans fats has already declined very substantially.
So we can predict that, if the FDA declares that trans fats are no longer "generally recognized as safe" and prohibits their use in commercially prepared foods, the impact on the public health may be modest at best. Might it affect an individual who eats a lot of doughnuts much more? Maybe.
Doughnuts and some other baked goods are made with trans fats to give them a lighter texture. Without trans fats, doughnuts would likely seem more oily. Partially hydrogenated oils stay in the "matrix" of the food and do not exude out of it. If doughnuts wind up being greasier because of a new FDA regulation, some people won't like it. I suspect the folks who make doughnuts will figure out a way to solve that problem, but I'm no food chemist, so I really don't know.
Let us keep in mind what we're talking about here. This is a proposed government regulation that would say, "Thou shalt not" use trans fats - unless you can convince the FDA that doughnuts made with trans fats really are safe.
What level of evidence should we expect for such an edict? There are some things for which we very reasonably believe we don't need evidence from a prospective, randomized, controlled trial. No one would seriously suggest, to offer my favorite example, that we need to do a study to find out whether it really is safer to jump out of an airplane with a parachute than without one.
When it comes to public health and the use of case control studies to try to figure something out, however, there are so many sources of error, bias, and confounding, that I have to wonder whether what we "know" today is clearly much closer to the truth than what we "knew" when the food scientists first created margarine to save us from animal fat. Just a quick review of the dizzying array of opinions on whether polyunsaturated versus monounsaturated vegetable oils are best for us will give you an idea of how challenging it is to be certain about things in this realm.
I don't have a problem with the government telling us what is good for us or outlawing what it thinks is bad for us.
Ha! I probably caught at least a few of you not paying close attention, reading that sentence and thinking I really meant that. The fact is, I do have a problem with that, especially when the evidence for what is good or bad for us is somewhat less than compelling. And this, I believe, is an instance in which it is quite reasonable to doubt the evidence and to ask whether what we "know" today is scientific truth.
Wednesday, November 20, 2013
Is It Time to Scrap the Second Amendment?
Earlier this month I wrote about the controversy stirred up among gun owners when a writer for the special interest magazine Guns & Ammo suggested some gun controls are not "infringements" on the right to keep and bear arms (RKBA). Most surprising to me was that a longtime gun writer seemed to have a basic misunderstanding of the meaning of the term "well regulated" as used in the text of the Second Amendment.
The following week a law professor from Texas A&M University, Mary Margaret Penrose, spoke as a member of a panel at a symposium held at the law school of the University of Connecticut. Penrose advocated ditching the Second Amendment, as part of a broader call for a constitutional convention to draft a comprehensive revision of the U.S. Constitution. Penrose believes many things in the Constitution, written more than two centuries ago, inadequately address the issues facing modern society. Although we have judicial interpretation to apply its provisions to current legal questions, and the document itself provides a mechanism for amendment, Penrose prefers a wider approach.
Penrose at UConn
Specifically regarding the Second Amendment, Penrose said the Framers included it in the Bill of Rights because of 18th-Century aversion to standing armies. The idea was that standing armies enabled oppression of a people by their rulers, and it was preferable to avoid having them. The alternative was that all of the citizenry be armed - or at least able bodied adult males, who - as a group - would make up a sort of "unorganized" militia (as distinguished from various state militias, which were "organized").
Now, of course, we have grown accustomed to having "standing armies." The United States has quite a large number of uniformed personnel in organized forces, full time, around the globe. So we obviously do not need an armed citizenry: it is no longer, in the words of the Second Amendment, "necessary to the security of a free state" that all able bodied citizens have privately owned firearms and be practiced in their use.
Gun rights advocates would point out that an armed citizenry may not be essential for national security, as we have delegated responsibility for that to the national government, but the other broad societal purpose of an armed citizenry in the minds of the framers was as a "bulwark against tyranny" by our own government. The idea was that our central government wouldn't get "too big for its britches" and be tempted to oppress the populace if the people were armed and clearly intolerant of an oppressive regime.
Do we still need an armed citizenry to restrain our own government, lest it become oppressive and exhibit too little regard for the people's civil liberties? We could talk about that at great length, but chances are those who say yes would be labeled paranoid, while those on the other side of the argument would be called naive.
But we're missing a very important point. The Framers wrote the Second Amendment to restrain the central government, and so their writings are focused on the relationship between the government and the people. The government, they wrote, must not infringe upon this right that existed to guarantee the people's ability to resist tyranny. The Framers didn't write about hunting for sustenance or armed self defense against criminal attack. The importance of gun ownership for those purposes was so universally understood that it did not require exposition. Furthermore, that aspect of gun ownership was not connected, in the minds of the Framers, with the relationship between the government and the people, and the right to use guns for those purposes was not thought of as a political right.
Penrose suggests we should drop the Second Amendment and leave it up to the states to regulate the ownership and use of firearms as they see fit.
Coincidentally, the Fall 2013 issue of Tufts Magazine (obviously New England is a hotbed of intellectual curiosity) includes a fascinating article about how the United States can be divided into eleven regions, with marked differences in attitudes about things like gun rights, gun control, and violence as a social problem - and the proper solutions to that problem.
Tufts Magazine: Eleven Nations
The thesis that there are stark regional differences in people's beliefs about such things dovetails rather nicely with the contention by Penrose that we should leave gun rights up to the states.
There is, however, an obvious flaw in this reasoning. While there may be dramatic differences in people's tolerance for, or willingness to accept, stringent controls on the private ownership of firearms from state to state, these differences are seen on a societal level and cannot be assumed to reflect the beliefs or desires of individuals. Penrose seems to think if we leave gun rights up to the states, people can just sort themselves out. If I live in Massachusetts, where gun rights are little respected and gun controls are strict, I can just move to Arkansas. On the other hand, if I live in Vermont and am appalled that one may carry a concealed handgun without a permit, I can just move to our nation's capital, where there are no such permits issued to ordinary citizens, or to one of the states where permits are extremely difficult to obtain.
Of course she believes the enlightened folks who live in states with strict controls will need help from the federal government to keep them safe from illegal trafficking across state lines. As you may know, New York City Mayor Michael Bloomberg is convinced that gun crime would disappear from the Big Apple if he could just shut off the flow of guns from Virginia.
I consider myself fortunate to live in a state (Pennsylvania) with a modest regimen of controls. If I want to buy a handgun, all I need is money to pay the asking price and a clean record, so when the dealer queries the National Instant Check System, the sale will be approved. To get a permit to carry a concealed handgun, I must do some paperwork, pay a reasonable fee, and have a clean record and character references willing to vouch for me. But if I lived in any of a number of other states, it would be much more difficult, at least in some locales. Try getting a permit in New York City or most counties in California.
And so the question arises whether my right to keep and bear arms should depend on my zip code. If one views RKBA as a political right, then the answer is yes. We make political decisions about political rights. From the time of our nation's origin, we have restricted the right to vote. Early on it belonged only to adult males, and only to whites. We have repeatedly expanded suffrage, to include all races, then to include women, and then to include everyone at least 18 years of age. But the right to vote is a political right, and so we have made political decisions, as an electorate, about how broad that right should be.
Gun ownership is different. We live in a time and a set of social circumstances in which the gun is widely considered to be an essential implement of self defense. Certainly other options exist, including training in martial arts and the use of other weapons of varying lethality. But for most people effective self defense is available mainly through personal ownership of a gun and the achievement of proficiency in its use. Many advocates of strict gun control (or even outright bans) deny that this is so, insisting that gun ownership makes people less, rather than more, safe and secure. But I have read the published literature on this at great length, and it has convinced me that the intended victim of a criminal assault is considerably less likely to be injured or killed if armed than if not.
Effective self defense is a fundamental or natural right. In political philosophy and jurisprudence, these are terms of art with carefully elucidated meanings. But suffice it to say that fundamental, natural human rights exist independent of political constructs. We may guarantee them against infringement by governments in our constitutions, but such rights would exist even without those guarantees.
If you accept that characterization of the right to effective self defense, and you accept that in modern society effective self defense is most readily, and realistically, available through personal ownership of guns, then it becomes clear that it makes no sense for restrictions on RKBA to vary from one political jurisdiction to another.
That answers the question of whether the Second Amendment should be deleted from our constitution because we no longer need it or because it is more appropriate for the states to be given free reign to determine gun rights.
[You may have noticed that, as a general rule, people to the left of center on the political spectrum are unfriendly to gun rights and quite friendly to abortion rights. They staunchly oppose giving the states free reign over the latter but are quite content to have the states that wish to restrict gun rights do just that, and the more the better.]
No, the Second Amendment should stay put. And the question of what sort of regulation of private ownership of firearms it permits should be decided by that arbiter of what the United States Constitution allows: the Supreme Court. There is no guarantee that the Supreme Court will always respect the fundamental, natural right of armed self defense and strike down laws that unduly restrict RKBA. But I am inclined toward greater confidence in the high court than in 50 state legislatures.
The following week a law professor from Texas A&M University, Mary Margaret Penrose, spoke as a member of a panel at a symposium held at the law school of the University of Connecticut. Penrose advocated ditching the Second Amendment, as part of a broader call for a constitutional convention to draft a comprehensive revision of the U.S. Constitution. Penrose believes many things in the Constitution, written more than two centuries ago, inadequately address the issues facing modern society. Although we have judicial interpretation to apply its provisions to current legal questions, and the document itself provides a mechanism for amendment, Penrose prefers a wider approach.
Penrose at UConn
Specifically regarding the Second Amendment, Penrose said the Framers included it in the Bill of Rights because of 18th-Century aversion to standing armies. The idea was that standing armies enabled oppression of a people by their rulers, and it was preferable to avoid having them. The alternative was that all of the citizenry be armed - or at least able bodied adult males, who - as a group - would make up a sort of "unorganized" militia (as distinguished from various state militias, which were "organized").
Now, of course, we have grown accustomed to having "standing armies." The United States has quite a large number of uniformed personnel in organized forces, full time, around the globe. So we obviously do not need an armed citizenry: it is no longer, in the words of the Second Amendment, "necessary to the security of a free state" that all able bodied citizens have privately owned firearms and be practiced in their use.
Gun rights advocates would point out that an armed citizenry may not be essential for national security, as we have delegated responsibility for that to the national government, but the other broad societal purpose of an armed citizenry in the minds of the framers was as a "bulwark against tyranny" by our own government. The idea was that our central government wouldn't get "too big for its britches" and be tempted to oppress the populace if the people were armed and clearly intolerant of an oppressive regime.
Do we still need an armed citizenry to restrain our own government, lest it become oppressive and exhibit too little regard for the people's civil liberties? We could talk about that at great length, but chances are those who say yes would be labeled paranoid, while those on the other side of the argument would be called naive.
But we're missing a very important point. The Framers wrote the Second Amendment to restrain the central government, and so their writings are focused on the relationship between the government and the people. The government, they wrote, must not infringe upon this right that existed to guarantee the people's ability to resist tyranny. The Framers didn't write about hunting for sustenance or armed self defense against criminal attack. The importance of gun ownership for those purposes was so universally understood that it did not require exposition. Furthermore, that aspect of gun ownership was not connected, in the minds of the Framers, with the relationship between the government and the people, and the right to use guns for those purposes was not thought of as a political right.
Penrose suggests we should drop the Second Amendment and leave it up to the states to regulate the ownership and use of firearms as they see fit.
Coincidentally, the Fall 2013 issue of Tufts Magazine (obviously New England is a hotbed of intellectual curiosity) includes a fascinating article about how the United States can be divided into eleven regions, with marked differences in attitudes about things like gun rights, gun control, and violence as a social problem - and the proper solutions to that problem.
Tufts Magazine: Eleven Nations
The thesis that there are stark regional differences in people's beliefs about such things dovetails rather nicely with the contention by Penrose that we should leave gun rights up to the states.
There is, however, an obvious flaw in this reasoning. While there may be dramatic differences in people's tolerance for, or willingness to accept, stringent controls on the private ownership of firearms from state to state, these differences are seen on a societal level and cannot be assumed to reflect the beliefs or desires of individuals. Penrose seems to think if we leave gun rights up to the states, people can just sort themselves out. If I live in Massachusetts, where gun rights are little respected and gun controls are strict, I can just move to Arkansas. On the other hand, if I live in Vermont and am appalled that one may carry a concealed handgun without a permit, I can just move to our nation's capital, where there are no such permits issued to ordinary citizens, or to one of the states where permits are extremely difficult to obtain.
Of course she believes the enlightened folks who live in states with strict controls will need help from the federal government to keep them safe from illegal trafficking across state lines. As you may know, New York City Mayor Michael Bloomberg is convinced that gun crime would disappear from the Big Apple if he could just shut off the flow of guns from Virginia.
I consider myself fortunate to live in a state (Pennsylvania) with a modest regimen of controls. If I want to buy a handgun, all I need is money to pay the asking price and a clean record, so when the dealer queries the National Instant Check System, the sale will be approved. To get a permit to carry a concealed handgun, I must do some paperwork, pay a reasonable fee, and have a clean record and character references willing to vouch for me. But if I lived in any of a number of other states, it would be much more difficult, at least in some locales. Try getting a permit in New York City or most counties in California.
And so the question arises whether my right to keep and bear arms should depend on my zip code. If one views RKBA as a political right, then the answer is yes. We make political decisions about political rights. From the time of our nation's origin, we have restricted the right to vote. Early on it belonged only to adult males, and only to whites. We have repeatedly expanded suffrage, to include all races, then to include women, and then to include everyone at least 18 years of age. But the right to vote is a political right, and so we have made political decisions, as an electorate, about how broad that right should be.
Gun ownership is different. We live in a time and a set of social circumstances in which the gun is widely considered to be an essential implement of self defense. Certainly other options exist, including training in martial arts and the use of other weapons of varying lethality. But for most people effective self defense is available mainly through personal ownership of a gun and the achievement of proficiency in its use. Many advocates of strict gun control (or even outright bans) deny that this is so, insisting that gun ownership makes people less, rather than more, safe and secure. But I have read the published literature on this at great length, and it has convinced me that the intended victim of a criminal assault is considerably less likely to be injured or killed if armed than if not.
Effective self defense is a fundamental or natural right. In political philosophy and jurisprudence, these are terms of art with carefully elucidated meanings. But suffice it to say that fundamental, natural human rights exist independent of political constructs. We may guarantee them against infringement by governments in our constitutions, but such rights would exist even without those guarantees.
If you accept that characterization of the right to effective self defense, and you accept that in modern society effective self defense is most readily, and realistically, available through personal ownership of guns, then it becomes clear that it makes no sense for restrictions on RKBA to vary from one political jurisdiction to another.
That answers the question of whether the Second Amendment should be deleted from our constitution because we no longer need it or because it is more appropriate for the states to be given free reign to determine gun rights.
[You may have noticed that, as a general rule, people to the left of center on the political spectrum are unfriendly to gun rights and quite friendly to abortion rights. They staunchly oppose giving the states free reign over the latter but are quite content to have the states that wish to restrict gun rights do just that, and the more the better.]
No, the Second Amendment should stay put. And the question of what sort of regulation of private ownership of firearms it permits should be decided by that arbiter of what the United States Constitution allows: the Supreme Court. There is no guarantee that the Supreme Court will always respect the fundamental, natural right of armed self defense and strike down laws that unduly restrict RKBA. But I am inclined toward greater confidence in the high court than in 50 state legislatures.
Subscribe to:
Comments (Atom)












