The Growing Perils of Pauline’s Pregnancy

It’s not easy to be a mother these days. Despite all of the advances in gender equality, the rearing of children still remains by default “women’s work.” This is not to say that fathers are not increasingly involved in caring for their kids, but most studies have shown that women still do the bulk of the work. Not only do they have to put up with nine months of discomfort while pregnant, once the child comes mothers are more likely than fathers to be responsible for changing diapers, looking after a sick kid, arranging for daycare and play dates, and even cooking, cleaning, laundry and other household chores.

You can now add to this some new fears: post-partum depression and the Zika virus. Earlier this week, for example, the US Preventive Services Task Force – an independent and non-partisan group of healthcare experts – recommended that all pregnant women and new mothers be screened for clinical depression. Despite several decades’ worth of research showing that a significant percentage of pregnant women and new mothers (nearly 1 in 10) will experience a major depressive episode, it goes largely undiagnosed. Untreated depression is the leading cause of prenatal and maternal morbidity in the US, and is associated with an increased risk of substance abuse and suicidal ideation among new mothers.

Given the Preventive Services Task Force new recommendations on depression, it is likely that most health insurance companies will soon cover the costs of screening. It may also spur Congress to pass legislation, introduced last year, which would fully fund mental health screening and treatment for all pregnant women and new mothers. Treatment, however, will likely remain a contentious issue.

Even when pre- or post-partum depression is diagnosed, whether because of perceptive doctors, the concern of family members, or a known history of mental illness, many afflicted women go still untreated. This is in part because antidepressant drugs – particularly the family of medications known as selective serotonin reuptake inhibitors (SSRIs), a class of drugs that includes such popular medicines as Paxil, Prozac, and Zoloft – have been linked to miscarriage, premature birth, low birth weight, birth defects and, in one small and still controversial study, autism. Although the absolute risk of these adverse birth outcomes is small, it is not insignificant, leaving many mothers and their physicians with the difficult challenge of weighing the known risks of untreated depression with the potential harms to the unborn child.

If that wasn’t enough to make some think twice about having a child, pregnant women worldwide must now worry about a mosquito-borne virus known as Zika. Although the virus was first discovered nearly 70 years ago in African monkeys, there have been almost no cases of human infection until recently. The first known outbreak occurred in Micronesia in 2007, followed by an outbreak in French Polynesia in 2013. However, these outbreaks didn’t raise any real concerns among doctors and public health officials, because of number of people infected was small and the symptoms relatively benign. In fact, over three-fourths of people infected with Zika don’t experience any clinical symptoms at all. Those who do become sick tend to develop a fever, headache and joint and muscle pain, but the symptoms are relatively mild and resolve within seven days.

So why is concern growing about the Zika virus now, especially for pregnant women? Starting in early 2015, public health officials in Brazil reported an outbreak of Zika in the northern part of that country. Within a short period of time, the virus spread to 21 other countries in the Americas. This includes the United States, where 20 cases of Zika have been reported among people who have traveled to Brazil or elsewhere in Central and South America.

Shortly thereafter, Brazilian authorities noted a sharp increase in the number of cases of a birth defect known as microcephaly, a rare neurological condition in which an infant is born with a smaller-than-usual brain. While some children born with microcephaly develop normally, many will experience lifelong symptoms including developmental delays and disabilities, difficulties with coordination and movement, hyperactivity, and seizures.

Since the outbreak began in Brazil, that country has reported nearly 4,000 suspected cases of microcephaly. In the previous year, the number of cases was less than 150. Although a casual link between Zika and microcephaly has not yet been proven, data now suggest that the risk of having an afflicted child increases 30-fold if a woman is infected with the virus while pregnant. So alarming are these figures that the US Centers for Disease Control and Prevention (CDC) has advised pregnant women to postpone travel to regions of the world where Zika outbreaks are actively occurring. Similarly, government officials in El Salvador, Columbia, Jamaica and Ecuador have taken the unprecedented step of recommending that women in those countries avoid getting pregnant until the Zika outbreak is contained.

Even if you are not planning to travel to Rio for the Olympic Games, however, you have reason to be concerned. The mosquito that transmits Zika is already present in the United States. Moreover, the warm and wet winter caused by this year’s El Niño event make it likely that these mosquitos will thrive in the coming months. That, coupled with the likelihood of the virus continuing to enter the United States with ever increasing trade and travel with Latin America, means that Zika will likely take root in our fertile American soil. All we can do now is wait, plan and hope that a safe and effective vaccine for Zika is developed soon.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 28, 2016, and is available on the WAMC website.]

Posted in Health Care, Public Health, Reproductive Rights, Vaccines | Leave a comment

Penning a Solution to the War on Drugs

After nearly six months on the run, Joaquin Guzman Loera — the Mexican drug lord known as “El Chapo” — was recaptured by Mexican authorities. He is now back in the prison from which he made his daring escape, awaiting extradition to the United States to face charges of drug trafficking and murder.

Interestingly enough, El Chapo’s apprehension was both aided and hindered by two celebrities, Oscar-winning American actor Sean Penn and Mexican telenovela starlet Kate del Castillo. In October, Mr. Penn travelled to Mexico for a secret meeting and interview with the infamous drug lord. At the time, Mexican law enforcement agents delayed a scheduled raid of El Chapo’s hideout in order to protect the safety of Mr. Penn and Ms. del Castillo, giving him time to escape. However, because of continued communication between El Chapo and the two actors – fueled by the drug lord’s narcissistic desire to have a movie made about his life – Mexican authorities were later able to track and capture him.

While Mr. Penn and Ms. del Castillo’s actions may not be illegal, in my opinion they certainly were unethical. More importantly, after reading Mr. Penn’s interview with El Chapo in Rolling Stone this morning, I am struck by how foolish and naïve the actor is. While paying lip service to the many law enforcement officers killed by El Chapo and other narcotics traffickers, Sean Penn nevertheless idolizes the drug lord, going so far as to justify his interview by saying, “I’m drawn to explore what may be inconsistent with the portrayals our government and media brand upon their declared enemies.”

Let us not forget that drug lords like El Chapo, both here in the US and overseas, are directly responsible for hundreds of murders and indirectly responsible for thousands of deaths associated with narcotics trafficking and the rampant use of illicit drugs. That said, I hope the controversy over the Rolling Stone interview and the pending trial of El Chapo leads us to consider a bigger issue: the failure of the so-called War on Drugs.

Since then-President Nixon first declared a war on drug use in 1971, federal, state and local law enforcement agencies have spent hundreds of billions of dollars combating drug use. But despite the money spent, true victories in the war are few and far between. Even when a large-scale drug lord like El Chapo is captured, he and his organized crime syndicate are quickly replaced by an even more ruthless gang of drug dealers. The killings continue unabated and there is but a short-lived dip in the amount of illicit narcotics flowing into the US.

Moreover, in addition to the billions of dollars spent hunting drug traffickers like El Chapo abroad, the government also spends billions of dollars annually imprisoning those who sell or use drugs here in the US. Because of the desire of our elected officials to appear tough on crime, those convicted of even minor drug-related offenses are sentenced to decades (or even life) behind bars. For example, almost half of all of the inmates in the federal prison system are there for nonviolent drug-related offenses, with the leading drug involved being marijuana. Since the start of the War on Drugs, the number of Americans in prison for selling or possessing narcotics has increased ten-fold yet our drug epidemic remains unchecked.

Not only are the economic and political costs astronomical, the social impact on many communities – particularly inner-city neighborhoods and communities of color — has been devastating. Much of the hostility of these communities towards law enforcement agencies (currently at a boiling point following police shootings of young African-American men like Michael Brown and Tamir Rice) can be traced back to the War on Drugs and the subsequent militarization of the police.

No wonder then that many top law enforcement officials, including the police chiefs of Los Angeles, Chicago, and Houston, are beginning to decry our current approach to dealing with illegal drug use. One such group of police chiefs, known as Law Enforcement Leaders to Reduce Crime and Incarceration, is calling for reforms in our current judicial approach. Specifically, they want to reclassify many drug-related crimes from felonies to misdemeanors as well as reduce or remove mandatory sentencing minimums; a 19-year-old arrested for possessing a small amount of heroin for personal use will no longer be sentenced to 3-to-5 years in a state penitentiary, but will instead be referred to a substance abuse program.

While this is a good start, I wonder if reforming our judicial system is enough. Perhaps it’s time to have a national dialogue about illegal drugs and their use, including considering proposals to legalize some recreational drug use (as has been done for marijuana in four states and the District of Columbia). Creating a legalized yet highly regulated market might not only address the problem of drug trafficking and violence, but the tax revenues could provide a much needed stream of revenue to support substance abuse programs. In addition, legalizing some recreational drug use would put cartels run by vicious criminals like El Chapo out of business; one study found key drug traffickers like Mexico’s notorious Sinaola cartel would lose more than half of their annual income if the US simply legalized the use of relatively benign drugs like marijuana.

America has a serious drug problem, but it is clear after over four decades that our current approach has failed. The War on Drugs is simply bad policy, both economically and politically. As hard as it is for Americans to concede, it’s time for us to give up this losing strategy and look at other ways of combating the problem of drug use at home and abroad.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 14, 2016, and is available on the WAMC website.]

Posted in Crime, Drugs, Policy, Uncategorized | Leave a comment

Dreaming of a White Christmas

Most of my friends, upon learning that I was raised in sunny California, are shocked to find that winter is my favorite season. Since first moving to this area in the mid-90s, I’ve relished in the fact that I now live in a place with seasons, a region of the country that enjoys subzero temperatures and frequent snow during the darkest months of the year. No wonder then that the Snow Miser from the classic cartoon ‘A Year Without a Santa Claus’ is my yuletide Facebook avatar.

Much to my dismay and my local friends’ glee, that has not been the case so far this year. In fact, until this past Tuesday when we received a dusting of snow hardened into place by relentless freezing rain, this region of the country has had abnormally warm temperatures. It was so warm on Christmas Eve, a stunning 72°F, that my husband and I drove out to my in-laws house with the convertible top down on my car. For the Northeast, this December will go on record as the warmest in over 200 years.

A lot of the warmth that we are experiencing, and the similarly unusual weather in other parts of the country – including unexpectedly low temperatures in the Southwest, snow and ice storms in the Southeast, tornadoes in Texas, and flooding in Missouri – can be chalked up to the cyclical weather pattern known as El Niño. But global climate change is also likely playing a major role in this year’s (and future year’s) extreme weather patterns.

El Niño, more appropriately called the warm phase of the El Niño Southern Oscillation (or ENSO), is a global pattern of climatic variation that occurs when an unusually warm band of seawater develops in the equatorial region of the Eastern Pacific Basin. It normally develops around December, thus giving this natural event is name; El Niño means can mean little boy, but more often means Christ Child in Spanish.

When it occurs, El Niño creates increased rainfall across the east-central and eastern Pacific Ocean. While the effects of El Niño are more direct and stronger in South America, it is also associated with warmer weather in the western and northern US states, and heavy rainfalls in the south and southeast.

El Niño is associated not only with the natural disasters that we have seen on the news this past week — where dozens of people in the Midwest and South lost their lives and hundreds more lost their homes to raging floodwaters and swirling tornados — it is also linked to outbreaks of infectious diseases that threaten the health and lives of millions more.

Until 1991, for example, the entire Western hemisphere had been free of cholera for more than 100 years. When the disease re-emerged in Peru, later spreading throughout South and Central America, it coincided with an El Niño event that resulted in much warmer than normal coastal waters.

Among the many hypotheses about the re-emergence of cholera in this part of the world is that the bacteria that causes the disease, Vibrio cholerae, was able to proliferate in these unusually warm waters which set the stage for increased exposure and transmission to humans. Cholera is now re-established in Central and South America, and it is only a matter of time before it re-emerges in the US. That may occur this year. The US National Oceanic and Atmospheric Administration (NOAA) predicts that this year’s event “could be among the strongest in the historical record,” and the average temperature of US coastal waters has similarly surpassed prior historical accounts.

Cholera is not the only disease that US public health officials are worried about with this year’s unusal weather. Cases of the disease Cryptosporidiosis can also be linked to unusual and increased rainfall patterns. The disease is caused by a chlorine-resistant parasite that normally infects cattle and waterfowl, but which can be transmitted to humans when our drinking water becomes contaminated with agricultural waste.

In the US, the largest outbreak of Cryptosporidiosis occurred in Milwaukee, Wisconsin, in 1993 when agricultural runoff from local pastures contaminated the water supplies of the Howard Avenue Water Purification Plant. Over 400,000 people contracted the illness and nearly 100 died, with one study suggesting that the outbreak cost nearly $100 million in medical treatments and lost productivity.

While the US has not had an outbreak on that scale since, sporadic epidemics often occur during times of flooding. I would not be surprised if there is a localized outbreak in Missouri in light of the record floods that El Niño has caused.

There are, in fact, a whole host of endemic, emerging and re-emerging diseases that we should worry about given this year’s unusual weather. These include not only diarrheal diseases like cholera and Cryptosporidiosis, but also mosquito-borne illnesses like dengue fever, chikungunya and malaria, as well as Lyme disease, rabies and spongiform encephalopathy.

However, while El Niño makes this situation bad, global climate change makes the problem even worse. During the last several decades, the number of El Niño events has increased, and studies of historical data suggest that the increased frequency and intensity of these events is linked to global climate change.

As the average temperature of our planet increases, so too will the likelihood of weather-triggered outbreaks of disease. Given this state of affairs, it is unfortunate then that most of our leaders in Washington, and the current slate of Presidential candidates, seem to be largely dismissive of the threat that climate change poses to the public health. Until the economic costs of disaster relief, medical treatment and lost productivity directly affect their bank accounts and those of their corporate backers, however, I fear that little will change.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 31, 2015, and is available on the WAMC website.]

 

Posted in Climate Change, Politics, Public Health | Leave a comment

Tempering Sheen’s Shame

Television actor and Hollywood bad boy Charlie Sheen revealed that he was HIV positive last month, breaking four years of silence during which he allegedly paid out millions of dollars in extortion in an attempt to keep his diagnosis private.

Public reaction to Mr. Sheen’s announcement was swift but mixed. Many praised Charlie’s so-called ‘courage’ in making his diagnosis known in a live television interview. Others were quick to condemn or blame him, suggesting that he got what he deserved after engaging in years of unsafe and risky behavior. Still others, myself included, had a more tempered response, wishing Mr. Sheen well but questioning the circumstances and manner in which he revealed his HIV status.

When Charlie initially made his public announcement, some of the first questions that he was asked were about his behaviors in the years since his first diagnosis. Did he inform his intimate partners about his HIV status? Did he have unprotected sex or engage in other activities that could put others at risk of acquiring HIV?

For the most part, I believe that those sorts of questions are inappropriate. What Mr. Sheen does in his private life is none of our business, despite Charlie’s penchant to live his life very publicly. That said, I think it is worth discussing the responsibilities of those who know that they are HIV positive: to whom are they required to disclosure their status, and what precautions must they take to prevent spreading the virus to others?

If you were to ask someone as knowledgeable as former Playboy Playmate and anti-vaccination activist Jenny McCarthy, the answer is quite clear: you should reveal your HIV status to everyone and spend the rest of your life avoiding all human contact.

Ms. McCarthy is apparently upset that she wasn’t told when she played Charlie’s love interest on a few episodes of the television show Two and a Half Men, including scenes in which the script called for her to hug and kiss Mr. Sheen. It is possible that she missed the health science class in which HIV transmission was discussed, that she truly doesn’t know that the virus cannot be spread by hugging, kissing, or drinking from the same glass.

More likely, as is still commonplace in the US and the world, Jenny sees HIV as something to be ashamed of. She has internalized the belief that those who are infected with the virus are somehow dirty and depraved, and are the kind of people that virtuous citizens like Jenny McCarthy have the right to avoid.

What about some of the others that Charlie has been involved with since his diagnosis? Clearly they have a potentially stronger moral claim for knowing Mr. Sheen’s HIV status than Jenny McCarthy.

It now seems that a number of women that Mr. Sheen has been intimate with in the past four years plan to sue, starting with his former fiancée Scottine Ross. Last week Ms. Ross filed a lawsuit seeking $1 million in damages for emotional distress. According to the court documents filed, she found out that Charlie was HIV positive only after finding his antiviral medications in the medicine cabinet, a discovery that occurred months after they had begun a sexual relationship.

Mr. Sheen, of course, denies these allegations, setting up a classic “he said-she said” argument that the courts will be required to settle. Unfortunately for Charlie Sheen, he could also face criminal prosecution as California is one of 33 US states that makes it illegal to expose an uninfected individual to HIV through sex, shared needles or other routes of transmission.

Here’s the problem with this case. First and foremost, the varying legal requirements for disclosing HIV status aside (a topic I’ve discussed in a previous commentary and in an article in the International Journal of Law in Context), consensual sex implies that there is a tacit agreement between the partners involved. By becoming intimate with Mr. Sheen, Scottine Ross herself had the obligation to ask Charlie about his HIV status and, based on the answer and her own status, to negotiate acceptable behaviors and limits. There may be a greater onus on Mr. Sheen, but unless Charlie knowingly lied to her about his diagnosis Ms. Ross also bears some responsibility in this case.

Moreover, there is an asymmetry here. We assume that those who are HIV positive have a moral obligation to disclose their status. By contrast, people who frequently engage in unsafe sexual encounters but who do not know that they are HIV infected — even if they have a strong reason to assume so, based on behavior — are under no such obligation to disclose this information to a partner.

The moral obligation in this case is not disclosure. Instead, it’s the more general obligation to protect your partner from known risks, be it HIV, herpes, or any other sexually transmitted infection. But we know that treatment of those with HIV is as effective in reducing the risk of viral transmission as using condoms. Modern antiretroviral drugs can essentially render a person non-infectious, and we know that Charlie Sheen – unlike so many others currently living with HIV – is on treatment. He and his doctor have said so publicly, and have released medical records to prove this.

As I see it, Mr. Sheen took the precautions that make transmission exceedingly unlikely, meeting his moral obligation to protect his partners from intentional and malicious infection with HIV. Disclosing his HIV status to his partners, or even to the public, was laudable but not morally obligatory.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 17, 2015, and is available on the WAMC website.]

Posted in Celebrities, Clinical Care, HIV/AIDS | Leave a comment

A Smack Upside the Head

Like many Americans, my in-laws have a Thanksgiving Day tradition of watching football and a Black Friday tradition of going shopping. Both of these are full contact sports, but only one of them will prove to be deadly for thousands of people.

That is because concussions — known clinically as mild traumatic brain injury (or MTBI) — are common in football. They are also common in other contact sports like soccer, hockey, boxing and martial arts.

Concussions are one of the most frequent traumatic brain injuries, occurring more than 1.5 million times a year in the United States alone. They happen when a blow to the head or body, a fall, or some other impact causes the brain to smash into the skull.

Symptoms of a concussion can range from a mild headache, blurred vision and disorientation to a loss of consciousness, convulsions and even memory loss. These symptoms usually subside within a few hours, but can last for days, weeks and even months. Unfortunately, there is no real treatment for a concussion other than physical and cognitive rest.

For most people who suffer a concussion, there are thankfully no lingering or long-term effects. But that is not the case for many athletes. Repeated concussions can result in a condition known as chronic traumatic encephalopathy (CTE), a progressive neurodegenerative disease characterized by impaired speech, deafness, amnesia, depression, anger and dementia. We now know that a significant percentage of amateur and professional athletes are likely to be suffering from CTE.

When retired Chicago Bears player Dave Duerson killed himself in 2011, for example, his subsequent autopsy found that he was suffering from CTE. So too was Hall-of-Fame linebacker Junior Seau, who took his own life in 2012. Just last week, the family of Frank Gifford revealed that the football legend also suffered from the debilitating effects of this disease.

In fact, one study of men who had histories of repeated concussions found that 80 percent showed evidence of CTE; most of these men had played football in high school, college or professionally. Similarly, a posthumous study of 91 brains of former professional football players found that 87 (a whopping 96%) tested positive for that neurodegenerative disease.

Despite this, and despite numerous lawsuits by former players and their families, professional organizations like the National Football League (NFL) have been slow to act.

In 1994, the League established its Mild Traumatic Brain Injury Committee with the stated goal of studying the effects of concussions and other head injuries among NFL players.

However, it wasn’t until 2003, nine years later, that the Committee produced its first report. Despite mounting evidence to the contrary, that report concluded that there were no long-term health consequences associated with the repeated concussions sustained by many professional athletes.

It took another six years, and a Congressional inquiry, before the League established clear rules on when a player can return to the field after suffering a suspected concussion.

It wasn’t until this season that the NFL created an independent spotter system, allowing athletic trainers and health professionals stationed on the sidelines and in the press box to stop the game for a medical timeout should they suspect that a player has suffered concussive injury.

But even that system seems to be ineffectual. During last Sunday’s game against the Seattle Seahawks, Pittsburgh Steelers quarterback Ben Roethlisberger took a hit that clearly left him staggering. He had difficulty getting off the ground after being tackled, and was clearly disoriented. Nevertheless, the game wasn’t stopped, and Roethlisberger completed an additional nine plays before removing himself from the game for “vision problems” (a clear sign of a concussion).

The same thing happened two weeks ago, when St. Louis Rams quarterback Case Keenum suffered an obvious concussion but continued to play. In neither case, however, have the teams been penalized despite these clear violations of the NFL’s own concussion protocol.

In a commentary that I wrote nearly two years ago, I railed against the reluctance of the NFL and other professional sports leagues to take the problem of concussions seriously. Nothing has changed. Despite new protocols designed to prevent a concussed player from returning to the field, the teams, the coaches and the players continue to ignore the rules. NFL-led investigations into cases like Roethlisberger and Keenum go nowhere; there are no fines or penalties imposed. It’s simply business as usual.

And that’s exactly the problem. Professional football is a business, and a lucrative one at that. Even the NFL’s lauded $1 billion concussion settlement, a one-time payout that would provide medical care and treatment for thousands of former players suffering from CTE, is a drop in the bucket for a franchise that makes tens of billions of dollars a year in tickets, merchandise, sponsorships and broadcast deals.

On the gridiron, money matters more than lives. Until we tackle that problem head on — be it through regulations, policies or heavy fines — nothing is going to change.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 3, 2015, and is available on the WAMC website.]

Posted in Athletics, Clinical Care, Public Health | Leave a comment

A Withering Shine for Sheen

Earlier this week, troubled actor Charlie Sheen announced that he is HIV positive. Charlie now joins the 1 million Americans and nearly 40 million people worldwide living with HIV/AIDS. He also joins a small list of celebrities — NBA star Earvin “Magic” Johnson, professional tennis player Arthur Ashe, Olympic diver Greg Louganis, fellow actor Danny Pintauro, and a handful of others — who have gone public with their diagnoses.

The mainstream and social media response to this announcement has been interesting. A lot of Charlie’s fellow celebrities have expressed sympathy or applauded him for being “brave” in revealing his diagnosis publicly. Others have shaken their heads in disbelief. A vocal minority has even suggested that he only has himself to blame, that years of risky (and often public) behaviors made his infection with HIV inevitable.

I do have a lot of sympathy for Mr. Sheen. No one deserves to be infected with HIV, regardless of how irresponsible they act. I’m also glad that he is on treatment with antiretroviral drugs, both to protect his own health and to prevent transmitting the virus to his partners. There are, however, a number of things about his case that temper my response.

First and foremost is the fact that he hid his diagnosis for years, reportedly paying millions of dollars in hush money to those who knew so that they would keep his HIV status quiet. He might be brave in coming forward now, but for nearly five years he was a bit of a coward. I also fear that he is a self-serving jerk.

What do I mean by that? Doesn’t it make sense that Mr. Sheen would want to hide his HIV status in order to protect his family and his reputation? After all, stigma and discrimination against those living with HIV/AIDS is still rampant, even in 2015.

In the US, HIV is still seen as a disease of gay men, prostitutes and drug users, even though some of the highest rates of new infection occur among heterosexual women of color. Those who reveal their HIV status still run the risk of losing their friends, their family, their jobs, their insurance and health care, and even their homes. For those living overseas, the problem of stigma and discrimination is even worse, often culminating in rape, violence and even death.

Despite this, I don’t buy the argument that he was afraid to reveal his HIV status because it would damage his television and movie career. He damaged his career himself many years before his HIV diagnosis through increasingly outrageous public behaviors that eventually resulted in his being fired as the star of a top-rated television comedy. The main reason he is coming forward now, I suspect, is to court sympathy and to rehabilitate his image.

Charlie has also said that he is going public with his diagnosis now so that he can stop the so-called “shakedowns” by those who knew that he was HIV positive. But the mere fact that Charlie could pay these extortionists millions of dollars to keep his status quiet shows just how atypical his experience is. Charlie Sheen is wealthy and lives a life of privilege. He will never face the same challenges that the vast majority of those living with HIV/AIDS deal with on a day-to-day basis. For example, he has ready access to treatment and care, the average cost of which can run almost $20,000 a year. By contrast, there are millions of people for whom these drugs are unavailable, unattainable, or simply too expensive to buy. He has the money and access to keep his infection under control and to keep his partners safe, unlike the majority of those with HIV worldwide.

Don’t get me wrong. I am glad that Mr. Sheen has gone public about his HIV status, I only wish he’d done so sooner and in a more appropriate way. He could have used his announcement to talk about how lucky he is compared to most of those living with HIV/AIDS. He could have talked about the deplorable lack of access to HIV treatment and care worldwide, a problem that could be solved for less than the cost of one month of military spending in Afghanistan and Iraq. He could have even used his mistakes to warn others about the risks of HIV and the importance of being tested. But he did not.

I wish Charlie Sheen well, and I hope that this announcement marks a turning point in a tumultuous and troubled life. But we shouldn’t place Charlie on a pedestal, or make him the latest “poster boy” for those living with HIV/AIDS.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on November 19, 2015, and is available on the WAMC website.]

Posted in Celebrities, HIV/AIDS, Media, Uncategorized | Leave a comment

Let Them Eat Bacon!

That Earth-shattering noise that you heard last week was the sound of a billion bacon-lovers, myself included, screaming out in agony after the World Health Organization (WHO) classified processed meats as a definite human carcinogen and also classified red meat as a probable human carcinogen. After newspapers proclaimed that eating bacon was as dangerous as smoking cigarettes — one such headline in the Guardian, a British newspaper, proclaimed that “Processed Meats Rank Alongside Smoking as Cancer Cause” — carnivores around the world were left wondering if they would need to give up their beloved meaty treats.

The answer is no. What the WHO’s International Agency for Research on Cancer — a group of 22 independent public health and cancer experts — found after reviewing 800 studies looking at environmental and lifestyle factors that contribute to cancer is this: eating 50 grams or more of processed meats like bacon or sausage daily raises an individual’s lifetime risk of colorectal cancer by 18%. Similarly, regularly eating 100 grams of red meat is associated with a 17% increase in risk. Based on these findings, the WHO classified these meats as group 1 carcinogens, the same cancer-causing category as tobacco.

This does not mean that eating bacon is as bad as smoking.

First of all, the International Agency for Research on Cancer classifies potential carcinogens into five categories based only on the weight of evidence that they are causally linked to cancer, not on the degree of cancer risk. Group 1 carcinogens like tobacco and asbestos are known to cause cancer. Group 2A and 2B carcinogens are probably or possibly linked to cancer, respectively. Group 3 compounds are not classifiable and group 4 materials are not carcinogenic. Bacon and other processed meats are now considered group 1 carcinogens; the nitrates and nitrites used to cure these meats turn into cancer-causing N-nitroso compounds in the gut. But there are lots of things that are also in that same category of cancer-causing agents, including alcohol, birth control pills, smog and (gasp) sunlight.

So when you sit out on your sunny patio on a lazy Sunday morning having a Bloody Mary and eating a bacon-and-cheese omelet, you are exposing yourself to all sorts of cancer-causing agents. But the likelihood that any one of these things will cause you to get cancer is slim. This is what all the fear mongering about bacon and red meat gets wrong: the risk of developing cancer from eating processed meat is much lower than the risk associated with smoking. Moreover, the relative risk of a bacon lover getting colorectal cancer doesn’t amount to much in terms of their absolute risk of developing the disease.

Colorectal cancer, to which eating bacon and other processed meats is now linked, is the third most common form of cancer in the US. According to the American Cancer Society, about 130,000 US residents will be diagnosed with colorectal cancer in 2015. About 50,000 will die of the disease. For the average American adult with no familial history of this disease, the lifetime risk of developing colorectal cancer is 1.8%; about 1-in-55 people will be diagnosed with this form of cancer. Eating two slices of bacon every day increases that lifetime risk to about 2.0%. By contrast, having a relative with colorectal cancer increases that risk to 3.4%, and having two or more relatives with a history of colorectal cancer increases it to 6.9%.

What about smoking and lung cancer? Lung cancer is the most common (and most preventable) form of cancer in the US. This year, about 225,000 Americans will be diagnosed with lung cancer and 160,000 will die of the disease. But rates of diagnosis vary widely between smokers and non-smokers. An adult male who smokes is 25 times (or 2500%) more likely to get lung cancer than a non-smoker. Tobacco smoke, it turns out, is 140 times more carcinogenic than bacon.

This isn’t to say that colorectal cancer is not something to be concerned about. It’s a serious and often deadly illness. Treatment can include chemotherapy, radiation, and even colostomy (removing the colon and creating an opening in the skin of the abdomen through which digestive waste drips out). However, there are ways of reducing your risk of developing or dying of this cancer without foregoing bacon or red meat. You can eat a diet that is also high in fiber and antioxidant-rich foods, you can exercise regularly, and you can get screened by colonoscopy starting at age 50 (or even earlier if you have familial history of the disease).

Of all the things in this world that are likely to kill us, bacon should be the least of our worries. Don’t let the media, the WHO, or your vegan friends try to convince you otherwise.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on November 4, 2015, and is available on the WAMC website.]

Posted in Cancer, health literacy, Media, Public Health | Leave a comment

Reconsidering Cancer Screening Programs

In a public commentary that aired a little over a year ago, I caused quite a stir when I discussed the case of Amy Robach, the then-40-year-old ABC News correspondent who was diagnosed with breast cancer after receiving an on-air mammogram conducted as part of a Good Morning America story about cancer screening programs. Ms. Robach underwent a double mastectomy shortly after her diagnosis and is currently cancer free.

In that commentary, I raised concerns about the message that story presented to the American public about the utility of breast cancer screening programs. Specifically, I worried about the idea, promoted by organizations like the American Cancer Society and Susan G. Komen for the Cure, that all women should undergo screening as early as age 40. Best safe than sorry, right?

But not everyone recommends routine mammography for women starting at the age of 40 — including women like Amy Robach — unless they have a familial history of breast cancer. Those are the cancer guidelines issued in 2009 by the US Preventative Services Task Force, an independent and non-partisan group of healthcare experts. The Preventative Services Task Force concluded that most women should not undergo regular mammography until they are at least 50 years old.

This recommendation may seem counterintuitive. After all, breast cancer is a very serious public health issue. There are few families that it hasn’t touched, including mine. My aunt Kathryn recently passed after battling breast cancer for nearly two decades.

Breast cancer is currently the second leading cause of cancer death among American women. Nearly 250,000 new cases are diagnosed each year, and over 40,000 women die of invasive breast cancer. Moreover, for all of the hype around Angelina Jolie and testing for cancer-related genes like BRCA1, the vast majority of cases of breast cancer are spontaneous; that is, they occur in women with no familial history or genetic predisposition to breast cancer.

For women with no family history of breast cancer, the likelihood of developing it is 1 in 70 for those in their 40’s. That rises to 1 in 35 for those in their 50s, and to 1 in 25 for women in their 60’s. Overall, an American woman has about a 1 in 10 chance of developing breast cancer during her lifetime.

Those are pretty significant odds, so why shouldn’t women be screened annually for breast cancer? Why shouldn’t every woman in America have mammograms as early and as often as possible?

After reviewing decades of epidemiologic data, what the Preventative Services Task Force found was this: unless a woman has a familial history of breast cancer, routine mammograms before age 50 actually yield little benefit. For every 2,000 young women screened for breast cancer by mammography, only a single cancer-related death was prevented.

This is due in part to the fact that rates of breast cancer are lower among women in their 40s, and due to the fact that mammography is a notoriously inaccurate method of screening younger women. The breasts of younger women tend to have more glandular (milk-producing) and connective tissue, while older women have breasts that are more fatty. This glandular and connective tissue is mammographically dense, appearing white on the X-ray film. Abnormalities like tumors also appear white, making them difficult to detect.

As a result, mammography misses an average 20-30 percent of all cases of cancer in younger women (so called false negative results). For women in their 40s with no familial history of breast cancer, non-invasive screening methods — feeling for lumps or looking for other symptoms of breast abnormality — can be as effective at detecting nascent breast cancer as mammography.

Mammography also has a high rate of false positives: findings that look like cancer but are later determined to be benign after additional testing, including invasive biopsies. After just 10 yearly mammograms, over 50% of women will have at least one false positive test result. False positives are particularly common among younger women, again due to that they have mammographically denser breast tissue.

The psychological effects of a false positive test result can be profound. Hearing that you may have cancer can be emotionally devastating, as suggested in recent television commercials produced by the American Cancer Society. The effects of hearing that diagnosis can linger, even after subsequent tests rule out cancer. One study found that a significant number of women who received a false positive result suffered from anxiety and depression. In some women these symptoms continued for years, even after cancer had been definitively ruled out.

In fact, based on all of these data, the American Cancer Society recently changed its breast cancer screening recommendations to be more in line with those of the US Preventative Services Task Force. Previously, the Society recommended annual mammograms starting at age 40. They issued new guidelines this past Tuesday, recommending that women without a familial history breast cancer start having mammograms every year starting at 45, then every other year once they are 54. Other groups like Susan G. Komen for the Cure are still promoting earlier and more frequent screening.

Just like a false positive result, the differing recommendations from Komen, the American Cancer Society and the Preventative Services Task Force are likely to leave women anxious and depressed. Women and their doctors are now left to sort through conflicting messages about cancer screening and decide what is best for them based on their own personal circumstances, medical histories and prevention goals.

And that is the very point that I tried (and failed) to make in my original commentary. Cancer screening and prevention programs should not take a one-size-fits-all approach. The “worried well” might want to undergo frequent screening, so long as their desire to know whether or not they are cancer free is tempered with an understanding and appreciation of the potential harms associated with a false positive or overdiagnosis. Other women might choose to undergo less frequent screening, weighing the benefits of early diagnosis with the risks of mammography.

Screening saves lives, but not everyone needs to be screened. And not everyone needs to be screened early and often.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on October 22, 2015, and is available on the WAMC website.]

Posted in Cancer, Clinical Care, Public Health, Women | Leave a comment

Orphan Drug Economics 101

Last month, the cost of a drug called Daraprim, used to treat a potentially life-threatening infection in cancer and AIDS patients, increased 5500% overnight.

Daraprim is used to treat toxoplasmosis, a disease that is caused by the parasite Toxoplasma gondii. Infection with this ubiquitous parasite can cause flu-like symptoms and, in some cases, an encephalitis (or infection of the brain) that can result in blindness, brain damage or death. Babies can also be infected in utero if their mothers are exposed to the parasite. In those cases, known as congenital toxoplasmosis, infection of the child can lead to lifelong developmental disabilities, hearing and vision loss, and miscarriage or stillbirth.

The price of Daraprim increased from $13.50 to $750 per dose after a startup company called Turing Pharmaceuticals, founded by Martin Shkreli, a 32-year-old former hedge fund manager, bought the exclusive US marketing rights to Daraprim. They acquired these rights for $55 million from Impax Laboratories, which itself acquired those rights only five months earlier after acquiring a company called CorePharma.

The outcry from clinicians, medical organizations and patient advocacy groups was swift, particularly given that Daraprim costs only about $1 per pill to produce, is off patent, and has been on the market for over 60 years. For example, the Infectious Disease Society of America and the HIV Medicine Association condemned the price increase, declaring in a joint statement that, “this cost is unjustifiable for the medically vulnerable patient population in need of this medication and unsustainable for the healthcare system.” Similarly, advocacy groups like the Center for American Progress urged the federal government to start using its “march-in” rights to force licensing of drug patents to generic manufacturers, although that really wouldn’t apply for an off-patent drug like Daraprim.

Finally, Democratic presidential candidates Bernie Sanders and Hillary Clinton also spoke out, with Clinton unveiling a plan to cap out-of-pocket costs for drugs like Daraprim. The fifteen Republican candidates have been silent.

Turing CEO Shkreli was initially unapologetic, going so far as to call the price increase “altruistic” since the company plans to use the increased revenue to research new treatments for toxoplasmosis. He has since backed down, however, and has promised to lower the cost of Daraprim.

Now this is not the first time that Martin Shkreli has been linked to legal but morally questionable drug price practices. Just last year, Retrophin, a biotech company that Shkreli founded in 2011, acquired the rights to a drug called Thiola. He then jacked up the price of that drug, used to treat a chronic condition that causes kidney stones in patients, from $1.50 to $30 per dose (an increase in the annual treatment cost from $2,700 to $55,000 for patients with this disease).

So why can folks like Shkreli get away with this? They can in part because medications like Daraprim and Thiola are so called “orphan drugs”. Orphan drugs are used to treat rare illness or conditions, usually those that are so uncommon or rare that most drug manufacturers see little to no profit in developing and marketing treatments.

Let’s consider the case of Daraprim. The US market for this drug is actually quite small. Last year, American sales of Daraprim totaled only $5 million. While that would still amount to an 8% annual return on Turing’s $55 million initial investment, it is far below the billions of dollars in revenue and 30% profit margin of most blockbuster pharmaceuticals.

The reason for the small market is this: even though 1-in-6 Americans is infected with the parasite that causes toxoplasmosis — through the consumption of undercooked meat or unwashed vegetables or from their pets (cats readily transmit the parasite through their feces, which is why pregnant women or cancer patients should never clean the litter box) — most never develop symptoms. Only babies infected in utero and those who are immunocompromised because of extreme age or illness, such as cancer patients, transplant recipients or those living with HIV/AIDS, are at risk of developing the disease.

In 2014, less than 5000 people were hospitalized and 327 died from toxoplasmosis in the US. That same year, only 2000 prescriptions for Daraprim were written. Thus, there is little to no market incentive for other drug manufacturers to seek FDA approval to market generic versions of this drug and other orphan drugs.

Given this, it doesn’t make sense to respond to the Daraprim scandal by talking about patent march-in rights or proposing caps on out-of-pocket costs (which, when announced by Hillary Clinton on Twitter, caused the 144-member Nasdaq Biotechnology Index to drop by almost 5%). The problem is a lack of financial incentives for generic drug manufacturers to make orphan drugs like Daraprim or Thiola, such that the market for each of these drugs is monopolized by a single producer that can charge whatever the market will bear.

We need to get creative. We need to consider expanding laws like the Orphan Drug Act of 1983 — which provided pharmaceutical companies tax benefits, research subsidies, and extended patent protection and marketing rights for new drugs — in order to financially incentivize companies to produce generic versions of older orphan drugs. Without competition in the generic market, particularly for off-patent orphan drugs, the predatory acquisition and price spiking practices of pharmaceutical CEOs like Martin Shkreli will continue. It’s Economics 101, which, given the response to the current Daraprim scandal, many policymakers and advocates seem to have flunked.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on September 24, 2015, and is available on the WAMC website.]

Posted in Clinical Care, Pharmaceuticals, Policy, Regulation | Leave a comment

A Question of Conscience

On Tuesday, in front of a large crowd of supporters that included Republican presidential candidates Mike Huckabee and Ted Cruz, Kim Davis was released from federal custody.

As you all undoubtedly know, Kim Davis is the Kentucky county clerk who refused to issue marriage licenses following the U.S. Supreme Court ruling that legalized same-sex marriage. Kentucky law requires marriage licenses be issued under the authority of an elected county clerk like Ms. Davis. A devout Apostolic Christian, however, Kim is opposed to same-sex marriage. She believes that being forced to issue marriage licenses to same-sex couples would violate her religious liberties.

Although she has six deputy clerks working under her, Ms. Davis also refused to let them issue marriage licenses. Since her name appears on the marriage license itself, she believes that issuing these licenses to a same-sex couple constitutes her “stamp of approval” of something she believes to be a sin.

After two straight and two gay couples sued Ms. Davis for refusing to issue them marriage licenses, she was ordered to do so by U.S. District Judge David Bunning. Despite Judge Bunning’s order and despite her appeals to the Sixth District and U.S. Supreme Court being rejected, she remained defiant and was eventually jailed for contempt of court. She was released only after her deputy clerks began issuing marriage licenses this week, but could again face contempt charges if she interferes with the license process.

Now Ms. Davis isn’t the only government official who is refusing to issue marriage licenses to same-sex couples. At least two other clerks in Kentucky have also denied gay couples such licenses. Similarly, an Oregon circuit court judge has refused to marry same-sex couples. In all of these cases, these so-called public officials claim that doing so would be a violation of their First Amendment right to freely exercise their religious faith.

But is this really the case? While the First Amendment prohibits Congress from passing laws that prohibit the free expression of religious beliefs, for now at least this applies only to an individual’s personal life. Once you enter the public realm, all bets are off. The First Amendment does not bestow upon an individual the right to impose their religious values on others, or to exempt them from laws that might be inconsistent with their spiritual beliefs. The State of California, for example, prohibits discrimination on the basis of martial status. Despite the protections of the First Amendment, a Catholic landlord in that state cannot (at least not yet) refuse to rent an apartment to an unmarried couple simply because he believes living together is a sin.

Despite all of the media hoopla, political spin and claims of religious freedom, this is actually a straightforward case of a public official failing to do her job. Kim Davis is not a private citizen and her refusal to issue marriage licenses is not a matter of private conscience. She is the elected clerk of Rowan County, Kentucky. It doesn’t matter if Ms. Davis is a thrice-divorced Evangelical Christian opposed to same-sex marriage, a teetotaling Seventh Day Adventist opposed to the consumption of alcohol, or a celibate Quaker opposed to handguns. If her job requires her to issue marriage licenses, liquor licenses or gun permits, then she should do her job or resign. At least that is the situation for now.

Unfortunately, it is unlikely that Ms. Davis’s fifteen minutes of fame have been completely used up. Self-serving politicians like Mike Huckabee and conservative organizations like the Liberty Counsel will continue to use cases like hers to push their religious agenda, resulting in what legal analyst Dahlia Lithwick calls conscience creep, “the slow but systematic effort to use religious conscience claims to sidestep laws that should apply to everyone.”

Over the past 40 years, for example, the US Congress has passed laws like the Religious Freedom Restoration Act. That law requires that the federal government demonstrate a “compelling interest” before placing limits on an individual’s religious freedom. Twenty states have passed similar laws, and six other states have religious freedom legislation pending. This includes a newly introduced law in Kentucky that would protect county clerks like Ms. Davis who refuse to issue marriage licenses to same-sex couples on religious grounds.

We are already seeing the impact of these conscience claims and religious freedom laws on our personal and professional lives. For example, most states have laws that allow physicians and other health care providers to refuse to provide abortions. Some thirteen states also allow doctors to refuse to provide contraceptive services. In seven of those states, pharmacists can refuse to fill prescriptions for emergency contraceptives. Similarly, in many states doctors can refuse to treat gay and lesbian patients solely because of their personal, religious or moral beliefs. Secular for-profit companies can refuse to provide employees with access to federally mandated birth control if doing so would run counter to the religious beliefs of the owners. Private adoption and foster care agencies can even refuse to place a child in a single parent or same-sex home if that would violate the agency’s religious or moral convictions.

But this is wrong, and it leads the U.S. down a very dangerous path. As I see it, the problem with all of these conscience claims and religious freedom laws is this: they upset a long-standing balance between religious liberty and civil rights. As written, these new laws increasingly allow individuals and organizations to opt out of civil rights and anti-discrimination laws that they object to on spiritual grounds. Your religious freedoms will soon trump my civil rights. You will be able to deny me medical care, access to prescription drugs, refuse to serve me in a restaurant, refuse to rent me a hotel room, and deny me access to all sorts of public goods and services under the banner of righteousness.

That is what is truly unconscionable.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on September 10, 2015, and is available on the WAMC website.]

Posted in Celebrities, Discrimination, Homosexuality, Human RIghts | Leave a comment