The Breast Intentions Are Fraught With Disappointment

About once a year on average, I seem to create a bit of a stir with a commentary on breast cancer and screening guidelines. In those commentaries, I sometimes question the message that is given to American women about the utility of breast cancer screening programs. In the weeks that follow, both my email and my answering machine tend to fill up with people suggesting that I am wrong, sharing personal tales of invasive cancers that were detected only because of screening, and sometimes (although very rarely) hoping that a relative of mine is stricken with the disease.

Given this, let me start by saying that I take the topic of breast cancer very seriously. Breast cancer is one of the leading causes of cancer death among American women, second only to lung cancer. Nearly 250,000 new cases are detected each year in the United States, and over 40,000 women die annually from the disease.

All told, nearly 1 in 8 American women will be diagnosed with breast cancer during their lifetime, most of who have no familial history or genetic predisposition to the disease. Few families thus remain untouched by breast cancer, including mine. My aunt Kathryn finally succumbed to the disease in 2015 after battling it for nearly two decades.

Breast cancer is a public health crisis and one that deserves a strong, concerted and well-reasoned response. The problem, however, is that current public health messages about breast cancer screening and treatment are disjointed at best and dangerous at worst. Currently, different professional organizations in the US offer differing and often contradictory advice about if, how and when women should be screened.

For instance, the American College of Radiology takes a very aggressive stance on screening and treatment, recommending that all women get annual mammograms starting at the age of 40. Private organizations like the Susan G. Komen for the Cure similarly promote earlier and frequent screening.

By contrast, the US Preventive Services Task Force, an independent and non-partisan group of healthcare experts that looks at the risks and benefits of clinical screening and disease prevention programs, recommends that most women should delay getting regular mammograms until after they turn 50. The Task Force also recommends that screening be done every other year, not annually. Only women with a familial history of breast cancer should be screened earlier and more often.

Finally, groups like the American Cancer Society have staked out the middle ground. That organization, for example, recommends that women get annual mammograms from age 45 to 54, followed by screenings every other year once they turn 55.

This is all very confusing for most women, and it is about to be made even more so as a result of a Danish study published in this week in the Annals of Internal Medicine. That study, which looked at nearly 100,000 women diagnosed with breast cancer between 1980 and 2010, found that as many as one-third of those women might have been over-diagnosed and over-treated.

By comparing the medical records of those who participated in a mammographic screening program with those who did not, the Danish researchers discovered that there was no significant difference in the number of invasive tumors detected in the two groups. There was also no significant difference in the number of lives lost to cancer. Moreover, the number of non-malignant or slow-growing tumors detected was much higher in women who underwent regular mammography. Over 30 percent of the women in this group were diagnosed with a condition known as ductal carcinoma in situ, or DCIS.

Experts disagree on whether or not DCIS should be treated. It isn’t immediately life threatening, but some doctors still recommend treating it quickly and aggressively with chemotherapy, radiation or surgery in order to prevent it from becoming cancerous. Others, however, argue that DCIS carries such low risk of becoming invasive that it should be monitored only. The risks of treatment, these experts believe, outweigh the benefits. The Danish data, particularly the fact that the number of women with advanced tumors did not decrease despite an increase in the detection of ductal carcinoma in situ, would seem to support this theory.

As I have suggested previously, the information being given to women and to their doctors about if, when and how often to get mammograms is becoming increasingly confusing and conflicted. Rather than rely on one-size-fits-all guidelines from the American Cancer Society, American College of Radiology or the US Preventative Services Task Force, women should instead decide what is best for them based on their own personal circumstances, medical histories and prevention goals.

Someone with a familial history of cancer or someone who worries about their individual risk of cancer might choose to undergo more frequent screening, so long as they understand the potential harms of over-diagnosis. Others might instead choose to undergo less frequent screening, concluding the possibility that they might develop invasive cancer early doesn’t outweigh the risks of over-treatment.

I’ve said it before and I’ll say it again: screening saves lives, but not everyone needs to be screened early and often as some experts suggest.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 12, 2017, and is available on the WAMC website.]

Posted in Cancer, Clinical Care, Women | Leave a comment

A New Hope for Mental Illness

Every year, my husband and I throw a big New Year’s Eve party. Most of the time, we celebrate the coming of a new year with food, champagne and the company of good friends. This weekend’s party will be particularly poignant for me. I will be toasting not to the coming of 2017 but, rather, to the end of 2016.

This year has been particularly tumultuous for me, characterized by significant professional challenges and two recent hospitalizations. This year was also capped off by the passing of my mother, who recently succumbed to the very health problem that I have been struggling with for the past three months. The only positive thing to say about 2016 is that I have a new found appreciation for all that I have, and a plan to achieve better work-life balance in the coming year.

Of course, I am not the only person who has faced personal and professional challenges this year. In fact, my own struggles cannot compare to those whose lives have been irreparably changed by the war in Syria, the gun violence in Chicago, or the terror attacks in Belgium, Florida, France, Germany and elsewhere.

We’ve also lost what seems to be an extremely long list of political figures, sports legends, and celebrated entertainers in 2016, including Cuban revolutionary Fidel Castro, boxing champion and political activist Muhammad Ali, and award-winning artist Prince. In the past couple of days, we’ve even lost two of my teenage idols: musician George Michael and actress-writer Carrie Fisher [Author’s Note: After this commentary was written and recorded, it was announced that Carrie’s mother Debbie Reynolds also passed away unexpectedly]. While I don’t normally pay too much attention to comings and goings of celebrities like George and Carrie, I think it is worth commenting on the tragic deaths of these two public figures.

Both Carrie Fisher and George Michael – just like far too many celebrities and average folk — struggled with addiction. Mr. Michael’s abuse of cocaine and heroin was largely secret, coming to light (and inappropriately so, I might add) as the result of post-mortem interviews and tweets by his so-called friends and colleagues.

By contrast, Ms. Fisher’s battle with alcohol and prescription drug abuse was well known, chronicled by the actress herself in the autobiographical novel Postcards from the Edge. In going public about her struggles with sobriety, one of only a handful of celebrities who have done so willingly , Carrie helped to humanize the problem of addiction. One need only look to the flood of posts on social media following the news of Ms. Fisher’s death to understand just how much of an impact that she had in inspiring others to come to terms with their own addiction. Moreover, it wasn’t just alcoholism that Carrie Fisher struggled with. She was also an outspoken advocate for other mental illnesses, courageously sharing her own experience with bipolar disorder.

Sometimes called manic depression, bipolar disorder is a common illness that is characterized by sudden and extreme mood swings. As many as six million Americans — almost 3% of the adult population — suffer from the illness, experiencing dramatic shifts in mood and energy that range from euphoric highs (mania or hypomania) to crippling lows (depression). A related illness, major depressive disorder, affects approximately 15 million Americans; in any given year, nearly 7% of the adult population in the US will suffer from depression.

In most cases, depressive and bipolar disorders can be controlled with medication and psychological counseling. Because of the social stigma associated with mental illness, however, more than half of those living with these disorders go undiagnosed or untreated. Sadly, untreated depressive and bipolar disorders are a leading cause of suicide in the US, accounting for nearly two-thirds of the 30,000 suicides reported annually. Studies also suggest that 55% of those whose illness is untreated abuse illicit or prescription drugs while 45% abuse alcohol. Carrie Fisher herself believed that her years-long battle with addiction was a result of her undiagnosed bipolar disorder.

Along with other well-deserved epithets, 2016 will be remembered as the year that the mental health community lost a remarkable advocate. Despite Ms. Fisher’s untimely passing, however, there is still some hope (even “a new hope” should we want to play off of the subtitle to Star Wars Episode IV, the movie that made Carrie a household name). Shortly after her death last Tuesday, the hashtag #InHonorOfCarrie began to trend on Twitter. Within a couple of hours, nearly 200,000 people had used the hashtag to open up publicly about their own fight with mental illness.

Like her on-screen persona Leia Organa, Carrie Fisher seems to have inspired a revolt against the stigma of living with mental illness, inspiring a generation with her words. As she so famously stated in her 2008 novel Wishful Drinking, “being bipolar can be an all-consuming challenge, requiring a lot of stamina and even more courage, so if you’re living with this illness and functioning at all, it’s something to be proud of, not ashamed of. They should issue medals along with the steady stream of medication.”

Like so many who have left us this year, Carrie will be missed but the Rebellion that she once commanded will continue on.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 29, 2016, and is available on the WAMC website.]

Posted in Celebrities, Discrimination, Uncategorized | Leave a comment

Means to an End

My mother passed away last Wednesday. She was found unresponsive on the floor of her kitchen early Tuesday, in severe septic shock from untreated peritonitis and a perforated intestine. Although she was admitted to the intensive care unit and given aggressive medical treatment, she never regained consciousness. Because of her age and her poor health – exacerbated by the fact that she had ignored the signs and symptoms of sepsis for nearly a month – her body was simply not strong enough to fight the infection. Less than thirty-six hours after she was admitted to the hospital, we let her peacefully and painlessly slip away.

I’m telling you this story not to garner sympathy, but rather to share with you a lesson that I learned. Because my mother was unmarried and because she was unable to consent for treatment, according to the laws of the state in which she lived I was the de facto decision maker about her medical care. This is quite common. Unless otherwise indicated, family members – usually the spouse, adult children, adult siblings, and parents, in that order – are assumed to be the surrogate decision makers for a patient who cannot provide consent.

The decisions that I had to make, most of which were made at 2:30 in the morning after chatting briefly with the clinical care team, included the decision to make my mother DNR (‘do not resuscitate’) after her heart stopped for the third time. I also made the difficult decision to stop aggressive medical treatment and to move my mother to comfort care after a neurologist concluded that she had suffered extensive and irreversible brain damage.

A surrogate like myself is supposed to make these decisions by using a concept that we ethicists call substituted judgment: they should try to make the choice that the patient would have made had they be able to make decisions on their own behalf. Hopefully, I made the same decisions about my mother’s care that she would have had she been conscious and able to speak.

In reality, however, all of my decisions were made without any sense of what my mother would actually want. Although I am a bioethicist – part of my job involves teaching students about the importance of planning for situations like this – my own mother had not made any decisions about her end-of-life care. For example, she did not have an Advance Directive. Sometimes called a Living Will, an Advance Directive is a legal document that specifies the type of medical treatment a patient would want or not want should they be unable to make decisions for themselves.

My mother was not alone in lacking an Advance Directive. According to a recent survey of nearly 8,000 Americans, over two-thirds do not have an Advance Directive, Living Will, Health Care Proxy or similar document like the Physician Orders for Life-Sustaining Treatment (POLST) form. They don’t have these documents because they don’t know about them or because they assume their families already know their end-of-life wishes.

Unfortunately, the few studies that have looked at the accuracy of family decision-making have also found that most health care proxies, like myself, might as well just guess what their loved one wants. Surrogate accuracy is only slightly above chance, with rates of accuracy running about 50-65%. This is largely because too many people avoid conversations about end-of-life planning. Talking about death is difficult even under the best of circumstances, let alone our own end-of-life wishes. Even when we do touch upon the issue, it’s usually some glib remark about “not wanting to be a ‘vegetable’”.

This was exactly the situation that I faced. Other than an off-the-cuff remark over a decade ago about the Terri Schiavo case, my mother had never spoken about her end-of-life wishes with myself, my sister, or even her unmarried partner. Moreover, because my sister and I had a relationship with my mother that could be described as complex at best and tumultuous at worst, the likelihood that she would actually have been open to having this conversation was slightly less than zero. I knew she wouldn’t want to spend years in a persistent vegetative state, but I had no idea if she would want to be intubated, I had no idea if she would be willing to spend months in rehab, and I had no idea if she would be happy even if she had to spend her remaining years in a long-term care facility.

I’m at peace with the choices that I made regarding my mother’s care. Given the severity of her situation, it was not a question of if she would pass away. It was a question of when, and the decisions that I made in consultation with the critical care resident ensured that her last hours were comfortable and pain-free.

I consider myself lucky in that regard. While I regret the fact that I will never again have the chance to address the unresolved issues that made my relationship with my mother so challenging, I wasn’t forced to make any decisions that could have resulted in weeks, months or years of a slow and lingering death. Sadly, far too many spouses, adult children, and other surrogate decision makers aren’t so blessed; they spend months or years as caregivers and health care surrogates, watching a person they love battle illness without ever knowing if they made the right decisions.

And this is why it is so important that we all talk about end-of-life decision making with our loved ones, no matter our age or current health status. We all expect to live for decades more. But life is unpredictable, and the only thing that is certain is that none of us get out of it alive. While it might be difficult to contemplate our own mortality, we owe it to those that we love to make sure that they know what we want when the inevitable comes.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 15, 2016, and is available on the WAMC website.]

 

Posted in Decision Making, End-of-Life, Health Care | 1 Comment

Leadership. Commitment. Hype

Today is World AIDS Day. It is, in fact, the 29th annual World AIDS Day, which is held every year on December 1st to honor the 35 million people who have died from the disease and to support the 40 million who currently live with HIV/AIDS. The theme for this year’s event, at least according to the US federal government? “Leadership. Commitment. Impact.” You’ll have to excuse me if I scoff openly at the audacity of that motto.

Let’s consider the leadership and commitment of our politicians in fighting HIV/AIDS. When AIDS was first identified in 1981, it was seen as a disease that primarily affected socially marginalized populations, notably gay men, injection drug users and immigrants from poor Caribbean countries like Haiti. As long as it was confined to those ‘undesirable’ groups, there was no need for upstanding American citizens to pay it much heed. Following the lead of then-President Ronald Reagan – who didn’t even mention the word ‘AIDS’ publicly until 1985, and then only sparingly — politicians and other members of his conservative administration largely ignored the looming public health crisis.

American leadership failed when it was needed the most, by refusing to tackle the nascent AIDS crisis with measures like comprehensive education, blunt messaging and active promotion and widespread distribution of condoms. Had federal officials not been so afraid of ruffling conservative feathers, it is entirely possible that the HIV/AIDS epidemic might have be thwarted then and there.

This isn’t to say that there haven’t been a few times when our political leaders actually stepped up to the plate and contributed to the fight against HIV/AIDS. In 1988, despite opposition by the more conservative members of the Reagan Administration, then-Surgeon General Dr. C. Everett Koop mailed detailed information to every American household on the use of condoms to prevent the spread of HIV.

Similarly, in response to public pressure from celebrity advocates and radical activist groups, Reagan dramatically increased funding for AIDS research and established what would eventually be the first permanent advisory council on HIV/AIDS. Subsequent Administrations have increased funding and support for treatment and prevention efforts even more, including the establishment of the President’s Emergency Plan for AIDS Relief (PEPFAR), a global initiative spearheaded by George W. Bush that provides lifesaving antiretroviral treatment to millions of people living with HIV/AIDS in the hardest hit countries around the world.

That being said, political leaders at both the state and the federal level are more likely to stymie that promote efforts to prevent the spread of HIV/AIDS, most often for ideological reasons. Despite decades of research demonstrating that needle exchange programs greatly reduced rates of HIV transmission among injection drug users, for example, the use of federal funds to support these programs was largely banned until 2015. Opposition to federal support of needle exchange programs was largely based on the erroneous fear that they actively promote drug use among clients, in spite of evidence to the contrary. This decades-long ban likely lead to the otherwise preventable infection of thousands of drug users and their partners.

Under the leadership of Vice President-elect Mike Pence, Indiana legislators eliminated all state funding for Planned Parenthood because of their ideological opposition to abortion. As many who have been following the political wrangling over Planned Parenthood know, however, legal termination of pregnancy accounts to less than 3% of the total number of medical procedures and services offered by that organization. Rather than a place to get abortions, for many poor Americans Planned Parenthood is the only source for a variety of desperately needed health care services, including: family planning; pregnancy testing and prenatal care; screening for breast, cervical and testicular cancer; testing and treatment for sexually-transmitted diseases; and HIV testing and education. When Pence and his colleagues cut funding for Planned Parenthood in Indiana, five clinics were shuttered. This included one clinic that was the sole provider of HIV counseling and testing in Scott County. Soon after, that rural community saw a 16-fold increase in the number of new HIV infections.

Similar increases in the spread of HIV are likely to be seen nationally should newly emboldened conservative politicians make good on many of their campaign promises. During the recent campaign, now President-elect Donald Trump called for the complete elimination of all public funding for organizations like Planned Parenthood. Congressman Tom Price, Trump’s nominee for Secretary of Health and Human Services, is an ardent pro-life advocate who has championed those efforts. Representative Price has also called for a rollback of Medicaid, including slashing funding of programs that provide HIV-positive patients with low-cost access to care and treatment, and is a strong supporter of faith-based abstinence-only education programs that are widely known to be ineffective in educating teens and young adults about HIV/AIDS.

Support for HIV prevention and treatment efforts has been slipping for years, in part because of public fatigue and in part because of hype that new drug regimes have turned HIV/AIDS from a once deadly disease to a chronic condition that can be managed like diabetes or heart disease. AIDS is no longer seen as a serious public health crisis, explaining why both private and pubic funding for HIV/AIDS programs has been stagnant for almost a decade.

This problem is only going to get worse in the coming years, if the current political climate is any indication of where American priorities lie. Unless our newly-elected politicians show true leadership and commitment to the fight against HIV/AIDS – instead of their usual self-serving and ideologically-motivated efforts to promote themselves and enrich their donors — I fear that the hard-won gains that we have made since HIV was first discovered will soon be lost.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 1, 2016, and is available on the WAMC website.]

Posted in disadvantaged, Discrimination, HIV/AIDS, Policy, Politics, Uncategorized | Leave a comment

Under the Knife

I nearly died last month. This is not an exaggeration. What started out as a bad bout of influenza quickly developed into something more. After five days sick in bed, I was struck with stabbing abdominal pains, a fever that spiked over 105° F, and a severe case of sepsis. Had I not gotten myself to the emergency room, I might have ended up in a coma, or worse, as a result of the raging infection coursing through my blood stream.

I spent a total of 16 days in the hospital, including an overnight stay in the intensive care unit (ICU), as a team of doctors and specialists furiously tried to bring my infection under control. I’m now convalescing at home, 23 pounds lighter and with 6 holes in my chest and abdomen.

For someone who considers himself to be healthier than most men his age, this was a terrifying experience. For a bioethicist who reads, writes and teaches about clinical care, this was also a very humbling experience. Other than a couple of out-patient procedures to fix orthopedic problems, this was the first (and longest) time I have ever spent being treated for a severe medical issue. I learned a lot about what it means to be a patient, lessons that will undoubtedly influence my own research and writing about modern medical policies and practices. In particular, there are five lessons that I want to share.

Lesson 1: Modern medicine is an inexact science. Over the course of eight days I underwent eight X-rays, three ultrasounds, two CT, two hepatobiliary (HIDA) scans, an MRI, and a sigmoidoscopy. I also had dozens of blood tests. These test results were inconclusive and confusing, leading one surgical resident to admit to me that clinicians often just make their best guess as to what’s wrong, treating the symptoms and letting the body heal itself. In my case, it was only after they opened me up in the operating room that the doctors realized that I had peritonitis, peri-appendicitis, and several perforations of my small and large intestine. They still don’t know the cause of my illness.

Lesson 2: Modern medicine is very expensive. I have already received eight bills for my care, totaling nearly $30,000. Still looming are the charges for the operation, all of the medical tests, and the night in the ICU. My total bill is likely to be over $100,000. Thankfully, I have medical insurance and my total out-of-pocket costs are capped at $3,500, an amount I can afford. By contrast, there are many who live paycheck to paycheck for whom even a few thousand dollars would be a financial hardship, and that doesn’t include the 10% of Americans who are uninsured and would likely driven into bankruptcy if they had to deal with a $100,000 hospital bill.

Lesson 3: The looming “superbug” crisis is even more frightening than I thought. I have written in the past about one of the most deadly threats to human health since the bubonic plague: the coming epidemic of antibiotic-resistant bacteria. I’m even more worried now. It took the doctors 16 days, infusing me intravenously with some of the strongest antibiotics known, to bring my infection under control. As soon as bacteria resistant to those drugs emerge – a question of when rather than if – there will be nothing available to treat such severe infections in patients like myself. Unless we address this problem head on, in the coming years millions of patients will die as a result of untreatable infections.

Lesson 4: Nurses and medical technicians are the under-appreciated heroes of modern medical practice. This is not to say that the doctors didn’t give me great care, but during my 16 days in the hospital I rarely saw them. They would pop into my room at random hours, check my vitals and palpate my abdomen, and then go out into the hall to issue new orders to the nursing staff. The nurses and medical technicians on the ward were the ones that provided the front line care that I needed. They treated my pain, drew my blood, gave me antibiotic infusions, managed my fever, bathed me, took me to the bathroom, sat with me, and provided me an unlimited supply of cold ginger ale to soothe my parched throat. They did so for me and all the other patients on the ward unflinchingly, despite the fact that many patients and family members (not me) often take out their anger, fear and pain by yelling at the nursing staff.

Lesson 5: Never underestimate the importance of friends and family in the treatment and recovery process. In some ways this is the most important lesson I learned. When I was at my lowest point – my abdomen distended, my legs swollen from the 15 liters of IV fluid pumped into me, and my pain controlled only by frequent injections of morphine – it was the visits from my friends and family that gave me the strength to soldier on. They didn’t need to talk (sometimes they didn’t, as I often was barely lucid from the painkillers). Rather, they sat with me and held my hand. Without them, particularly my husband, I’m not sure I would have survived. Remember that next time a friend, family member or even a distant acquaintance is in the hospital. Make sure you visit them, even if it is only for a few minutes. Your presence is the most powerful medicine there is.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on November 3, 2016, and is available on the WAMC website.]

Posted in Uncategorized | Leave a comment

The Age of the Superbug

With all of the media hullaballoo about Hillary Clinton’s pneumonia, Donald Trump’s physical exam, Brangelina’s impeding divorce, and poisoned Skittles, you may have missed one of the biggest and most important health stories of this year.

Just yesterday, the United Nations General Assembly held a day-long meeting in New York City to discuss one of the most deadly threats to human health since the bubonic plague: antibiotic-resistant bacteria. This is only the fourth time in history that the General Assembly has met to address a health issue, having met twice in 2011 to talk about HIV/AIDS and chronic diseases, respectively, and again in 2014 to discuss the West African Ebola outbreak.

Antibiotic-resistant bacteria pose an even greater threat than Ebola, HIV/AIDS, and heart disease combined. According to the US Centers for Disease Control and Prevention (CDC), antibiotic-resistant forms of common bacteria like E. coli, Staphylococcus aureus, Streptococcus pneumoniae, and Mycobacterium tuberculosis – among others – infect nearly 2 million people a year in the United States, killing at least 25,000.

Worldwide, the number of people infected is several magnitudes greater; an estimated 750,000 people died from antibiotic-resistant infections in 2015. Within just a couple of decades, that number is expected to increase by nearly 1500%, yielding over 10 million “superbug”-related deaths annually by 2050.

Antibiotic-resistant infections will soon account for one-third of all deaths globally, a startling turnabout from 1967. That year, thanks to the widespread use of antibiotics and still effective public immunization programs, then Surgeon General William Stewart famously stated that, “the time has come to close the book on infectious diseases. We have basically wiped out infection in the United States.”

However, it turns out that it was the very successes that Dr. Stewart was touting –including copious use of antibiotics – that resulted in the grave crisis that we face today. It was the overprescribing and misuse of antibiotics over the past 50 years that allowed these superbugs to emerge.

Commonly used antibiotics like amoxicillin, cephalexin, azithromycin and ciprofloxacin still kill most bacteria, but a small percentage of these microorganisms are naturally resistant. Naturally occurring resistance has been seen for every antibiotic that has ever been developed. Thus, whenever an antibiotic is used, the drug-sensitive bacteria die off but the resistant bacteria survive. Eventually, if a particular antibiotic is used enough, the resistant bacteria take over. This is why antibiotics should only be used sparingly.

Unfortunately, we haven’t been so thoughtful in our use of these drugs. Ever since the first antibiotics were prescribed to treat serious infections among the soldiers fighting in World War II, we have used them more and more liberally.

It is not uncommon, for example, for a physician to prescribe an antibiotic like azithromycin to a patient with the flu, even though these drugs do not work on viruses like influenza. They may do so because they are hurried, because they misdiagnosis the illness, because they want to prevent potential secondary infections, or (most likely) because their patients expect them to.

We also use antibiotics for non-medical purposes. Nearly 80% of the antibiotics produced annually in the US are not used to treat infections, but instead are used by farmers as growth promoters. Antibiotics are routinely added to the feed or water of agricultural livestock – cattle, pigs and poultry – in order to make these animals fatter.

Given the high-dosages used, many of these drugs pass through the digestive system un-metabolized and are thus present in animal waste. This waste eventually enters the ecosystem through agricultural run-off or sewage spills, contaminating the ground, local streams and rivers, and underground aquifers. In agriculturally intensive regions of the world, pharmaceutically active concentrations of antibiotics are routinely found in soil and water samples. One study of the Yangtze and Pearl Rivers in China, for example, detected more than 60 different antibiotics in those waterways, often at levels that were 10,000 times greater than the normal human treatment dose.

As a result, the world is quickly running out of effective antibiotics. Despite this, there are few new antibiotic drugs in development. This is not because the need isn’t there, but it is simply too expensive and too difficult for pharmaceutical researchers and drug manufacturers to develop, test and market new antibiotics when the bacteria adapt so quickly.

Unless we change our current practices, we will soon be entering a post-antibiotic era. We need to stop prescribing antibiotics for every little cold. We need to stop using these drugs to satisfy our desire for cheap meat, milk and eggs. We need a global effort to develop new drugs and treatments for the myriad of drug-resistant bugs we already face. And we need to do it before it is too late.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on September 22, 2016, and is available on the WAMC website.]

Posted in Clinical Care, Disasters, Drugs | Leave a comment

How to Die in California

Late last month, Betsy Davis died at her home in Ojai, California. The 41-year-old performance artist was suffering from ALS, a progressive neurodegenerative disorder, also known as Lou Gehrig’s disease, which had already robbed her of the ability to stand, to walk, and to speak clearly. Facing the prospect of a slow and lingering death as she lost her capacity to move, to eat and, eventually, to breathe, Ms. Davis took her own life by taking a lethal dose of barbiturates.

In doing so, Betsy Davis became the first terminally ill patient to die under California’s End of Life Options Act. That law, which went into effect in June of this year, allows a terminally ill resident of California to be prescribed a lethal dose of drugs so long as they meet certain medical criteria, make two oral and one written request for physician aid-in-dying, and have the ability to take the drugs without assistance.

In passing the End of Life Options Act, California became the fifth state to legalize physician aid-in-dying. Oregon was the first state to legalize the practice by popular referendum, implementing it in 1998. Washington and Vermont followed suit in 2008 and 2013, respectively. Montana is the only other state where the practice is allowed, the result of a Montana Supreme Court ruling that nothing in that state’s laws prevents the practice.

California’s Act was passed largely thanks to the efforts of two advocacy groups. The first is Compassion and Choices, a national organization that has been working to expand end-of-life treatment options (including physician aid-in-dying) for the past 30 years. The other organization involved, and the one that has been the most instrumental in changing public opinion about physician aid-in-dying, was the Brittany Maynard Fund.

The Fund was established in 2014 following the death of Brittany Maynard. Earlier that year, the then 29-year-old Californian was diagnosed with astrocytoma, a rare and aggressive form of brain cancer. Despite treatment, which included a partial craniotomy and the removal of part of her temporal lobe, Brittany’s cancer continued to progress and she was given a terminal diagnosis.

Facing a brief future filled with pain as she slowly lost her memory, her vision, and her ability to walk and to speak, Brittany sought to end her life on her own terms. Unfortunately, at that time physician aid-in-dying was not available in her home state of California. In response, she and her husband left their family, friends and home behind and moved to the neighboring state of Oregon. It was there, on November 1, 2014, that Ms. Maynard ended her life peacefully by taking a lethal overdose of drugs that was prescribed to her under that state’s Death with Dignity Act.

In going public with her story, Ms. Maynard became one of the most visible faces of the right-to-die movement. A young, beautiful and talented woman, she presented to the public an image that was very different from what most pictured when thinking of the terminally ill: she wasn’t old, she wasn’t depressed or suicidal, and she wrote and spoke bluntly but eloquently about her terminal diagnosis and her desire to die on her own terms and in her own way. She also had the support of her husband of three years, Dan Diaz, who founded the Brittany Maynard Fund in honor of her memory.

In one of those rare coincidences, earlier this month I had the good fortune to meet Dan Diaz. Purely by chance, my husband and I happened to be dining in a bar-restaurant in New York City’s Little Italy when Mr. Diaz sat on the stool next to us. He was in town briefly as part of his unceasing efforts to lobby for expanded end-of-life options in New York and the other 44 states where physician aid-in-dying is still illegal.

Over the next two hours, we talked openly and honestly about Brittany, her experience, the arguments for and against aid-in-dying, and whether or not New Yorkers would be open to making the practice legal in this state.

Neither he nor I know the answer to that last question. The arguments for legalizing physician aid-in-dying are compelling, but so too are many of the concerns raised by critics. For example, physicians opposed to the practice believe that hastening death run counter to the moral duties outlined in the 2500-year-old Hippocratic Oath. Others fear that terminally ill patients will be pushed into ending their lives because of the emotional and financial burden placed upon their loved ones. Disability rights advocates worry that legalizing physician aid-in-dying devalues the lives of those living with physical or mental limitations, and point to the recent legalization of euthanasia for severely disabled children in the Netherlands as proof that we are but one step away from legitimizing their murder.

These are all valid concerns, and ones that need to be respected and addressed as we begin to debate the issue of physician aid-in-dying more and more publicly. Moreover, we need to bluntly discuss the issue of death itself, by talking with our friends, families and physicians about what a ‘good death’ means for each of us, by planning for the inevitable with our loved ones and our lawyers, and by exploring and expanding alternative end-of-life treatment options like hospice and palliative care. That is the true legacy of women like Brittany and Betsy: not the hastening of death but the celebration of life, no matter how short.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on August 25, 2016, and is available on the WAMC website.]

Posted in Cancer, Celebrities, Disability, End-of-Life, Physician Aid-in-Dying, Policy, Uncategorized | Leave a comment