Drop the Kleenex and Put Your Hands Up!

For the past week, mainstream, alternative, and social media outlets here in the United States and abroad have been consumed with discussion and debate about the legality and morality of President Trump’s recent travel ban. However, the so-called Muslim travel ban is not the only set of potentially controversial restrictions put into place recently.

Unbeknownst to most, the federal government is also planning to expand greatly the power of the US Centers for Disease Control and Prevention (CDC) to detain people who are suspected of carrying a dangerous communicable illness. Also known as quarantine – a term that comes from the Italian word for forty, in honor of the practice in Early Renaissance Venice to make trading vessels remain anchored offshore for 40 days before entering the port – the detention, isolation and even forcible treatment of those potentially exposed to a infectious disease like tuberculosis or Ebola is one of the most powerful and one of the most contentious tools in the public health arsenal.

The authority of local, state, and federal officials to do this comes from the parens patriae powers of the state. Latin for “parent of the nation,” parens patriae refers to the legal doctrine that the government has a responsibility to protect those who cannot care for themselves. This includes, for example, the power of the state to intervene against an abusive or negligent parent. More broadly, it also encompasses the government’s responsibility to protect the health and welfare of the general population, which is accomplished through public health policies and practices like food safety inspections, fluoridation and chlorination of municipal water supplies, immunization programs and requirements, and the use of isolation and quarantine to prevent the spread of disease.

The decision to quarantine a person is not something to be taken lightly. Doing so places restrictions on an individual’s civil rights, including their right of movement and their right of assembly. They may also experience significant economic, psychological, social, and even physical injuries as a result of being quarantined.

Thus, quarantine can be justified only if it is absolutely necessary to protect others. To put it another way, the forcible detention of someone believed to be infected with a dangerous infectious disease is ethically and legally defensible only if it meets the standards of the harm principle, as originally articulated by philosopher John Stuart Mill. In his classic work On Liberty, Mill argued that “the only purpose for which power can be rightfully exercised over any member of a civilized community against his will, is to prevent harm to others. His own good, either physical or moral, is not a sufficient warrant.”

Most public health ethicists and lawyers argue that the use of quarantine must not only meet the harm principle, but it must also be proportional to the danger faced and be transparent in its application. For example, you wouldn’t quarantine someone with a disease that cannot be spread from person to person. Similarly, quarantine might be made voluntary rather than compulsory. It is also expected that public health authorities clearly communicate the reason for the quarantine order and allow for a process of appeal.

This is why the expanded powers of the CDC, should they be enacted, raise considerable concern.

Currently, the CDC is limited to detaining those who are entering the country or crossing state lines unless they get approval from local and state officials. The agency is also limited to quarantining people exposed to a handful of deadly and highly infectious diseases, such as cholera, tuberculosis, and plague.

Under the new regulations, however, CDC officials will be able to detain anyone in the country who is exhibiting signs that they are infected with a potentially dangerous disease, such as a high fever, headache, or cramps. Of course, these symptoms are pretty indiscriminate; a person with a fever of 104°F may be infected with Ebola or they may just have a bad case of the flu. The proposed rules will also allow the CDC to detain someone for up to 72 hours before their case is subject to medical and legal review. That review will be conducted by CDC officials themselves, raising concerns about transparency, objectivity, and due process.

It should thus come as no surprise that many public health practitioners and health policy experts are concerned about these newly proposed quarantine regulations. Not only are they worried about the lack of legal safeguards and the potential for abuse by overly zealous officials, most believe that the expansion of the CDC’s quarantine powers may actually elevate the threat of epidemic disease. People who are experiencing clinical signs of a dangerous illness, for example, may choose to hide their symptoms from public health authorities rather than run the risk of being detained.

Quarantine is a very potent weapon in the fight against infectious disease, but the decision to deploy this “nuclear option” should be done carefully and judiciously. Individual civil liberties must be protected even during a public health crisis. But as we saw during the 2014 Ebola outbreak, when public figures like Donald Trump and Chris Christie called for a blanket quarantine of those returning from West Africa despite the lack of an evidence-based reason to do so, government officials are far too quick to pull this trigger.

Giving the CDC greater authority and power to detain people on public health grounds will do little to prevent new outbreaks of infectious disease in the US, but it will further chip away at our already eroded civil liberties and rights.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on February 9, 2017, and is available on the WAMC website.]

Posted in Human RIghts, Policy, Public Health | 1 Comment

A Public Cervix Announcement

On Monday, just days after millions of women (and their allies) marched in political demonstrations, researchers reported a disturbing new finding that could affect the health and wellbeing of these protestors. In a study published in this month’s issue of the journal Cancer, scientists found that a woman’s risk of dying from cervical cancer was much higher than originally suspected.

Cervical cancer is the fourth most common cancer in women worldwide. It also has the fourth highest mortality rate. Approximately 13,000 cases of invasive cervical cancer are diagnosed annually among American women. The number of women diagnosed with cervical cancer has decreased significantly over the past 40 years, largely due to the widespread use of the Pap test (or smear) to screen for the presence of precancerous lesions on the cervix, but over 4,000 women still succumb to the disease every year.

Previously, health experts had used those numbers to estimate that cervical cancer killed approximately 5.7 of 100,000 black women and 3.2 of 100,000 white women in the US. That racial disparity in death rates is pretty stark, particularly when you consider rates of cervical cancer incidence and mortality among other racial and ethnic groups. For example, Latinas have even higher incidence rates than black women — Hispanic women in the US are more likely to be diagnosed with cervical cancer — but in recent years the death rate in this group has fallen to the point where it is similar to that of white women.

What is particularly disturbing is we now know that those death rates are wrong and that the racial disparity is much worse than we assumed. Previous estimates of cervical cancer mortality failed to account for women who had undergone a hysterectomy to treat other conditions like endometriosis, fibroids, pelvic inflammatory disease, or ovarian and uterine cancer; almost one-third of American women will undergo a hysterectomy by the time they are 60. Because a hysterectomy almost always involves the removal of the cervix, which eliminates a woman’s cervical cancer risk, failing to exclude these individuals leads to a significant underestimate of the true rate of cervical cancer death.

When researchers adjusted these estimates to exclude women who had had hysterectomies, the results were startling. In fact, the actual cervical cancer death rate is 4.7 per 100,000 white women and 10.1 per 100,000 black women. Those are rates that are 47% and 77% higher than previously calculated. Moreover, despite the fact that Hispanic women were not included in this study, the racial disparity in cervical cancer death rates is much wider than originally thought. Black women in the US are dying of cervical cancer at the same rate as women living in less developed countries in Africa, Asia, and Latin America.

This is particularly troubling for a number of reasons, most notably because cervical cancer is largely preventable.

First, almost all cases of cervical cancer are caused by the human papillomavirus (HPV). HPV is a sexually transmitted disease believed to infect nearly 80% of the adult population. Luckily, there are now vaccines on the market that protect against the types of HPV that cause 70% of cervical cancers in women. Studies suggest that these vaccines, if given before a woman becomes sexually active, are likely to be highly effective in preventing cervical cancer. Widespread vaccination of adolescents against HPV would largely eliminate this deadly disease.

Second, for those already infected with HPV, regular Pap smears (or more modern and less invasive tests) can be used to identify precancerous cervical lesions early. The potentially cancerous cells can then be excised, frozen or destroyed by using a laser. Without treatment, these precancerous lesions will progress to invasive cancer in about 30 to 50% of cases. With treatment, less than 1% of them do.

So, the fact that so many women still die of cervical cancer, and the fact that black women die at significantly higher rates than white women, raises serious questions about their access to the HPV vaccine and to cervical cancer screening and treatment programs. For example, only 40% of American girls aged 13 to 17 have been fully vaccinated against HPV, and geographic, socioeconomic and racial disparities persist. Similarly, despite expert recommendations, only about half of American women undergo regular Pap smear screening. Oddly enough, rates of Pap smear testing are remarkably similar between black and white women, suggesting that the difference in death rates may be a result of disparate access to treatment and care rather than to the screening programs themselves.

Whatever the reason for the difference, these new findings – particularly the observation that black women in the US are dying from cervical cancer at the same rate as their counterparts in far less economically developed countries – suggest that our current prevention, screening, and treatment programs are insufficient.

Sadly, given the political winds currently blowing in Washington and various state capitals, we probably won’t see any improvements soon. When a prominent anti-vaxxer is tapped to chair an important vaccine panel, when a vocal opponent of Planned Parenthood is nominated to head the Department of Health and Human Services, and when the first goal of the new Administration is to repeal a program that provides preventative health care to millions, a new and concerted push to promote cervical cancer prevention and treatment programs is unlikely to occur.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 26, 2017, and is available on the WAMC website.]

Posted in Cancer, Clinical Care, disadvantaged, Health Care | Leave a comment

The Breast Intentions Are Fraught With Disappointment

About once a year on average, I seem to create a bit of a stir with a commentary on breast cancer and screening guidelines. In those commentaries, I sometimes question the message that is given to American women about the utility of breast cancer screening programs. In the weeks that follow, both my email and my answering machine tend to fill up with people suggesting that I am wrong, sharing personal tales of invasive cancers that were detected only because of screening, and sometimes (although very rarely) hoping that a relative of mine is stricken with the disease.

Given this, let me start by saying that I take the topic of breast cancer very seriously. Breast cancer is one of the leading causes of cancer death among American women, second only to lung cancer. Nearly 250,000 new cases are detected each year in the United States, and over 40,000 women die annually from the disease.

All told, nearly 1 in 8 American women will be diagnosed with breast cancer during their lifetime, most of who have no familial history or genetic predisposition to the disease. Few families thus remain untouched by breast cancer, including mine. My aunt Kathryn finally succumbed to the disease in 2015 after battling it for nearly two decades.

Breast cancer is a public health crisis and one that deserves a strong, concerted and well-reasoned response. The problem, however, is that current public health messages about breast cancer screening and treatment are disjointed at best and dangerous at worst. Currently, different professional organizations in the US offer differing and often contradictory advice about if, how and when women should be screened.

For instance, the American College of Radiology takes a very aggressive stance on screening and treatment, recommending that all women get annual mammograms starting at the age of 40. Private organizations like the Susan G. Komen for the Cure similarly promote earlier and frequent screening.

By contrast, the US Preventive Services Task Force, an independent and non-partisan group of healthcare experts that looks at the risks and benefits of clinical screening and disease prevention programs, recommends that most women should delay getting regular mammograms until after they turn 50. The Task Force also recommends that screening be done every other year, not annually. Only women with a familial history of breast cancer should be screened earlier and more often.

Finally, groups like the American Cancer Society have staked out the middle ground. That organization, for example, recommends that women get annual mammograms from age 45 to 54, followed by screenings every other year once they turn 55.

This is all very confusing for most women, and it is about to be made even more so as a result of a Danish study published in this week in the Annals of Internal Medicine. That study, which looked at nearly 100,000 women diagnosed with breast cancer between 1980 and 2010, found that as many as one-third of those women might have been over-diagnosed and over-treated.

By comparing the medical records of those who participated in a mammographic screening program with those who did not, the Danish researchers discovered that there was no significant difference in the number of invasive tumors detected in the two groups. There was also no significant difference in the number of lives lost to cancer. Moreover, the number of non-malignant or slow-growing tumors detected was much higher in women who underwent regular mammography. Over 30 percent of the women in this group were diagnosed with a condition known as ductal carcinoma in situ, or DCIS.

Experts disagree on whether or not DCIS should be treated. It isn’t immediately life threatening, but some doctors still recommend treating it quickly and aggressively with chemotherapy, radiation or surgery in order to prevent it from becoming cancerous. Others, however, argue that DCIS carries such low risk of becoming invasive that it should be monitored only. The risks of treatment, these experts believe, outweigh the benefits. The Danish data, particularly the fact that the number of women with advanced tumors did not decrease despite an increase in the detection of ductal carcinoma in situ, would seem to support this theory.

As I have suggested previously, the information being given to women and to their doctors about if, when and how often to get mammograms is becoming increasingly confusing and conflicted. Rather than rely on one-size-fits-all guidelines from the American Cancer Society, American College of Radiology or the US Preventative Services Task Force, women should instead decide what is best for them based on their own personal circumstances, medical histories and prevention goals.

Someone with a familial history of cancer or someone who worries about their individual risk of cancer might choose to undergo more frequent screening, so long as they understand the potential harms of over-diagnosis. Others might instead choose to undergo less frequent screening, concluding the possibility that they might develop invasive cancer early doesn’t outweigh the risks of over-treatment.

I’ve said it before and I’ll say it again: screening saves lives, but not everyone needs to be screened early and often as some experts suggest.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 12, 2017, and is available on the WAMC website.]

Posted in Cancer, Clinical Care, Women | Leave a comment

A New Hope for Mental Illness

Every year, my husband and I throw a big New Year’s Eve party. Most of the time, we celebrate the coming of a new year with food, champagne and the company of good friends. This weekend’s party will be particularly poignant for me. I will be toasting not to the coming of 2017 but, rather, to the end of 2016.

This year has been particularly tumultuous for me, characterized by significant professional challenges and two recent hospitalizations. This year was also capped off by the passing of my mother, who recently succumbed to the very health problem that I have been struggling with for the past three months. The only positive thing to say about 2016 is that I have a new found appreciation for all that I have, and a plan to achieve better work-life balance in the coming year.

Of course, I am not the only person who has faced personal and professional challenges this year. In fact, my own struggles cannot compare to those whose lives have been irreparably changed by the war in Syria, the gun violence in Chicago, or the terror attacks in Belgium, Florida, France, Germany and elsewhere.

We’ve also lost what seems to be an extremely long list of political figures, sports legends, and celebrated entertainers in 2016, including Cuban revolutionary Fidel Castro, boxing champion and political activist Muhammad Ali, and award-winning artist Prince. In the past couple of days, we’ve even lost two of my teenage idols: musician George Michael and actress-writer Carrie Fisher [Author’s Note: After this commentary was written and recorded, it was announced that Carrie’s mother Debbie Reynolds also passed away unexpectedly]. While I don’t normally pay too much attention to comings and goings of celebrities like George and Carrie, I think it is worth commenting on the tragic deaths of these two public figures.

Both Carrie Fisher and George Michael – just like far too many celebrities and average folk — struggled with addiction. Mr. Michael’s abuse of cocaine and heroin was largely secret, coming to light (and inappropriately so, I might add) as the result of post-mortem interviews and tweets by his so-called friends and colleagues.

By contrast, Ms. Fisher’s battle with alcohol and prescription drug abuse was well known, chronicled by the actress herself in the autobiographical novel Postcards from the Edge. In going public about her struggles with sobriety, one of only a handful of celebrities who have done so willingly , Carrie helped to humanize the problem of addiction. One need only look to the flood of posts on social media following the news of Ms. Fisher’s death to understand just how much of an impact that she had in inspiring others to come to terms with their own addiction. Moreover, it wasn’t just alcoholism that Carrie Fisher struggled with. She was also an outspoken advocate for other mental illnesses, courageously sharing her own experience with bipolar disorder.

Sometimes called manic depression, bipolar disorder is a common illness that is characterized by sudden and extreme mood swings. As many as six million Americans — almost 3% of the adult population — suffer from the illness, experiencing dramatic shifts in mood and energy that range from euphoric highs (mania or hypomania) to crippling lows (depression). A related illness, major depressive disorder, affects approximately 15 million Americans; in any given year, nearly 7% of the adult population in the US will suffer from depression.

In most cases, depressive and bipolar disorders can be controlled with medication and psychological counseling. Because of the social stigma associated with mental illness, however, more than half of those living with these disorders go undiagnosed or untreated. Sadly, untreated depressive and bipolar disorders are a leading cause of suicide in the US, accounting for nearly two-thirds of the 30,000 suicides reported annually. Studies also suggest that 55% of those whose illness is untreated abuse illicit or prescription drugs while 45% abuse alcohol. Carrie Fisher herself believed that her years-long battle with addiction was a result of her undiagnosed bipolar disorder.

Along with other well-deserved epithets, 2016 will be remembered as the year that the mental health community lost a remarkable advocate. Despite Ms. Fisher’s untimely passing, however, there is still some hope (even “a new hope” should we want to play off of the subtitle to Star Wars Episode IV, the movie that made Carrie a household name). Shortly after her death last Tuesday, the hashtag #InHonorOfCarrie began to trend on Twitter. Within a couple of hours, nearly 200,000 people had used the hashtag to open up publicly about their own fight with mental illness.

Like her on-screen persona Leia Organa, Carrie Fisher seems to have inspired a revolt against the stigma of living with mental illness, inspiring a generation with her words. As she so famously stated in her 2008 novel Wishful Drinking, “being bipolar can be an all-consuming challenge, requiring a lot of stamina and even more courage, so if you’re living with this illness and functioning at all, it’s something to be proud of, not ashamed of. They should issue medals along with the steady stream of medication.”

Like so many who have left us this year, Carrie will be missed but the Rebellion that she once commanded will continue on.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 29, 2016, and is available on the WAMC website.]

Posted in Celebrities, Discrimination, Uncategorized | Leave a comment

Means to an End

My mother passed away last Wednesday. She was found unresponsive on the floor of her kitchen early Tuesday, in severe septic shock from untreated peritonitis and a perforated intestine. Although she was admitted to the intensive care unit and given aggressive medical treatment, she never regained consciousness. Because of her age and her poor health – exacerbated by the fact that she had ignored the signs and symptoms of sepsis for nearly a month – her body was simply not strong enough to fight the infection. Less than thirty-six hours after she was admitted to the hospital, we let her peacefully and painlessly slip away.

I’m telling you this story not to garner sympathy, but rather to share with you a lesson that I learned. Because my mother was unmarried and because she was unable to consent for treatment, according to the laws of the state in which she lived I was the de facto decision maker about her medical care. This is quite common. Unless otherwise indicated, family members – usually the spouse, adult children, adult siblings, and parents, in that order – are assumed to be the surrogate decision makers for a patient who cannot provide consent.

The decisions that I had to make, most of which were made at 2:30 in the morning after chatting briefly with the clinical care team, included the decision to make my mother DNR (‘do not resuscitate’) after her heart stopped for the third time. I also made the difficult decision to stop aggressive medical treatment and to move my mother to comfort care after a neurologist concluded that she had suffered extensive and irreversible brain damage.

A surrogate like myself is supposed to make these decisions by using a concept that we ethicists call substituted judgment: they should try to make the choice that the patient would have made had they be able to make decisions on their own behalf. Hopefully, I made the same decisions about my mother’s care that she would have had she been conscious and able to speak.

In reality, however, all of my decisions were made without any sense of what my mother would actually want. Although I am a bioethicist – part of my job involves teaching students about the importance of planning for situations like this – my own mother had not made any decisions about her end-of-life care. For example, she did not have an Advance Directive. Sometimes called a Living Will, an Advance Directive is a legal document that specifies the type of medical treatment a patient would want or not want should they be unable to make decisions for themselves.

My mother was not alone in lacking an Advance Directive. According to a recent survey of nearly 8,000 Americans, over two-thirds do not have an Advance Directive, Living Will, Health Care Proxy or similar document like the Physician Orders for Life-Sustaining Treatment (POLST) form. They don’t have these documents because they don’t know about them or because they assume their families already know their end-of-life wishes.

Unfortunately, the few studies that have looked at the accuracy of family decision-making have also found that most health care proxies, like myself, might as well just guess what their loved one wants. Surrogate accuracy is only slightly above chance, with rates of accuracy running about 50-65%. This is largely because too many people avoid conversations about end-of-life planning. Talking about death is difficult even under the best of circumstances, let alone our own end-of-life wishes. Even when we do touch upon the issue, it’s usually some glib remark about “not wanting to be a ‘vegetable’”.

This was exactly the situation that I faced. Other than an off-the-cuff remark over a decade ago about the Terri Schiavo case, my mother had never spoken about her end-of-life wishes with myself, my sister, or even her unmarried partner. Moreover, because my sister and I had a relationship with my mother that could be described as complex at best and tumultuous at worst, the likelihood that she would actually have been open to having this conversation was slightly less than zero. I knew she wouldn’t want to spend years in a persistent vegetative state, but I had no idea if she would want to be intubated, I had no idea if she would be willing to spend months in rehab, and I had no idea if she would be happy even if she had to spend her remaining years in a long-term care facility.

I’m at peace with the choices that I made regarding my mother’s care. Given the severity of her situation, it was not a question of if she would pass away. It was a question of when, and the decisions that I made in consultation with the critical care resident ensured that her last hours were comfortable and pain-free.

I consider myself lucky in that regard. While I regret the fact that I will never again have the chance to address the unresolved issues that made my relationship with my mother so challenging, I wasn’t forced to make any decisions that could have resulted in weeks, months or years of a slow and lingering death. Sadly, far too many spouses, adult children, and other surrogate decision makers aren’t so blessed; they spend months or years as caregivers and health care surrogates, watching a person they love battle illness without ever knowing if they made the right decisions.

And this is why it is so important that we all talk about end-of-life decision making with our loved ones, no matter our age or current health status. We all expect to live for decades more. But life is unpredictable, and the only thing that is certain is that none of us get out of it alive. While it might be difficult to contemplate our own mortality, we owe it to those that we love to make sure that they know what we want when the inevitable comes.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 15, 2016, and is available on the WAMC website.]

 

Posted in Decision Making, End-of-Life, Health Care | 1 Comment

Leadership. Commitment. Hype

Today is World AIDS Day. It is, in fact, the 29th annual World AIDS Day, which is held every year on December 1st to honor the 35 million people who have died from the disease and to support the 40 million who currently live with HIV/AIDS. The theme for this year’s event, at least according to the US federal government? “Leadership. Commitment. Impact.” You’ll have to excuse me if I scoff openly at the audacity of that motto.

Let’s consider the leadership and commitment of our politicians in fighting HIV/AIDS. When AIDS was first identified in 1981, it was seen as a disease that primarily affected socially marginalized populations, notably gay men, injection drug users and immigrants from poor Caribbean countries like Haiti. As long as it was confined to those ‘undesirable’ groups, there was no need for upstanding American citizens to pay it much heed. Following the lead of then-President Ronald Reagan – who didn’t even mention the word ‘AIDS’ publicly until 1985, and then only sparingly — politicians and other members of his conservative administration largely ignored the looming public health crisis.

American leadership failed when it was needed the most, by refusing to tackle the nascent AIDS crisis with measures like comprehensive education, blunt messaging and active promotion and widespread distribution of condoms. Had federal officials not been so afraid of ruffling conservative feathers, it is entirely possible that the HIV/AIDS epidemic might have be thwarted then and there.

This isn’t to say that there haven’t been a few times when our political leaders actually stepped up to the plate and contributed to the fight against HIV/AIDS. In 1988, despite opposition by the more conservative members of the Reagan Administration, then-Surgeon General Dr. C. Everett Koop mailed detailed information to every American household on the use of condoms to prevent the spread of HIV.

Similarly, in response to public pressure from celebrity advocates and radical activist groups, Reagan dramatically increased funding for AIDS research and established what would eventually be the first permanent advisory council on HIV/AIDS. Subsequent Administrations have increased funding and support for treatment and prevention efforts even more, including the establishment of the President’s Emergency Plan for AIDS Relief (PEPFAR), a global initiative spearheaded by George W. Bush that provides lifesaving antiretroviral treatment to millions of people living with HIV/AIDS in the hardest hit countries around the world.

That being said, political leaders at both the state and the federal level are more likely to stymie that promote efforts to prevent the spread of HIV/AIDS, most often for ideological reasons. Despite decades of research demonstrating that needle exchange programs greatly reduced rates of HIV transmission among injection drug users, for example, the use of federal funds to support these programs was largely banned until 2015. Opposition to federal support of needle exchange programs was largely based on the erroneous fear that they actively promote drug use among clients, in spite of evidence to the contrary. This decades-long ban likely lead to the otherwise preventable infection of thousands of drug users and their partners.

Under the leadership of Vice President-elect Mike Pence, Indiana legislators eliminated all state funding for Planned Parenthood because of their ideological opposition to abortion. As many who have been following the political wrangling over Planned Parenthood know, however, legal termination of pregnancy accounts to less than 3% of the total number of medical procedures and services offered by that organization. Rather than a place to get abortions, for many poor Americans Planned Parenthood is the only source for a variety of desperately needed health care services, including: family planning; pregnancy testing and prenatal care; screening for breast, cervical and testicular cancer; testing and treatment for sexually-transmitted diseases; and HIV testing and education. When Pence and his colleagues cut funding for Planned Parenthood in Indiana, five clinics were shuttered. This included one clinic that was the sole provider of HIV counseling and testing in Scott County. Soon after, that rural community saw a 16-fold increase in the number of new HIV infections.

Similar increases in the spread of HIV are likely to be seen nationally should newly emboldened conservative politicians make good on many of their campaign promises. During the recent campaign, now President-elect Donald Trump called for the complete elimination of all public funding for organizations like Planned Parenthood. Congressman Tom Price, Trump’s nominee for Secretary of Health and Human Services, is an ardent pro-life advocate who has championed those efforts. Representative Price has also called for a rollback of Medicaid, including slashing funding of programs that provide HIV-positive patients with low-cost access to care and treatment, and is a strong supporter of faith-based abstinence-only education programs that are widely known to be ineffective in educating teens and young adults about HIV/AIDS.

Support for HIV prevention and treatment efforts has been slipping for years, in part because of public fatigue and in part because of hype that new drug regimes have turned HIV/AIDS from a once deadly disease to a chronic condition that can be managed like diabetes or heart disease. AIDS is no longer seen as a serious public health crisis, explaining why both private and pubic funding for HIV/AIDS programs has been stagnant for almost a decade.

This problem is only going to get worse in the coming years, if the current political climate is any indication of where American priorities lie. Unless our newly-elected politicians show true leadership and commitment to the fight against HIV/AIDS – instead of their usual self-serving and ideologically-motivated efforts to promote themselves and enrich their donors — I fear that the hard-won gains that we have made since HIV was first discovered will soon be lost.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 1, 2016, and is available on the WAMC website.]

Posted in disadvantaged, Discrimination, HIV/AIDS, Policy, Politics, Uncategorized | Leave a comment

Under the Knife

I nearly died last month. This is not an exaggeration. What started out as a bad bout of influenza quickly developed into something more. After five days sick in bed, I was struck with stabbing abdominal pains, a fever that spiked over 105° F, and a severe case of sepsis. Had I not gotten myself to the emergency room, I might have ended up in a coma, or worse, as a result of the raging infection coursing through my blood stream.

I spent a total of 16 days in the hospital, including an overnight stay in the intensive care unit (ICU), as a team of doctors and specialists furiously tried to bring my infection under control. I’m now convalescing at home, 23 pounds lighter and with 6 holes in my chest and abdomen.

For someone who considers himself to be healthier than most men his age, this was a terrifying experience. For a bioethicist who reads, writes and teaches about clinical care, this was also a very humbling experience. Other than a couple of out-patient procedures to fix orthopedic problems, this was the first (and longest) time I have ever spent being treated for a severe medical issue. I learned a lot about what it means to be a patient, lessons that will undoubtedly influence my own research and writing about modern medical policies and practices. In particular, there are five lessons that I want to share.

Lesson 1: Modern medicine is an inexact science. Over the course of eight days I underwent eight X-rays, three ultrasounds, two CT, two hepatobiliary (HIDA) scans, an MRI, and a sigmoidoscopy. I also had dozens of blood tests. These test results were inconclusive and confusing, leading one surgical resident to admit to me that clinicians often just make their best guess as to what’s wrong, treating the symptoms and letting the body heal itself. In my case, it was only after they opened me up in the operating room that the doctors realized that I had peritonitis, peri-appendicitis, and several perforations of my small and large intestine. They still don’t know the cause of my illness.

Lesson 2: Modern medicine is very expensive. I have already received eight bills for my care, totaling nearly $30,000. Still looming are the charges for the operation, all of the medical tests, and the night in the ICU. My total bill is likely to be over $100,000. Thankfully, I have medical insurance and my total out-of-pocket costs are capped at $3,500, an amount I can afford. By contrast, there are many who live paycheck to paycheck for whom even a few thousand dollars would be a financial hardship, and that doesn’t include the 10% of Americans who are uninsured and would likely driven into bankruptcy if they had to deal with a $100,000 hospital bill.

Lesson 3: The looming “superbug” crisis is even more frightening than I thought. I have written in the past about one of the most deadly threats to human health since the bubonic plague: the coming epidemic of antibiotic-resistant bacteria. I’m even more worried now. It took the doctors 16 days, infusing me intravenously with some of the strongest antibiotics known, to bring my infection under control. As soon as bacteria resistant to those drugs emerge – a question of when rather than if – there will be nothing available to treat such severe infections in patients like myself. Unless we address this problem head on, in the coming years millions of patients will die as a result of untreatable infections.

Lesson 4: Nurses and medical technicians are the under-appreciated heroes of modern medical practice. This is not to say that the doctors didn’t give me great care, but during my 16 days in the hospital I rarely saw them. They would pop into my room at random hours, check my vitals and palpate my abdomen, and then go out into the hall to issue new orders to the nursing staff. The nurses and medical technicians on the ward were the ones that provided the front line care that I needed. They treated my pain, drew my blood, gave me antibiotic infusions, managed my fever, bathed me, took me to the bathroom, sat with me, and provided me an unlimited supply of cold ginger ale to soothe my parched throat. They did so for me and all the other patients on the ward unflinchingly, despite the fact that many patients and family members (not me) often take out their anger, fear and pain by yelling at the nursing staff.

Lesson 5: Never underestimate the importance of friends and family in the treatment and recovery process. In some ways this is the most important lesson I learned. When I was at my lowest point – my abdomen distended, my legs swollen from the 15 liters of IV fluid pumped into me, and my pain controlled only by frequent injections of morphine – it was the visits from my friends and family that gave me the strength to soldier on. They didn’t need to talk (sometimes they didn’t, as I often was barely lucid from the painkillers). Rather, they sat with me and held my hand. Without them, particularly my husband, I’m not sure I would have survived. Remember that next time a friend, family member or even a distant acquaintance is in the hospital. Make sure you visit them, even if it is only for a few minutes. Your presence is the most powerful medicine there is.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on November 3, 2016, and is available on the WAMC website.]

Posted in Uncategorized | Leave a comment