Zeroing in on Zika

Every year I spend one to two weeks visiting the Caribbean island nation of Grenada. I don’t go for vacation, despite the allure of that country’s white sand beaches, but rather for work. I spend most of my time in windowless classrooms teaching clinical and research ethics to a number of graduate, medical and professional students from across the region.

One of the worries often voiced by family and friends when I travel to the tropics is about my health and safety. In recent years there have been a number of outbreaks of mosquito-borne diseases across Latin America and the Caribbean, including dengue, Chikungunya and (most recently) Zika. I myself caught Chikungunya during a visit to Grenada a year-and-a-half ago. Despite having a relatively mild case of what the locals call ‘Chick-V’, I still suffer from some lingering aftereffects, including intermittent arthritis-like joint pain in my right hand.

Despite all hullabaloo about Chikungunya in past years, public concern about that disease has largely faded in both the US and in the Caribbean. Most of the people I work with or teach in Grenada caught and recovered from ‘Chick-V.’ While the disease is now endemic in that part of the world, the number of new cases is relatively small since most people are now immune. Moreover, the long-term health impacts of Chikungunya are relatively mild.

Instead, and rightfully so, it is the rapid spread of the Zika virus across the Western hemisphere that is raising so many concerns. Zika, as you undoubtedly know, was first identified as a serious health threat during an outbreak of that virus in Brazil. Although most of the people who became sick with Zika only developed a mild illness – characterized by fever, headache and joint and muscle pain – at the same time Brazilian health authorities also noted a sudden increase in the number of children born with a rare birth defect known as microcephaly. Similar increases were also seen in other Zika hotspots, including El Salvador and Jamaica.

Microcephaly is a neurological condition in which an infant is born with a smaller-than-usual brain. Some children born with microcephaly develop normally, but most will experience lifelong symptoms that include developmental delays and disabilities, difficulties with coordination and movement, hyperactivity, and seizures.

So serious is the problem that some government officials in the region recommended that women avoid getting pregnant until the Zika outbreak is contained. Other (largely Catholic) countries in the region are reconsidering laws that currently outlaw abortion. Health authorities in the US and elsewhere are similarly recommending that pregnant women avoid traveling to Zika-affected areas. Some have even called for Brazil to cancel the 2016 Summer Olympic Games in Rio because of the potential threat that the virus poses to competitors and spectators.

Experts are also raising concerns about the possibility of an outbreak of Zika in the United States and Southern Europe. To date, over 500 cases have been reported in 35 states, including 48 cases involving pregnant women [Update: One day after writing this, the US Centers for Disease Control and Prevention increased these figures to include 279 pregnant women]. While none of these cases were locally acquired – all of these patients were infected while traveling in a Zika-afflicted region of the world – Aedes aegypti, the mosquito that most commonly transmits the virus, is abundant throughout the southern tier of the US. Nearly 700 cases of Zika have also been reported in the American territories of Puerto Rico, Samoa and the Virgin Islands; almost all of those cases were locally acquired.

So convinced are American public health experts that an outbreak of Zika in the US is imminent that doctors at the Children’s National Health System in DC, the Texas Children’s Hospital and the Baylor College of Medicine in Houston, have established specialized programs for diagnosing and treating people with the virus. Similarly, the US Centers for Disease Control and Prevention and the National Institutes of Health have already invested nearly $600 million to study the disease and to develop a vaccine.

Despite this, and despite a $1.9 billion request by the White House to combat the Zika crisis, Congress has largely failed to act. While the Democratic-controlled Senate has authorized $1.1 billion in funding, the Republican-controlled House of Representatives has proposed spending a paltry $600 million. Most of those House-authorized funds would also come from existing public health programs, including $350 million that would be stripped from a program designed to develop a vaccine for Ebola. The only House Republican who supports the Obama’s Administration request for nearly $2 billion? That would be Congressman Vern Buchanan of Florida, whose state has already (and will continue to be) the hardest hit by Zika.

During an election cycle where the most newsworthy candidates are decrying “politics as usual,” our leaders in Congress are nevertheless doing just that. They are risking the nation’s health because they don’t want to be seen as spendthrift politicians who spend taxpayer’s dollars willy-nilly. Only when we have a full-fledged outbreak, when our pregnant sisters and daughters are infected with a dangerous virus, when our children are born with a largely preventable birth defect will they likely act. Unfortunately, as our response to AIDS, to Ebola, and to other public health crises has shown, by then it will be too little too late.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on May 19, 2016, and is available on the WAMC website.]

Posted in Clinical Care, Health Care, Human RIghts, Policy, Politics, Prenatal, Public Health | Leave a comment

Half-and-Half Wits

I lost a friend last week. I didn’t lose her in the physical sense. She didn’t pass away or move to the other side of the globe. Rather, after a disturbing online exchange, I made the decision to, in the words of Gwyneth Paltrow, ‘consciously uncouple’ myself from her.

What happened was this: on her Facebook page she posted a popular Internet meme that read, “If Caitlyn Jenner went missing, would her picture appear on the back of a carton of half-and-half?” While some people might find a celebrity-mocking joke like this funny, I found it in exceedingly bad taste.

I was also surprised that this joke was posted by someone who is herself a member of the LGBT (lesbian, gay, bisexual and transgender) community. I asked her to take the post down, explaining my concerns about the type of message that a joke like that sends. She refused and our online conversation quickly went downhill. I finally ‘de-friended’ her.

Now anyone who knows me well should know that I rarely take offense at jokes. I enjoy sarcastic, self-depreciating and (often) inappropriate humor, particularly of the type that skewers celebrities and politicians, or that calls out some of the absurdities of modern life. My snarky comment about Gwyneth Paltrow is proof of that. So what was it about this joke that got me so riled up?

The problem with a joke like the one about Caitlyn Jenner is it perpetuates ugly stereotypes about the transgender community. It makes light of the real struggles of a highly marginalized and stigmatized segment of the population. It contributes to the continued victimization of a group of individuals whose only mistake was to be born into the wrong body.

Because of their fame, transgendered celebrities like Caitlyn Jenner, award-winning actress Laverne Cox, and musician Chaz Bono are easy targets for jokes like this. However, their wealth and prestige means that they are largely insulated from the discrimination and harassment that other transgendered men and women deal with on a daily basis.

If Caitlyn Jenner heard us laugh at a joke that implies that she is somehow half a man or only half a woman, she would probably roll her eyes, climb into her limo, and head off to her next photo shoot or red carpet gala. But when a young man or woman who is struggling with their gender identity hears us laugh, it sends a very different and very powerful message. It reinforces the idea that they are somehow damaged or defective. It denies them of their basic humanity and strips them of their dignity.

Add to this the fear-mongering rhetoric of politicians in states like North Carolina, Mississippi and Tennessee – where conservative lawmakers have promoted the idea that transgendered individuals are sexual predators in order to gain support for laws that deny them access to basic services, including the use of public restrooms – and it should come as no surprise that transgendered men and women have some of the highest rates of drug addiction, alcohol abuse, domestic violence and suicide.

For example, the National Transgender Discrimination Survey, a study of over 6,000 people conducted by the National Gay and Lesbian Task Force and National Center for Transgender Equality, found that over 40% of transgendered men and women attempt to take their own lives. This is a rate of suicide that is more than 10-times the national average.

That same 2011 study also found that transgendered individuals are very likely to be bullied in school (55 percent), to experience discrimination at work (59 percent), to be refused services by a health care provider (60 percent), to be harassed by law enforcement officers (61 percent), to be homeless (69 percent), and to be the victim of physical violence and sexual assault (78 percent). Most frightening is the fact that murder of transgendered women has hit an all time high, with one transwoman killed every 29 hours. Few of these murders are solved; of all of the transgender murders that occurred from 2013 to 2015, not a single one was prosecuted or even reported as a hate crime.

All of this can be directly attributed to the rampant transphobia that permeates American society. While a single ill-conceived joke about Caitlyn Jenner might not seem like a big deal, when we suggest that transgender men and women are to be laughed at we ourselves contribute to the internalized self-loathing and externalized stigmatization that results in the tragic death of so many of our brothers, sisters, sons and daughters.

Suicide, murder, physical violence and sexual assault are no laughing matter. If you think that they are, then you’re a half-and-half wit.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on May 5, 2016, and is available on the WAMC website.]

Posted in Discrimination, Human RIghts | 1 Comment

The Weight of the World

When I was a young child, I was a very picky eater. I would often refuse to eat the meals my parents put before me, even if it was something that I’d eaten and enjoyed before. Some kids are so-called “selective” eaters because of a medical problem like gastroesophageal reflux disease, gluten intolerance, or some other nutritional or sensory disorder, but my picky eating was a result of sheer stubbornness.

Like most mothers, mine resorted to all sorts of inducements, incentives and threats to get me to eat, including using the age-old remonstration about starving children in India or Ethiopia. My usual retort was to offer to pay the postage out of my allowance so that she could send the food there.

What a change 40 years can make. The rebellious five-year-old boy who would go a whole month eating nothing but buttered noodles is now a very adventurous eater. I have even sampled such exotic foods as snake, spiders, jellyfish and grasshoppers while traveling overseas. Even more surprising, although there is still a massive problem of hunger and malnutrition in India, Ethiopia and even in the United States – as many as 1 in 7 Americans go to bed hungry – there are now more people who are obese than who are malnourished globally.

According to a new study published in the medical journal The Lancet last week, the number of people in the world who are clinically obese has increased six-fold over the last four decades. Using a measure of body fat composition known as the body mass index (or BMI, which is calculated as an individual’s weight in kilograms divided by their height in meters squared), researchers compared historical rates of obesity among 20 million people from 186 different countries.

Clinically, a person is usually considered to be obese if they have a body mass index of 30 or higher. By this standard, an American man who is an average 5’10” in height and weighs 210 pounds would be obese. While the use of BMI as an individual measurement of body fat does have its flaws – it doesn’t distinguish between fat and lean muscle, for example, so an extreme body builder might also be classified as “clinically obese” despite a body fat level of less than 5% – the body mass index does works well when examining obesity at the population level.

What the study in The Lancet reported was this: the number of people worldwide who are clinically obese has increased more than 600% during the past 40 years, from 100 million in 1975 to almost 650 million in 2014. Globally, ten percent of men and 15 percent of women are now considered to be obese, the bulk of whom (pun intended) live in industrialized countries like the United States, Great Britain and China. By contrast, only 450 million people worldwide are considered to be malnourished. Most of those individuals live in impoverished regions of the world.

Should this trend continue unabated, over one-fifth all adults worldwide will be obese by 2025. Another two-fifths of the world’s adult population will be considered overweight. The public health and economic implications of this are staggering.

According to the World Health Organization (WHO), obesity is linked to as many as 60 life-threatening and costly illnesses, including heart disease, high blood pressure, stroke, cancer, and diabetes. Nearly 3 million people each year die as a result of preventable weight-related illnesses, making obesity directly responsible for about 5 percent of all deaths worldwide.

The human toll aside, obesity-related health care expenses total approximately $2 trillion annually. As a relatively fat nation, both in terms of our waistlines and our wallets, we Americans shoulder about one-tenth of these costs: about $200 billion a year in medical bills alone. Only war and smoking make bigger but equally preventable dents in the world economy.

Small wonder then that the WHO has set the ambitious goal of reversing rates of obesity by 2020. This is, however, a goal that the World Health Organization and other public health agencies will never be able to meet. This is because obesity is not a medical problem. Rather, it is a social problem with medical consequences.

It’s not that so many people worldwide are deliberately eating unhealthy foods. Rather, they increasingly lack access to healthier choices. For example, the urban poor have some of the highest rates of obesity globally. This is in part because they do not have the money or opportunity to buy healthy foods. In many urban communities, from New York’s Spanish Harlem to the Kibera slum in Nairobi, the only stores that sell food are often small corner bodegas that stock little in the way of fresh and affordable produce.

Similarly, our growing urban and suburban cityscapes are rarely designed to provide residents safe opportunities for exercising out-of-doors, whether by providing sidewalks on busy streets, by building walking and biking trails, or by creating and maintaining public facilities like parks and basketball courts.

Quite simply, the global obesity epidemic is a complex social problem with no single quick fix solution. The various public health proposals that have been proposed — free and healthy school lunches for all students, nutritional labels on restaurant menus, taxes designed to reduce the consumption high-calorie foods and drinks, government-sponsored wellness programs, and educational campaigns – all have their merits. Taken alone, however, each of these efforts will be largely ineffectual in reversing current trends.

This isn’t to say that we shouldn’t try, but we need to do more than set lofty goals and promote quick fix policies. This crisis didn’t happen overnight, and it won’t be resolved overnight. We need to develop a holistic and concerted plan that addresses all of the factors contributing to the obesity epidemic, be they medical, psychological, social, political or financial. Only then can we hope to achieve a happier, healthier and lighter future.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on April 21, 2016, and is available on the WAMC website.]

Posted in Health Care, Obesity, Uncategorized | Leave a comment

In the Eyes of the Beholder?

A couple of weeks ago, I sparked a small firestorm on my social media feed. While ‘checking into’ my local gym on Facebook, I made the self-deprecating comment that I was “not yet beach worthy.” Several of my friends and colleagues quickly took me to task for that statement, accusing me of buying into socially constructed stereotypes of health and beauty.

Anyone who knows me well likely knows that my Facebook comment was meant to be tongue-in-cheek. For a middle-aged professor who spends most of his time in mental endeavors rather than physical pursuits, I am in pretty decent shape (my lack of functioning hair follicles notwithstanding). That is in part because of genetics: I am naturally rather slim although I can put on weight if I do not eat right or exercise regularly. This is also in part because I have the luxury of time and the financial means to purchase healthier foods, to go to the gym, and to hire a personal trainer.

That said, my colleagues do have point. As Americans, we constantly judge ourselves and we constantly judge others according to largely unrealistic and entirely artificial expectations of physical perfection. But these expectations are not universally shared. They vary from culture to culture, from generation to generation, and from era to era. In one cross-societal study of “beauty,” for example, researchers found that a slim body was the feminine ideal in six societies while a plumper body was the ideal in thirteen societies. Similarly, in 17th century American society women were expected to have full hips and bust, a tiny and corseted waist, and a pale white complexion. By the late 20th century this American ideal had changed. Women were now expected to be slim but muscular, and have a dark tan.

According to feminist scholars like Naomi Wolf, author of the seminal text The Beauty Myth: How Images of Beauty Are Used Against Women, these cultural standards of beauty are constantly being reinforced by the images we see in magazines, movies, television, and on social media. Men and women alike – but more so women – are expected to adhere to these social standards of physical beauty, regardless of the personal or financial sacrifice required. Commercial companies like Revlon, Nutrisystem, Planet Fitness, and the Hair Club for Men all use this as a marketing tool, promising each and every one of us that we too can attain this physical ideal through diet, exercise, a judicious use of cosmetics and other beauty products, and even invasive plastic or bariatric surgery.

Moreover, if and when we fail to achieve this ideal (as most of us will), we become easy targets for criticism by ourselves and by others. Such criticism can have devastating consequences. Consider the case of actor Wentworth Miller, who became famous playing the buff and tattooed hero on the television show Prison Break. When that show ended in 2009, and he was no longer in the public eye, he fell into a depressive spiral. He also gained a lot of weight and, as a result of an unflattering photo taken by a Hollywood paparazzo, became the subject of an Internet meme that included such vicious taglines as “Fit to Flab” and “Hunk to Chunk.”

More recently, after being included in a special “plus size” issue of Glamour magazine alongside other stars as actress Melissa McCarthy and singer Adele, comedienne Amy Schumer was quick to take the editors of that magazine to task. Pointing out that she is a svelte 140 lbs. and usually wears a size 8 dress, the scathingly funny and stunningly beautiful Ms. Schumer was quick to point out that “young girls seeing my body type [are] thinking that is plus size.”

Amy Schumer has a point. Media studies have found that nearly three-quarters of high-school age girls report that magazine pictures and other media images shaped their concept of the perfect body. Of those girls, over half wanted or were trying to lose weight in order to achieve that largely unattainable physical ideal.

No wonder then that rates of potentially deadly eating disorders like anorexia nervosa and bulimia continue to rise in the US, particularly among adolescents. According to statistics from the National Association of Anorexia Nervosa and Associated Disorders, nearly 30 million Americans have some sort of eating disorder. Upwards of 5% of adolescent girls will suffer from anorexia nervosa or bulimia, and another 5% will experience some sort of binge-eating disorder. While eating disorders are less common among adolescent boys – only about 10-15% of those with anorexia nervosa or bulimia are men – young gay men are particularly susceptible; in one survey, nearly 15% of gay men were suffering from bulimia and over 20% were anorexic.

Despite what the images presented in magazines, movies and television try to tell us, health and beauty comes in all shapes, sizes, and colors. The two-hundred-and-fifty pound woman whom you are mocking in the grocery store may actually be an elite triathlete who happens to suffer from lipedema. The one-hundred-and-ten pound runway model that you idolize may achieve that weight only through the use of laxatives, purgatives, and stimulants.

Furthermore, and again in contrast to what we are told by the popular media, notions of self-worth and identity are not (and should not) be tied to how we look. Rather, they are defined by what we do, how we act, and whom we love.

Only until we realize that can we be truly “beach worthy.”

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on April 7, 2016, and is available on the WAMC website.]

Posted in Advertising, Discrimination, Education, Health Care, health literacy, Obesity, Public Health | Leave a comment

The Whiter the Bread …

As someone who grew up in Northern California during the 1970s, I was surrounded by all the fads of the New Age movement: past-life regression, crystals, channeling, EST (or Erhard Seminars Training), macramé, hot tubs, and the nascent organic food movement.

My mother willingly embraced many of these fads, particularly the organic movement. Our school lunch boxes were often filled with granola, yogurt and sandwiches on homemade whole grain bread the color and consistency of the macramé potholders that hung in our patio. It wasn’t until after my mother started working long hours as a real estate agent that we kids finally got the sorts of lunches we craved: PB&J or bologna sandwiches on Wonder® bread, with a Hostess Ho-Ho or Ding Dong as a treat.

If a new study out of the University of Texas MD Anderson Cancer Center is correct, that much-ballyhooed switch from homemade wheat bread to store-bought white bread may have been a bad move, at least with respect to my lung cancer risk. That study looked at the link between diet – specifically a diet rich in high glycemic foods like white bread – and rates of lung cancer among non-smokers.

Lung cancer is one of the most common (and the most preventable) forms of cancer in the US. According to the American Lung Association, about 225,000 Americans are diagnosed and 160,000 die of lung cancer annually. The majority of these cases are directly linked to smoking. An adult male who smokes is 25 times (or 2500%) more likely to get lung cancer than a non-smoker. In fact, nearly a quarter of heavy smokers (defined as smoking more than five cigarettes a day) will be diagnosed with the disease, as compared with less than one percent of people who have never smoked.

However, despite the clear link between smoking and lung cancer, approximately 25,000 non-smokers will be diagnosed with lung cancer in the US this year. Many of these cases can be linked to secondhand smoke, occupational hazards like asbestos or uranium, or exposure to radon (a colorless and odorless radioactive gas that can build up in basements and cellars). But many cases of lung cancer among non-smokers have no clear etiology. That, coupled with mounting evidence that dietary factors can influence an individual’s lifetime cancer risk, led the investigators at MD Anderson to conduct their study.

Nearly 4,500 non-smokers (1,900 lung cancer patients and 2,400 healthy controls) were asked about their dietary habits, including their consumption of sugary and starchy foods like white bread and white rice. The people who ate the most of so-called high glycemic foods – foods that quickly raise blood sugar levels following a meal – were 50% more likely to have cancer than those that ate the fewest of these foods.

So this must mean that the old adage “the whiter the bread the quicker you’re dead” is correct, right? In order to reduce my lifetime risk of developing lung cancer, a non-smoker like myself should eliminate all white bread, bagels, white rice, and potatoes from my diet, right?

Not so fast! Despite all of the television reports and newspaper headlines to the contrary, the results of this study are somewhat suspect. Retrospective studies of dietary habits are notoriously inaccurate; most people can’t even tell you what they had for breakfast that very morning, let alone give you an accurate description of the types of foods they’ve eaten over the past months and years.

In addition, the study failed to control for other lifestyle factors that may also influence cancer risk, including exercise, obesity and consumption of red meats and saturated fats. In fact, the greatest association between starchy foods and lung cancer was seen among patients with fewer than 12 years of formal education, which is often used as a proxy for socioeconomic status, health literacy, and dietary and exercise habits.

The biological mechanism by which starchy foods may increase a non-smoker’s risk of lung cancer are also unclear. One theory is that these high-glycemic foods stimulate the production of insulin, which in turn stimulates the growth of cells. Cancer is essentially the uncontrolled growth of cells, so this insulin-induced stimulation might be fueling the development of tiny tumors. But if that is true, then a diet with a high glycemic index should also be linked with a variety of other cancers. To date, however, studies looking at the consumption of starchy foods and colon, stomach, pancreatic, ovarian and prostate cancers have been inconclusive.

Finally, even if these results turn out to be true, this doesn’t mean that you should eschew your morning bagel. If you are a non-smoker, the likelihood that you will develop lung cancer is slim. Fewer than 1 in 500 people who have never smoked will be diagnosed with this disease, and most of those cases can be linked to non-dietary factors like secondhand smoke, asbestos and radon. A 50% increased lifetime risk is, in absolute terms, but a drop in the bucket. You’re more likely to die of some other dietary or lifestyle-related health issue, such as heart disease or type 2 diabetes.

Eating a diet that limits the consumption of sugary and starchy foods is a good thing to do. Coupled with regular exercise and other healthy choices, it can help reduce your likelihood of developing all sorts of chronic diseases. But to recommend that we avoid eating that bowl of steamed rice because of a putative lung cancer risk? That’s just blowing smoke.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on March 10, 2016, and is available on the WAMC website.]

Posted in Cancer, Public Health, Research | Leave a comment

Toxic Turf

Last summer, to great fanfare, the high school that my husband teaches at unveiled its new artificial turf field. Installed at considerable cost, that field was lauded as yet one more way that the school was working to protect the safety of its students.

As compared with a natural grass field (with all of its inherent unevenness, divots, rocks and gopher holes), an artificial turf field greatly reduces the number of sports-related injuries. One study of high school football players, for example, found that the number of concussions, ligament tears, muscle strains and ankle sprains was cut in half when players practiced and competed on artificial turf.

So important are these safety concerns that many schools will no longer allow their players to compete on a natural grass field. Those school districts that lack artificial turf fields may be excluded from league play or may be forced to rent such facilities in order to host local competitions.

However, recent reports aired on ESPN, NBC, and other networks have raised questions about the long-term safety of playing on artificial turf. Specifically, an increasing number of players and coaches, along with consumer safety advocates and environmental activists, are worried that those fields may be associated with an increased risk of cancer.

This is because artificial turf fields aren’t just made of plastic blades of grass. While those plastic blades are important to provide a natural grass look and feel, it is the rubber infill that is the most important aspect of an artificial turf field. That infill, usually made of black crumb rubber, is what provides the cushioning and traction necessary to ensure that soccer balls bounce and football cleats grab just as they would on a natural grass field. That same cushioning is also what helps reduce the frequency and severity of sports-related injuries.

The black crumb rubber used as infill is often made from recycled car tires. Among the 250 different chemicals that make up car tires are a number of compounds (including arsenic, benzene, cadmium and nickel) that are considered to be carcinogenic by organizations like the US Environmental Protection Agency (EPA) and the International Agency for Cancer Research.

When a soccer goalie dives onto the artificial turf to make a save, they may absorb small amounts of these cancer-causing chemicals through their skin. When a defensive lineman practices on a hot summer day, they may inhale small quantities of these carcinogenic compounds that are being released as gas. This is where the concern lies. Despite the fact that the amount of arsenic, benzene, cadmium and nickel in black crumb rubber is extremely low – well below the levels considered dangerous by regulatory agencies like the Consumer Product Safety Commission and the EPA – little is yet known about the risks associated with long-term low-level exposure, particularly for kids.

In fact, many advocates and activists are convinced that exposure to black crumb rubber is associated with a supposed increase in the number of lymphomas among athletes in the US. University of Washington coach Amy Griffin, for example, has a list of over 150 former collegiate and professional soccer players who were diagnosed with cancer after years of practicing and competing on artificial fields.

But before we rush to ban synthetic turf and replace existing artificial fields with natural grass, it’s important to recognize that anecdotal data like Ms. Griffin’s does not prove that there is a link between black crumb rubber and cancer. Self-reported disease clusters like these can help identify unexpected exposures or risks, but more often than not they turn out to be red herrings. Of nearly 600 such cancer clusters identified in the past 25 years, only three of these turned out to real upon further investigation. In less than 1% of such cases was a direct link between a cancer of concern and hypothesized environmental exposure demonstrated.

To date, more than 50 studies have looked at the health risks of crumb rubber. Not a single study has found a link between exposure to black crumb rubber and cancer. That does not mean that such a link does not exist, but we need studies that are carefully designed to look at these risks. For example, we need to collect the data that will allow us to compare rates of cancer among people who play soccer versus those who do not. Similarly, we need to collect the data that will allow us to compare rates of cancer among those that play soccer on artificial fields versus those that do not.

In fact, the state of California is doing just that. In the largest study of its kind, the California Office of Environmental Health Hazard Assessment is analyzing the different type chemicals released from new, uninstalled and in-use artificial fields, monitoring the air above such fields and playgrounds for specific chemicals that can be inhaled, and estimating oral and dermal exposures associated with playing on artificial turf. They will also be monitoring personal exposure to various carcinogens of athletes who play on synthetic turf. The results of that three-year $2.9 million study should be available in 2018.

Until then, we really won’t know whether or not these fields are 100% safe. The bulk of current data suggest that they are, but that is little comfort for the millions of concerned parents whose children play on those fields every day. Will they continue to take the risk or, given the stakes, will they opt to protect their children despite the lack of definitive evidence?

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on February 25, 2016, and is available on the WAMC website.]

Posted in Athletics, Cancer, Public Health, Uncategorized | Leave a comment

Shutting Down the Pill Mills

Last Friday, for the first time ever, a physician was convicted of second-degree murder for recklessly prescribing pain-killing drugs to patients. Dr. Lisa Tseng was sentenced to 30 years to life in prison by a Los Angeles County Superior Court judge for her role in the overdose deaths of three young men, each of whom had been given prescriptions for large amounts of opiate-based painkillers and other potent narcotics despite having no medical need.

Although the defense tried to paint Dr. Tseng as a well-meaning but naïve physician who was taken advantage of by manipulative and drug-seeking patients, in fact she ran a lucrative but criminal “pill mill” out of a seedy Southern California strip mall. She earned over $4 million a year by writing prescriptions for addictive drugs like Oxycontin and hydrocodone without performing the necessary medical exams.

Dr. Tseng is not the first physician to be arrested for running a “pill mill”. In 2014, for example, nearly two-dozen clinicians were arrested in New York for prescribing addictive narcotics to patients who didn’t need them. That small handful of physicians alone had prescribed more than five million doses of oxycodone, a powerful and highly addictive narcotic.

Just last year, agents with the federal Drug Enforcement Agency (DEA) conducted the “largest pharmaceutical-related bust” in that agency’s history. Nearly 300 people, including 22 doctors and pharmacists, were arrested and charged with conspiracy as part of a multi-state scheme to illegally distribute painkillers and other addictive drugs.

Unfortunately, these arrests and convictions will probably do little to stem the disastrous flood of prescription drug abuse that is currently washing over communities across America.

While it is tempting to argue that corrupt doctors like Tseng are responsible for the current epidemic of prescription drug abuse, they are only one strand in an incredibly complex and tangled web of addiction that dates back decades. The problem itself actually started in 1995, when the US Food and Drug Administration first approved long-lasting opioid drugs like OxyContin. Those drugs revolutionized the way in which pain was clinically managed in the United States, allowing patients with severe and unremitting pain to get much needed relief.

Prior to OxyContin’s approval by the FDA, chronic pain – a highly subjective symptom that can vary substantially from one individual to the next – was grossly undertreated. Physicians were often reluctant to prescribe pain-killing drugs like Vicodin, which had to be taken several times a day, out of fear that their patients would quickly become addicted. Since OxyContin only needed to be taken once or twice a day, physicians were far more comfortable in prescribing it. That, coupled with heavy promotion by pharmaceutical representative, meant that sales of it and similar painkillers quickly skyrocketed. A few years later, so did rates of addiction and subsequent overdose deaths.

So serious is this epidemic – and the interrelated heroin epidemic – that drug overdose has become the leading cause of injury-related death in the US. This year, more people will die from a drug overdose (usually involving an opioid prescription painkiller like Oxycontin) than will die in an automobile accident. Given the magnitude of this problem, it is not surprising drug abuse was a hotly debated issue in last week’s Republican and Democratic Presidential debates. Nor is it surprising that the Obama Administration has proposed spending over $1 billion on new initiatives to combat the problem.

That said, we have already made some headway in tackling this problem. In recent years, for example, state and federal drug enforcement agencies have begun to crack down on the use of these pain medications. In 2012, the New York State legislature passed the Internet System for Tracking Over-Prescribing (ISTOP) Act. That Act required the New York State Commissioner of Health to create a drug database in order to crackdown on the over-prescription and abuse of opioid painkillers like OxyContin. Nearly every US state now has a similar drug -monitoring program, which can be used to identify and suspend the prescribing rights of doctors believed to over-prescribe these highly addictive drugs. As a result, the rate of prescription drug abuse has finally stabilized and the number of overdoses has actually dropped slightly in recent years.

The conviction of Tseng and prosecution of others like her for second-degree murder will contribute little to these efforts. However, high-profile criminal cases like these may have a chilling and retrograde effect on the way in which we treat patients with chronic pain. Physicians, particularly those that specialize in pain management, are already under increased scrutiny as a result of programs like ISTOP. If they begin to worry that they could face criminal prosecution for prescribing drugs like OxyContin, they may start to limit the amount of powerful painkillers they give to patients, even to those who have a medical need for them.

We may soon learn that cautious steps that we have made in recent years towards more effective treatment and management of chronic pain in this country are for naught. It is important to hold callous physicians like Tseng responsible for their actions, but we shouldn’t deceive ourselves into thinking that these trials and convictions will make even the smallest dent in the prescription drug abuse epidemic. While these trials bring justice to the families of those who were harmed by the doctors and pharmacists who run pill mills, they may come at great cost to those who suffer from chronic pain. We need to find a more appropriate balance by creating laws and policies that prevent addicts from gaining easy access to drugs like OxyContin while still ensuring that doctors feel secure enough to prescribe them to those in pain.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on February 11, 2016, and is available on the WAMC website.]

 

Posted in Crime, Drugs, FDA, Substance Abuse, War on Drugs | Leave a comment

The Growing Perils of Pauline’s Pregnancy

It’s not easy to be a mother these days. Despite all of the advances in gender equality, the rearing of children still remains by default “women’s work.” This is not to say that fathers are not increasingly involved in caring for their kids, but most studies have shown that women still do the bulk of the work. Not only do they have to put up with nine months of discomfort while pregnant, once the child comes mothers are more likely than fathers to be responsible for changing diapers, looking after a sick kid, arranging for daycare and play dates, and even cooking, cleaning, laundry and other household chores.

You can now add to this some new fears: post-partum depression and the Zika virus. Earlier this week, for example, the US Preventive Services Task Force – an independent and non-partisan group of healthcare experts – recommended that all pregnant women and new mothers be screened for clinical depression. Despite several decades’ worth of research showing that a significant percentage of pregnant women and new mothers (nearly 1 in 10) will experience a major depressive episode, it goes largely undiagnosed. Untreated depression is the leading cause of prenatal and maternal morbidity in the US, and is associated with an increased risk of substance abuse and suicidal ideation among new mothers.

Given the Preventive Services Task Force new recommendations on depression, it is likely that most health insurance companies will soon cover the costs of screening. It may also spur Congress to pass legislation, introduced last year, which would fully fund mental health screening and treatment for all pregnant women and new mothers. Treatment, however, will likely remain a contentious issue.

Even when pre- or post-partum depression is diagnosed, whether because of perceptive doctors, the concern of family members, or a known history of mental illness, many afflicted women go still untreated. This is in part because antidepressant drugs – particularly the family of medications known as selective serotonin reuptake inhibitors (SSRIs), a class of drugs that includes such popular medicines as Paxil, Prozac, and Zoloft – have been linked to miscarriage, premature birth, low birth weight, birth defects and, in one small and still controversial study, autism. Although the absolute risk of these adverse birth outcomes is small, it is not insignificant, leaving many mothers and their physicians with the difficult challenge of weighing the known risks of untreated depression with the potential harms to the unborn child.

If that wasn’t enough to make some think twice about having a child, pregnant women worldwide must now worry about a mosquito-borne virus known as Zika. Although the virus was first discovered nearly 70 years ago in African monkeys, there have been almost no cases of human infection until recently. The first known outbreak occurred in Micronesia in 2007, followed by an outbreak in French Polynesia in 2013. However, these outbreaks didn’t raise any real concerns among doctors and public health officials, because of number of people infected was small and the symptoms relatively benign. In fact, over three-fourths of people infected with Zika don’t experience any clinical symptoms at all. Those who do become sick tend to develop a fever, headache and joint and muscle pain, but the symptoms are relatively mild and resolve within seven days.

So why is concern growing about the Zika virus now, especially for pregnant women? Starting in early 2015, public health officials in Brazil reported an outbreak of Zika in the northern part of that country. Within a short period of time, the virus spread to 21 other countries in the Americas. This includes the United States, where 20 cases of Zika have been reported among people who have traveled to Brazil or elsewhere in Central and South America.

Shortly thereafter, Brazilian authorities noted a sharp increase in the number of cases of a birth defect known as microcephaly, a rare neurological condition in which an infant is born with a smaller-than-usual brain. While some children born with microcephaly develop normally, many will experience lifelong symptoms including developmental delays and disabilities, difficulties with coordination and movement, hyperactivity, and seizures.

Since the outbreak began in Brazil, that country has reported nearly 4,000 suspected cases of microcephaly. In the previous year, the number of cases was less than 150. Although a casual link between Zika and microcephaly has not yet been proven, data now suggest that the risk of having an afflicted child increases 30-fold if a woman is infected with the virus while pregnant. So alarming are these figures that the US Centers for Disease Control and Prevention (CDC) has advised pregnant women to postpone travel to regions of the world where Zika outbreaks are actively occurring. Similarly, government officials in El Salvador, Columbia, Jamaica and Ecuador have taken the unprecedented step of recommending that women in those countries avoid getting pregnant until the Zika outbreak is contained.

Even if you are not planning to travel to Rio for the Olympic Games, however, you have reason to be concerned. The mosquito that transmits Zika is already present in the United States. Moreover, the warm and wet winter caused by this year’s El Niño event make it likely that these mosquitos will thrive in the coming months. That, coupled with the likelihood of the virus continuing to enter the United States with ever increasing trade and travel with Latin America, means that Zika will likely take root in our fertile American soil. All we can do now is wait, plan and hope that a safe and effective vaccine for Zika is developed soon.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 28, 2016, and is available on the WAMC website.]

Posted in Health Care, Public Health, Reproductive Rights, Vaccines | Leave a comment

Penning a Solution to the War on Drugs

After nearly six months on the run, Joaquin Guzman Loera — the Mexican drug lord known as “El Chapo” — was recaptured by Mexican authorities. He is now back in the prison from which he made his daring escape, awaiting extradition to the United States to face charges of drug trafficking and murder.

Interestingly enough, El Chapo’s apprehension was both aided and hindered by two celebrities, Oscar-winning American actor Sean Penn and Mexican telenovela starlet Kate del Castillo. In October, Mr. Penn travelled to Mexico for a secret meeting and interview with the infamous drug lord. At the time, Mexican law enforcement agents delayed a scheduled raid of El Chapo’s hideout in order to protect the safety of Mr. Penn and Ms. del Castillo, giving him time to escape. However, because of continued communication between El Chapo and the two actors – fueled by the drug lord’s narcissistic desire to have a movie made about his life – Mexican authorities were later able to track and capture him.

While Mr. Penn and Ms. del Castillo’s actions may not be illegal, in my opinion they certainly were unethical. More importantly, after reading Mr. Penn’s interview with El Chapo in Rolling Stone this morning, I am struck by how foolish and naïve the actor is. While paying lip service to the many law enforcement officers killed by El Chapo and other narcotics traffickers, Sean Penn nevertheless idolizes the drug lord, going so far as to justify his interview by saying, “I’m drawn to explore what may be inconsistent with the portrayals our government and media brand upon their declared enemies.”

Let us not forget that drug lords like El Chapo, both here in the US and overseas, are directly responsible for hundreds of murders and indirectly responsible for thousands of deaths associated with narcotics trafficking and the rampant use of illicit drugs. That said, I hope the controversy over the Rolling Stone interview and the pending trial of El Chapo leads us to consider a bigger issue: the failure of the so-called War on Drugs.

Since then-President Nixon first declared a war on drug use in 1971, federal, state and local law enforcement agencies have spent hundreds of billions of dollars combating drug use. But despite the money spent, true victories in the war are few and far between. Even when a large-scale drug lord like El Chapo is captured, he and his organized crime syndicate are quickly replaced by an even more ruthless gang of drug dealers. The killings continue unabated and there is but a short-lived dip in the amount of illicit narcotics flowing into the US.

Moreover, in addition to the billions of dollars spent hunting drug traffickers like El Chapo abroad, the government also spends billions of dollars annually imprisoning those who sell or use drugs here in the US. Because of the desire of our elected officials to appear tough on crime, those convicted of even minor drug-related offenses are sentenced to decades (or even life) behind bars. For example, almost half of all of the inmates in the federal prison system are there for nonviolent drug-related offenses, with the leading drug involved being marijuana. Since the start of the War on Drugs, the number of Americans in prison for selling or possessing narcotics has increased ten-fold yet our drug epidemic remains unchecked.

Not only are the economic and political costs astronomical, the social impact on many communities – particularly inner-city neighborhoods and communities of color — has been devastating. Much of the hostility of these communities towards law enforcement agencies (currently at a boiling point following police shootings of young African-American men like Michael Brown and Tamir Rice) can be traced back to the War on Drugs and the subsequent militarization of the police.

No wonder then that many top law enforcement officials, including the police chiefs of Los Angeles, Chicago, and Houston, are beginning to decry our current approach to dealing with illegal drug use. One such group of police chiefs, known as Law Enforcement Leaders to Reduce Crime and Incarceration, is calling for reforms in our current judicial approach. Specifically, they want to reclassify many drug-related crimes from felonies to misdemeanors as well as reduce or remove mandatory sentencing minimums; a 19-year-old arrested for possessing a small amount of heroin for personal use will no longer be sentenced to 3-to-5 years in a state penitentiary, but will instead be referred to a substance abuse program.

While this is a good start, I wonder if reforming our judicial system is enough. Perhaps it’s time to have a national dialogue about illegal drugs and their use, including considering proposals to legalize some recreational drug use (as has been done for marijuana in four states and the District of Columbia). Creating a legalized yet highly regulated market might not only address the problem of drug trafficking and violence, but the tax revenues could provide a much needed stream of revenue to support substance abuse programs. In addition, legalizing some recreational drug use would put cartels run by vicious criminals like El Chapo out of business; one study found key drug traffickers like Mexico’s notorious Sinaola cartel would lose more than half of their annual income if the US simply legalized the use of relatively benign drugs like marijuana.

America has a serious drug problem, but it is clear after over four decades that our current approach has failed. The War on Drugs is simply bad policy, both economically and politically. As hard as it is for Americans to concede, it’s time for us to give up this losing strategy and look at other ways of combating the problem of drug use at home and abroad.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on January 14, 2016, and is available on the WAMC website.]

Posted in Crime, Drugs, Policy, Uncategorized | Leave a comment

Dreaming of a White Christmas

Most of my friends, upon learning that I was raised in sunny California, are shocked to find that winter is my favorite season. Since first moving to this area in the mid-90s, I’ve relished in the fact that I now live in a place with seasons, a region of the country that enjoys subzero temperatures and frequent snow during the darkest months of the year. No wonder then that the Snow Miser from the classic cartoon ‘A Year Without a Santa Claus’ is my yuletide Facebook avatar.

Much to my dismay and my local friends’ glee, that has not been the case so far this year. In fact, until this past Tuesday when we received a dusting of snow hardened into place by relentless freezing rain, this region of the country has had abnormally warm temperatures. It was so warm on Christmas Eve, a stunning 72°F, that my husband and I drove out to my in-laws house with the convertible top down on my car. For the Northeast, this December will go on record as the warmest in over 200 years.

A lot of the warmth that we are experiencing, and the similarly unusual weather in other parts of the country – including unexpectedly low temperatures in the Southwest, snow and ice storms in the Southeast, tornadoes in Texas, and flooding in Missouri – can be chalked up to the cyclical weather pattern known as El Niño. But global climate change is also likely playing a major role in this year’s (and future year’s) extreme weather patterns.

El Niño, more appropriately called the warm phase of the El Niño Southern Oscillation (or ENSO), is a global pattern of climatic variation that occurs when an unusually warm band of seawater develops in the equatorial region of the Eastern Pacific Basin. It normally develops around December, thus giving this natural event is name; El Niño means can mean little boy, but more often means Christ Child in Spanish.

When it occurs, El Niño creates increased rainfall across the east-central and eastern Pacific Ocean. While the effects of El Niño are more direct and stronger in South America, it is also associated with warmer weather in the western and northern US states, and heavy rainfalls in the south and southeast.

El Niño is associated not only with the natural disasters that we have seen on the news this past week — where dozens of people in the Midwest and South lost their lives and hundreds more lost their homes to raging floodwaters and swirling tornados — it is also linked to outbreaks of infectious diseases that threaten the health and lives of millions more.

Until 1991, for example, the entire Western hemisphere had been free of cholera for more than 100 years. When the disease re-emerged in Peru, later spreading throughout South and Central America, it coincided with an El Niño event that resulted in much warmer than normal coastal waters.

Among the many hypotheses about the re-emergence of cholera in this part of the world is that the bacteria that causes the disease, Vibrio cholerae, was able to proliferate in these unusually warm waters which set the stage for increased exposure and transmission to humans. Cholera is now re-established in Central and South America, and it is only a matter of time before it re-emerges in the US. That may occur this year. The US National Oceanic and Atmospheric Administration (NOAA) predicts that this year’s event “could be among the strongest in the historical record,” and the average temperature of US coastal waters has similarly surpassed prior historical accounts.

Cholera is not the only disease that US public health officials are worried about with this year’s unusal weather. Cases of the disease Cryptosporidiosis can also be linked to unusual and increased rainfall patterns. The disease is caused by a chlorine-resistant parasite that normally infects cattle and waterfowl, but which can be transmitted to humans when our drinking water becomes contaminated with agricultural waste.

In the US, the largest outbreak of Cryptosporidiosis occurred in Milwaukee, Wisconsin, in 1993 when agricultural runoff from local pastures contaminated the water supplies of the Howard Avenue Water Purification Plant. Over 400,000 people contracted the illness and nearly 100 died, with one study suggesting that the outbreak cost nearly $100 million in medical treatments and lost productivity.

While the US has not had an outbreak on that scale since, sporadic epidemics often occur during times of flooding. I would not be surprised if there is a localized outbreak in Missouri in light of the record floods that El Niño has caused.

There are, in fact, a whole host of endemic, emerging and re-emerging diseases that we should worry about given this year’s unusual weather. These include not only diarrheal diseases like cholera and Cryptosporidiosis, but also mosquito-borne illnesses like dengue fever, chikungunya and malaria, as well as Lyme disease, rabies and spongiform encephalopathy.

However, while El Niño makes this situation bad, global climate change makes the problem even worse. During the last several decades, the number of El Niño events has increased, and studies of historical data suggest that the increased frequency and intensity of these events is linked to global climate change.

As the average temperature of our planet increases, so too will the likelihood of weather-triggered outbreaks of disease. Given this state of affairs, it is unfortunate then that most of our leaders in Washington, and the current slate of Presidential candidates, seem to be largely dismissive of the threat that climate change poses to the public health. Until the economic costs of disaster relief, medical treatment and lost productivity directly affect their bank accounts and those of their corporate backers, however, I fear that little will change.

[This blog entry was originally presented as an oral commentary on Northeast Public Radio on December 31, 2015, and is available on the WAMC website.]

 

Posted in Climate Change, Politics, Public Health | Leave a comment