Posts in Journalism
The Bacteria Babies Need

Originally published in the New York Times June 17, 2018

Image by Ariel Davis via the New York Times

Image by Ariel Davis via the New York Times

We may be missing the key to one of the biggest boons to public health since the introduction of iodine into the food supply in 1924.

Scientists at the University of California, Davis, have found that a strain of bacteria called B. infantis that is thought to have been the dominant bacterium in the infant gut for all of human history is disappearing from the Western world. According to their research, this was probably caused by the rise in cesarean births, the overuse of antibiotics and the use of infant formula in place of breast milk.

Indeed, nine out of 10 American babies don’t harbor this bacterium in their gut, while researchers suspect that the majority of infants in less industrialized countries do.

Bruce German, a professor of food science and technology and one of the U.C. Davis researchers, says, “The central benefits of having a microbiota dominated by B. infantis is that it crowds all the other guys out” — especially pathogenic bacteria, which can cause both acute illnesses and chronic inflammation that leads to disease.

Read more on why infant exposure to micro-organisms is important.

Studies suggest that by the time babies without B. infantis are children, they are more likely to have allergies and Type 1 diabetes and more likely to be overweight. This change to the infant gut may be at the root of the rising prevalence of diseases and ailments, from allergies to certain cancers.

Dr. German and his colleagues learned about the missing bacterium by studying breast milk. They found that the milk contains an abundance of oligosaccharides, carbohydrates that babies are incapable of digesting. Why would they be there if babies can’t digest them?

They realized that these carbohydrates weren’t feeding the baby — they were feeding B. infantis.

What can new mothers do to ensure that their babies have this beneficial bacterium? At the moment, nothing.

If you live in the industrialized world, you probably can’t pass B. infantis on to your baby. Not even if you give birth vaginally, breast-feed exclusively and eat well.

B. infantis is not the only endangered bacterium in the West, and babies aren’t the only ones affected. By studying mice, researchers at Stanford have found that a lack of dietary fiber — which is missing from most processed foods — results in the loss of important bacterial strains.

Once these strains are gone, the only way to get them back will be to deliberately reintroduce them.

In a study funded by a company that plans to do just that, Dr. German and colleagues fed B. infantis to breast-fed babies. They found that it took over the entire lower intestine, crowding out pathogenic bacteria.

Read on how good bacteria can prevent deadly infection in babies.

Although it’s too early to know if these babies will turn out to be healthier than their peers, the hope is that the presence of B. infantis for the first year or two of life will help prevent colic, allergies, asthma, obesity, diabetes, heart disease and cancers later in life.

Dr. German envisions a future when it will be common for us to add the bacterium to some of our foods, much as we did with iodine.

But just inoculating babies with B. infantis won’t be enough. We should also give their mothers the opportunity to breast-feed.

The bacterium can’t survive without the carbohydrates it depends on. While companies are trying to figure out how to add oligosaccharides into infant formula, it will be very difficult to replicate the complexity and concentration of the carbohydrates that are naturally present in breast milk.

While the decision to breast-feed is often framed as a personal choice, most women have no choice. Only 15 percent of workers and 4 percent of the lowest-paid workers in the United States have access to paid family leave, which means they often can’t afford to stay home with a newborn.

Many other nations — like Austria, Bulgaria, the Czech Republic, Hungary, Japan, Latvia, Lithuania, Norway and Slovakia — manage to provide working parents with more than a year’s worth of paid family leave.

We should do the same. It’s not just about better personal health, but about better public health, which has been in decline in this country for decades.

We’d also be wise to heed these findings on the microbiota as a harbinger of what’s to come. The promotion of infant formula in place of breast milk, and our reliance on processed foods into adulthood, have had some unforeseen and frightening repercussions for our health. The industrialization of our food supply is changing us from the inside out.

No One Knows Exactly How Much Herbicide Is in Your Breakfast

Originally published on VICE May 11, 2016

Last week, lawyers in New York and California initiated a class-action lawsuit against Quaker Oats for selling oatmeal labeled "100% natural," even though it contains trace amounts of the not-so-natural chemical glyphosate, the active ingredient in the herbicide known as Roundup. Labeling aside, the suit brings up an even bigger question: How freaked out should we be about chemicals in our breakfast?

The International Agency for Research on Cancer (IARC), which is part of the World Health Organization (WHO), declared glyphosate "a probable human carcinogen" last year, heightening consumer concern about the use of the herbicide on our foods. Glyphosate is mixed with other chemical ingredients to make Roundup (which is manufactured by the biotech company Monsanto), and is widely used on food crops to kill unwanted weeds in agricultural production; it's also frequently used in home gardens. The IARC report pointed to an increased risk of non-Hodgkin's lymphoma in humans who are occupationally exposed to the herbicide, and noted the prevalence of rare liver and kidney tumors in animals exposed to glyphosate.

Glyphosate is the most commonly used broad-spectrum herbicide in the world. Its use rose globally from 112.6 million pounds in 1995 to 1.65 billion pounds in 2014. This spike coincides with the introduction of "Roundup Ready" GMO crops, which are genetically engineered to withstand the herbicide. In addition, even some non-GMO crops, including wheat, oats, barley, and beans, are sprayed with glyphosate in a practice called desiccation, which dries the crops and speeds ripening. This has prompted concern about increased residues—as has the fact that, in 2013, the EPA raised the allowable limit for glyphosate residue in food. This means there's a good chance glyphosate residue lurks in both GMO and non-GMO foods. (The use of glyphosate, as well as many other pesticides, herbicides, and fungicides, are prohibited in organic farming, so certified-organic foods are likely free of these residues.)

Testing done on a sample of Quaker Oats Quick 1-Minute Oats at an independent lab and paid for by the Richman Law Group, which is representing plaintiffs in the new lawsuit, found levels of glyphosate at 1.18 parts per million. The EPA currently allows up to 30 parts per million in cereal grains. A spokesperson for Quaker Oats wrote in an emailed statement to VICE, "Any levels of glyphosate that may remain are trace amounts and significantly below any limits which have been set by the EPA as safe for human consumption." Echoing that sentiment, an EPA spokesperson, also in an emailed statement to VICE, wrote, "In setting tolerances for pesticide residues on various foods, 

EPA ensures that there will be a reasonable certainty of no harm to people when they consume food containing residues resulting from use of the pesticide."

In other words, both Quaker Oats and the EPA take the position that you should not worry about glyphosate residue in your oatmeal or elsewhere because the levels are below the threshold the EPA has set for "no harm."

Researchers cannot ethically test the effects of glyphosate in a randomized controlled experiment on humans, so instead they have to rely on animal studies as well as large-scale observational studies, in which they make associations (for instance, between farm workers with occupational exposure to glyphosate and increases in lymphoma). And given the findings so far, scientists who study environmental chemicals strongly disagree with the idea that low levels of glyphosate are harmless. Fourteen of these experts recently published a consensus statement expressing concern that the herbicide may be an endocrine disrupting chemical (EDC), which means it has the potential to be biologically active even in extremely low doses. (Despite this, the EPA does not consider glyphosate to be an EDC.) Thousands of separate studies on EDCs have shown that low-level exposure could have detrimental health effects—including an increased risk for certain cancers, infertility, obesity, diabetes, and developmental problems. This suggests that even trace amounts of chemicals like glyphosate found in oats or other foods could 

be carcinogenic or disruptive to other important biological functions. "Hormones themselves are active at parts per trillion and parts per billion levels [in our bodies]," John Peterson Myers, chief scientist at the research and policy nonprofit Environmental Health Sciences, told VICE. "In the real world of biology, those levels have huge effects. Hormone-disrupting chemicals can also be biologically powerful at those doses."

Studies have shown that glyphosate may interfere with fetal development and causebirth defects, and while much is still unknown, emerging work in rodent models shows that it has effects on male reproductive development. The endocrine system is exquisitely sensitive to very low dosages of EDCs, Andrea Gore, professor and Vacek Chair of pharmacology at the University of Texas, told VICE, and this is especially true when it comes to developing fetuses, infants, and children. "Small fluctuations from the norm can change developmental processes and lead to a dysfunction at the time of exposure, or sometimes, many years after exposure," she said.

Given the research on endocrine disruption, the levels allowed by the EPA are too high, and have no basis in science, Bruce Blumberg, professor of developmental and cell biology and pharmaceutical sciences at UC Irvine, told VICE. "This is a political decision rather than one based on reasonable, peer-reviewed science." Blumberg is especially concerned about desiccation, which could mean there are potentially even greater 

amounts of glyphosate residue on our foods than previously accounted for. "Glyphosate and other herbicides were never intended to be used [as desiccants], and I am truly astonished that the EPA allows it absent a showing of how much glyphosate or other herbicides are present on the final product."

Another concern is that Roundup is actually a mixture of glyphosate and other potentially harmful chemicals—a combination that has never been tested. Tests are performed on glyphosate alone, a fact several scientists VICE spoke to pointed to as being a major and often overlooked concern.

"The actual product used is a mixture of chemicals, combined to increase the effectiveness of the active ingredient," Myers said. "The actual product mixture is never tested in regulatory testing. Never—even though that is what people are exposed to."

The widespread use of Roundup means there are potentially many food products—some carrying an "all-natural" label on their packaging—that also contain glyphosate residue. But consumers would have no way of knowing: Despite having a set limit for the herbicide residue in food, and despite the fact that it was introduced to our food system in 1974, the FDA has never monitored levels of glyphosate in food. 

But in February of this year, the agency announced that it would begin monitoring levels in soybeans, corn, milk, and eggs. Notably absent from this list is the food in question in the lawsuit: oats. The FDA would not provide any further information about this when contacted by VICE, but a spokesperson said the agency has recently developed "streamlined methods for testing glyphosate."

There doesn't seem to be any disagreement about the presence of glyphosate in our oatmeal; Quaker Oats and the EPA admit it's there. It's likely in myriad other food products as well. And that's the deeper relevance of this lawsuit: It points to the fact that so much is unknown or undisclosed about what actually ends up in our food. Where regulatory agencies have dragged their feet, and where food manufacturers continue to make dubious claims on labels, consumers are taking matters into their own hands with class-action lawsuits—Kim Richman of Richman Law group, for instance, has also filed class action lawsuits in regard to the presence of trans fats and GMOs in foods.

R. Thomas Zoeller, a biology professor at the University of Massachusetts who studies endocrine disruptors, said that there are several examples of how the government has failed to protect the consumer when it comes to environmental chemical exposure. He points to flame retardants and chemicals called PCBs. "We now know we exposed pregnant women and kids to these chemicals, which affected brain development—we 

have heard this story over and over again," Zoeller told VICE. "The government is using a strategy that hasn't protected people."

Everyone can agree that more research in this area is necessary; the EPA is currently reviewing last year's IARC findings. The real question is, will glyphosate prove to be another notorious environmental chemical that we'll later learn harms human health? And if that's even a possibility, how shall we hedge our bets in the meantime?

 

We’re About to Start Seeing More Early Deaths from Diabetes

Hip-hop pioneer Phife Dawg died this week at the age of 45, from complications of diabetes. His early death is a harbinger of tragedies to come.

Photo by Rodrigo Vaz via Getty

A Tribe Called Quest's Malik Taylor, aka Phife Dawg, died on Wednesday at age 45 from complications of diabetes. Phife was known for being a pioneer of hip-hop, and, to a much lesser extent, as having a sweet tooth. (A few bars into the 1991 track "Buggin Out," he notes, I drink a lot of soda so they call me Dr. Pepper.) Taylor was diagnosed with the disease in 1991, at the age of 20.

A 20-year-old diagnosed with diabetes was once exceedingly rare—the disease was called "adult-onset" diabetes for a reason. But increasingly children and young adults are being diagnosed in alarming numbers. The rise was noted with concern back in 2000, when the American Diabetes Association published a consensus statement on the subject. A 2014 study found that the prevalence of type 2 diabetes among ten to 19-year-olds rose 30 percent between 2001 and 2009. By 2012, fully one half of the entire US adult population had either diabetes or pre-diabetes.

There's a common perception that people who have diabetes can just take meds and live a normal life. A growing industry normalizes the disease with lotions, supplements, medications, magazines, and food and drinks that cater to a diabetic population. But as Taylor's death illustrates, diabetes is not something to take lightly, and this is especially true for those diagnosed young, since living with the disease for longer can lead to worse outcomes. Complications include blindness, end-stage kidney failure, stroke, and numbness in the extremities—which means wounds go unnoticed, get infected, and can result in amputations. Taylor was so sick that in 2008 he required a kidney transplant from his wife, Deisha Taylor.

Read the full article

Bad (and Good) Eating Habits Start in the Womb

Last week my piece "Bad Eating Habits Start in the Womb" appeared in TheNew York Times and generated a lot of interest and  commentary — it was the number one most emailed story on the entire site for over a day. I find this to be an encouraging sign that people are really concerned about the food they eat, and especially about the health of their babies and children. Click here to listen to an interview I did for New Hampshire Public Radio discussing my piece. Below is a description from NHPR's website.

You may be familiar with the ordeal of introducing children to broccoli and spinach.  Two new studies suggest that finicky eaters might have picked up their discriminating habit in the womb. Forget genetics, personal responsibility, and discipline. Your taste for junk food and soda may have a lot to do with how your mother satisfied her cravings.

Kristin Wartman, is a food, politics and health journalist. She recently wrote about the new science of food choices for the New York Times.

Photo: Rafael Viana Araujo via Flickr Creative Commons

Bad Eating Habits Start in the Womb

THE solution to one of America’s most vexing problems — our soaring rates of obesity and diet-related diseases — may have its roots in early childhood, and even in utero.

Researchers at the Monell Chemical Senses Center, a nonprofit research organization in Philadelphia, have found that babies born to mothers who eat a diverse and varied diet while pregnant and breast-feeding are more open to a wide range of flavors. They’ve also found that babies who follow that diet after weaning carry those preferences into childhood and adulthood. Researchers believe that the taste preferences that develop at crucial periods in infancy have lasting effects for life. In fact, changing food preferences beyond toddlerhood appears to be extremely difficult.

“What’s really interesting about children is, the preferences they form during the first years of life actually predict what they’ll eat later,” said Julie Mennella, a biopsychologist and researcher at the Monell Center. “Dietary patterns track from early to later childhood but once they are formed, once they get older, it’s really difficult to change — witness how hard it is to change the adult. You can, but it’s just harder. Where you start, is where you end up.”

This may have profound implications for the future health of Americans. With some 70 percent of the United States population now overweight or obese and chronic diseases skyrocketing, many parents who are eating a diet high in processed, refined foods are feeding their babies as they feed themselves, and could be setting their children up for a lifetime of preferences for a narrow range of flavors.

Read the full article

Pay People to Cook at Home

THE home-cooked family meal is often lauded as the solution for problems ranging from obesity to deteriorating health to a decline in civility and morals. Using whole foods to prepare meals without additives and chemicals is the holy grail for today’s advocates of better eating.

But how do we get there? For many of us, whether we are full-time workers or full-time parents, this home-cooked meal is a fantasy removed from the reality of everyday life. And so Americans continue to rely on highly processed and refined foods that are harmful to their health.

Those who argue that our salvation lies in meals cooked at home seem unable to answer two key questions: where can people find the money to buy fresh foods, and how can they find the time to cook them? The failure to answer these questions plays into the hands of the food industry, which exploits the healthy-food movement’s lack of connection to average Americans. It makes it easier for the industry to sell its products as real American food, with real American sensibilities — namely, affordability and convenience.

I believe the solution to getting people into the kitchen exists in a long-forgotten proposal. In the 1960s and ’70s, when American feminists were fighting to get women out of the house and into the workplace, there was another feminist arguing for something else. Selma James, a labor organizer from Brooklyn, pushed the idea of wages for housework. Ms. James, who worked in a factory as a young woman and later became a housewife and a mother, argued that household work was essential to the American economy and wondered why women weren’t being paid for it. As Ms. James and a colleague wrote in 1972, “Where women are concerned their labor appears to be a personal service outside of capital.”