Tuesday, February 26, 2013

Let's Talk About Gluten. Please. This Has Gotten A Little Out of Control

You certainly don't have to look very hard to find articles and blog posts on gluten and its purported association with a variety of health issues such as obesity, heart disease, arthritis, and non-celiac gluten sensitivity. While I don't really doubt that some people without celiac disease might legitimately be affected by gluten, I think the discussion around gluten and it's non-celiac ill effects have now crossed the line into fad. Wheat Belly, a diet book authored by cardiologist William Davis, is currently the #2 best selling health and fitness book on Amazon, which advocates eliminating wheat entirely from our diets, whole grain or not, largely based upon the premise that modern wheat is nothing like what our grandparents used to eat, and so it must be connected to these growing problems, much less the increased prevalence of celiac disease. Big Ag, essentially, has manipulated the genes of this staple beyond recognition, and the unintended consequences are vast and dire, amounting to "the most powerful disruptive factor in the health of modern humans than any other". While I haven't read the book, I'm familiar with a lot of the arguments it makes, and how they are perceived in the general public, especially the contention that Davis is referring to GMO wheat, which does not exist on the market. Unfortunately it's pretty difficult to find a good, genuine science-based take on it. To paraphrase Keith Kloor, the majority of what you'll find only has the veneer of science.

Jesus F'ing Christ (Source)
When I was in high school, and as an undergrad in the humanities, writing a research paper meant I started with a thesis statement and found evidence to support whatever it was I wanted to advocate for. In science-based medicine, you test a hypothesis by conducting a randomized control trial if possible, and ultimately by finding all available published reports and presenting the entire story (systematic review), or by combining the statistical analysis of multiple studies into a single large study (meta-analysis). This is not a subtle difference. Wheat Belly is a prime example of the former, which is not necessarily a bad thing, per se. People can make a compelling argument  without a systematic review, but it is not acceptable as a last word in medicine, health, or nutrition, period. While there may be evidence to support the idea, it's easy to minimize or even completely overlook evidence to the contrary, especially since you're not really making a point to look for it. It seems pretty obvious that the discussion around wheat could use a little objectivity.

Take a look at this pretty balanced article recently published by the New York Times on the increasing diagnoses of celiac disease.
BLAME for the increase of celiac disease sometimes falls on gluten-rich, modern wheat varietals; increased consumption of wheat, and the ubiquity of gluten in processed foods.

Yet the epidemiology of celiac disease doesn’t always support this idea. One comparative study involving some 5,500 subjects yielded a prevalence of roughly one in 100 among Finnish children, but using the same diagnostic methods, just one in 500 among their Russian counterparts.

Differing wheat consumption patterns can’t explain this disparity. If anything, Russians consume more wheat than Finns, and of similar varieties.

Neither can genetics. Although now bisected by the Finno-Russian border, Karelia, as the study region is known, was historically a single province. The two study populations are culturally, linguistically and genetically related. The predisposing gene variants are similarly prevalent in both groups.
The article goes on to suggest that exposure to different microbial environments is the biggest factor, but it's rather apparent that we have can't just point to a simple answer. The world is a complex place, our bodies are complex, nutrition and health are complex. This is pretty much what you'd expect, right?

Now take a look at some of the massive coverage on the recent randomized control trial showing significant cardiovascular benefits to the Mediterranean diet. Here's a good analysis from The Harvard School of Public Health. The Mediterranean diet arm of the study were encouraged to liberally use olive oil, eat seafood, nuts, vegetables, and whole grains, including a specific recommendation that pasta could be dressed with sofrito (garlic, tomato, onions, and aromatic herbs). The control diet this ended up favorably compared to was quite similar, but specifically geared toward being low-fat. Both were discouraged from eating red meat, high-fat dairy products like cream and butter, commercially-produced bakery goods, and carbonated beverages.

The largest differences between the two diets centered around discouraging vegetable oils, including olive oil, and encouraging 3 or more pasta or starchy dishes per day in the control group. To me, this suggests that Wheat Belly lives in that sweet spot for widespread dissemination of being easily actionable, having some evidence to support it so that you get some good anecdotes and positive results, but is vastly oversimplified and not suitable or necessary for everyone. Remember, the Wheat Belly diet implicates even organic whole grains as being irredeemably manipulated. It's a completely wheat free diet, because modern wheat is the greatest negative factor in human health. Based on an actual experiment of almost 7,500 people, we have strong evidence that it's the amount of wheat people eat that is problematic. You can eat some whole grains daily and still vastly decrease your risk of heart disease and obesity as long as you don't eat them 3 or more times a day.

The appendix to the NEJM study indicates that some of the patients in the control diet complained about bloating and fullness, but nothing similar from the Mediterranean diet group. The implications seem fairly obvious: there is little basis to make a draconian decision to completely eliminate something with proven health benefits such as whole grains from your diet unless you genuinely suffer from celiac disease. If you're interested in losing weight, think maybe you have gluten sensitivity, or just want to eat healthier, try something like this diet first, and definitely don't put your gluten free diet pamphlets in my child's take-home folder at school. That wasn't cool.

Update: Take a look at this critical post about the RCT of the Mediterranean diet. There's some perspective on the magnitude of the effect they found, and some compliance issues with the recommended diets that I'm not convinced are as damning as he suggests. I also find this sentence a bit odd:
So while you might be less likely to have a heart attack or stroke, you're no less likely to die. This is why I'm so confused they ended the study early.
I don't know. I'd rather just not have a heart attack or stroke. Nevertheless, it's a thoughtful and overall very thorough take on the study.

Sunday, February 17, 2013

Moderate Drinking Isn't Totally Risk-Free? Crap

Let's be honest, it's pretty ridiculous that it's been three decades since anyone bothered to look at an association between alcohol consumption and risk of cancer mortality in the U.S. Surely, even though there was basically no research to point to, few people would be totally surprised to be reminded that maybe there's some other potentially fatal conditions caused by drinking apart from loss of liver function. And certainly, it's foolish to just assume that, as a moderate drinker, you'd expect only the purported benefits without any of these potential consequences. Bringing this discussion back on the table is a good thing, but it's important to do so responsibly and in the proper context.

I can buy that Malort gives you cancer at least (Source)
Earlier this week, researchers published a study aiming to do exactly that. An in-depth article on the research was featured here in the San Francisco Chronicle. The basic idea is that we know alcohol puts people at risk of developing certain types of cancer, including oral, esophageal, liver, and colon cancer, so the study used meta-analyses published since 2000 to calculate the effect alcohol has in developing these types of cancers, controlling for confounding variables. They then used data from health surveys and alcohol sales to estimate adult alcohol consumption, and then analyzed with mortality data from 2009 to estimate how many deaths might specifically be attributed to drinking using formulas established in other countries for similar purposes. This came out to range between 18,000 and 21,000 people, or about 3.5% of all cancer deaths. This is actually higher than the amount of deaths from melanoma, and considering how aware people are of the risks of extended exposure to the sun without sunscreen, risks of drinking alcohol could be unjustifiably underrated. The next step is to establish a dose-response curve, establishing how drinking more might affect this relationship.

Many of the stories on the article focus particularly on the quote that "there is no safe threshold" for alcohol consumption, and that roughly a third of these deaths represented individuals who consumed less than 1.5 drinks per day. Essentially, as many as 7,000 people in the U.S. who drank that amount per day die from cancer each year that they developed because of that consumption. I'm not really interested in poring through the data to question the validity of this number. It's fair to be very skeptical of how granular you can be in determining the risks for each individual based on an average obtained from surveys known to be quite limited, and ecological data like sales. Ultimately, without longitudinal follow-up of drinkers or a case-control study, this represents a fairly low level of evidence on the grand scheme of things. That's not to say to take the general conclusion with a grain of salt. Quite the contrary, actually. It's just that we can't safely interpret exactly how strong (or weak) this effect really is at this point, and need more robust study designs to get there.

Science-minded people like to blame the media for hyping up conclusions of studies, but here you see the investigators explicitly saying that there is no safe amount of drinking. The abstract itself declares alcohol to be a "major contributor to cancer mortality." What message is a journalist supposed to take from that? The headlines are right there, laid out on a platter for them. I don't think that the investigators necessarily egregiously overstated the conclusions, but it wasn't exactly brimming with context. Also, I don't really expect moderate drinkers to really alter their behavior based on this, but it sort of goes without saying that the proper conclusion wouldn't really grab as much attention, and you never know how things will be absorbed. So I'll try and lay one out myself:

Based upon this study, it appears that the the risk of death due to drinking has been underestimated. Even moderate drinking, which has some potential health benefits, may contribute to mortality from one of seven types of cancers largely understood to be associated with alcohol consumption. This is the first look at such an association in the United States in over 30 years, and as such, represents a building block from which to generate research ideas that more effectively establish this association and how different consumption patterns alter its effect.

Now if you'll excuse me, I'm going to the liquor store to buy some rye and a shaker. For real.

Monday, February 4, 2013

New Proof That Being a Vegetarian is Healthier?

According to this blog post from ABC News, the recently published (behind paywall) results of a large prospective cohort study performed in the UK offers "further proof that eating meat can be hazardous to health." That strikes me as sort of an odd way of framing it, and I hope after reading this you'll understand why I think so. The proper conclusion really should be that we have more evidence that being a vegetarian is associated with a reduced risk of heart disease, but it's still difficult to tell how much other lifestyle choices play into this. In other words, as long as you live a healthy lifestyle in general and eat meat a few times a week or less, it's still difficult to assume you'd see a significant benefit by cutting that meat out of your diet.

The study began in 1993 and followed 44,561 men and women in England and Scotland, 34% of whom were vegetarians when the study began. After an average follow-up time of nearly 12 years, the incidence of developing ischemic heart disease (IHD) was compared between the vegetarian group and the meat-eating group. A subset of the study population was also used to measure risk factors related to heart disease such as body-mass index (BMI), non-HDL-cholesterol levels, and systolic blood pressure, and again the vegetarian group came out ahead. The investigators controlled for some obvious potential confounding variables such as age (the vegetarian group was 10 years younger on average), sex, and exercise habits. Overall, I agree with the quoted physician from the ABC post that it's a very good study, and does a pretty good job of trying to estimate the singular effect of whether a person eats meat or not.

This is essentially the type of design I was talking about being needed in the lead/crime post, and while it represents a pretty high level of evidence, there's still some very important limitations to be aware of. That's not to justify being skeptical of the entire claim that being a vegetarian is healthier for the heart than not, but it's not as simple as just saying "meat is hazardous to your health." The two groups being compared are not randomized, so it could very well be that the vegetarians had an overall healthier lifestyle, of course including exercise, but also dietary factors beyond just not eating meat. How would the results compare if you took a vegetarian that ate a lot of whole grains and vegetables vs. an omnivore that ate meat maybe once or twice a week but also ate a lot of whole grains and vegetables? My guess would be that the risks of IHD for the latter individual wouldn't be much higher, if at all. The central issue here is called selection bias, and it refers to the possibility that the two populations are different enough that drawing a firm cause and effect relationship to one specific difference between the groups is suspect.

The perils of reading too much into a cohort study are illustrated by the story of hormone-replacement therapy (HRT) for post-menopausal women. In 1991, a large cohort study based on a population of nurses came to the conclusion that HRT protected women from cardiovascular diseases. This was one out of a series of cohort studies for HRT that found a variety of purported benefits. Immediately, concerns about selection bias were raised, but were largely dismissed. HRT became very common throughout the rest of the decade based on these very well-designed cohort studies that seemed to provide really good evidence. Finally, in 2002, the first double-blinded randomized control trial for HRT vs. placebo was performed. Both the estrogen plus progestin and estrogen only arms of the study were halted early because the risks for heart disease, stroke, and pulmonary embolism were higher than in the placebo groups. The exact opposite of the results of the cohort study were found when the comparison groups were randomized. Again, this isn't to say I dispute that completely eliminating meat is a healthy choice, or that if an RCT were performed, the meat-eaters would turn out to be healthier. I'm just illustrating why you should be always be aware of the possibility of selection bias in non-randomized studies.

Another issue to look out for is how reduced risks are presented in studies. There are essentially two ways to do it: by subtracting the difference in risk in one group vs. the other, or by presenting a risk ratio. Looking at the numbers, only 2.7% of the entire study developed IHD, about 1.6% for meat-eaters vs. 1.1% for vegetarians. Another way of putting this is, "a vegetarian diet is associated with 0.5% less risk of developing IHD." That doesn't sound quite as impressive, but it's completely accurate. It's a lot more eye-opening to divide the 1.1% by the 1.6% and get your 32% reduced risk. I'm not saying anybody did anything inappropriate, but the raw numbers have a way of putting things like this into the proper perspective. 

Ultimately, what does all this amount to? If I were at risk for IHD, I'd adopt a healthy lifestyle of a well-rounded diet with moderate to no meat, as well as to exercise regularly and stay active throughout the day. Hardly an earth shattering piece of advice, but that's what we've got. To put it bluntly, just eat like a person is supposed to eat, goddamnit. You know what I mean.