Researchers in Germany recently found that higher maternal levels of vitamin D during pregnancy translated to higher in utero levels of vitamin D for their babies. This in turn, was associated with greater likelihood of developing a food allergy by age two. This is interesting because vitamin D is one of the few supplements I advise pretty much everyone to take, at least during the winter months, as it’s highly unlikely that most of us are meeting the current recommendations via food consumption.
Looking at the original journal article, I couldn’t find any listed source of funding. Always the first thing I check when I’m looking for legitimacy of research. Next I checked to see how many participants there were, and how they were selected. There were 629 mother-child pairs, not too shabby. Participants were recruited from an existing cohort study entitled LINA (Lifestyle and environmental factors and their Inﬂuence on Newborns Allergy risk). I’m not sure how participants were recruited from the study, nor am I sure how participants for the original study were recruited. I can’t help but wonder if they were volunteers if there might be some selection bias that would skew the results. However, further along in the methods, the researchers state that the vitamin D levels were tested for 378 mother-child pairs. This is still not a bad sample size, but not nearly as good as it initially appeared.
Getting into the statistical analysis makes me wish I had cared more about learning and retaining statistical ability when I was in school. It’s interesting to note that vitamin D levels were strongly correlated with season. Maternal and cord blood levels of vitamin D peaked in August and were lowest in March. Also, only seven of the mothers took vitamin D supplements during pregnancy. 44% of the mothers were deficient in vitamin D, 26% were insufficient, and 30% had optimal levels. 50% of newborns had deficient levels of vitamin D.
Results were analysed by dividing the mother-child pairs into quartiles based on the mothers’ vitamin D levels. I find this a little suspicious. There are no details provided as to how many mother-child pairs were in each category, nor what the cut-off were for each quartile, and I’m not sure how fair it is to draw comparisons between the quartiles if the distribution (as was mentioned in the analysis) was not equal. The results do show significantly more diagnosed food allergies among children born to mothers in the 3rd and 4th quartiles (i.e. those with the highest blood levels of vitamin D during pregnancy). However, there were still only six in the 4th quartile and five in the 3rd, compared to three in the 2nd and one in the 1st. It’s also important to note that the “diagnosed” food allergy was reported by the mother on a questionnaire; diagnosis was not actually obtained by the researchers. This introduces an additional element of bias.
In the discussion the researchers acknowledge that there was a high level of participation from allergy sufferers which may mean that the results can’t be generalised to the entire population.
Because vitamin D levels were so strongly correlated with the seasons I wonder if there is some other seasonal factor which may be causing the increased rate of food allergies among the children. There may also be some other commonality among the mothers with higher levels of vitamin D which is leading to increased rates of food allergies among their offspring.
I do think that this research is interesting and warrants further investigation. However, I worry that studies like this may actually cause more harm than good. Vitamin D supplements for infants are important to avoid the development of rickets. I wouldn’t want any potential parents, or new parents, to interpret these findings as an indication that they shouldn’t be providing these supplements to their babies.
Thanks to one of my loyal readers for sharing the news article with me and for one of my twitter friends for hooking me up with the original journal article.