Whole Food-Based Supplementation Versus Fragmented Vitamin and Mineral Supplementation Term Paper
- Length: 7 pages
- Subject: Health - Nutrition
- Type: Term Paper
- Paper: #26764599
Excerpt from Term Paper :
Whole Food-Based Supplementation vs. Fragmented Vitamin and Mineral Supplementation
Americans' interest in nutrition has grown in proportion to their waistlines over the last 30 years or so. Further, as healthcare in general improves and the population ages, problems associated with old age which were relatively rare in the past (because few people lived to see their 80th birthday) have become increasingly common. This has also contributed to the growing interest in nutrition among Americans. Unfortunately, this interest has most frequently taken the form of an act of desperation -- people look for a silver bullet to slay the monster of obesity or senescence (Tomlin, 2001). The silver bullet itself most often takes the form of a nutritional supplement or a fad diet. People seldom address the root cause of their obesity or medical problems, usually a lack of self-discipline at the dinner table and/or the gym. Even when embracing a fad diet people usually lack the self-discipline to stick with it long enough for it to truly fail.
This project aims to research and resolve at least one of the issues surrounding modern nutrition: whether whole food-based nutritional supplementation is better than fragmented vitamin and mineral supplementation at delivering bioavailable nutrients to the human body.
If a difference exists between the effectiveness of whole food-based nutritional supplementation and fragmented vitamin and mineral supplementation, it will be in the bioavailability of the necessary nutrients. According to Jackson (1997), bioavailability is the fraction of the ingested nutrient that is utilized [by the human body] for normal physiological functions or storage. Essentially, this means that there are some forms of nutrients that the body finds easier to assimilate than others. Iron in spinach or kale is easily assimilated, whereas iron from iron filings is not.
A number of things affect the bioavailability of any given nutrient. Quite beyond the chemical formulation of the nutrient itself are other factors likely to influence the proportion of nutrient absorbed from any source, whether it be whole foods or synthetic supplements:
1. The efficiency of digestion
2. The previous intake of the nutrient
3. The body "status" of the nutrient
4. Gut transmit time
5. The presence of gasto-intestinal disorder or disease
6. Other products with which the food stuff is consumed
7. The prior-treatment (namely cooking or processing) of the product
Just as some of us are better at storing fat than are others, some of us are better at digesting our meals than are others; hence the efficiency of digestion. As for the nutrient's previous intake, people can build up tolerances to nutrients such that if massive doses of nutrients are taken to counteract a deficiency (real or imagined) and then the nutrient is suddenly discontinued, the amount of the nutrient in the body will sometimes fall below the level that prompted taking the supplement in the first place.
Further, the "status" of the nutrient refers to the fact that some nutrients are necessary for healthy living in small amounts, but large doses of the stuff do not necessarily provide an added benefit. Rather, in most cases it is simply a question of whether the tank is full or empty. If the nutrition tank is empty, then taking a supplement will do some good. If, however, the tank is already full, or ae full, then taking a supplement only tops off the supply, at best, or more likely simply gets excreted without doing anyone any good. Gut transmit time, obviously, relates to how long the nutrient stays inside the human digestive tract before it is excreted. The lesser the transmit time, the greater the chance that more of the nutrient will pass through the body without doing anybody any good.
If a person has a gastric disorder that keeps them from properly assimilating the contents of their food, then not only does it become probable that the person will suffer any number of deficiencies, but as a rule, he or she will have to eat a good deal more than most people would just to break even.
Some consumables can positively or negatively affect the bioavailability of unrelated foods with which they are consumed. Vitamin C, for instance, can adversely affect the body's ability to assimilate iron, just as vitamin D positively affects the body's ability to assimilate calcium. Truswell (2003), found that grapefruit actually increased the potency of some supplements and even medicines, especially those that were involved in channel blocking, such as medicine for hypertension.
Finally, in two cases reviewed for this study, beta carotene was made more bioavailable by first pureeing then cooking a quantity of carrots as compared to its availability in the raw, unprocessed version of the vegetable (Livny, et al., 2003), hence Jackson's "prior treatment" factor.
These findings are generally echoed by Castenmiller and West (1998), who found that pureed and cooked carrot meal made beta carotene dramatically more available to research subjects' bodies than did raw carrot meal.
The nutrients to which I am continually referring are technically called "micronutrients" -- those nutrients which are required in minute quantities for continued healthy living. In addition to minerals, they include vitamins and secondary products such as polyphenols, flavonoids, isoflavones, terpenes, and glucosinates (Vuong, et al. 2004).
It can be very difficult to know where to begin when confronted with an issue as vast as the whole food vs. nutritional supplement problem. Just in the United States, there are over thirty six brands of iron supplements alone (Fairweather-Tait and Teucher, 2002), which are, themselves, a tiny fraction of the total volume of supplements "out there."
Another thing that compounds the problem of are the varying methods used to measure the bioavailability of nutrients (Fairweather-Tait, 1997). In the past, one method used was simply to measure the amount of nutrient being ingested, and then examine fecal stools for nutrient content. By subtracting the nutrient content of the stools from the known nutrients that were ingested, an exact figure for how much of the nutrient has been assimilated can be derived. However, this never worked for all nutrients, only for those that can remain stable passing through the gut. Vitamins may break down in the gut without being absorbed at all, giving a false impression about how much of the nutrient has been absorbed. Minerals would seem to be stable, but the intestines actually excrete some minerals (copper, for instance) so measuring the content of these nutrients in stool samples is far from perfect (Yeum and Russell, 2002).
Until recently, nutrients were tagged with radioactive markers, and the marker's distribution throughout the body could be measured using special instruments; the markers were treated as proxies for the actual nutrient. However, this method was not particularly accurate, and as we have become more knowledgeable about the effect of radioactivity, injecting the body with substances emitting ionizing radiation has fallen out of favor.
A method called accelerator mass spectroscopy has been recently pioneered which has proven safer and much more accurate than all other methods (Vuong, et al., (2004). The problem with increasing the accuracy of our detection instruments is that it renders previous observations suspect, so comparisons among data spanning decades becomes problematic. For instance, it has been shown that the amount of cholesterol in chicken eggs has steadily fallen throughout the last twenty years or so. This is not because of any change farmers have made in chickens' diet or exercise regime, but because of the increasing accuracy of our instruments; chicken eggs never had as much cholesterol as was originally measured.
Another increasingly pernicious problem is the fact that intensive farming techniques have steadily depleted farmlands of minerals. Quite apart from the usual P, K, and N. minerals with which we are all familiar as being necessary for growing plants and which we continually replenish through commercial fertilizers, the other minerals such as iron, copper, zinc, and magnesium exist in finite quantities in soil and have steadily been leached out by decades of over-farming. This condition has been exacerbated by the Green Revolution's development and dissemination of faster-growing crops; we can not get four crops a year from some fields instead of only two, thus depleting the soil of minerals at twice the old rate.
This means we cannot make any assumptions about the mineral content of foods we eat. Spinach has a reputation for being rich in iron, but if it is grown in iron-poor soil, it will have little or no iron to offer the human body. These plants do not make the minerals we need; they get them from the soil. Researchers are aware of this fact, and have recommended growing specially genetically modified seeds in nutrient-rich soils for exporting to place where the soils are poor; the idea is that a seed that is already packed with minerals will produce a crop that is also packed with minerals (Graham, Humphries and Kitchen, 2000). This is plausible, but is certainly not sustainable in the long run.
This has implications for research that is conducted in different places…