Nutrition research commenced more than 200 years ago in the dawn of the chemical
revolution. The “golden age of nutrition” began in the early 1910s and continued
into the 1940s when nutritional sciences focused primarily on diseases
associated with single nutrient deficiencies. This led to the formulation of the
Recommended Daily Allowance (RDA) of each nutrient. After almost all of
the essential nutrients had been discovered, nutrition research focused on the
problem of multifactorial chronic diseases, many of which are caused not by nutritional
deficiency but by overnutrition. In the following years, the revolutionary
progress in recombinant DNA technology and genomics culminated in 2001 with
completion of the Human Genome Project and sequencing of the entire human
genome. As a result of all these developments, genomics, transcriptomics,
proteomics, and metabolomics are increasingly being used in nutritional research.
Nutritional genomics, also called nutrigenomics, is an emerging field in the
life sciences and is considered as one of the next frontiers in the postgenomic era.
Its fundamental concept is that a healthy phenotype can develop into a chronic
disease phenotype via alterations in gene expression or epigenetic phenomena
and that the diet contains substances having the potential to modify these processes.
Nutrigenomics focuses on the relationship between dietary nutrients
and gene expression using state-of-the-art technology.
The development of DNA microarrays and protein chips make large-scale
genomic and proteomic investigations possible by allowing simultaneously high
throughput monitoring of the expression of thousands of genes in response to
diet. The emerging knowledge will aid in the understanding of how nutrients
modify cancer risk, chronic diseases, and aging. It is generally recognized that
most human diseases are largely avoidable by lifestyle changes. This places nutrigenomics
at the forefront of preventive medicine.