News

Article

Risk factor: The concept, the evolution, and the practice

The concept of the risk factor, often taken for granted in modern medicine, was not used in clinical studies until 1950 but has made a tremendous contribution to health policy, the health sciences, and scientific inquiry. The roots of the risk factor can be found in the 18th and 19th centuries, when probability and statistics were adopted as methods of quantifying the risk of death and disease, according to M. Roy Wilson, MD, chancellor of the University of Colorado at Denver and Health Sciences Center.

The concept of the risk factor, often taken for granted in modern medicine, was not used in clinical studies until 1950 but has made a tremendous contribution to health policy, the health sciences, and scientific inquiry. The roots of the risk factor can be found in the 18th and 19th centuries, when probability and statistics were adopted as methods of quantifying the risk of death and disease, according to M. Roy Wilson, MD, chancellor of the University of Colorado at Denver and Health Sciences Center.

Dr. Wilson presented the American Glaucoma Society subspecialty day lecture Saturday afternoon, discussing the concept of the risk factor as well as its evolution and its practice. Probability, one of the building blocks of the risk factor, is based on acceptance of uncertainty, an idea that led to it being renounced until the 20th century because the notion of uncertainty clashed with the prevailing philosophy of determinism, Dr. Wilson said. Statistics, introduced in the mid-1700s, was less controversial because it was useful in quantification.

The general concept of the risk factor began to evolve with the growth of the life insurance industry and its need to identify factors influencing the risk of death as well as through acceptance of the idea of multifactorial disease etiology. The shift from infectious diseases to chronic diseases as major causes of death in the United States also played a role.

Once the concept of risk factors was accepted, it first was used in the Framingham Heart Study in 1950, and the term first appeared in a study publication in 1961. By 1970, the term was still relatively infrequent, with only 24 references in the medical literature, most of them referring to the Framingham study, Dr. Wilson said. In 1980, there were 699 articles, and with the concept solidly entrenched, there were almost 40,000 in 2006.

Although the concept is now well established, its application in medicine is at different stages depending on the specialty, Dr. Wilson said. Global risk assessment has become quite sophisticated in cardiovascular disease, and similarities have been noted between the epidemiology of coronary heart disease and that of glaucoma. A number of risk factors for glaucoma have been identified, but the development of risk calculators is in the early stages.

Predicting the future of risk factors, Dr. Wilson said that the concept would be at the heart of transformative changes in medicine. When an individual's genetic fingerprint can be readily determined, predicting what diseases he or she might develop over time, 'prospective medicine' could become routine. This would entail continuous individual risk assessment planning, a major shift from today's emphasis on the occurrence of a molecular event, Dr. Wilson said.

Dr. Wilson is featured in Ophthalmology Times' AAO Podcast Series.

Related Videos
Dr. Analisa Arosemena discusses glaucoma at EyeCon 2024
Dr. Amir Ghanipour discusses light adjustable lenses for cataracts at EyeCon 2024
EyeCon 2024: Adam Wenick, MD, talks about myopic interventions across the lifespan
© 2024 MJH Life Sciences

All rights reserved.