Our Stolen Futurea book by Theo Colborn, Dianne Dumanoski, and John Peterson Myers
 
 

 

 

Clegg, LX, EJ Feuer, DN Midthune, MP Fay and BF Hankey. 2002. Impact of Reporting Delay and Reporting Error on Cancer Incidence Rates and Trends. Journal of the National Cancer Institute 94:1537–45.


Press coverage of this study

For years, optimistic messages coming from the National Cancer Institute and the American Cancer Society had implied that after decades and billions of dollars we were winning the war on cancer.

Mortality rates were on the way down because of better treatment, and even incidence rates appeared to be leveling off, meaning that fewer people were developing cancer than expected from past patterns. But this new analysis by Clegg et al. reveals that for several of the most common cancers the incidence statistics are far less rosy.

Incidence rates are still rising, not falling, and the trends for several cancers are alarming. Thus while fewer people are dying, more people are having their lives profoundly disrupted by cancer.

What did they do? Clegg et al. focused on the fact that it takes time for all cancer cases to be reported to the national registry of cancer, NCI's SEER (Surveillance, Epidemiology, and End Results), and for diagnosis corrections to be incorporated. Previous estimates of cancer incidence rates had not accounted for this delay adequately.

Some adjustment for delayed reporting was built explicitly into the system by simply delaying the first reports of a year's data for two years, waiting beyond the "deadline" (19 months after diagnosis) for cancer case reporting. But some cases take much longer to report, sometimes many years. Periodically the cancer incidence rates would be corrected to incorporate these delayed cases. Some of the corrections also involved removal of cases after corrected diagnosis.

Clegg et al. developed a computer model that allowed them, based on past experience, to predict the number of cases that are missed due to reporting delay or removed because of diagnosis correction. To build the model, they analyzed the history of reporting delays at nine cancer registries across the US, focusing on melanoma, prostate cancer, female breast cancers, colorectal cancer and lung/bronchus cancers. They then used the model to predict missed/removed cases in the most recent SEER cancer incidence report, 1998 (published in 2000), and compared rates reported there with rates calculated by summing known with predicted cases.

What did they find? Clegg et al. found that reporting delays for cancer cases can last years. They calculate that it "would take 4–17 years for 99% or more of the cancer cases to be reported," and that the numbers in hand at the end of the current reporting period, 2 years, amount to somewhere between 88% and 97% of final cases counted.

These delays, in turn, introduce a bias into comparisons of recent vs less-recent years. More cases were removed than added using the old approach, and while this led to a better understanding of what had happened in earlier years, it created a inappropriate base of comparison between older and recent years: the old data set was complete, the more recent one was not. Any comparison of time trends was then biased toward finding lower rates in the most recent years, all other things being equal.

How big an effect did that bias create? The corrected estimates for female breast cancer in whites in 1998, for example, is 4% higher than the uncorrected calculation.

Cancer
1998 Adjustment
Female breast cancer
+4%

Prostate cancer
(white males)

+12%
Prostate cancer
(black males)
+14%
colorectal cancer
+3%
melanoma
(whites)
+14%
lung cancer
4%

These adjusments alter not only the assessment of cancer incidence in 1998, they change the trend analysis. For example, prior to the adjustment, the trend analysis of female breast cancer for the years 1987 to 1998 was flat: the trend was not statistically distinguishable from no annual change. The adjustment, however, revealed a statistically significant 0.6% annual increase in breast cancer risk during recent years.

Cancer
Annual Trend before
Annual Trend after
Female breast cancer
(white)
+0.4 ns
+0.6%

Prostate cancer
(white males)

-0.1 ns
+2.2% ns
colorectal cancer
(white females)
+0.9% ns
+2.8% ns
melanoma
(white males)
-4.2% ns
+4.1%
lung cancer
(white females)
0.5% ns
1.2%

["ns" indicates that the trend is statistically indistinguishable from no annual change]

What does it mean? According to the authors, "our results suggest that ignoring reporting delay and reporting error may result in the false impression of a recent decline in cancer incidence when the apparent decline is, in fact, caused by delayed reporting of the most recently diagnosed cases."

This is important because the SEER data are key signposts used by the policy and health community to gauge how well we are faring in the war against cancer. As long as the incidence trends were perceived to be headed downward, people could argue that we are winning. This new analysis suggests that we continue to lose, and should heighten pressure to direct new resources toward understanding the causes of cancer and toward prevention.

 
     
     

 

 

 

OSF Home
 About this website
Newest
Book Basics
  Synopsis & excerpts
  The bottom line
  Key points
  The big challenge
  Chemicals implicated
  The controversy
  Recommendations
New Science
  Broad trends
  Basic mechanisms
  Brain & behavior
  Disease resistance
  Human impacts
  Low dose effects
  Mixtures and synergy
  Ubiquity of exposure
  Natural vs. synthetic
  New exposures
  Reproduction
  Wildlife impacts
Recent Important    Results
Consensus
News/Opinion
Myths vs. Reality
Useful Links
Important Events
Important Books
Other Sources
Other Languages
About the Authors
 
Talk to us: email