Most of the radiation doses that are received by members of the public and by radiation workersboth routinely and in accidentsare what are commonly referred to as "low doses." There is no precise definition of "low" but it would include doses below, for example, 10 mSv per year. As seen from Table 15-2, the average radiation doses received by people in the U.S. are in the "low dose" region. It is obviously important to determine the effects of low radiation dosesor, more precisely, the effects of small additions to the unavoidable natural background dose.
However, despite much study, these effects are not known, being too small to see unambiguously. The most prominent assumption, accepted by most official bodies, is the so-called linearity hypothesis, according to which the cancer risk is directly proportional to the magnitude of the dose, down to zero dose. In applying this assumption a consensus estimate is that the risk to a "typical" individual of an eventual fatal cancer is 0.00005 per mSv (or 0.05 per Sv). Thus, if 100,000 people each receive an added dose of 1 mSv, then 5 additional cancer deaths are to be expected. At the same time, while adopting the linearity hypothesis as a prudent working assumption, many of the leading studies have also indicated the possibility that small increases in radiation dose do not create any additional cancer risk. This reflects the considerable disagreement that exists within the scientific community as to the validity of the linearity hypothesis (see Appendix F).