How accurate is an optical age?

A common question geoscientists and archaeologists have is, how accurate are luminescence ages? Under ideal conditions (e.g., bright optical signals that are optimal for the measurement protocol used in the lab, and well-constrained dose rate histories) the error of an optical age can be 10% of the age or less. 

Relative errors in optical ages, however, can be harder to minimize in young (e.g., <1000 years) sediments than they are in older (Mid-late Pleistocene) samples.  This is largely because younger samples i) tend to have dim signals, ii) can be more adversely effected by temporal and spatial variations in the dose rate field during burial, iii) are more sensitive to incomplete re-setting of the optical signal during sediment transport and burial, and iv) are particularly sensitive to heating treatments applied in the laboratory during De measurement (Madsen & Murray, 2009).

Optical ages vs their relative errors for young (&lt;1000 yrs) and older (Mid-late Pleistocene) sediments. The highest relative errors are commonly associated with sediments of ~100 years in age or less.

Optical ages vs their relative errors for young (<1000 yrs) and older (Mid-late Pleistocene) sediments. The highest relative errors are commonly associated with sediments of ~100 years in age or less.

So, in the most ideal cases, late-Holocene optical ages may have errors of ±100-500 years, whereas mid-Pleistocene sediments that are ~200-300 ka in age may have errors of ±15-20 ka.  If a sample was deposited within the last 500 years or so, absolute errors in the order of decades or less can be achieved.