Abstract Exposure to ultraviolet radiation has commonly been recognized as the most important environmental risk factor for melanoma. The measurement of UV exposure in humans, however, has proved challenging. Despite the general appreciation that an objective metric for individual UV exposure is needed to properly assess melanoma risk, little attention has been given to the issue of accuracy of UV exposure measurement. The present research utilized a GIS based historical UV exposure model (for which the accuracy of exposure estimates is known) and examined, in the case–control setting, the relative importance of UV exposure compared to self-reported time spent outdoors, in melanoma risk. UV estimates were coupled with residential histories of 820 representative melanoma cases among non-Hispanic white residents under 65 years of age from Los Angeles County and for 877 controls matched to cases by age, sex, race, and neighborhood of residence, to calculate the cumulative lifetime UV exposure and average annual UV exposure. For historical measures, when the participants resided outside the US, we also calculated UV estimates. While there was no increased risk of melanoma associated with self-reported time spent outdoors, the association between annual average UV exposure based on residential history and melanoma risk was substantial, as was the association between cumulative UV exposure based on residential history and melanoma. The time spent in outdoor activities appeared to have no significant effect on melanoma risk in any age strata, however, when adjusted for UV exposure based on residential history, time spent outdoors during young age significantly increased risk for melanoma. While there was some attenuation of risk when we excluded data from people resident overseas (as all other studies we are aware of have done), this did not significantly impact subsequent risk estimates of UV exposure on melanoma.