This report describes the construction and evaluation fo a 35-item checklist used in performing peer review of ambulatory medical records. Scores obtained by using the checklist were evaluated for reproducibility. Ten reviewers, reviewing ten records on each of two occasions judged the records consistently item by item, 74 per cent of the time; 53 per cent greater than expected by chance (p less than 0.01). Pairs of reviewers, reviewing the same ten records, were consistent with one another, item by item, 72 per cent of the time; 35 per cent greater than expected by chance (p less than 0.05). Ten sick call patients were reexamined by an especially trained Reevaluation Physician who evaluated the quality with which they had been managed at the time of sick call. The medical records of the same ten patients were then reviewed with the Peer Review Checklist. The correlation between the quality scores obtained by the two methods were 0.72 and 0.74 on two trials. A correlation coefficient of 0.44 was found between the two evaluation methods when 89 cases were reviewed by a Peer Review panel composed of 10 different physicians. Peer Review Checklist scores correlated positively with scores obtained by using a series of disease specific protocols with explicit criteria. The correlations varied from 0.28 to 0.63 with six different disease specific protocols.