Problem Statement. To determine whether a common peer assessment instrument can assess competencies across internal medicine, pediatrics, and psychiatry specialties. Method. A common 36 item peer survey assessed psychiatry (n = 101), pediatrics (n = 100), and internal medicine (n = 103) specialists. Cronbach's alpha and generalizability analysis were used to assess reliability and factor analysis to address validity. Results. A total of 2,306 (94.8% response rate) surveys were analyzed. The Cronbach's alpha coefficient was .98. The generalizabililty coefficient (mean of 7.6 raters) produced an Ep2 = .83. Four factors emerged with a similar pattern of relative importance for pediatricians and internal medicine specialists whose first factor was patient management. Communication was the first factor for psychiatrists. Conclusions. Reliability and generalizability coefficient data suggest that using the instrument across specialties is appropriate, and differences in factors confirm the instrument's ability to discriminate for specialty differences providing evidence of validity.