Event-related potentials were recorded from adults and 4-month-old infants while they watched pictures of faces that varied in emotional expression (happy and fearful) and in gaze direction (direct or averted). Results indicate that emotional expression is temporally independent of gaze direction processing at early stages of processing, and only become integrated at later latencies. Facial expressions affected the face-sensitive ERP components in both adults (N170) and infants (N290 and P400), while gaze direction and the interaction between facial expression and gaze affected the posterior channels in adults and the frontocentral channels in infants. Specifically, in adults, this interaction reflected a greater responsiveness to fearful expressions with averted gaze (avoidance-oriented emotion), and to happy faces with direct gaze (approach-oriented emotions). In infants, a larger activation to a happy expression at the frontocentral negative component (Nc) was found, and planned comparisons showed that it was due to the direct gaze condition. Taken together, these results support the shared signal hypothesis in adults, but only to a lesser extent in infants, suggesting that experience could play an important role.