In this article, an enactive architecture is described that allows a humanoid robot to learn to compose simple actions into turn-taking behaviors while playing interaction games with a human partner. The robot's action choices are reinforced by social feedback from the human in the form of visual attention and measures of behavioral synchronization. We demonstrate that the system can acquire and switch between behaviors learned through interaction based on social feedback from the human partner. The role of reinforcement based on a short term memory of the interaction is experimentally investigated. Results indicate that feedback based only on the immediate state is insufficient to learn certain turn-taking behaviors. Therefore some history of the interaction must be considered in the acquisition of turn-taking, which can be efficiently handled through the use of short term memory.