There are many ways to present information to visitors and users of 2D and 3D interface environments. In these virtual environments we can provide visitors with simulations of real environments, including simulations of presenters in such environments (a lecturer, a sales agent, a receptionist, a museum guide) and including audience participation in these environments. Our research aims at generating presentations from available multimedia information. In particular, we would like to see the generation of presentations by embodied conversational agents that employ verbal and nonverbal capabilities. In the past we have seen the introduction of embodied agents and robots that take the role of a museum guide, a news presenter, a teacher, a receptionist, or someone who is trying to sell insurance, houses or tickets. In all these cases the embodied agent needs to explain and to describe. The automatic generation of presentations and presentation agents from information sources is still too ambitious a task. Therefore we look at research from the perspective of the design of tools that can support presenters or can help to provide natural access to presentations and lectures. Can we use a given collection of sheets and maybe other accessible media sources to design, create and generate an embodied presenter? Among others we discuss manual annotation of available information and the way in which presenter agents can use it. Clearly, the development of tools for these purposes is a first step towards automating the generation of presentations and presentation agents.