Pigeons pecked keys on concurrent-chains schedules that provided a variable interval 30-sec schedule in the initial link. One terminal link provided reinforcers in a fixed manner; the other provided reinforcers in a variable manner with the same arithmetic mean as the fixed alternative. In Experiment 1, the terminal links provided fixed and variable interval schedules. In Experiment 2, the terminal links provided reinforcers after a fixed or a variable delay following the response that produced them. In Experiment 3, the terminal links provided reinforcers that were fixed or variable in size. Rate of reinforcement was varied by changing the scheduled interreinforcer interval in the terminal link from 5 to 225 sec. The subjects usually preferred the variable option in Experiments 1 and 2 but differed in preference in Experiment 3. The preference for variability was usually stronger for lower (longer terminal links) than for higher (shorter terminal links) rates of reinforcement. Preference did not change systematically with time in the session. Some aspects of these results are inconsistent with explanations for the preference for variability in terms of scaling factors, scalar expectancy theory, risk-sensitive models of optimal foraging theory, and habituation to the reinforcer. Initial-link response rates also changed within sessions when the schedules provided high, but not low, rates of reinforcement. Within-session changes in responding were similar for the two initial links. These similarities imply that habituation to the reinforcer is represented differently in theories of choice than are other variables related to reinforcement.