When a significant event occurs at a predictable time, an animal can learn to adjust
its behavior in an appropriate manner. An observer can then use the animal's behavior as a
clock to guess quite accurately the time of occurrence of the significant event. What
clock did the animal use to predict the event? No external clock was available, so the
fact of temporal discrimination suggests that the animal has some sort of internal clock
which it can read. Our general question is, what are the properties of this internal
In this chapter I will briefly use the use of "time" as a stimulus in studies
of animal conditioning and learning, describe several types of explanations of temporal
discrimination, emphasizing explanations which make use of the concept of an
"internal clock". I will then go on to describe some of the properties of the
internal clock of a rat, and the types of control that a rat has over its internal clock.
I. Time as a Stimulus
A. Studies of Temporal Discrimination
If we divide events into stimuli (S), response (R), and outcomes (O), it is possible to
consider the following time intervals: The duration of S, R, and O, the interval between
S-S, R-R, and O-O, the interval between S-R, S-O, and R-O, the interval between R-S, O-S,
and O-R. Most of these intervals have recognizable names, and an exceptional history. In
this section, I will briefly refer to some of the studies which suggest that animals may
use time as a stimulus.
1. The Duration of Stimulus. The duration of a stimulus, like other quantitative
properties of a stimulus, can be discriminated by an animal and can influence its
behavior. For example, in one study a pigeon was trained to make one response if the
stimulus was shorter than a criterion, and a different response of the stimulus was longer
than a criterion (Stubbs, 1968). The results were plotted as a standard psychometric
function - percentage of responses of one kind as a function of stimulus duration. When
duration was scaled in logarithmic units, the psychometric function was approximately a
2. The Duration of a Response. The duration of a response, like other
quantitative properties of a response, can be changed by differential reinforcements. For
example, Platt, Kuch, and Bitgood (1973) reported the results of a study in which a rat
received food whenever it depressed a lever longer than a criterion duration. The median
response duration increased as a power function of the duration of the criterion.
3. The Duration of an Outcome. The duration of an outcome, like other
quantitative properties of a reinforcer, affects the magnitude of reinforcement. For
example, in one study, if a pigeon pecked one key it occasionally had access to gain for k1
sec; if it pecked a second key it equally often had access to the grain for k2 sec.
Under such conditions, the relative rate of responding was approximately equal to the
relative duration of reinforcement (Catania, 1963).
4. The Interval between Stimulus and Stimulus. Most of the time intervals
investigated by experimental psychologists have been in the order of a few seconds or
minutes, but rats discriminate temporal intervals much shorter than a second and much
longer than a few minutes. For example, various clicker frequencies serve as
discriminative stimuli, and there is reason to believe that the Kamin effect (Holloway
& Wansley, 1973) may be due primarily to generalization decrement, i.e., the level of
performance is highest when the test trials occur at the same time (relative to a 12- or
24-hr clock) as the training trials.
5. The Interval between Response and Response. If reinforcement is made
available only if the animal withholds responding longer than a criterion time, the
distribution of interresponse times is closely related to the minimum reinforced
interresponse time (e.g., Kramer & Rilling, 1970).
6. The Interval between Outcome and Outcome. For temporal conditioning, a
reinforcer (e.g., food powder) is delivered at regular time intervals (e.g., every 30
min). In one experiment (Pavlov, 1927), the rate of salivation of dogs regularly increased
as the time of the next reinforcer approached. An aversive event presented at regular
intervals can also lead to temporal conditioning (e.g., LaBarbera & Church, 1974).
Most recent studies of temporal conditioning have used general activity or a trained
instrumental act as the response, and either food or electric shock as the reinforcer.
Killeen (1975) has provided a good review of the facts, and some new interpretations.
7. The Interval between Stimulus and Outcome. When food is delivered at a fixed
time after signal onset (inhibition of delay), the response rate increases as the time to
the next reinforcement approaches (e.g., Pavlov, 1927). Many studies of the relationships
between response latency and the interval between warning signal and the shock (the CS-UCS
interval) in avoidance learning have verified this result and extended its generality.
8. The Interval between Stimulus and Outcome. In Sidman avoidance a response
delays the onset if the next shock by a particular number of seconds. The response rate
increases as the time of the next shock approaches (e.g., Anger, 1963; Gibbon, 1972; Libby
& Church, 1974).
9. The Interval between Outcome and Response. In the case of fixed-interval
schedules of reinforcement (FI), if the interval of time between the previous
reinforcement and the response of exceeds a criterion, the animal will be reinforced
again. In such situations, animals respond increasingly rapidly as the time of the next
reinforcement approaches (e.g., Dews, 1970).
This list of procedures is far from exhaustive, but it illustrates the fact that
animals learn to adjust their behavior in a manner appropriate to a time interval. The
time interval can begin with a stimulus, a response, or an outcome; it also can end with
any of these events. The duration of the interval being timed may be marked by some
distinctive cue (filled) or it may not (unfilled). Although the data in this chapter come
from only a few procedures, the conclusions about the nature of the internal clock may be
quite general. An animal may not be much affected by the nature of the signal that starts,
maintains, or stops its timing mechanism.