B.F. Skinner and radical behaviorism:

B.F. Skinner, who carried out experimental work mainly in comparative psychology from the 1930s to the 1950s, but remained behaviorism's best known theorist and exponent virtually until his death in 1990, developed a distinct kind of behaviorist philosophy, which came to be called radical behaviorism. He also claimed to have found a new version of psychological science, which he called behavior analysis or the experimental analysis of behavior.

Definition of radical behaviorism:

Skinner was influential in defining radical behaviorism, a philosophy codifying the basis of his school of research (named the Experimental Analysis of Behavior, or EAB.) While EAB differs from other approaches to behavioral research on numerous methodological and theoretical points, radical behaviorism departs from methodological behaviorism most notably in accepting treatment of feelings, states of mind and introspection as existent and scientifically treatable. This is done by identifying them as something non-dualistic, and here Skinner takes a divide-and-conquer approach, with some instances being identified with bodily conditions or behavior, and others getting a more extended 'analysis' in terms of behavior. However, radical behaviorism stops short of identifying feelings as causes of behavior. Among other points of difference were a rejection of the reflex as a model of all behavior and a defense of a science of behavior complementary to but independent of physiology.

Skinner's conceptual innovations:

This essentially philosophical position gained strength from the success of Skinner's early experimental work with rats and pigeons, summarised in his books The Behavior of Organisms (1938) and Schedules of Reinforcement (1957, with C. B. Ferster). Of particular importance was his concept of the operant response, of which the canonical example was the rat's lever-press. In contrast with the idea of a physiological response, an operant is a class of structurally distinct but functionally equivalent responses. For example, while a rat might press a lever with its left paw or its right paw or its tail, all of these responses operate on the world in the same way and have a common consequence. Operants are often thought of as species of responses, where the individuals differ but the class coheres in its function--shared consequences with operants and reproductive success with species. This is a clear distinction between Skinner's theory and S-R theory.

Another crucial contribution was his clarification of the key concept of reinforcement, which had been introduced by Thorndike and used extensively by Hull, but seemed to be mired in issues of definitional circularity. Whereas Thorndike had tried to define reinforcement mentalistically, as a "satisfying state of affairs", and Hull had tried to define it physiologically, in terms of the reduction of a drive, Skinner defined it empirically: if an event was experimentally observed to increase the rate of response, it was then called a reinforcer for that particular animal at that time. Food, water, brain stimulation, sex, social contact, and reinforcing drugs are all reinforcers that have been used in operant research with animals. The issue of whether these stimuli were satisfying to the animal (Thorndike's definition) was thereby bypassed, and the issue of whether they involved the reduction of a drive was left open for empirical physiological investigation (and it was quickly realised that many do not).

next page »