Mozgostroje

Reductionism for Psychological Science

04 May 2014

[]

I plan to post about the inference in psychology - how it is done and how it should be done. In this post I want to clear some ground by reviewing some historical ideas. These ideas should be familiar to most psychologists, though they may be new to researchers from other fields. I will use the terminology used by Norbert Bischof (2008; Chap. 8-11). I suppose this terminology will be unfamiliar to anglophones. Still I think introducing new terminology is easier than attempting to clarify and recycle the available ideas.

A key aspect of science is mapping of observations onto laws, models and theories. This mapping usually omits some details about the observations and therefore can be labeled reduction. Bischof distinguishes two kinds of reduction: qualitative and quantitative reduction. A typical example of qualitative reduction is provided by the laws of physics. These tell us how one quality can be derived from other qualities. For instance, velocity of a falling object can be derived from the duration of the fall and gravity as $v=\frac{g}{2} t^2$. The equation works regardless of the numbers you plug in. This is where the reduction and abstraction happens. The equation works and provides robust prediction regardless of whether the object has been falling for 5 or for 10 seconds. It works on earth but also on moon where the $g$ is different. It however outputs non-sense if you plug into it values of a different quality. If you plug for $t$ temperature instead of time the equation will not return velocity. It will return nonsense. In principle we could imagine equations where quality of the variable can be exchanged. But as it happens, the laws in physics are mostly not of this sort.

The other kind of reduction goes the opposite way. It discards qualitative aspect and gives prescription how to derive quantity. This kind of reduction is mainly used in computer science, biology and engineering to describe events in complex systems. The content of abstraction is described by a block diagram or algorithm rather than as an equation. The block diagram specifies how the quantities interact. However it does not need to specify the quality. This depends on how the system is implemented. In fact, system can be described without any reference to the units (quality) of the signals running within the system. To give examples, below I present the diagrams for thermostat, von Neumann computer architecture, a sketch of evolutionary algorithm and the model of cortical column from the Blue Brain Project.

Nomological and Systemic reduction

Qualitative and quantitative reduction have different consequences for how the knowledge in a scientific field is structured and accumulated. Physics strives towards unification. Maxwell came with unified theory for electric and magnetic forces. Einstein's theory of relativity extended Newton's classical mechanics. Physicists hope they will one day have theory of everything. This will presumably have a form of equation that will connect quantum mechanics and the standard model (assuming the standard model is correct). Bischof labels this process of abstraction nomological reduction.

Looking at quantitative reduction we find the opposite pattern of knowledge accumulation. The most general laws of how systems behave are already well known. These are given by the evolutionary algorithm and the foundations of information theory (derived mostly by Shannon more than a half century ago) and the foundations of theoretic computer science (automata, grammars, Turing machines etc.). Our task is now to work toward the model of a particular system. This is the human mind in context of psychology. The simplest way to do this is to start with a subsystem and iteratively extend it's scope until the model reasonably resembles human behavior. If multiple researcher groups work in parallel on different subsystems, they will eventually merge their models. There is also certain sense of unification behind this process. However the systemic reduction (as Bischof labels this process) does not result in simpler models. Rather the process leads to ever growing block diagrams, ever more complex algorithms. Furthermore as the model grows, the group of systems it describes - that is the group of phenomena it models becomes smaller.

Esthetic and Functional heuristic

In the context of particular types of reduction different heuristics proved to be efficient. Physicists will point you to the simplicity and beauty of their laws. Bischof cites some nice examples of this. For instance Richard Feynman writes "You can recognize truth by its beauty and simplicity" and Newton famously wrote "Nature does nothing in vain, and more is in vain, when less will serve; for nature is pleased with simplicity, and affects not the pomp of superfluous causes". This heuristic is useful because with two theoretic proposals that are both well supported by the evidence it tells us to prefer the nicer and simpler one (and to test it). Bischof notes that the heuristic can well fail. He illustrates this by Keppler's attempt to derive a law that would explain the distance of planets from the sun. Such theory would be indeed beautiful. Unfortunately, the distance is determined by idiosyncratic processes that occurred when our solar system was in its infancy. There is no simple law.

In biology and engineering beauty and simplicity are beside the point. Instead functionality provides a (ultimate) perspective for deciding between theories and models. We identify the ecological niche of the organism. We then identify the problems the organism had to solve in order to survive. We can then confine our theorizing to models that are able to survive. Bischof used to illustrate the distinction between the two heuristics with the example of two keys. Consider the image below. If the right part of the key on the left is occluded we can guess with the help of aesthetic heuristic what it looks like. To guess what an occluded part of the key on the right looks like we need to know the construction of the lock. Equipped with the functional heuristic we can guess at the shape of the key.

Program for Psychology

Bischof's recommendation for psychology is a program of quantitative systemic reduction. This is also the direction in which most of the current research is heading (at least in principle). However, there have been many attempts to take different direction. These were mainly motivated by the success of physics in 19th and 20th century. Understandably, psychologists tried to replicate the success by copying the research strategies. As the discussion so far illustrated these research strategies are not appropriate if we wish to study complex systems.

Most well known is the case of behaviorism. Behaviorists were looking for laws which would relate stimulus to response. Skinner (1959) described his idea of reductionism with the graph below.

The three curves show cumulative frequency of behavior (y axis) across time (x axis) relative to positive feedback (short vertical lines) for three different organisms. Skinner writes:

"Pigeon, rat, monkey, which is which? It doesn't matter. Of course, these three species have behavioral repertoires which are as different as their anatomies. But once you have allowed for differences in the ways in which they make contact with the environment, and in the ways in which they act upon the environment, what remains of their behavior shows astonishingly similar properties. Mice, cats, dogs, and human children could have added other curves to this figure."

Skinner doesn't care about the differences between the three systems. For him they differ only by a constant factor in a mathematical equation that describes how to translate a reward schedule into a frequency of behavior. Other instances of attempts to adapt the qualitative reduction in psychology include Lewin's $\mathrm{behavior}=f(\mathrm{personality}, \mathrm{environment})$ or Metzger's explorations of statistical mechanics. More recent instances are attempts to find special particles or events in the brain (e.g. quantum efects a la Eccles, Penrose) and to explain the functioning of human mind based on these.

Needless to say these programs had little success. With cognitive revolution functionalist perspective (and quantitative reduction) ascended to dominant research strategy.

Mind as a crystal

Behaviorism went and Cognitive science came. Physics held a strong allure to aspiring cognitive scientists and many initial attempts to come up with theories and models utilized nomological reduction. The best example is Chomsky's and Fodor's proposals for language of thought and their attempts to frame human mind in the language of the automata theory. Dennett (1998; pp.276) describes the spirit behind their endeavor in his recollection of an AI conference which took place during the the late 70. at Tuffts University. Schank and Winograd defended the possibility of AI. Chomsky and Fodor were strongly opposed.

It began as a straightforward, first principles condemnation of conceptual error - Schank was on one fool's errand or another - but it ended with a striking concession from Chomsky: it just might turn out, as Schank thought, that the human capacity to comprehend conversation (and more generally, to think) was to be explained in terms of the interaction of hundreds or thousands of jerry-built gizmos - pseudo-representations, one might call them but that would be a shame, for then psychology would prove in the end not to be 'interesting.' There were only two interesting possibilities, in Chomsky's mind: psychology could turn out to be like physics - its regularities explainable as the consequences of a few deep, elegant, inexorable laws - or psychology could turn out to be utterly lacking in laws - in which case the only way to study or expound psychology would be the novelist's way.

The "interaction of thousands of jerry-built gizmos" is precisely what the final product of systemic reduction will look like. We also clearly see Chomsky's recourse to aesthetic heuristic here.

Similar to qualitative reduction nomological reduction seems to be dead today and if it survives then only among Chomsky's acolytes. (If you have other good recent examples, please tell me in the discussion below.)

Rationality and Beauty

Even upon accepting nomological reduction, researcher may use any of the two heuristics. For instance, the Rational Analysis (Oaksford & Chater, 1998) and its more recent bayesian incarnation (Jones & Love, 2011) can be seen as a form of aesthetic heuristic where rational theory is the standard of beauty. This is at least the case when rationality is not related to its functional value.

On the other hand the functional heuristic itself is not without controversy. Other phenomena such as sexual selection, genetic drift or epigenetic events put the validity of functional heuristic into question. With these phenomena a researcher may wrongly propose a functional account of a trait that didn't arise as a optimal solution to any problem.

Conclusion

I think in its general outline the strategy of quantitative/systemic reduction accompanied by functional heuristic is right. The functional heuristic will require some further qualifications and I will return to this topic in my future posts. There are also arguments against the qualitative, nomological reduction coming from the embodied cogsci direction. I don't consider these serious but I will consider some of them in the next post. Finally, I think the really distinctive feature of the two different reduction strategies is the role they assign to causality. I will return to this point later when I discuss Pearl's theory of causality. It will allow us much better grasp of what the supporters of qualitative/nomological reduction were up to.

comments powered by Disqus