[1997 Wenner, A.M. The role of controversy in animal behavior. Pages 3-37 in Greenberg, C. and E. Tobach (eds). Comparative Psychology of Invertebrates:The Field and Laboratory Study of Insect Behavior. Garland Publishing, New York.]
What I saw when I looked at the famous duck-rabbit was either the duck or the rabbit, but not the lines on the page – at least not until after much conscious effort. The lines were not the facts of which the duck and rabbit were alternative interpretations.
(Thomas S. Kuhn, 1979, p. ix)
As a philosopher of science, T. C. Schneirla understood the important distinction between facts and their interpretation. He was clearly no stranger to controversy, as Piel pointed out in his introductory essays to a book honoring Schneirla (1970) and to the published version of the inaugural Schneirla conference (1984). His involvement in controversy – scientific, philosophical, cultural and political – is part of the reason that Schneirla “was not popular, not celebrated in the gatherings of psychology in his time. Schneirla was not in the mainstream” (Piel, 1984, p. 13). A pertinent example is the intellectual battle he waged with both the European ethologists and the American operant psychologists (Piel, 1970).
In retrospect one can well wonder why divergent views held by the ethology and the American psychology schools led to such intense controversy. That is because few realize that controversy, in part, is a recurring component in collective scientific research, albeit a terribly inefficient and quite unnecessary complication. We can appreciate this complication if we are willing, but another problem was recognized by Anderson (1988, p. 18): “To end controversies, scientists must first understand them, but scientists would rather do science than discuss it.” More optimistically, Latour (1987, p. 62) stated: “We have to understand first how many elements can be brought to bear on a controversy; once this is understood, the other problems will be easier to solve.”
To understand scientific controversy, we first have to understand how science operates, a topic normally given scant attention by scientists – just as Anderson emphasized. Schneirla clearly understood how science is actually more a process than a series of accomplishments. By contrast, biology textbooks extoll the accomplishments of scientists, largely ignore scientific process, and omit mention of controversies that may have preceded given accomplishments.
Psychologists, who receive extensive exposure to the history, methodology and philosophy of science, even as undergraduates, are usually surprised when they learn that these topics are nearly absent from the formal education of biology students in this country. Two decades ago I checked dozens of college catalogues across the country and found that courses in the above subjects were notably absent from undergraduate biology curricula, while nearly universally required for psychology undergraduates.
Physics is apparently in little better shape than biology, leading Theocaris and Psimopoulos to comment (1987, p. 597): “The hapless student is inevitably left to his or her own devices to pick up casually and randomly, from here and there, unorganized bits of the scientific method, as well as bits of unscientific methods.”
The problem is not a new one. Ludwik Fleck (1935/1979) recognized the presumed disparity of approach between those in the “hard sciences” and those in the “soft sciences” when he wrote (p. 47): “. . . thinkers trained in sociology and classics. . . commit a characteristic error. They exhibit an excessive respect, bordering on pious reverence, for scientific facts.” Neither did Fleck leave those in the “hard sciences” untouched; he wrote (p. 50) that the error of natural scientists consists of “an excessive respect for logic and in regarding logical conclusions with a kind of pious reverence.
Unfortunately, most ignore the great amount of accumulated thought (wisdom) that has been published in past decades, including notions about scientific process. While others have treated parts of that process in depth, a brief review of the overall collective process can be included here, summarized from more complete accounts published earlier (e.g., Wenner, 1989, 1993; Wenner &Wells, 1990).
Unless science students are thoroughly inculcated with the discipline of correct scientific process, they are in serious danger of being damaged by the temptation to take the easy road to apparent success…. [They] shouid understand all the subtle ways in which they can delude themselves in the design of observations and the interpretation of data and statistics. (Branscomb, 1985, pp. 421, 422)
My interest in an analysis of this process began when Patrick Wells and I, along with our co-workers, became embroiled in a major scientific controversy in the mid to late 1960s (e.g., Wells & Wenner, 1973; Wenner & Wells, 1990). That controversy (the question of a “language” among honey bees) continues to this day.
We felt at the time that the bee language controversy should not have emerged from our test of that hypothesis. The relevant scientific community not only rejected our alternative interpretation but also ignored or summarily dismissed the anomalous results we had obtained, for the most part without even repeating the critical experiments that had yielded our results. Instead, Maier’s Law (Maier, 1960, p. 208) prevailed: “If facts do not conform to the theory, they must be disposed of.”
Consequently, we studied the history, philosophy, sociology, psychology and politics of science for two decades in an attempt to decipher what had transpired in the controversy that erupted as a result of our test of the language hypothesis (see Wenner & Wells, 1990). From that study, we recognized that scientific progress occurs (as indicated above) collectively and inefficiently by an unconscious group application of a definable method. We gradually formulated a diagram for our perception of this process (Figure 1).
The diagram is simple in principle. A new research trend begins (lower-right hand corner – exploration approach) when an individual recognizes (not merely observes) an important anomaly in nature while engaged in “normal science” (e.g., Kuhn, 1962/1970; Polanyi, 1958). Unconsciously, perhaps, that individual has moved from a state of “realism” (“knowing” what reality is) to “relativism” (an interpretation previously held to be “fact” is now suspect). The scientist then “creates an image” (Atkinson, 1985), forms an alternative explanation for evidence at hand and attempts to convert others to the same point of view.
Figure 1. The collective scientific process and how portions of it have been perceived through time, with each portion having had its advocates. For each sequence on any one collective research project, the complete process starts in the lower right-hand corner and progresses clockwise around the diagram, as the scientific community expands the scope of its inquiry (some steps may be omitted by practitioners). Movement around the diagram can stall (paradigm hold, see Figure 2), at which time progress plateaus. The numbers represent a chronology of contributions to formation of the diagram. See text for further explanation.
If others can be convinced of the new interpretation, the scientist’s view is reinforced (incipient “vanguard science” – Fleck’s term, see below), moves back to the “realism” mode and attempts to verify the results (verification approach), a necessary but not sufficient part of the overall scientific process. ‘When others can verify the results, many scientists can become committed to the new interpretation. A new research emphasis and protocol may then arise in a broader portion of the scientific community (“vademecum science” – Fleck’s term, see below; also later termed “normal science” by Kuhn).
Unfortunately, much of animal behavior research during the past three decades has relied on verification alone (a partial view of the scientific process and only a portion of the “logical positivism” or “logical empiricism” school). That is, testing each hypothesis was not considered necessary in much of animal behavior research whenever a large body of evidence supported a given hypothesis (e.g., Wenner & Wells, 1990, pp. 204, 234). Animal behaviorists are not alone; scientists in general are reluctant to test their hypotheses (e.g., Mahoney, 1976). Such an attitude, of course, was responsible for “cold fusion” and other debacles in chemistry and physics (e.g., Asimov, 1989; Huizenga, 1992; Rousseau, 1992; Taubes, 1993).
During the 1960s and 1970s, researchers in ecology (e.g., critiques by Dayton, 1979; Loehle, 1987) and in psychology (e.g., critique by Mahoney, 1976) adopted another rather narrow approach (moving further clockwise around the diagram); they insisted that research be molded into an appropriate “null hypothesis” (falsification) protocol. The implicit rationale: If a premise cannot be proven false, then it is likely true or has some “probability” of being true (“realism” school).
In part, Thomas Kuhn’s influence (anomalies emerge and eventually hypotheses become rejected, even without application of the formal null hypothesis approach) gradually forced psychologists to abandon their former comfortable stance. However, Schneirla had earlier perceived the weakness of that “working hypothesis” (e.g., Chamberlin, 1890/1965) approach, as phrased by Tobach (1970, p. 239): “He rejected logical positivism and operationism as bases for scientific inquiry and opened the way to a dynamic, holistic approach based on process.” Ecologists continue to demand conformity to the null hypothesis approach (Dayton, 1979; Loehle, 1987).
In 1890, Thomas Chrowder Chamberlin (upper right corner of the diagram) recognized the weakness of an overreliance on verification (“ruling theory,” in his terms) and/or on falsification (attempting to falsify a “working hypothesis,” as phrased by Chamberlin). He advocated application instead of “The Method of Multiple Working Hypotheses” (inference approach), employing “crucial” experiments designed to provide mutually exclusive results. In that approach, scientists continually pit hypotheses against one another and attempt to falsify all of them during experimentation. After additional evidence is in, new alternative hypotheses are generated that might explain known facts and other pertinent information (e.g., Platt, 1964).
Only rarely does one find a scientist who can move from one approach to another with ease, as Claude Bernard, Louis Pasteur and Schneirla seem to have done, and as Feyerabend (1975) suggested in his famous phrase, “anything goes.” Duclaux (1896/1920), biographer of Pasteur, recognized another root problem with respect to “objectivity” and experimental design for those who attempt to use standard procedure when he wrote: “However broadminded one may be, he is always to some extent the slave of his education and of his past.” Four decades later, Fleck (1935/1979, p.20) formed much the same conclusion: “Furthermore, whether we like it or not, we can never sever our links with the past, complete with all its errors.”
Bernstein summarized succinctly the dichotomy between realism and relativism (1983, p. 8):
The relativist not only denies the positive claims of the [realist] but goes further. In its strongest form, relativism is the basic conviction that when we turn to the examination of those concepts that philosophers have taken to be the most findamental … we are forced to recognize that in the final analysis all such concepts must be understood as relative to a specific conceptual scheme, theoretical framework, paradigm, form of life, society, or culture.
Ludwik Fleck, Overlooked Sage
Ludwik Fleck was a medical doctor in Poland in the 1930s and an expert on syphilis and typhus, expertise that kept him from being killed in concentration camps during WWII. While earlier studying the history of changes in attitude toward syphilis through time, he recognized the tentative nature of scientific “fact” and published a monograph in 1935, entitled Genesis and Development of a Scientific Fact.
Many of the points covered therein parallel notions advocated by those in the Schneirla school.
Thomas Kuhn wrote a foreword to the 1979 translation of Fleck’s book, in part to acknowledge his indebtedness to the work (having read it in German before publication of his own classic 1962 work) and in part to explain that the volume contained much that he had missed earlier. Kuhn wrote (1979, p. x): “Though much has occurred since its publication, it remains a brilliant and largely unexploited resource.” Kuhn also recognized that, rather than grasping the full implication of Fleck’s message during his early reading (relying on his “rusty German” as he put it), he had focused primarily on “… changes in the gestalts in which nature presented itself, and the resulting difficulties in rendering ‘fact’ independent of ‘point of view.'”
While writing our book, Patrick Wells and I did not know of Fleck’s perceptive analysis of scientific process, but our thoughts nevertheless had converged with his on many issues, particularly in his sections on epistemology.
Realism and Relativism Schools of Thought
Realism. The dubious notion that one can “know” reality was challenged repeatedly in Fleck’s treatise. He also recognized that scientists become too committed to hypotheses. Fleck (1935/1979) wrote (p. 84): “Observation and experiment are subject to a very popular myth. The knower is seen as a kind of conquerer, like Julius Caesar winning his battles according to the formula ‘I came, I saw, I conquered.'” And (p. 84): “Even research workers who have won many a scientific battle may believe this naive story when looking at their own work in retrospect.” Later Fleck commented (p. 125): “… the [generated] fact becomes incarnated as an immediately perceptible object of reality.”
The notion that “fact” has not necessarily been gained emerges from Fleck’s statement (p. 32): “The liveliest stage of tenacity in systems of opinion is creative fiction, constituting, as it were, the magical realization of ideas and the interpretation that individual expectations in science are actually fulfilled.”
Fleck’s awareness of the essence of Duclaux’s statement (above) is evident in his own statements (p. 27): “Once a structurally complete and closed system of opinions consisting of many details and relations has been formed, it offers enduring resistance to anything that contradicts it,” and (pp. 30, 31): “The very persistence with which observations contradicting a view are ‘explained’ and smoothed over by conciliators is most instructive. Such effort demonstrates that the aim is logical conformity within a system at any cost . . .” Neither was Fleck blind to social constraints in the conduct of research (p. 47): “. . . Social consolidation functions actively even in science. This is seen particularly clearly in the resistance which as a rule is encountered by new directions of thought.”
One of the more striking features of Fleck’s book is the notion of “thought collectives” (expanded upon below). Various interest groups exist within each scientific community, as exemplified today by units within the electronic mail system. He defined his use of the term “thought collective” as (p. 39): “a community of persons mutually exchanging ideas or maintaining intellectual interaction. . . .”
Any one person belongs to several thought collectives (very obvious in multiple enrollment in e-mail networks) and becomes molded into the thought patterns expected within each scientific community. Kuhn’s notion of “paradigm hold” was already known to Fleck (p. 28): “When a conception permeates a thought collective strongly enough, so that it penetrates as far as everyday life and idiom and has become a viewpoint in the literal sense of the word, any contradiction appears unthinkable and unimaginable.”
Relativism. We chose the term “relativism” for emphasis in our book, among other possible choices of words, to stress the relative nature of knowledge (see Excursus RE in Wenner & Wells, 1990), but Fleck had already perceived the same concept when he wrote (p. 50): “An empirical fact . . . is relative. . . . Both thinking and facts are changeable . . . Conversely, fundamentally new facts can be discovered only through new thinking.” In stronger words he wrote (p. 20): “. . . we would argue that there is probably no such thing as complete error or complete truth” and (p. 48): “. . . nobody has either a feeling for, or knowledge of, what physically is possible or impossible.”
Fleck again recognized the influence of social factors in science (p. 124): “If a fact is taken to mean something fixed and proven, it exists only in vademecum science,” and (p. 21): “At least three-quarters if not the entire content of science is conditioned by the history of ideas, psychology, and the sociology of ideas and is thus explicable in these terms.”
Finally, Fleck recognized the very tentative nature of scientific investigation (pp. 10, 11):
The acquisition of physical and psychological skills, the amassing of a certain number of observations and experiments, the ability to mold concepts, however, introduce all kinds of factors that cannot be regulated by formal logic. Indeed, such interactions . . . prohibit any systematic treatment of the cognitive process.
Figure 2. A diagram illustrating how individuals who rely too heavily on verification or falsification can become locked into a paradigm hold Verificationists may accumulate support for a hypothesis but fail to test it. Those who attempt to falsify a null hypothesis may ignore anomalies and erroneously conclude that failure to falsify leads to truth. In either case, basic assumptions may then no longer be questioned.
Four Scientific Approaches: Fleck’s Comments
In our book, Patrick Wells and I recognized that scientists use one or more of four approaches in the collective scientific process (as described above). Fleck had clearly preempted us on this score, more so on the first two than on the last two.
Exploration. Atkinson (1985) coined the term, “creation of an image,” to describe what happens when one has a new perception about existing information (lower right hand corner in Figure 1). However, he was clearly preempted by Fleck, as is evident by Fleck’s use of the word “genesis” in the title of his book, Genesis and Development of a Scientific Fact.
Fleck recognized the creative spark (p. 48): “. . . the ability to perceive scientifically is only slowly acquired and learned. Its prime manifestation is discovery. This occurs in a complex, socially conditioned way . . .” He also stressed that the creative individual should recognize the strictures of “reality” (p. 30): “Discovery is . . . inextricably interwoven with what is known as error. To recognize a certain relation, many another relation must be misunderstood, denied, or over-looked.” Fleck further noted the influence of social relationships (p. 123): “. . . the true creator of a new idea is not an individual but the thought collective . . . The collective remodeling of an idea has the effect that, after the change in thought style, the earlier problem is no longer completely comprehensible.”
If a novel idea does gain sufficient acceptance in the scientific community (rather readily if exotic, it seems), a new field of research may emerge, and many of those in “vademecum science” fall into line behind those in the “vanguard,” in Fleck’s terms. Soon assumptions become accepted as “facts” and become the basis for more “normal science” (Kuhn’s term).
Verification. Perhaps one of the most rigid notions (and roadblocks) in animal behavior studies is the concept that one can prove something true if one gathers sufficient “positive” evidence (e.g., Wilson, 1972, p. 6). Fleck saw through that line of reasoning and recognized the additional social element involved (p. 37): “. . . once a statement is published it constitutes part of the social forces which form concepts and create habits of thought.”
Garrett Hardin (1993, p. 225) expressed this same thought somewhat differently: “Often the creation of a noun (‘substantive’) seems to presume the presence of a substance, a physical thing.” As an example, we have seen in behavioral studies that various expressions (e.g., “innate releasing mechanism” and “fixed action potential”) come into vogue for a time and then disappear. Schneirla’s school perceived that same problem: the invention of such terms brings us no closer to understanding (e.g.,Tobach & Aronson, 1970, p. xvi): “He was especially opposed to such ethological terms as innate releasing mechanism, vacuum and displacement reactions, fixed action potential, and action specific energy, which he considered to be reifications deduced from the basic assumption of the existence of instincts.”
The problem here is that members of the scientific community too readily accept notions that may not be backed by substantial evidence, as illustrated by Mark Twain’s comment in Life on the Mississippi: “There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.”
In Fleck’s words (p. 42): “Thoughts pass from one individual to another, each time a little transformed. . . . ‘Whose thought is it that continues to circulate? It is one that obviously belongs not to any single individual but to the collective,” and (p. 28): “When a conception permeates a thought collective strongly enough . . ., any contradiction appears unthinkable and unimaginable.”
Eventually, the “thought collective” forms a consensus and holds to given positions, as outlined by Fleck (p. 27):
- A contradiction to the system appears unthinkable.
- What does not fit into the system remains unseen;
- alternatively, if it is noticed, either it is kept secret, or
- laborious efforts are made to explain an exception in terms that do not contradict the system.
- Despite the legitimate claims of contradictory views, one tends to see, describe, or even illustrate those circumstances that corroborate current views and thereby give them substance. In other words, “vademecum science.”
Falsification. Fleck clearly understood the importance of heeding evidence that did not fit prevailing thought. He clearly preempted Kuhn on the notion of “paradigm hold” when he wrote (p. 27): “Once a structurally complete and closed system of opinions consisting of many details and relations has been formed, it offers enduring resistance to anything that contradicts it.”
In the past few decades of animal behavior studies we have seen numerous examples of some rather exotic hypotheses being put forth by vanguard scientists and embraced by vademecum science participants. As examples we have had honey bee, dolphin and chimpanzee “languages.” In the “hard sciences” examples include “cold fusion,” “water with a memory,” and “polywater” (e.g., Rousseau, 1992).
Fleck’s attitude about scientific process is relevant to the fact that progress in animal behavior and ecology studies seems to be slow (e.g., Dayton, 1979). One might also find relevant statements about scientific protocol in Garrett Hardin’s book, Living Within Limits: Ecology, Economics, and Population Taboos (1993). Hardin (p. 41) wrote that if “each new proposal advanced” were to be “assumed to be true until it is proven false . . . the scientific community would soon be overwhelmed by unworkable proposals, and the advance of science would be greatly retarded.” But many do not seem to realize that something not yet proven false is not necessarily true.
Inference. There seems to be little in Fleck’s book that relates directly to Chamberlin’s concept of the multiple inference approach, but Fleck had a good grasp of the relative nature of science when he wrote (p. 20): “. . . we would argue that there is probably no such thing as complete error or complete truth.” One can only speculate about how Fleck might have benefited from Chamberlin’s thoughts (1890/1965), as resurrected by Platt (1964).
As Eugene Meyer phrased it (personal communication), Fleck was too “gloomy” on this point. The multiple inference approach has great promise for animal behavior studies, once that procedure becomes part of the research arsenal for those in that field. Whereas “truth” may always be elusive, diverse options can always be kept alive with this approach; furthermore, paradigm holds will likely be less severe. In our work the multiple inference/strong inference approach has been extremely valuable (e.g., Wenner et al., 1969; Wenner, 1972; Wenner & Harris, 1993).
For quite some time, individuals within the “hard sciences” have insisted that social factors do not influence the conduct of their science. In the Foreword to Fleck’s book, Kuhn wrote (1979, p. viii): “. . . in 1950 and for some years thereafter I knew of no one else who saw in the history of science what I was myself finding there . . . acquaintance with Fleck’s text helped me to realize that the problems which concerned me had a fundamentally sociological dimension.” More recently, Hardin (1993, p. 257) echoed that thought: “Part of the unofficial mythos that supports science is the belief that truth will prevail, no matter what. If you have a heretical idea, publish it, supporting it with data and arguments as needed, it will be noticed. If your theory is true it will soon be accepted by the establishment; heterodoxy will metamorphose into orthodoxy.”
Fleck’s use of the term “thought collective” can now be seen to be especially apt. Increasingly, those who study the sociology and psychology of science recognize the importance of social bonds among those who work within given broad research groups (in Fleck’s words, again, “thought collectives”). He lucidly described the social hierarchy we now recognize (p. 124): “Every discipline . . . has its own vanguard. . . . This is followed by the main body, the official community [vademecum scientists]. Then come the somewhat disorganized stragglers.”
Fleck elaborated upon that point (p.4l): “The individual within the collective is never, or hardly ever, conscious of the prevailing thought style, which almost always exerts an absolutely compulsive force upon his thinking and with which it is not possible to be at variance,” and (p. 82): “The more deeply one enters into a scientific field, the stronger will be the bond with the thought collective and the closer the contact with the scientist.” In 1841 Mackay was more pointed (in Taubes, 1993, p. 107): “Men, it has been well said, think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, and one by one.”
The Schneirla group had a similar perception of the problem (e.g., Lehrman, 1970, p. 21): “It is too easy to close one’s mind to an argument by simply deciding that the source of the argument is an outsider.”
Fleck used even more powerful language when he wrote (p. 141):
To the unsophisticated research worker limited by his own thought style, any alien thought style appears like a free flight of fancy, because he can see only that which is active and almost arbitrary about it. His own thought style, in contrast, . . . becomes natural and, like breathing, almost unconscious, as a result of education and training as well as through his participation in the communication of thoughts within his collective.
Hardin punctuated that idea in other words (1993, p. 43): “The scientific mind is not closed: it is merely well guarded by a conscientious and seldom sleeping gatekeeper. . . .”
Consider now the situation in which two thought collectives embrace different concepts of “reality” on a given topic. It may become impossible for them to communicate, one with another. Controversy can then erupt. Kuhn addressed that point in his foreword to Fleck’s book (p. ix): “. . . given my own special concerns, I am particularly excited by Fleck’s remarks on the difficulties of transmitting ideas between two ‘thought collectives,’ above all by the closing paragraph on the possibilities and limitations of participation in several ‘thought communities.'”
From that point one can see how someone in a thought collective who attends too closely to arguments of an opposing thought collective might go beyond the bounds of acceptable behavior (see Pauly, 1981, with reference to Jacques Loeb). We can see here a gradation along a continuum – loyal club member to skeptic (pessimist). Only a little step more, and one can fall into the “whistle-blower” category (the hypothesis fails too often). Hardin (p. 234) addressed that point: “Cronyism can be good, cronyism can be bad. ‘Whistle-blowers,’ who seek to serve the good of a larger group . . . [science or society as a whole instead of merely the thought collective] . . . are, more often than not, ostracized by their fellow workers.” Glazer and Glazer (1986) covered that theme in some detail.
Fleck became more profound when he considered the role of “thinking” in scientific research. He suggested that individuals were not normally capable of independent thinking and used a quotation from Gumplowicz to emphasize that point (p. 46): “The greatest error of individualistic psychology is the assumption that a person thinks . . .”; he further commented (p. 47): “What actually thinks within a person is not the individual himself but his social community,” and (p. 98): “. . . thinking [is] a supremely social activity which cannot by any means be completely localized within the confines of the individual.”
Fleck realized that he had been too negative on that point and later qualified that rather rigid stand by using a quotation from Jerusalem (p. 49): “Man acquires [the ability to think ‘purely theoretically’ and to state ‘given facts purely objectively’] only slowly and by degrees, to the extent that by conscious effort he overcomes the state of complete social bondage and thus develops into an independent and self-reliant personality.” Fleck thus felt it possible that individuals could become aware of themselves and of the role of social forces in the scientific process.
Reflections on Scientific Controversy
With the above comments by Fleck as background, we can consider further the role of controversy in scientific research, with some special attention to the question of “language” in honey bees. Although this section is written by an insider, it has the unique perspective of one who stepped out of a controversy for more than two decades.
The two-decade leave of absence mattered little. Even though my colleagues and I were not directly involved, the controversy continued unabated during those two decades. Honey bee “language” proponents repeatedly felt compelled to conduct “the definitive experiment,” one that could reinforce the prevailing viewpoint (“consensus”) of honey bee “language.” In a series of critiques spanning that same period, Rosin (e.g., 1978, 1980, 1988, 1992) exposed the flaws in those “definitive” experiments in terms of theory, design and execution, critiques that relied heavily on theoretical foundations laid by Maier, Morgan and Schneirla. For example, Rosin wrote (1988, p. 268):
Whereas Wenner & Wells (1987) explain that they came to oppose the “dance language” hypothesis as a direct result of their own research in honey bee behavior, I joined their opposition due to no particular interest in honey bees, but because I saw in the specific “dance language” controversy a major reflection of a much more generalized and much more important controversy over the whole field of animal behavior, between European Ethology . . . and Schneirla’s School. . . .
Note the close correspondence here between Rosin’s comment and Gerard Piel’s comment in the first paragraph of this contribution.
Rosin earlier had also written (1978, p. 589): “The controversy between the von Frisch group’s ‘language’ hypothesis . . . and Wenner’s group’s olfactory hypothesis . . . for the arrival of honey bee recruits at field sources, is essentially a controversy between a human-level hypothesis for an insect and an insect-level hypothesis for an insect.”
Even though Rosin’s arguments were largely informally dismissed by the bee language “thought collective” (essentially by near lack of citation), that same community eventually recognized that each of the various “definitive” experiments had been inadequate. Latour (1987, p. 43) addressed this type of issue: “If an article claims to finish the dispute once and for all it might be immediately dismembered, quoted for completely different reasons, adding one more empty claim to the turmoil” (Latour’s emphasis).
Fleck’s contribution about “thought collectives” helps place this last point in perspective. Just as it is not usually the individual but the thought collective that is creative, neither are controversies merely between individuals. They are instead most often controversies between thought collectives. Even though various sociopolitical maneuvers can result in denial of a platform to particular individuals, the thought collective to which those individuals belong doesn’t simply vanish. All that happens is that one or more key figures in one thought collective may lose a platform and become (hopefully, to those in an opposing thought collective) temporarily silent. Whereas members of the opposing thought collective can then convince themselves that an issue has been finally resolved in their favor, members of the original thought collective can continue to challenge any “elegant” experiments that form the basis for continued consensus.
Thus, it is usually not two individuals who are engaged in a nontrivial controversy. Rather, it is two thought collectives with two different views of “reality” that collide. Either that, or one thought collective with a fixed view of “reality” collides with another thought collective that may recognize that scientific accomplishments always remain relative.
To ascertain the degree to which the thinking of an individual may be unconsciously controlled by expectations of one or more thought collectives, one need only ask a generic question: Is it conceivable that your assumption (hypothesis) is not true? The answer to that question reveals much – an unquestioning member of a thought collective usually immediately answers in the negative (“no, that is not possible”). As Fleck put it (p. 107): “At a certain stage of development the habits and standards of thought will be felt to be the natural and the only possible ones. No further thinking about them is even possible.”
Thomas Kuhn termed the above fixation, “paradigm hold” (see Figure 2), but Fleck had earlier appreciated the same concept when he wrote (as quoted earlier) (p. 27): “Once a structurally complete and closed system of opinions consisting of many details and relations has been formed, it offers enduring resistance to anything that contradicts it.” Hardin (1993, p.4) used another interesting expression, “gatekeeper of the mind,” for much the same idea but also recognized just how insidious such fixation can become (p. 4): “An effective gatekeeper of the mind does not call attention to itself. It actuates a psychological mechanism called a taboo.”
Dewsbury (1993, p. 869) attempted to justify some censorship of novel perspective, viewing that practice as a necessary evil:
Both the creative innovator and the crackpot work at the fringes of the prevailing paradigm, and it often is dfficult to distinguish one from the other in the early stages of development. The scientific establishment, therefore, must develop a commitment to scientific orthodoxy that makes it hostile to challenges to that orthodoxy. Limiting access to the publication outlets controlled by the scientific establishment is one way in which those who are part of a scientific in-group or who are working within the dominant perspective can help defend that perspective.
If challenges to established dogma are not permitted, however, science does not advance.
Consider now the attitudes of scientists toward controversy. If controversy erupts in some field of science other than our own, we can enjoy watching the antics of those committed to the dogma (and/or flawed protocol) of one thought collective or another. Biologists and psychologists, for example, may well relish discussions about controversies in the “hard sciences” mentioned earlier.
Consider further the concept of taboo (e.g., Hardin, 1993). If a nontrivial controversy is too close to home, “vademecum scientists” in one thought collective or another distance themselves from the emerging controversy. Those who become enmeshed in the controversy become suspect (have only themselves to blame – a self-destructive act) unless they are a part of “vanguard science,” those “elite” thought collective members who spearhead goals of their own thought collective. Eventually no real interchange occurs between or among members of opposing thought collectives. Such extracommunity interaction is part of the taboo. As Hardin saw it (p. 4): “Westerners, with their cherished tradition of free speech and open discussion . . . change the subject.”
Here we must differentiate between minor and major controversies. Members of a thought collective may good-heartedly engage in minor controversies (“playful” or trivial controversies) as long as they remain minor; that is, when none of the basic assumptions of the thought collective become threatened. Participants may even pride themselves in their tolerance of divergent opinions.
Different fields have different levels of tolerance for revolutionary ideas, as is evident in the speed with which controversies become resolved. Fast-moving fields (e.g., genetics, molecular biology, nuclear physics), ones that routinely employ the strong inference approach in research (e.g., Platt, 1964), are generally much more receptive to challenge of existing dogma – which is why they are fast moving. Even in fast-moving fields, though, true progress can sometimes be slow, as in the rate of adoption of the chemiosmotic coupling hypothesis proposed by Peter Mitchell (Gilbert & Mulkay, 1984).
Once a major controversy erupts, certain events are quite predictable. Resolve within each thought collective solidifies, and much private support is given to those in the front lines (“vanguard scientists”). Papers submitted for publication by vanguard scientists do not undergo the same scrutiny as those submitted by vademecum scientists or those outside the thought collective, a point thoroughly documented by Peters and Ceci (1982, as summarized in Wenner & Wells, 1990, p. 191). Neither does totality of evidence count for much. Participants select those bodies of evidence that reinforce their own position. An example of such resolve was documented by Taubes (1993, p. 270): “Cold fusion existed until proven otherwise. . . . The Electrochemical Society administrators wanted to avoid a repetition of the rampant negativity of the Baltimore American Physical Society meeting. Speakers would present ‘confirmation results’ only.”
Those in the Schneirla school understood this type of development, as concisely stated by Lehrman (1970, pp. 18, 19):
When opposing groups of intelligent, highly educated, competent scientists continue over many years to disagree, and even to wrangle bitterly about an issue which they regard as important, it must sooner or later become obvious that the disagreement is not a factual one, and that it cannot be resolved by calling to the attention of the members of one group (or even of the other!) the existence of new data which will make them see the light. Further, it becomes increasingly obvious that there are no possible crucial experiments that would cause one group of antagonists to abandon their point of view in favor of that of the other group.
My colleagues and I encountered that phenomenon after we tested the honey bee dance language hypothesis and found it wanting. In the following two decades, symposia on insect communication included only participants who could provide positive results in support of “language” among bees, in spite of an early comment (Wells & Wenner, 1973, p. 175):
Do honey bees have a language? That is a question which may never be answered with certainty. It may be more useful to examine assumptions critically, state hypotheses and their consequences with precision, review the evidence objectively and ask: Can we now believe that honey bees have a language? Thus, it appears that the honey bee forager recruitment controversy is not about the nature of evidence but rather about the nature of hypotheses. It is not what investigators observe (the data) but what they believe (infer) that is at the heart of the controversy.
When one realizes the importance of thought collective control, it becomes more clear why journal referees come down strongly on one side or another during a controversy, just as do proposal reviewers, members of panels for granting agencies and even members of the media (e.g., Horgan, 1990, p. 29; Taubes, 1993, p. 263).
That is why new scientific journals have often been started after members of one thought collective have been excluded from existing platforms by members of other thought collectives (see Hull, 1988).
Fleck again preempted us all when he addressed that aspect of scientific controversy (p. 43): “Words which formerly were simple terms become slogans; sentences which once were simple statements become calls to battle.” Participants in a given controversy, it seems, suddenly fail to distinguish between ideas and personalities. Daniel Lehrman, of the Schneirla school, recognized this type of development (1970, p. 47): “We do not lightly give up ideas which seem central to us, and when they are attacked, we tend to mobilize defenses against the attacks.”
When controversies mature, positions almost imperceptively change, a point clearly emphasized by some of those in the Schneirla school (e.g., Lehrman, 1970, p. 47). We can restate
The attacked ideas in such a form as to make them seem again convincing to an audience whose confidence in them might have been weakened by the criticism. But when we change the formulation of the ideas in such a situation, we may also be modifying the ideas themselves, in response to criticisms which really may have been leveled against weaknesses in the original formulations.
In time, modifications can have accumulated to such an extent that the original hypothesis becomes lost. This event seems to have happened with respect to the dance language hypothesis, leading us to comment (Wenner et al., 1991, p. 771): “. . . proponents of the dance language hypothesis today no longer seem to have a clear notion of what one should expect from that hypothesis.”
Controversies eventually become resolved, if not automatically or rapidly, by renewed attention to Nature, which cannot be fooled by rhetoric. In many cases, if not most, that resolution comes gradually (as in Lehrman’s comment, above). One of the prevailing hypotheses becomes obviously more useful than the other(s) for explaining all available evidence (e.g., Wenner & Wells, 1987). But no prizes will be forthcoming: Fleck wrote (p. 123): “. . . the collective remodeling of an idea has the effect that, after the change in thought style, the earlier problem is no longer completely comprehensible.” Hardin (p. 51) echoed that thought: “Only at the end of an era do surviving pessimists have a chance to be recognized by their fellow citizens as being (finally) right, but it is not likely that they will then be praised for their foresight.”
Neither do we find that textbooks or the popular literature treat the resolution of controversy adequately. Textbook writers (usually “vademecum” scientists, or “stragglers” in Fleck’s terms) and editors continue to select items from earlier texts without realizing that some “facts” have become discredited (e.g., Paul, 1987). Even in the late 1980s I found a clip in the Santa Ynez Valley News (California) describing how some flies could travel 880 mph, despite discreditation of that “fact” in 1938 (see Wenner, 1989).
Today we realize that paradigm holds are essential in science; otherwise it would not be possible to design and conduct experiments. Nevertheless, we must make ourselves and others in our thought collectives more fully aware that paradigms can control our thinking and that we need to always reexamine assumptions (see Ten Principles, below).
The Complicated World of Animal Behavior Studies
The study of animal behavior is actually one of the more difficult tasks in science, occasioned in large part by the twin pitfalls of teleology (e.g., Bernatowicz, 1958) and anthropomorphism. The rationale among all too many participants: If a behavior is there, it must be good for something. Furthermore, its function must correspond with our current concept of reality (i.e., it has to be good for what we human beings think it should be good for).
The first of those pitfiulls (functional approach) has evolved out of our Judeo-Christian heritage (God created all for a given purpose). In the fields of animal behavior and ecology, in recent years this attitude has shifted to a belief in “Nature’s purpose.” (See Excursus TEL in Wenner & Wells, 1990 for a more expanded treatment of this topic.)
The second pitfall (assigning human characteristics to our subjects) has been inculcated in all of us since childhood. The “Nature” programs on television promulgate that concept throughout society (the “Disneyfication of science” as zoologist Bill Tavolga once phrased it).
Schneirla was a leader in opposing those twin pitfalls (e.g., Tobach & Aronson, 1970, p. xvi): “He was guided overall by the law of parsimony, by Morgan’s canon, and above all, by the need to avoid the dangers and fallacies of anthropomorphism and zoomorphism.” Instead, Schneirla advocated use of the inductive method in animal behavior research. Unfortunately, we have recently been treated to a resurgence in the “animal thinking” concept (including “cognition”) in nonhuman species (e.g., Griffin, 1984), a practice that Schneirla decried (as in Gerard Piel, 1970, p. 3):
One who sets out to demonstrate that protozoan organisms or any others have the mental characteristics of man may convince himself at least, provided he singles out opportunely the brief episodes which seem describable as instances of perception of danger, of reasoning, or what not. By the same method, the absence of reasoning in man can be proved with ease.
Also, one need not watch many Nature programs on television before it becomes evident that “exotic” episodes carry the day. This emphasis on the exotic carries with it an implicit pressure on students of animal behavior; experimental subjects and results must be exciting. Actually, so is it with all of scientific research.
Self and Mass Delusion
At times claims by those on the “forefront” of science may stretch the bounds of credibility (e.g., “cold fusion” research). Nevertheless, support may arise from varied quarters, and a vanguard scientist may then become ever more committed (as in Atkinson, 1985) to an exotic hypothesis. Optimism rules, in science as in other aspects of life, as expressed in another context by Hardin (1993, p. 50): “Perhaps for several decades the optimist will win out – getting richer, earning more prestige in the community, marrying better, and perhaps having more children than the pessimist . . . thus is the pessimist made to look foolish in the short run.
The Nobel chemist, Irving Langmuir (in unpublished lecture notes, Taubes, 1993, pp. 342, 343) was apparently the first to apply the label, “pathological science,” to circumstances such as “supersonic flies” (e.g., Wenner, 1989), “polywater” (e.g., Rousseau, 1992), and “cold fusion” (e.g., Taubes, 1993). Languir defined pathological science in six points, as follows:
1. The maximum effect that is observed is produced by a causative agent of barely detectable intensity and the magnitude of the effect is substantially independent of the intensity of the cause.
2. The effect is of a magnitude that remains close to the limit of acceptability.
3. Claims of great accuracy.
4. Fantastic theories contrary to experience.
5. Criticisms are met by ad hoc excuses thought up on the spur of the moment.
6. Ratio of supporters to critics rises up to somewhere near 50 percent and then falls gradually to oblivion.
Langmuir added a comment: “The critics can’t reproduce the effects. Only the supporters could do that. In the end, nothing was salvaged. Why would there be? There isn’t anything there. There never was.” However, one can see, even here, a remarkable parallel between Langmuir’s comments and those of Fleck earlier in 1935, as outlined above in Fleck’s emphasis on the role of “consensus” within thought collectives.
Of course, there comes a time when the hard face of reality peers down, and the bubble may break. Three decades ago we were treated to supposed examples of learning by means of cannibalism. Likewise, the honey bee “dance language” hypothesis was with us for 20 years before it was truly tested by experiment, and Moore (1988) has now finally raised unsettling issues about the presumed “magnetic compass” orientation of pigeons, three decades after that “fact” was established. Who knows which of the other leading animal behavior hypotheses might fail critical tests once animal behavior studies evolve to the point where careful scrutiny of both evidence and basic assumptions becomes a part of the working protocol?
Unfortunately, the exotic sells (as indicated above), and those in vanguard science are not immune from self-delusion (the easiest one to fool is oneself). Fleck had an appropriate comment about that circumstance as well (p. 105): “The elite panders, as it were, to public opinion and strives to preserve the confidence of the masses. This is the situation in which the thought collective of science usually finds itself today [in 1935]. If the elite enjoys the stronger position, it will endeavor to maintain distance and to isolate itself from the crowd.”
Promotion in the Media
Scientists in today’s world race to the media with their most recent “finds,” as is evident from the activity present in the press rooms established at major scientific conventions. Furthermore, science reporters no longer “report” and evaluate the news they have been exposed to. They seek advice from “authorities” in the field and search for “consensus” among those experts, seemingly afraid to think for themselves. Little do they realize that their actions may well do no more than reinforce the views of one particular “thought collective” (as phrased by Fleck) over another, justified or not.
The curious notion of “polywater” is a good example. An obscure physicist invented the polywater hypothesis, an exotic notion with military implications that caught on all too rapidly and widely (e.g., Franks, 1981; Wenner & Wells, 1990, pp. 49-50; Rousseau, 1992). After a decade of intensive research devoted to verification of the hypothesis by many on several continents, accompanied by much media hype, Denis Rousseau (e.g., 1992) conducted a critical experiment, the results of which exposed earlier findings as artifacts. For a time members of the polywater thought collective were upset, and his results were ignored, but eventually Nature won out. “Cold fusion” ran much the same course in only a few months (Taubes, 1993).
Rousseau’s experience, coupled with more recent episodes of “water with a memory” and “cold fusion,” led him to the conclusion that all these events constitute “pathological science” in Langmuir’s sense, events that can be (and should have been) recognized early on – but only if one is aware of three characteristics that they have in common (Rousseau, 1992).The first two were as in points 1 and 2, as well as 4, of Langmuir (above), but Rousseau (1992, p. 54) added a third: “To avoid these pitfalls, scientists must conceive and carry out a critical series of experiments. . . . But the third identifying trait of pathological science is that the investigator finds it nearly impossible to do such experiments.”
I might add to that third point the fact that scientists locked into a paradigm simply can neither recognize nor accept the results of tests or critical experiments that counter the belief system of their thought collective. Neither can they bring themselves to repeat such negative experiments. All too often in animal behavior studies “critical experiments” have come to be identified only as those that support a prevailing hypothesis. Unfortunately, the media can keep a discredited hypothesis alive long after it is no longer useful or believed by the scientists themselves. An example: How long will honey bee “dance language” persist in school texts, magazine articles, video tapes, Nature programs and newspaper clips if (or after) the scientific community abandons that line of research?
One could add yet another point: The present unfortunate media and public focus on individual accomplishment in science rather than an emphasis on the collective nature of the scientific process. Prizes and awards seem to go most often to those who reinforce the belief system of one thought collective or another, rather than to those who may truly advance science by exploding a myth, as Rousseau did. Hardin addressed that point (p. 109): “Today the greatest honor is accorded to speakers who focus on individual interests to the exclusion of community interests.”
Ten Principles of Scientific Research
The bee . . . extracts matter from the flowers of the garden and the fleld, but works and fashions it by its own efforts. The true labor of philosophy resembles hers, for it neither relies entirely nor principally on the powers of the mind, nor yet lays up in the memory the matter afforded by the experiments of natural history and mechanics in its raw state, but changes and works it in the understanding. (Francis Bacon 1620/1952, p. 126)
We can look more deeply into history for guidelines that we can follow, and then instill in our students the importance and excitement of process, rather than the accomplishments of science. Throughout history any given person may have had a good grasp of at least a portion of the scientific process (Figure 1). Which of the portions was grasped, however, has varied considerably among individuals and reminds one of different views of the elephant held by several blind men, each of whom had touched a different part of its body (e.g., Atkinson, 1985).
A common misconception among scientists is that “objectivity” is possible – they forget that we are human beings first and scientists second. When I mentioned to a young faculty member, new at the University of California (not our campus), that his mentor in his former graduate program might have had a strong bias on one point, he countered: “No” and said that his former advisor “is totally objective and without bias.” Furthermore, the temptation to exaggerate findings, fudge results, omit unfavorable data, emphasize “positive” results, etc., may have become even greater under “big” science than it was before, especially under current intense pressures to publish and/or to procure grants.
We have to recognize that none of us can be truly objective. Regrettably, that new faculty member had not learned in his university education that each of us is biased to a greater or lesser degree on a great many issues. As scientists, we are no exception (e.g., Mahoney, 1976). Scientists and observers of science (e.g., sociologists, psychologists and philosophers), in fact, have spent considerable time wondering how it is that scientists succeed, given the very human nature of us all.
In 1991, students in a class I conducted (“The Nature of Biological Science”) in the College of Creative Studies on my home campus surveyed the literature and collectively formed a list of principles that scientists should be aware of as they conduct their research. Those points are presented below, accompanied by appropriate quotations and citations, in chronological order within each section.
1. Attend more to Nature than to theory
– Aristotle (330 B.C./1931, III, p. 760,b): “. . . we should trust more the observations than the theory, and we should hold good the latter only if facts support it.”
– Francis Bacon (1620/1952, p. 84): “. . . it is the greatest weakness to attribute infinite credit to particular authors . . . They who have presumed to dogmatize on nature . . . have inflicted the greatest injury on philosophy and learning. For they have tended to stifle and interrupt inquiry.”
– Louis Pasteur (in Rene Dubos, 1950, p. 376): Preconceived ideas “become a danger only if [an experimenter] transforms them into fixed ideas. . .The greatest derangement of the mind is to believe in something because one wishes it to be so.”
– Claude Bernard (1865/1957, p. 39): “In a word, we must alter theory to adapt it to nature, but not nature to adapt it to theory. . . . When we meet a fact which contradicts a prevailing theory, we must accept the fact and abandon the theory, even when the theory is supported by great names and generally accepted.”
– Evelyn Fox Keller (1983, p. 35): “. . . the necessary next step seems to be the re-incorporation of the naturalist’s approach – an approach that does not press nature with leading questions but dwells patiently in the variety and complexity of organisms.”
– Naomi Aronson (1986, p. 630): “The production of scientific knowledge is simultaneously the production of scientific error. . . .”
– David Bohm and David Peat (1987, p. 51): “. . . to cling rigidly to familiar ideas is in essence the same as blocking the mind from engaging in creative free play.”
2. Use the appropriate scientific approach
– Claude Bernard (1865/1957, p. 34): “The experimental method . . . cannot give new and fruitful ideas to men who have none; it can serve only. to guide the ideas of men who have them, to direct their ideas and to develop them as to get the best possible results.”
– Louis Pasteur (in Emile Duclaux, 1896/1920, p. 97): “Repeat [the experiments] with the details which I give you and you will succeed just as I have done.”
– Thomas Chrowder Chamberlin (1890/1965, p. 756): “The effort is to bring up into view every rational explanation of new phenomena, and to develop every tenable hypothesis respecting their cause and history. The investigator thus becomes the parent of a family of hypotheses and, by his parental relation to all, he is forbidden to fasten his affections unduly upon any one.”
– John R. Platt (1964, p. 350): “When multiple hypotheses become coupled to strong inference, the scientific search becomes an emotional powerhouse as well as an intellectual one.”
– Max Silvernale (1965, p. 4): “There is nothing really mysterious about the scientific method; it is so simple that it can be understood by almost everyone. Yet the sad truth is that the majority of people today are not scientific, even though ours is a scientific age.”
– Belver Griffith and Nicholas Mullins (1972, p. 963): “In our examination of highly coherent groups, we see two factors [in addition to communication] as basic to science: first, the radical revision of scientific theory and method . . .; second the rarity of high levels of personal creativity.”
– Bernard Dixon (1973, p. 34): “. . . the top-class creative thinker designs a single, crucial experiment that decides absolutely between one hypothesis and another.”
– David Bohm and David Peat (1987, p. 100): “. . . science should be carried out in the manner of a creative dialogue in which several points of view can coexist, for a time, with equal intensity.”
3. Seek understanding, not “truth”
– Claude Bernard (1865/1957, p. 23): “. . . (An experimenter) must submit his idea to nature and be ready to abandon, to alter or to supplant it, in accordance with what he learns from observing the phenomena which he has induced.”
– Peter Medawar (in Bernard Dixon, 1973, p. 24): “Truth takes shape in the mind of the observer; it is his imaginative grasp of what might be true that provides the incentive for finding out, so far as he can, what is true” (emphasis Medawar’s).
– Alan Chalmers (1982, p. xvi): “There is just no method that enables scientific theories to be proven true or even probably true.”
– James Atkinson (1985, p. 734): Science is “a process whereby the human capacity for imagination creates and manipulates images in the mind. . . . This process produces “concepts, theories, and ideas which incorporate and tie together shared human sensory experience and which are assimilated into human culture through a similar act of re-creation.”
– Richard Feynman (in Gleick, 1992, p. 438): “I don’t have to know an answer. I don’t feel frightened by not knowing things, by being lost in a mysterious universe without any purpose, which is the way it really is as far as I can tell.”
4. Recognize limits of human perception
– Claude Bernard (1865/1957, p. 23): “It is impossible to devise an experiment without a preconceived idea.” (1865/1957, p. 38): “[Those] who have excessive faith in their theories or ideas are not only ill prepared for making discoveries; they also make very poor observations”; (1865/1957, p. 52): “The doubter is a true man of science; he doubts only himself and his interpretations, but he believes in science.”
– Jerome Bruner and Leo Postman (1949, p. 222): “When . . . expectations are violated by [Nature], the perceiver’s behavior [is one of] resistance to the recognition of the unexpected or incongruous.”
– Evelyn Fox Keller (1983, p. 145): “In practice, scientists combine the rules of scientific methodology with a generous admixture of intuition, aesthetics, and philosophical commitment.
– Lewis M. Branscomb (1985, p. 422): “Nature does not ‘know’ what experiment a scientist is trying to do. ‘God loves the noise as much as the signal.'”
5. Be honest and accurate (careful)
– Peter Medawar (1979, p. 39): “A scientist who habitually deceives himself is well on the way toward deceiving others.”
– Lewis M. Branscomb (1985, p. 423): “. . . integrity is essential for the realization of the joy that exploring the world of science should bring to each of us.”
– Walter Stewart and Ned Feder (1987, p. 214): “Scientists have to an unusual degree been entrusted with the regulation of their own professional activities. Self-regulation is a privilege that must be exercised vigorously and wisely, or it may be lost.”
– Efraim Racker (1989, p. 91): “The spiritual damage caused by scientific fraud is irreversible, and those involved are and should be reported and prosecuted irrespective of whether financial losses are involved.”
6. Pursue reasons for anomalies
– Claude Bernard (1865/1957, p. 23): “An experimenter, who clings to his preconceived idea and notes the results of his experiment only from this point of view, falls inevitably into error, because he fails to note what he has not foreseen and so makes a partial observation.” “. . . [the experimenter] must never answer for [nature] nor listen partially to her answers by taking from the results of an experiment, only those which support or confirm his hypothesis. We shall see later that this is one of the great stumbling blocks of the experimental method”; (1865, p. 50): “Experimenters . . . always doubt even their starting point”; (1865, p. 56): “Some . . . fear and avoid counterproof. As soon as they make observations in the direction of their ideas, they refuse to look for contradictory facts, for fear of seeing their hypothesis vanish.”
– Thomas Kuhn (1962/1970, pp. 52, 53): “Discovery commences with the awareness of anomaly, i.e., with the recognition that nature has somehow violated the paradigm-induced expectations that govern normal science.
– Evelyn Fox Keller (1983, p. 123): “When scientists set out to understand a new principle or order, one of the first things they do is look for events that disturb that order. Almost invariably it is in the exception that they discover the rule”; (1983, p. 179): “The challenge for investigators in every field is to break free of the hidden constraints of their tacit assumptions, so that they can allow the results of their experiments to speak for themselves.”
7. Heed results of earlier workers
– Louis Pasteur (in Rene Dubos, 1950, p. 376): “Preconceived ideas are like searchlights which illumine the path of the experimenter and serve him as a guide to interrogate nature.
– Belver Griffith and Nicholas Mullins (1972, p. 961): “[A] general indifference to the work of other researchers can generate considerable antagonism.”
– Walter Stewart and Ned Feder (1987, p. 213): “Every scientist has, at a minimum, an obligation to ensure that what is published under his name is accurate.
8. Focus on results, not on personalities
– Belver Griffith and Nicholas Mullins (1972, p.959): “Communication and some degree of voluntary association are intrinsic in science, and the important question therefore becomes not whether scientists organize, but rather how, why, and to what degree.”
– Bernard Dixon (1973, p. 171): “Very, very few scientists appear to be aware that all experience . . . is subjective.”
– Albert Rees (in the preface to the series [Medawar, 1979, p. xi]): Science “is an enterprise with its own rules and customs, but an understanding of that enterprise is accessible to any of us, for it is quintessentially human.”
– William Broad and Nicholas Wade (1982, p. 180): “Like any other profession, science is ridden with clannishness and clubbiness. This would be in no way surprising, except that scientists deny it to be the case. . . . In fact, researchers tend to organize themselves into clusters of overlapping clubs.”
9. Seek “how” and “what,” not why
– Claude Bernard (1865/1957, p. 80): “The nature of our mind leads us to seek the essence or the why of things . . . experience soon teaches us that we cannot get beyond the how, i.e., beyond the immediate cause or the necessary conditions of phenomena” (emphasis Bernard’s).
– John Steinbeck (1941/1962, p. 143): “But the greatest fallacy in, or rather the greatest objection to, teleological thinking is in connection with the emotional content, the belief. People get to believing and even to professing the apparent answers thus arrived at, suffering mental constrictions by emotionally closing their minds to any of the further and possibly opposite ‘answers’ which might otherwise be unearthed by honest effort.”
10. Encourage expression of opposing views
– Peter Medawar (1979, p. 39): “I cannot give any scientist of any age better advice than this; the intensisy of the conviction that a hypothesis is true has no bearing on whether it is true or not (emphasis Medawar’s).
– Bruno Latour (1987, p. 97): “As long as controversies are rife, Nature is never used as the final [arbitrator], since no one knows what she is and says. But once the controversy is settled, nature is the ultimate referee” (emphasis Latour’s).
– Adrian Wenner and Patrick Wells (1990, p. 268): “. . . we have to get across the point at all levels that we are scientists because it is fun, and that the interjection of humor and tolerance of disparate viewpoints in our scientific controversies is a part of the fun, and is a part of life itself.”
A clear message emerges from the above. Each generation, unaware of the above time-tested principles (through lack of appropriate education and/or requirements during scientific training), repeats the mistakes of earlier generations (the Santayana principle: “Those who cannot remember the past are condemned to repeat it.”). We can find innumerable examples in animal behavior research where scientists: (1) lock too much into theory and not enough into Nature, (2) fail to use all available scientific approaches (do not understand process), (3) seek “truth” as an end product, (4) do not realize that their past experiences can influence perception, (5) fudge results just a little or perhaps emphasize or select “positive” results, (6) Ignore anomalies that arise, (7) seek fame and fail to acknowledge adequately those who went before, (8) elevate some individuals to “hero” status, (9) wallow in anthropomorphism and teleology, and (10) act to exclude (deny) a platform to those with opposing views. Any one or more of these mistakes can lead to controversy and most often do.
But it need not be so. Students of animal behavior can get into the spirit of true science by emphasizing interpersonal relations less and Nature more. In animal behavior studies, the future lies in a greater emphasis on the stimuli responsible for given acts, not on the presumed “function” of those acts – not “Why does a given behavior exist?” but, “What stimulus evokes the behavior?” – just as Schneirla did. Also, we can more often choose the appropriate animal for the question rather than focusing on the animal itself. Finally, we must not strive to obtain certain results in our experiments. If one hopes for a given result, all is lost. Nature has its own rules for us to find, not to dictate.
Our particular community interest in animal behavior studies, of course, is the progress of science and our understanding of Nature, including knowledge of what animals really do. When controversy does erupt – as it surely will, and repeatedly – scientists should not run and hide but rise to the occasion and exploit the spirit and challenge provided.
The University of California at Santa Barbara, especially the Department of Biological Sciences in the College of Letters and Science and the College of Creative Studies, provided a haven and opportunity for reflections on the scientific process. Students in my course, “The Nature of Biological Research,” also contributed much toward the thoughts contained in this contribution.
I thank Eugene Meyer at Loyola College in Baltimore; William Shurcliff, Emeritus Professor of Physics at Harvard; and Patrick Wells, Emeritus Professor of Biology at Occidental College in Los Angeles, for their valuable suggestions for improvement of the manuscript. Special thanks also go to Gary Greenberg and Ethel Tobach for their role in bringing together the participants in the Sixth T.C. Schneirla Conference.
Anderson, J. (1988). Controversies in science: When the experts disagree. MBL Science, 3, 18.
Aristotle. (1931). Historia animalium. Book 9.40; Vol. 3; Vol. 4. London: Oxford University Press. Original work, 330 B.C.
Aronson, N. (1986). The discovery of resistance: Historical accounts and scientific careers. Isis, 77, 630-646.
Asimov, I. (1989, June 9). Cold fusion: Science fiction or reality? Los Angeles Times.
Atkinson, J. W. (1985). Models and myths of science: Views of the elephant. American Zoologist, 25, 727-736.
Bacon, F. (1952). Novum organum. In R.M. Hutchins (ed.), Great books of the western world (pp. 103-195). Chicago: Encyclopedia Britannica, Inc. Original work, 1620.
Bernard, C. (1957). An introduction to the study of experimental medicine. New York: Dover. Original work, 1865.
Bernatowicz, A. J. (1958). Teleology in science teaching. Science, 128, 1402-1405.
Bernstein, R. J. (1983). Beyond objectivism and relativism: Science, hermeneutics, and praxis. Philadelphia: University of Pennsylvania Press.
Bohm, D., & Peat, F. D. (1987). Science, order, and creativity. New York: Bantam Books.
Branscomb, L. M. (1985). Integrity in science. American Scientist, 73, 421-423.
Broad, W., & Wade, N. (1982). Betrayers of the truth. New York: Simon & Schuster.
Bruner, J. S., & Postman, L. (1949). On the perception of incongruity: A paradigm. Journal of Personality, 18, 206-223.
Chalmers, A. F. (1978). What is this thing called science. Milton Keynes, England: Open University Press. Original work, 1976.
Chamberlin, T. C. (1965). The method of multiple working hypotheses. Science, 148, 754-759. Original work, 1890.
Dayton, P. K. (1979). Ecology: A science and a religion. In R. J. Livingston (ed.), Ecological processes in coastal and marine systems (pp. 3-18). New York: Plenum Press.
Dewsbury, D. A. (1993). On publishing controversy: Norman R. F. Maier and the genesis of seizures. American Psychologist, 48, 869-877.
Dixon, B. (1973). What is science for? London: Collins.
Dubos, R. J. (1950). Louis Pasteur: Free lance of science. Boston: Little, Brown & Co.
Duclaux, E. (1920). Pasteur: The history of a mind. Philadelphia: W. B. Saunders Co. Original work, 1896.
Feyerabend, P. K. (1975). Against method: Outline of an anarchistic theory of knowledge. London: New Left Books.
Fleck, L. (1979). Genesis and development of a scientific fact. Chicago: University of Chicago Press. Original work, 1935.
Franks, F. (1981). Polywater. Cambridge, Mass.: MIT Press.
Gilbert, G. N., & Mulkay, M. (1984). Opening Pandora’s box. New York: Cambridge University Press.
Glazer, M. P., & Glazer, P. M. (1986). Whistleblowing. Psychology Today. 20, 36-43.
Gleick, J. (1992). Genius: The life and science of Richard Feynman. New York: Pantheon Books.
Griffith, B. C., & Mullins, N. C. (1972). Coherent social groups in scientific change. Science, 177, 959-964.
Griffin, D. R. (1984). Animal thinking. Cambridge: Harvard University Press.
Hardin, G. (1993). Living within limits: Ecology, economics, and population taboos. New York: Oxford University Press.
Horgan, J. (1990). Stinging criticism. Scientific American. 26,32.
Huizenga, J. R. (1992). Cold fusion: The scientific fiasco of the century. Rochester. N.Y.: University of Rochester Press.
Hull, D. L. (1988). Science as a process: An evolutionary account of the social and conceptual development of science. Chicago: University of Chicago Press.
Keller, E. F. (1983). A feeling for the organism: The life and work of Barbara McClintock. New York: Freeman.
Kuhn, T. S. (1970). The structure of scientific revolutions. 2nd ed. enlarged. Foundations of the Unity of Science. Vol. ii, No. 2. Chicago: University of Chicago Press. Original work, 1962.
Kuhn, T. S. (1979). Forward. In Ludwik Fleck (1935/1979), Genesis and development of a scientific fact. Chicago: University of Chicago Press.
Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Milton Keynes, England: Open University Press.
Lehrman, D. S. (1970). Semantic and conceptual issues in the nature-nurture problem. In L. R. Aronson, E. Tobach, D. S. Lehrman, & J. S. Rosenblatt (eds.). Development and evolution of behavior: Essays in memory of T. C. Schneirla (pp. 17-52). San Francisco: W.H. Freeman.
Loehle, C. (1987). Hypothesis testing in ecology: Psychological aspects and the importance of theory maturation. Quarterly Review of Biology, 62, 397-409.
Mahoney, M. J. (1976). Scientist as subject: The psychological imperative. Cambridge, Mass.: Ballinger (Lippincott).
Maier, N. R. F. (1960). Maier’s law. American Psychologist, 15, 208-212.
Medawar, P. B. (1979). Advice to a young scientist. New York: Harper and Row.
Moore, B. R. (1988). Magnetic fields and orientation in homing pigeons: Experiments of the late W.T. Keeton. Proceedings of the National Academy of Sciences, 85, 4907-4909.
Paul, D. B. (1987). The nine lives of discredited data. The Sciences, May/June, 26-30.
Pauly, P. J. (1981). Jacques Loeb and the control of life: An experimental biologist in Germany and America. 1859-1924. Baltimore: Johns Hopkins University (Ph.D. Thesis, Univ. Microfilm #8106660).
Peters, D. P., & Ceci, S. J. (1982). Peer-review practices of psychological journals: The fate of published articles, submitted again. Behavioral and Brain Sciences, 5, 187-255.
Piel, G. (1970). The comparative psychology of T.C. Schneirla. In L.R. Aronson, E. Tobach, D. S. Lehrman, & J. S. Rosenblatt (Eds.), Development and evolution of behavior: Essays in memory of T. C. Schneirla (pp. 1-13): San Francisco: W.H. Freeman.
Piel, G. (1984). T. C. Schneirla and the integrity of the behavioral sciences. In G. Greenberg, & E. Tobach (eds.). Behavioral evolution and integrative levels (pp. 9-14). Hillsdale, NJ: Erlbaum.
Platt, J. (1964). Strong inference. Science, 146, 347-353.
Polanyi, M. (1958). Personal knowledge: Towards a post- critical philosophy. Chicago: University of Chicago Press.
Racker, E. (1989). A view of misconduct in science. Nature, 339, 91-93.
Rosin, R. (1978). The honey bee “language” controversy. Journal of Theoretical Biology, 7, 489-602.
Rosin, R. (1980). Paradoxes of the honey-bee “dance language” hypothesis. Journal of Theoretical Bioloqy, 84, 775-800.
Rosin, R. (1988). Do honey bees still have a “dance language”? American Bee Journal, 128, 267-268.
Rosin, R. (1992). A note on the decisive “proof” for use of “dance language” information. American Bee Journal, 132, 428.
Rousseau, D. L. (1992). Case studies in pathological science. American Scientist, 80, 54-63.
Silvernale, M. N. (1965). Zoology. New York: Macmillan Co.
Steinbeck, J. (1962). The log from the Sea of Cortez. New York: Viking. Original work, 1941.
Stewart, W., & Feder, N. (1987). The integrity of the scientific literature. Nature, 325, 207-214.
Taubes, G. (1993). Bad science: The short life and weird times of cold fusion. New York: Random House.
Theocaris, T., & Psimopoulos, M. (1987). Where science has gone wrong. Nature, 329, 595-598.
Tobach, E. (1970). Some guidelines to the study of the evolution and development of emotion. In L.R. Aronson, E. Tobach, D. S. Lehrman, & J. S. Rosenblatt (eds.), Development and evolution of behavior: Essays in memory of T. C. Schneirla (pp. 238-253). San Francisco: W. H. Freeman.
Tobach, E., & Aronson, L. (1970). T.C. Schneirla: A biographical note. In L.R. Aronson, E. Tobach, D. S. Lehrman, & J. S. Rosenblatt (eds.), Development and evolution of behavior: Essays in memory of T. C. Schneirla (pp. xi-xviii). San Francisco: W. H. Freeman.
Wells, P. H., & Wenner, A. M. (1973). Do honey bees have a language? Nature, 241, 171-175.
Wenner, A. M. (1972). Incremental color change in an anomuran decapod, Hippa pacifica Dana. Pacific Science, 26, 346-353.
Wenner, A. M. (1989). Concept-centered vs. organism-centered research. American Zoologist, 29, 1177-1197.
Wenner, A. M. (1993). Science as a process: The question of bee “language.” Bios, 64. 78-83.
Wenner, A. M., & Harris, A. M. (1993). Do California monarchs undergo long-distance directed migration? In S. G. Malcolm & M. P. Zalucki (eds.), Biology and conservation of the monarch butterfly (pp. 209-218). Los Angeles: Natural History Museum of Los Angeles County, Science Series contr. No. 38.
Wenner, A. M., Meade, D., & Friesen, L. J. (1991). Recruitment, search behavior, and flight ranges of honey bees. American Zoologist, 31, 768-782.
Wenner, A. M., & Wells, P. H. (1987). The honey bee dance language controversy: The search for “truth” vs. the search for useful information. American Bee Journal, 127, 130-131.
Wenner, A. M., & Wells, P. H. (1990). Anatomy of a controversy: The question of a “language” among bees. New York: Columbia University Press.
Wenner, A. M., Wells, P. H.. & Johnson, D. L. (1969). Honey bee recruitment to food sources: Olfaction or language? Science, 164, 84-86.
Wilson, E. 0. (1972). (Letter exchange). Scientific American, 227, 6.