Page 1 of 2 12 LastLast
Results 1 to 10 of 17

Thread: Genetics of Modern Human Origins and Differentiation

  1. #1

    Join Date
    Aug 2007
    Last Online
    Saturday, February 8th, 2020 @ 03:48 AM
    Single adult
    Thanks Thanks Given 
    Thanks Thanks Received 
    Thanked in
    8 Posts

    Post Genetics of Modern Human Origins and Differentiation

    The phylogeography of Y chromosome binary haplotypes and the origins of modern human populations


    Although molecular genetic evidence continues to accumulate that is consistent with a recent common African ancestry of modern humans, its ability to illuminate regional histories remains incomplete. A set of unique event polymorphisms associated with the non-recombining portion of the Y-chromosome (NRY) addresses this issue by providing evidence concerning successful migrations originating from Africa, which can be interpreted as subsequent colonizations, differentiations and migrations overlaid upon previous population ranges. A total of 205 markers identified by denaturing high performance liquid chromatography (DHPLC), together with 13 taken from the literature, were used to construct a parsimonious genealogy. Ancestral allelic states were deduced from orthologous great ape sequences. A total of 131 unique haplotypes were defined which trace the microevolutionary trajectory of global modern human genetic diversification. The genealogy provides a detailed phylogeographic portrait of contemporary global population structure that is emblematic of human origins, divergence and population history that is consistent with climatic, paleoanthropological and other genetic knowledge.

  2. #2

    Join Date
    Aug 2007
    Last Online
    Saturday, February 8th, 2020 @ 03:48 AM
    Single adult
    Thanks Thanks Given 
    Thanks Thanks Received 
    Thanked in
    8 Posts

    Post Genetics of Modern Human Origins


    John H. Relethford

    Department of Anthropology, State University of New York, College at Oneonta, Oneonta, New York 13820; e-mail:


    A major and continuing debate in anthropology concerns the question of whether modern Homo sapiens emerged as a separate species roughly 200,000 years ago in Africa (recent African origin model) or as the consequence of evolution within a polytypic species spread across several regions of the Old World (multiregional model). Genetic data have been used to address this debate, focusing on the analysis of gene trees, genetic diversity within populations, and genetic differences between populations. Although the genetic data do provide support for the recent African origin model, they also are compatible with the multiregional model. The genetic evidence provides little direct inference regarding phylogeny, but it can tell us a great deal about ancient demography. Currently, neither model of modern human origins is unequivocally supported to the exclusion of the other.

  3. #3

    Join Date
    Aug 2007
    Last Online
    Saturday, February 8th, 2020 @ 03:48 AM
    Single adult
    Thanks Thanks Given 
    Thanks Thanks Received 
    Thanked in
    8 Posts

    Post Human Origins And Differentiation


    Henry Harpending and Alan Rogers

    Department of Anthropology, University of Utah, Salt Lake City, Utah 84112


    This is a review of genetic evidence about the ancient demography of the ancestors of our species and about the genesis of worldwide human diversity. The issue of whether or not a population size bottleneck occurred among our ancestors is under debate among geneticists as well as among anthropologists. The bottleneck, if it occurred, would confirm the Garden of Eden (GOE) model of the origin of modern humans. The competing model, multiregional evolution (MRE), posits that the number of human ancestors has been large, occupying much of the temperate Old World for the last two million years. While several classes of genetic marker seem to contain a strong signal of demographic recovery from a small number of ancestors, other nuclear loci show no such signal. The pattern at these loci is compatible with the existence of widespread balancing selection in humans. The study of human diversity at (putatively) neutral genetic marker loci has been hampered since the beginning by ascertainment bias since they were discovered in Europeans. The high levels of polymorphism at microsatellite loci means that they are free of this bias. Microsatellites exhibit a clear almost linear diversity gradient away from Africa, so that New World populations are approximately 15% less diverse than African populations. This pattern is not compatible with a model of a single large population expansion and colonization of most of the Earth by our ancestors but suggests, instead, gradual loss of diversity in successive colonization bottlenecks as our species grew and spread.

  4. #4

    Post Modern Human Origins: Genetic and cognitive aspects



    A Thesis
    Submitted to the Division of Social Sciences
    of New College of the University of South Florida
    in partial fulfillment of the requirements for the degree
    of Bachelor of Arts in Anthropology
    Under the sponsorship of Dr. Anthony P. Andrews
    Sarasota, Florida
    Spring 1996

    Source page:

    This thesis consists of two approaches to the Middle-to-Upper
    Pleistocene transition in human evolution, i.e., the shift from Homo
    erectus to Homo sapiens.

    The first section is a review and assessment of the
    ongoing "continuity vs. replacement" controversy surrounding the
    shift, focusing especially on the evolutionary theories behind the
    competing hypotheses. A mathematical demographic model by Weiss &
    Maruyama (1976), the implications of which I believe have been
    ignored, is given particular attention.

    The second section is a tentative, exploratory discussion of the
    specifically cognitive evolutionary novelties which may explain the
    transition. I argue that (1) a coherent and robust body of cognitive
    psychological theory will be necessary to interpret the
    paleontological and archaeological data of the period, and (2) the
    elements of such a theory already exist, but await an adequately
    established synthesis.

    Two million years ago the ancestors of our species lived only on one
    continent, unable to survive outside of tropical or sub-tropical
    habitat. They probably had only incomplete mastery over the use of
    fire. It is unlikely they could hunt large game.

    By 35,000 years ago our ancestors had shown every indication that
    they possessed all the potential we possess today: complex social
    organization, extensive strategic ability in every aspect of
    subsistence, and a rapidly diversifying technology. They had already
    spread across one continent a million years before (Eurasia), and
    were beginning the process of populating three more (Australia,
    North & South America).

    This thesis will discuss this evolutionary transition from two
    perspectives, attempting first to answer the questions of where,
    when, and how. Secondly, it will address the more speculative
    question of why the transition occurred when and where it did. Only
    when we can say why an evolutionary process occurred as it did and
    not otherwise can we claim to have explained it. Too many plausible
    evolutionary scenarios beg the question of why they did not occur
    earlier, or elsewhere, or more often.

    Section one of this thesis is an exercise in 'paleogenetics': the
    historical reconstruction of evolutionary dynamics from present*day
    genetic data, using demographic models as an integrative framework.
    This approach thus goes beyond simply analyzing genetic relatedness
    among modern peoples to testing hypotheses about the processes that
    may have resulted in modern patterns. The difficulties which
    theories based solely on molecular genetic data are presently facing
    result from their lack of a broader approach encompassing the
    complexities of speciation, ecology, and demographics.

    The impulse to write this section of the thesis grew out of
    frustration at the lack of consensus in the scientific community
    about the interpretation of Middle to Upper Pleistocene hominid
    fossils and their phylogenetic relationships.

    The problem centers on disputes over the origin of anatomically
    modern humans, their exact phylogenetic relationship to the various
    pre*existing Homo erectus populations, and where Neanderthal
    specimens fit into the picture.

    That achieving even rudimentary agreement has been difficult is not
    very surprising given the number and complexity of fields concerned
    with the problem. This situation has resulted in much theory having
    to be accepted uncritically from outside anthropology. The
    fundamental conflicts within paleoanthropology are between the kinds
    of theory and methodology intrinsic to those 'borrowed-from' fields
    rather than the evidence itself. Much of the difficulty thus seems
    to stem from conflicts between the several theoretical orientations
    within evolutionary biology. While it has been said that "nothing in
    biology makes sense except in light of evolution" (Dobzhansky 1973),
    it can as easily be said that evolution brings out all the
    difficulties and contradictions in biology as well.

    For example, one would think that given the evolutionary perspective
    within which all of biology now works that the fundamental process
    of evolution, speciation, would be fairly well understood; or,
    failing that, it would then naturally be the subject of a
    substantial proportion of the effort expended in the field.
    Surprisingly, neither is the case (Sokal 1974; Futuyma 1986; Moya

    However, not all of the theoretical difficulty concerning modern
    human origins comes from biology. There are problems within
    anthropology. The disintegration of the field of anthropology over
    the last twenty years has been well noted : ... "where outright
    hostility does not prevail, there tends at best to be a cool and
    distant tolerance of each subfield by the other, each privately
    waiting, one suspects, for the other to either see things its way or
    else just go away" (Paul 1987). Paul goes on to observe that this is
    unlikely, and that both 'camps' need to recognize the value of the
    other; not just in general, but in direct reference to what they do
    themselves. Each has important pieces of the puzzle and each ignores
    the other at their own peril. Looking outside anthropology proper,
    several other disciplines have come to be essential to understanding
    human evolution. In particular, cognitive psychology and comparative
    cognition possess a burgeoning corpus of experimental and
    ethological data which is relevant to model*building in human
    evolutionary study. This information is no longer just relevant to
    early hominid evolution, i.e., our lineage versus the ape stem. It
    is plausibly applicable until the most recent steps of human

    The disintegration within anthropology has hardly made the smooth
    integration of extra*anthropological viewpoints any easier. Beyond
    that, some might argue that the 'softer' subdisciplines of
    anthropology, cultural anthropology and linguistics, are less
    important to an understanding of human evolutionary history than the
    more 'scientific' subdisciplines: archaeology and biological
    anthropology. Thus, the reasoning might go, the split in
    anthropology is more of a blessing than a hindrance with respect to
    the study of human evolution. Both sections of this thesis will, I
    hope, illustrate the key role the so-called 'softer' disciplines can
    and should play in paleoanthropological theory-building. How
    relevant human evolutionary history is to the study of human
    behavior today is open to question. That socio-cultural behavior was
    important in human evolution, and is thus crucial to its
    explanation, is not.

    Section two asks a complementary set of questions about what
    precisely the last step of human evolution was. What cognitive
    advantages did Homo sapiens possess that Homo erectus did not? Did
    erectus use language and if so, what kind? And what was the adaptive
    context in which the changes that resulted in modern humanity came

    This section is an exercise in 'cognitive archaeology,' both in the
    sense of an attempt to dig up the cognitive past of the human
    species, and the need to attempt to derive a cognitive
    interpretation of archaeological evidence.
    Last edited by Frans_Jozef; Sunday, January 2nd, 2005 at 06:57 PM.

  5. #5

    Post Re: Modern Human Origins: Genetic and cognitive aspects

    Current Theories: Multi-Regional Evolution and the Single-Source
    Two main schools of thought have emerged regarding the origin of
    anatomically modern Homo sapiens. Each makes different assumptions
    about population genetics and each has different implications for
    the time*depth of so-called 'racial' variations present in modern
    populations. A third alternative, with variants, has also been
    developed, and will be discussed, but has not been as widely debated
    as the primary two.

    The first is known as the multi*regional evolution hypothesis, and is
    also known as the phyletic theory. It states that regional
    populations of Homo erectus evolved independently into modern
    populations of Homo sapiens, with archaic intermediate forms between
    the two grades.

    In Europe, Neanderthals and pre-Neanderthals would represent this
    archaic intermediate grade. Because of this the theory has also been
    known as the "Neanderthal phase" theory.

    This theory implies that so-called 'racial' characteristics are, or
    at least could be, very old, i.e., over 1,000,000 years. In other
    words, regional variations in human morphology are hold-overs from
    the previous Homo erectus populations.

    The evidence offered for the multi-regional theory stems primarily
    from fossil evidence. Phyletic theorists contend that there is a
    great deal of morphological continuity in regional populations of
    Homo erectus and Homo sapiens. For example, in the Far East, there
    are a number of morphological characteristics which modern humans
    and erectus have in common. These include cheek form: high,
    anteriorly placed cheekbones, resulting in a flatter face; dental
    characteristics: shovel-shaped incisors; and cranial traits. One of
    these cranial traits is known as an 'Inca bone', a feature resulting
    from an extra suture running across the occipital bone, which has
    its highest frequency (30%) in modern Eastern populations (and New
    World migrants ) and which was present in three out of four of the
    Peking erectus skulls (Krantz 1980:198). Thus, Chinese
    paleoanthropologists, at least, see a very clear transition from
    erectus to their modern populations (Nelson & Jurmain 1988:563;
    Wolpoff, Wu Xinzhi, and Thorne 1984; but see Young 1995).

    The chief proponents of the modern version of this theory have been
    Milford Wolpoff of the University of Michigan and Alan Thorne of the
    Australian National University (e.g., Wolpoff 1989, 1992; Thorne &
    Wolpoff 1992).

    The second theory is the single-source or Out-of-Africa hypothesis,
    also called the replacement hypothesis, which states that
    anatomically modern humans originated in one locale and subsequently
    displaced, i.e., replaced, archaic H. sapiens populations elsewhere
    in the world.

    This replacement is theorized to have occurred fairly recently, on
    the order of 50,000 years ago. Since so-called 'racial'
    characteristics are generally accepted to be primarily climatic
    adaptations, this theory implies that the geographic variation now
    present, at least in external morphology, has developed since the
    spread of modern humans.

    Evidence for the single-source theory lies primarily in molecular
    genetic information gleaned from analyses of modern human
    populations. What these analyses (of both mitochondrial DNA and
    nuclear DNA) show is that, genetically, all modern human populations
    are remarkably similar. They also purport to show that there is
    greater genetic diversity within native Africans, and argue that
    this diversity implies greater time depth there than elsewhere.

    Single-source theorists argue that mitochondrial DNA analyses show
    that all modern human populations are descended from an African
    ancestor who lived between 100,000 and 290,000 years ago (e.g.,
    Cann, Stoneking, and Wilson 1987; Wilson et al. 1991; Stoneking

    Fossil support for the single-source theory comes from the relative
    timing of the appearance of anatomically modern Homo sapiens
    (beginning with African specimens, followed by their appearance
    elsewhere); and the apparent rapidity of the subsequent shift in
    European skeletal morphology from a Neanderthal pattern to a modern
    pattern, especially in Western Europe.

    Thus there were single-source theorists prior to the advent of
    genetic analysis, e.g., Leaky (1963), but it was a more rarely
    argued opinion before genetic evidence was interpreted to support
    the idea of recent speciation.

    The chief proponents of this theory have been Christopher Stringer
    of the Natural History Museum in London (e.g., Stringer 1989a,
    1989b,1990; Stringer and Andrews 1988) and Rebecca Cann, Mark
    Stoneking, and Alan Wilson (e.g., Cann, Stoneking, and Wilson 1987;
    Wilson et al. 1991; Stoneking 1993).

    These are extreme voicings of the two arguments. Each has less
    strict versions intended to sidestep conflicting evidence, or offer
    alternate explanations for patterns of evidence. Proponents of the
    multi-regional theory often mention the possible role of gene flow
    between regions 'maintaining species identity', in order to account
    for the complete interfertility between and relative genetic
    homogeneity of modern populations. This is also used to explain how
    different populations maintained parity with the others during the
    transition. But the main thrust of the position is that the
    advancement in evolutionary grade occurred throughout the species,
    in concert, due to selective pressures operating throughout its
    geographic range.

    Proponents of the single-origin theory might respond to evidence of
    local morphological continuity by mentioning possible interbreeding
    between displacing and indigenous populations, but would maintain
    that the overwhelming nature of such gene flow was such that the end
    result was the same as it would have been had the invaders simply
    replaced, i.e., annihilated, the native populations.

    While I have restated the extreme ends of the spectrum of views, and
    while the "hedged" versions of the two positions may not seem
    significantly different, the issue is not amenable to compromise
    without relinquishing the essence of each theory. The less extreme
    versions of the argument do not resolve the issue. In fact, they
    merely serve to obscure the fundamental incompatibility of the
    assumptions underlying the two positions. Since the basic reasoning
    behind each position is still different, they must remain
    irreconcilable no matter how similar they may seem on the surface.
    To my mind, the persistence of this dispute within the
    paleoanthropological community serves to demonstrate their

    Alternatives to the two primary theories have been offered by
    several researchers. One, offered by Gunter Brauer (Brauer
    1984,1989,1992) is known as the (African) Hybridization and
    Replacement Theory. Another, known as the assimilation model, has
    been offered by several scholars (Smith 1992; Smith and Trinkhaus
    1992; Smith, Falsetti and Donnely 1989). These theories have not
    received as much press, popular or academic, as the other two. They
    will be addressed following the discussion of the first two
    theories, since they are each modified versions, to one degree or
    another, of the primary theories.

  6. #6

    Post Re: Modern Human Origins: Genetic and cognitive aspects

    Alleles in Evolution
    Having followed Wolpoff in his analysis of the paleoanthropological
    scene this far, I will now take the opposite tack and question his
    characterization of the exact evolutionary nature of the Middle*Upper
    Pleistocene shift.

    Wolpoff (1989, 1992) is effective in refuting the single*source
    recent replacement hypothesis. But the quality of what he holds
    forth as an alternative is not the equal of his criticisms of
    others. First, he reveals a belief that gene flow is necessary to
    prevent speciation: "Without gene flow, it is inevitable that there
    will be speciation" (Wolpoff 1989:87). Second, gene flow "must
    primarily be regarded to function in changing frequencies of
    existing alleles, and not usually in introducing new ones" (1989:87-
    88). He thus posits no necessary center of origin nor any
    significant discontinuity in the course of the transition.

    Both of these notions can be seriously questioned both from the
    standpoint of population biology in the present (Petry 1982; Gould
    1977:115) and a more thorough analysis of the nature of the Middle*
    Upper Pleistocene transition. The issue of 'inevitable' speciation
    is not crucial to the argument here, so it will not be discussed
    beyond the observation that many long-term geographical isolates
    remain biologically fertile, especially among large-bodied mammals.
    The issue of the nature of the role of gene flow, however, is
    crucial to the argument.

    In Wolpoff's (1989) theory the effect of gene flow is to produce
    changes in frequencies of alleles rather than to introduce new
    alleles. His reasoning is that the characteristics which appear to
    be involved in the transition are more often quantitative than
    qualitative. This is also to counter charges that even the amount of
    gene flow he alleges could provide for the introduction of only 3 or
    4 advantageous alleles in the time period necessary (Rouhani

    Wolpoff's idea, however, suffers from a flaw in logic. If gene
    frequencies alone are all that are involved in the transition,
    selection would produce evolution even without gene flow from
    outside the geographic region in question. If, however, selection is
    not at work in the local region, then gene flow from outside would
    not produce local evolutionary change. Either the resultant gene
    frequencies are adaptive or not. And while clines do not serve as
    barriers to advantageous genes (Weiss & Maruyama 1976) they do serve
    as barriers to neutral alleles (Livingstone 1992).

    The dispute over the logic of gene flow between Wolpoff and Rouhani
    and others (Jones & Rouhani 1986) is over the behavior of variants
    of single genes (i.e., alleles) under particular mathematical
    conditions. This situation, unfortunately, may be tantamount to
    arguing over how many angels can dance on the head of a pin.

    Currently, there is very little detailed understanding of the
    mechanisms and possible higher*order structure involved in the
    developmental genetic process. Consequently, it is unknown to what
    extent mathematical models which assume single genes respond to
    selection as single independent entities are valid. Higher-order
    processes may very well be responsible for some of the kinds of
    change we are concerned with here.

    Chromosomal evolution, rather than changes in gene frequencies or
    novel alleles, is one example of a 'higher-order' phenomenon which
    might be relevant (White 1978). For example, chromosomal "banding
    studies indicate that at least 10 large inversions and
    translocations and one chromosomal fusion have occurred since" the
    human and chimpanzee lineages diverged (King and Wilson 1975:114).
    Obviously these changes have occurred by mutation and been
    subsequently inherited. Clearly they did not result in sterility or
    they would not have persisted. It is unclear what effects these
    changes have had and whether or not they are responsible for none,
    some, or all of the physiological differences between chimps and
    humans. Chromosomal mutations are not well understood and their
    suggested effects have ranged from uniformly deleterious to trivial.

    Moreover, the comparative study of karyotypes and interspecies
    chromosomal change has not, surprisingly, been integrated into
    general evolutionary research. The reasons for this are varied, but
    for the most part it is attributable to the fact that the scholars
    who pursue the work do so outside of the biological mainstream, and
    make little effort to make their work available and comprehensible
    to other biologists (Marks 1983). Moreover, this work examines
    phenomena which do not lend themselves to investigation within the
    currently dominant molecular biological paradigm. This has resulted
    in a potentially very informative line of questioning, i.e., what
    areas of the genome, in particular, may be involved in the
    differences between Homo and Pan/Gorilla, being unexplored.
    Surprisingly, there is even a fair amount of intra-species
    chromosomal variability within modern Homo sapiens. This hardly
    argues for a simple situation.

    So, contra Rouhani, heritable chromosomal mutations are not
    necessarily a "local phenomenon" in terms of their subsequent
    inheritance. It would appear that a chromosomal mutation, once
    having uniquely occurred, can be inherited like any other mutation.

    What is important for my argument is that as long as these higher-
    order mutations are heritable they may behave as alleles and thus
    conform to the demographic models discussed here. How they differ is
    that they cannot be expected to appear often or regularly in the
    evolution of a species. Indeed, while any individual nucleotide in
    DNA typically has a characteristic mutation rate which makes the
    occurrence of a mutation statistically certain to occur several or
    many times in the life of a species (Livingstone 1992), any given
    chromosomal mutation can be expected to be effectively unique (Marks
    1983). This is because their effects, if any, would be expected to
    depend on the context of the genetic makeup of the individual in
    which it occurs.

    If Wolpoff's assessment of the function of gene flow in Pleistocene
    human populations were correct, then Livingstone's analysis of the
    likelihood of allele spread vs. in-place allele mutation effectively
    invalidates Wolpoff's phyletic theory!

    An example of fundamental genetic processes just beginning to be
    investigated is recent work on how DNA folds and unfolds to allow
    transcription in a functioning cell (Manuelidis 1990). What was
    previously thought to be "junk" DNA, the regions between coding gene
    sequences, now appears to be involved in several levels of higher
    order folding which govern the expression of genes. Chromosomes are
    now understood to be maintained as discrete, compact entities within
    the nucleus and regions of chromosomes can acquire unique three-
    dimensional positions in differentiated cell types (Manuelidis 1990:
    1533). This research is noted to illustrate 1) the fairly recent
    recognition of hitherto unexpected realms of genetic complexity in
    general, 2) the recognition of complex chromosomal structure and
    function which suggests that chromosomal mutation may be far from
    trivial. Beyond this recognition of the potential role for
    chromosomal mutation, our ignorance of the details of developmental
    function makes other heretofore undetected, yet evolutionarily
    relevant phenomena not only possible, but likely.

    This issue, the refusal by many scholars (both in population biology
    and paleoanthropology) to recognize that vast areas of biological
    function remain to be explained, even in broad terms, has been the
    single most frustrating aspect of my research and thought about this
    subject. Clearly, scholars must be sure that their theories are
    consistent with what is currently understood about any natural
    phenomenon. They should likewise be cautious when making assertions
    based on still-speculative theories. With regard to this subject,
    however, the issue is one of making assertions when we are fully
    aware we have barely begun to understand the processes involved.

    Mosaic Evolution
    Wolpoff's phyletic theory implies a geographically generalized
    advance in grade across the human species territory. He has made
    much of the need to distinguish grade characteristics i.e., those
    which represent the evolution toward the modern condition, from
    clade characteristics, i.e., those which are locally evolved, and
    presumably neutral or at least irrelevant to the shift to modern
    humanity. However, Wolpoff's criticisms notwithstanding (1992: 45-
    47), very early dates for material from Klasies River Mouth, Border
    Cave, and Omo Kibish argue for the development of modern morphology
    in Africa prior to its appearance elsewhere. Moreover, recent
    archaeological evidence of fairly advanced bone technology at an
    early date (>90kya) in Africa lends credence to the antiquity of the
    specimens exhibiting advanced morphology that they might not
    otherwise have (Yellen 1995).

    Clearly the most troublesome fossil evidence for both major
    theories, though admittedly less so for the Single-Source theory, is
    the fossil evidence from the Levant. A series of dating reappraisals
    using ESR (Electron Spin Resonance) and Thermoluminescence (TL)
    dating have turned a previously-held chronology on its head.
    Neanderthals and early modern humans now appear to have coexisted
    over a long period and exchanged the same habitat back and forth
    with one another in the course of shifting population distributions.

    If the dates for these specimens offered as reliable are correct,
    they bring up a host of questions for both primary theories and
    the 'hybridization and replacement' theory. For the replacement
    theory: If the modern human species was so adaptively superior to
    archaic regional populations (like the Neanderthals), why were they
    not, at the least, permanently displaced when early moderns arrived
    in a region? With respect to the phyletic theory, their
    contemporaneous coexistence brings up the question of how such
    different variants persisted so long without effecting the other's
    morphology. The 'hybridization and replacement' theory would appear
    to at least allow for the complex process implied by the Levant
    fossil record, but fails to go very far toward explaining it.

    A perplexing aspect of much of the late Middle and early Upper
    Pleistocene evidence is the appearance of modern human anatomy in
    some areas without concurrent improvements in technology and
    subsistence. Indeed, this phenomenon is seen in the Levant with the
    Skhul remains associated with Lavelloiso-Mousterian implements (Day
    1986:114). This association makes more sense given the dating
    reappraisal. But again the evidence argues for a complex
    evolutionary dynamic rather than any straightforward process.

    This evidence supports neither the single-source replacement theory
    nor the phyletic theory. In the former case, modern behavior would
    be expected to accompany modern morphology. In the latter, modern
    behavior would necessarily predate modern morphology. The fact of
    this bizarre sequence of events, both morphological and
    technological, implies that current analysis of the possibilities is

    Surprisingly, there is a great deal of agreement among all those
    involved in the study of this period about the causation of several
    morphological characteristics of anatomically modern Homo sapiens
    (Harrold 1992). Stated briefly, most of the morphological change
    involved in the transition falls under the rubric
    of 'gracilization', i.e., a general reduction in robusticity. These
    changes include reduced anterior dentition and reduced facial
    prognathism and massiveness, reduced nuchal (neck) musculature,
    reduced upper body muscular massiveness, and reduced lower limb
    robusticity. A secondary change in both the length of the legs and
    the relative length of the distal segments served to make long-
    distance walking more efficient.

    The reduction in these areas is generally seen to be the result of
    lessened selection for robust morphology. Since robust morphology is
    energetically expensive, any reduction in selection for it would
    result in its loss. Robust morphology can only be evolutionarily
    sustained by the necessity of extremely strenuous, sustained
    physical activity. By every estimation, Middle and Upper Pleistocene
    hominids led far more rigorously athletic lives than any extant
    people. Robust dentition and cranial massiveness imply habitual use
    of the teeth as tools and/or relatively unprocessed food. Upper body
    massiveness implies the habitual use of strength to accomplish
    manipulative tasks rather than efficient, leverage-improving tools.
    Lower limb robusticity implies habitual, sustained, high-speed

    The subsequent shift in lower limb proportion is especially
    interesting since it implies the transition did not necessarily
    involve a reduction in the amount of distance typically covered,
    merely the speed with which it had to be traversed.

    This evolutionary causation of gracilization, and the archaeological
    evidence of the behavioral change it implies, will be more fully
    explored in Section II of this thesis. What is important here is
    that the morphological transition from archaic to anatomically
    modern Homo sapiens in several regions was not accompanied by an
    archaeologically detectable shift in behavior: a rather bizarre

    This situation argues for the evolution of improved cognitive
    potential, gracilization due to increased efficiency with improved
    cognitive potential (both in a fairly localized, very probably
    African, population), and subsequently the uneven spread of evolved
    modern cognitive potential and evolved gracile morphology, not
    always together, throughout the rest of the species' range. An
    important factor to note here is that the adaptive characteristic
    driving the transition may very well be paleontologically invisible,
    since the morphological characteristics which mark modern humans are
    generally agreed to be effects rather than causes!

    The only other alternative is to argue that either 1) the selection
    responsible for gracilization was not, at root, attributable to
    ecological efficiency, or 2) the improvement in ecological
    efficiency (across the species range outside of Africa) was
    archaeologically invisible. Neither one of these alternatives is
    very attractive.

  7. #7

    Post Re: Modern Human Origins: Genetic and cognitive aspects

    The (African) Hybridization and Replacement Theory
    Gunter Brauer (1984,1989,1992) has offered one form of a compromise
    between the two primary theories of the evolutionary origin and
    spread of modern humans. It differs from the replacement theory in
    that it assumes biological interfertility between early moderns and
    regional populations of archaic sapiens. Thus he clearly denies that
    modern humans were, strictly speaking, a new species. He is happy to
    make distinctions in grade between specimens; he uses Stringer's
    (1984) three-grade system for Homo sapiens.

    It differs from Wolpoff's phyletic theory in two ways, one
    straightforward and the other quite subtle. Brauer accepts early,
    i.e., prior to 60,000ya, dates for several modern or almost modern
    specimens (Klasies River Mouth, Omo Kibish I, Border Cave, and
    others; Brauer 1989:126-128). Thus he sees Africa as the clear
    source of modern morphology.

    The subtle way in which Brauer differs from Wolpoff (and the
    explanation for the name of the theory) is that Brauer sees modern
    humans evolving only in Africa. Their subsequent spread throughout
    the old world he describes as a process of hybridization and
    replacement. He grants, however, that this process "was certainly
    very complex, multicausal, different in various regions, and hardly
    rapid or complete" (1989: 124). In characterizing it this way he
    appears to be making a distinction between the context in which
    characteristics become adaptive and the context in which adaptive
    characteristics spread.

    Dr. Brauer has argued (1992) that the version of the Single-Source
    Theory that Wolpoff has so vigorously challenged, i.e., one that
    requires complete replacement without admixture, was merely an
    extreme model put forth as a test subject for the various
    methodological approaches. In a similar vein, Stringer has argued
    that his positions have used a phylogenetic species concept (one
    based purely on fossil morphology) rather than a biological species
    concept (one which has implications for interfertility) (Stringer

    Both of these responses are a bit disingenuous. In the first case it
    is to plead that a researcher didn't really mean what he said. As
    for the second case, all the scholars in the field are presumably
    arguing over what actually happened to real living organisms, each
    of which presumably belonged to a valid biological species. It
    should certainly be permissible to operate temporarily with a
    circumscribed set of concepts for the sake of methodological purity,
    but conclusions need to be quickly retranslated into real terms. An
    evolutionary species concept, (or a cladistic species concept,
    etc.), must eventually come into logical contact with the more
    general biological species concept.

    This necessity, that conclusions be put back into biological terms,
    is the basis of my criticism of Brauer's theory. While I clearly
    agree with his assessment of the processes that brought about the
    transition, I take issue with the terminology he uses to describe
    them. He contends that if the transition from archaic sapiens to
    a.m. sapiens resulted in a marked shift in cranial anatomy, with
    only a few minor regional characteristics demonstrating some gene
    flow between old and new forms, that this is best described by the
    terms 'hybridization and replacement.' Brauer appears to be basing
    his judgment of 'hybridization' on the same kind of phylogenetic or
    cladistic (i.e., based on morphology) species definition that
    Stringer (1992) has. However, I would contend that as long as we are
    assuming a single, albeit polytypic, species, that in strict
    biological terms we are witnessing gene flow. The distinction is not
    a trivial one, because it forces us to think about the adaptive
    context in which the evolutionary process is taking
    place. 'Hybridization' implies a less-adapted variant arising
    between two well-adapted genetic complexes. 'Evolution' elicits what
    I believe is the correct picture-- the increase in frequency of
    adaptive characteristics in a species due to their (the characters')
    adaptive superiority.

    Moreover, Brauer's terminology emphasizes what he sees as a process
    dominated by demic radiation, i.e., 'territorial change of groups or
    tribes, primarily due to ecological factors'(Brauer 1992:95). In
    other words, the transition is more attributable to migration than
    evolution per se. The analysis of the demographics of gene flow
    reviewed here make such an interpretation unnecessary. The
    chronology and characteristics of the fossil record are consistent
    with a quite local process of interdemic gene transfer without
    recourse to population migrations or replacements.

    Up to this point in the thesis, I have discussed the evolutionary
    process at work in recent human evolution in the abstract, i.e.,
    without going into the selective causation of the transition. When
    making fine distinctions of the kind involved in my critique of
    Brauer's and Wolpoff's theories, the exact nature of the adaptations
    involved become unavoidable. Section Two of the thesis will address
    this question in depth.

  8. #8

    Post Re: Modern Human Origins: Genetic and cognitive aspects

    The Assimilation Theory
    Fred Smith, of the Northern Illinois University, has offered a
    theory quite close to Wolpoff's multi-regional evolution, except
    that it accepts an African origin for the modern humans, followed
    presumably by the spread of modern human morphology by gene flow to
    the rest of the species' range (Smith 1989).

    Leslie Aeillo (1993) in her review of the various theories presented
    here, states that of the three non-replacement theories, the
    assimilation theory was the least likely. Her reason for this
    assessment is that

    "If gene flow were the main force for the spread of amHs, one would
    expect to find clearer examples of transitional specimens than
    currently recognized in the Levant or Europe."
    She also notes the degree of variability in Late Pleistocene climate
    and ecology and points out that this would be expected to result in
    large-scale population movements. She would argue that these kinds
    of movements are more consistent with the African replacement or
    African hybridization models. However, as reviewed here, the
    rapidity with which characteristics can potentially spread would
    argue that (relatively) rapid morphological change as seen in the
    Levant can be consistent with such a model. Moreover, her point
    about ecological changes only holds if one assumes that such changes
    would bring different populations into greater contact. If one
    visualizes the situation in terms of long-term shifts of
    geographical range, proceeding by demic expansion and contraction,
    then the pattern of fossil evidence in the Levant becomes even more
    plausible as a result of gene flow.

    Smith does not, however, emphasize the problematic role of
    evolutionary semantics (see below) in the disputes. Nor does he
    focus on the likely cognitive basis for the evolutionary advantage
    modern humans possessed over their erectus cousins. Rather, Smith
    and his supporters emphasize the advantages modern human body form
    has over previous populations (Aiello 1993).

    It should be clear by now that I favor a description of the
    processes which led to modern humanity essentially the same as
    Smith's assimilation theory. The concerns I wish to insert in to the
    discussion include (1) what I consider overconfidence regarding our
    understanding of the genetic basis of macroevolutionary change and
    (2) a lack of recognition that the fundamental unresolved issue is
    one of the nature of subspecific evolution and the nomenclature to
    deal with it. That much remains unsolved in this arena is reflected
    in the statement that there is currently no fully consistent and
    coherent theory of speciation in biology (Futuyma 1986), rather
    there is a collection of partial theories which are still being
    tested. Thus this charge has been voiced by better qualified
    geneticists than myself (which is to say I make no claim to be a
    geneticist at all).

    In the end, it should be clear that the greatest source of conflict
    within the continuity or replacement debate is an ill-defined idea
    of what we want in the way of an answer to the question "How can we
    best characterize, overall, the Middle-to-Upper Pleistocene
    transition?" Do we want to know simply where the genes behind modern
    human morphological changes came from? It would seem that the likely
    answer to that question, for the most part, is in Sub-Saharan Africa.

    Thus, the question comes down to what we mean when we say something
    evolved. Not all characteristics evolve in the same way, in spite of
    many evolutionary geneticist's standard operating assumption that
    shifting frequencies of single genes alone are behind the process.
    When novel characteristics and/or adaptive complexes arise in one
    portion of a species' geographic range and subsequently come to
    predominate throughout the species' range, where do we want to say
    the evolution really occurred? Do we want to know where the
    selective context for a given character's increase and fixation
    occurred? These are more formidable questions than those above, the
    answers for which depend on whether you consider the spread of a
    characteristic to be hybridization or gene flow. The distinction
    between hybridization and gene flow appears to me to be a false one
    for most of the hypothesized situations discussed here. If one tends
    to favor a (weakened) replacement theory, one would call it
    hybridization between separate species. If one favors a phyletic
    theory, one would call it gene flow between populations of a
    polytypic species. As long as interfertility is not questioned, the
    preferable characterization depends entirely on what one considers
    most important in the resultant species -- continuity or disjunction.

    At this point, I would like to explore one of the consequences of an
    acceptance of my interpretation of the evidence: a revised fossil
    taxonomy which ameliorates some of the inherent conflicts between
    the various schools of thought.

    There are two options. One is to consider recent human evolution to
    be an intraspecific process, and not consider modern humans worthy
    of a new specific name. In this case one is led to a nomenclature
    dominated by Homo erectus. Typical Homo erectus becomes Homo erectus
    erectus. Archaic Homo sapiens become perhaps Homo erectus
    neanderthalensis along with regional variants with their own
    subspecific names. And Homo sapiens sapiens become Homo erectus
    sapiens. Alternatively, one could simply grant all erectus specimens
    Homo sapiens status and thus designate them Homo sapiens erectus and
    Homo sapiens neanderthalensis, leading to modern Homo sapiens
    sapiens. This scheme, however, violates normal conventions.

    The other option is to consider recent human evolution to be a case
    of phyletic speciation, and thus grant modern humans new specific
    status while avoiding implications of non-interfertility. This might
    lead to Archaic Homo sapiens being designated Homo erectus sapiens.

    Being something of an iconoclast, I prefer the first option, since
    it accentuates our relationship to our much maligned, beetle-browed
    ancestor who was the first to use fire and survive outside of our
    original tropical environment. Moreover, emphasizing that link more
    clearly bridges the gap between ourselves and the animal world.
    Perhaps thinking of ourselves as Homo erectus sapiens rather than
    Homo sapiens sapiens would mute what seems to be a natural tendency
    toward hubris.

  9. #9

    Post Re: Modern Human Origins: Genetic and cognitive aspects

    Section One of this thesis attempted to bring some badly needed
    order to understanding the Middle-Upper Pleistocene transition. The
    focus of that discussion was the plausibility of alternate accounts
    of the population genetics of the transition. That discussion could
    take place without reference to the particular adaptive character or
    characters which drove the shift. As long as it (or they) were
    heritable and contributed to fitness, the analysis remains valid.

    This section will concern the nature of the adaptation responsible
    for the shift. Put simply, what made modern Homo sapiens , as
    currently defined, better adapted than Middle Paleolithic archaic
    Homo sapiens and Homo erectus (again, as currently defined)?

    Unlike phylogenetic interpretations, there is an emerging consensus
    regarding the general nature of recent human evolution. This
    consensus sees the modern human condition as the result of the
    evolution of specifically cognitive capacities not present in
    earlier hominids. Most agree that these capacities probably underlie
    language, social complexity, increased dependence on cultural means
    of adaptation, and as part of this, advances in technical ability
    (in the sense of both technology and technique). Most important is
    the recognition that no simple, straightforward causal chain was
    responsible for these changes, but rather a multifaceted process
    involving complex relationships between cognition, culture, and
    environment (Gibson 1991).

    Thus the processes that brought about modern humans are the result
    of neither non-genetic developments, nor straightforward anatomical
    or metabolic adaptations. In other words, modern humanity was not
    the end result of an invention, like writing or the bow, nor the
    result of a typical evolutionary innovation, e.g., the development
    of a shelled egg, as in the evolution of the reptiles out of
    amphibian stock. Quite the opposite, the gross anatomical changes
    which identify modern humans (gracile skeletons, higher crural and
    brachial indices, cranial features etc.) were most likely themselves
    the result of increased subsistence efficiency made possible by
    improved cognitive capacities.

    However, notwithstanding agreement over the general nature of recent
    human evolution, there are quite different ideas about the causal
    processes and relative importance of the various factors. As in the
    disputes over replacement vs. multi-regional evolution, this range
    of opinion seems to stem largely from the range of disciplines
    addressing the questions. Archaeologists, developmental
    psychologists accustomed to dealing with people, linguists, and
    cognitive psychologists (some dealing mostly with humans, others
    with a comparative evolutionary approach) all are currently at work
    on this period of prehistory. The room for disagreement also stems
    from the seemingly profound 'gap' between the abilities of our
    closest phylogenetic relatives and ourselves. (Noting this gap,
    however, should not imply a judgment about whether the differences
    are in kind or degree.) We should perhaps be thankful that our
    closest extant phylogenetic relative is not even more dissimilar to
    us than Pan sp.

    This section of the thesis will attempt to formulate a somewhat more
    synthetic and, I hope, plausible account of the cognitive
    capabilities involved in modern human evolution than the other
    accounts to be reviewed here.

    While it may seem extremely speculative to attempt an archaeological
    investigation of cognitive changes which took place 200,000 years
    ago, one might think about an aphorism penned by paleoanthropologist
    Glynn Isaac: If one is interested in ancient treetops but the forest
    has been cut down, leaving only the roots, "a mole may not be such
    an inappropriate consultant" (Isaac 1976:275).

    The interpretation of human cognitive evolution is made difficult
    less by a dearth of archaeological evidence (and specifically
    archaeological theory to interpret it) than the lack of a clearly
    applicable theory of mind with which to approach the products of
    archaeological investigation. Some scholars (e.g., Marvin Minsky,
    John Tooby, Leda Cosmides, Steven Pinker) hold that most, if not
    all, aspects of the modern human mind are the product of fairly
    narrow, domain-specific processing elements, (i.e., "autonomous
    computational modules" (Pinker and Bloom 1992:451), all of which
    they contend are specifically evolved to solve a particular adaptive
    problem. Other scholars hold that a great many mental processes,
    e.g., Fodor's "central processes," have, so far, remained opaque to
    experimental investigation because of their broad, non-domain-
    specific, global character. These may very well be evolved
    capacities as well, but their nature makes linking them to adaptive
    contexts problematic. Beyond that, not even all the scholars who
    advocate 'modular mind' theories apply their theories within an
    explicitly evolutionary framework (e.g., Minsky, Fodor).

    That what I am attempting here is not premature is, I hope,
    demonstrated by the efforts of two scholars who have developed
    elaborate evolutionary scenarios of the origin and structure of the
    modern human mind. The first is Merlin Donald (1991), who I believe
    makes a good case for a relatively recent, two-step cognitive
    evolutionary process which includes a uniquely human pre- (or at
    least proto-) linguistic stage. The second is Derek Bickerton (e.g.,
    1981,1990) whose theories extend in greater detail (and I will
    argue, with greater accuracy) than Donald's into the probable
    characteristics of proto-language and proto-linguistic cognition.

    My analysis here for much if not the most part will be a response to
    and extension of their theories, using theoretical tools supplied by
    Jerry Fodor (1983), Paul Rozin (1976), and others (Alan Leslie
    1987). At the same time, efforts will be made to amplify or
    illustrate points about cognitive function with small, but healthy,
    doses of archaeological theory and data.

    At the outset, though, a few points need to be addressed. First, a
    note must be made about independently operating arenas of
    evolutionary change. Clearly, anatomical change is evident in the
    human paleontological record. Throughout the later Pleistocene,
    there is an ongoing emergence of gracile postcranial anatomy, a
    thinner, high, rounded cranium, and reduced prognathism, along with
    a host of minor details. It is on the basis of this anatomical
    change that specimens have been designated as erectus, Archaic
    sapiens, Neanderthal, or anatomically modern sapiens.

    Next, cognitive change is evident from an analysis of the
    archaeological evidence. Strictly speaking, such change is of a type
    with the anatomical changes mentioned above, since both are the
    product of evolutionary processes, i.e., cognition is a result of
    neural anatomy and/or physiology. But methodologically they must be
    considered separately since, for the most part, the lines of
    evidence involved are so different. One learns about changes in limb
    proportions by studying bones, one learns about the capacity for
    long-term planning from the regional distribution of various kinds
    of tool debittage.

    Lastly, room must be left for the effects of social and cultural
    change: that not directly driven by evolutionary processes at all.
    Comparing early Upper Paleolithic material culture to modern
    material culture reveals the minimum range of cultural variation in
    modern humans. In their eager efforts to come up with answers, many
    scholars seem to have forgotten this factor, leading to instances
    where their arguments, if extended logically, would classify some
    materially simpler extant cultures as pre-human!

    In looking at and attempting to explain evidence, these three areas
    must be granted independent status. Not every specimen with modern
    anatomy necessarily possessed modern cognitive potential.
    Conversely, specimens lacking modern anatomy may very well have
    possessed modern cognitive potential, at least as indicated by
    technology. Furthermore, modern anatomy and cognitive potential do
    not guarantee the cultural or environmental adaptive context for
    complex lithic technology or subsistence techniques. While Western
    Australian aborigines use only three not-very-distinct tool types
    not significantly different from Middle Paleolithic technology, any
    of their children certainly have the potential to be NASA engineers.

    While recognizing these points may seem to merely provide a
    boundless space for speculation, unless we grant the problem its
    true complexity we can be doubly assured that no good answers will
    be found. Failing to grant independent status to these areas has
    resulted in ill-conceived explanations for human emergence.

    A second point to be made is that care must be taken to be sure that
    at each and every step along an evolutionary scenario, the
    characteristics of the organism form a coherent adaptive system. In
    their efforts to bridge from apes to modern humans, scholars
    sometimes forget that the species in between were long-lasting and
    extremely successful. At times the putative intermediate steps
    result in speculative hominids with characteristics which are only
    slightly plausible as a coherent adaptive complex.

    Setting the Stage
    The scenario to be presented here begins with several unargued
    positions which are nevertheless subject to ongoing dispute within
    cognitive science. Rather than attempting to trace those disputes, I
    will attempt merely to accurately place this discussion in a
    particular theoretical space. Thus it seems prudent to simply say
    that to whatever degree one disagrees with these premises, my
    arguments will lose their persuasiveness.

    This scenario falls soundly within a cognitivist, information-
    processing conception of psychology. Moreover, it assumes, to a very
    large degree, that the capacities to be explained are the result of
    evolution by natural selection. Those capacities not directly
    evolved are understood either as indirect consequences of such
    selection, the result of pre-existent constraints, or of
    straightforward physical necessity. At the same time, the argument
    to be presented is intended to stand as a cautionary tale about the
    care with which characteristics to be explained are defined. Not
    every phenomenon which we perceive and label as a unitary
    characteristic merits efforts at Darwinian explanation (Gould and
    Lewontin 1979). Moreover, extreme care must be taken to grant the
    entire problem its full difficulty. Those abilities which are
    subjectively the most effortless, e.g., perception of three-
    dimensional space, phonological analysis of auditory language input,
    etc., have already demonstrated that they are the product of
    devilishly complex cognitive processing.

    Beyond that, this scenario rejects the notion that human cognition
    is an unstructured collection of evolved cognitive modules. The
    overall function of any complex cognitive system, such as we are
    considering here, is better characterized by the nature of its
    integration at higher (i.e., more abstract) levels of representation
    than by the configuration of any particular content-
    specific 'module.' For example, Tooby and Cosmides' (1992:19-136)
    argument that human cognition is such a collection of evolved
    content-specific processing mechanisms is either quite misleading or
    overly ambitious. Granted, the long-standing idea that a completely
    domain*general learning faculty could account for all reasoning
    deserves the re-examination they urge; a completely content*
    independent problem solving faculty would, as they argue, be good at
    no tasks by attempting to be appropriate for any task.

    But, content-specificity, in the context of human cognition, is
    necessarily more a matter of degree than an either*or issue, since as
    information becomes conceptualized it ceases to belong to any
    particular sensory modality. And eventually may cease to belong to
    its original domain, however defined. There is a good deal more
    going on between perceptual input and motor output than is currently
    taken fully into account, namely, the middle part. Fodor's (1983)
    notice that little successful research has been accomplished on
    such 'central processes', e.g., analogical reasoning, is telling.
    Likewise, there is a great deal of interesting conceptual territory
    between the idea of a monolithic 'learning' faculty, i.e., general
    process learning theory, and an unstructured collection of
    informationally encapsulated, opaque modules. And cognitive
    psychology, particularly the evolutionary variety, should not frame
    its discussions as a choice between the two. These ideas will be
    taken up in context later.

    Merlin Donald & Origins of the Modern Mind
    This brief review is not intended to do justice to Donald's overall
    thesis. For the sake of brevity, only the points relevant to this
    thesis will be discussed.

    Donald (1991) traces a plausible path of cognitive development from
    a level equivalent to our closest extant ape relative to our current
    capabilities. Most importantly, he stresses the adaptive coherence
    of each step. He begins with ape cognition, which he sees as having
    a representational strategy based on 'episodic' memory. By episodic,
    he means that representations are limited to encoding concrete
    situations as perceived by the individual. Because apes are
    sensitive to the social aspects of events, these aspects are
    likewise encoded with episodic representations. Out of episodic
    cognition, Donald paints a picture of 'episodic culture,' i.e., that
    which characterizes apes.

    Donald's next stage is 'mimetic culture' which is identified with
    Homo erectus. Mimetic representation is the result of the
    combination of episodic representation plus a highly developed,
    extended representation of self. This 'mimetic controller' allows
    the episodic modeling of events but with the self included in the
    representation. He sees the addition of a self-image enabling a much
    more powerful and flexible system of representation with profound
    social consequences. It would presumably make possible a number of
    capacities, including voluntary, auto-cued rehearsal and a
    generative, recursive capacity for mime.

    It is important to note that for Donald, mimetic culture is possible
    entirely separate from any linguistic vocal communication at all. He
    posits an essentially apelike vocal communication system during the
    relevant period, although he entertains the notion that expanded
    voluntary control of facial musculature (as a result of mimesis)
    could have contributed to improved vocal control. He likewise
    discusses the possibility that aspects of modern human communication
    like rhythm and prosody may have their roots at the mimetic stage of
    hominid cognitive development.

    Donald's final stage is 'mythic culture,' that characterizing
    biologically modern humanity. Mythic representation is the result of
    the evolution of a 'linguistic controller' encompassing the various
    aspects of linguistic processing, from phonological processing to
    articulatory motor control. In his scheme, linguistic representation
    lies behind modern human reasoning ability, since semantic and
    propositional information is only, he implies, encoded

    Integral to Donald's thesis is the case of Brother John (Lecours and
    Joanette 1980), one so rich with inferences that it is worth
    recounting in some detail. Brother John is a 50-year old paroxysmal
    aphasic who works as an editor for his religious order. During one
    of his long attacks, he would progress from global aphasia (complete
    loss of language processing) through various stages of linguistic
    impairment until completely recovered. Throughout the process, he
    remains conscious, unconfused, able to perform various mundane
    tasks, able to remember his experiences, and able to systematically
    cope with his condition.

    During one particular episode, Brother John was traveling by train
    in a foreign country. He was able to calmly and systematically deal
    with his inability to speak or understand written or verbal
    language. Most importantly, he demonstrated that his complete array
    of practical knowledge was intact, all without the benefit of even
    inner speech. Because Brother John's episodes occur fairly regularly
    but are temporary, they have provided a rare opportunity to study in
    detail the phenomenon; and confirm anecdotal accounts of his
    abilities during an episode. Beyond what his case suggests about
    the 'oneness' of the bestiary of linguistic aphasias, his case
    appears to in fact allow us to study human cognition without
    language at all.

    Donald does in fact equate Brother John during one of his attacks to
    the human mimetic stage (1991:253). Donald's main contribution to
    this discussion is thus the compelling notion that the semiotic,
    mythic layer of modern human culture which is dependent on language
    is distinct from (and lies on top of) a fully-functioning, coherent
    cultural layer with many distinctly human characteristics. These
    characteristics include a wide range of species-wide social
    assumptions, schemas and modes of nonverbal communication. In other
    words, the human mind minus language does not simply leave an
    apelike mind. This idea, rather commonsensical when considered
    carefully, stands in opposition to a great many scholars who focus
    on human language as the evolutionary explanation for all uniquely
    human cognitive and cultural ability (e.g., Noble & Davidson 1991).

    A related contribution is his very well made point that a conception
    of language which omits extralinguistic reference is incomplete.
    What is interesting about language (and quite crucial) is that it is
    not simply a coherent system of phonological rules, syntactic rules,
    and semantic relationships between words. What is at issue is the
    relationship between that system and the representational models of
    the world which make the system meaningful. Some scholars argue that
    the distinction is false and that the linguistic system is the model
    (e.g., Donald 1991:253) . Other scholars argue for a non-
    linguistic 'mentalese' (Bickerton 1990; Fodor 1979,1983; and this
    thesis), i.e., a representational system which underlies linguistic

    Contra Donald: Antiquity of articulated speech
    One aspect of Donald's otherwise engaging theory (at least as it
    relates to the period discussed in this thesis) which deserves
    challenge is the notion that H. erectus lacked articulated, meaning-
    bearing language. He argues that erectus, in terms of communicative
    capacity, was limited to "rudimentary song," which would have been
    intentional, i.e., under voluntary, conscious control, but which
    would have been limited to emotional displays alone.

    However, several lines of evidence, some even offered by Donald in
    other contexts, argue that such a limited capacity is too little for
    erectus, though perhaps not for their immediate predecessors, Homo

    The first line of evidence is that of basicranial anatomy and the
    inferences that can be drawn from it concerning the range of
    vocalization and the degree of fine control over that vocalization.
    Analysis of Homo erectus' basicranial morphology has shown that
    these hominids had already diverged from the 'standard-plan'
    mammalian upper respiratory tract (Laitman 1984:307-8; Lieberman
    1991:74; Duchin 1990:694). This is very significant because as soon
    as the larynx descends into the neck at all, it places the entrance
    to the trachea below the back of the throat. Food can then
    inadvertently choke the individual if swallowing is not carefully
    orchestrated. In the 'standard-plan' mammalian tract the larynx is
    habitually kept locked into the nasal cavity, making choking
    virtually impossible. In fact, most animals can drink and breathe at
    the same time, as can human infants under 3 months of age (Lieberman
    1989:397). Not only are all extant primates completely 'standard-
    plan' mammals with respect to their upper respiratory tract, but all
    the Australopithecines had this configuration, even though they also
    were bipedal (this is mentioned to counter charges that the
    divergence occurred because of upright stance). The only plausible
    purpose a lowered larynx serves is to allow greater control over the
    volume of the pharyngeal cavity in order to permit voluntary
    articulated speech.

    This quite dangerous divergence from a very safe mammal*wide pattern
    indicates that articulated vocal communication was already a vital
    part of Homo erectus' adaptive complex. This suggests that this
    process probably began with Homo habilis, the logic being that it
    seems unlikely that a complex novel character could become a
    primary, crucial adaptive system in one step. So, if some wish to
    argue that language is a unitary phenomenon, then they should find
    themselves arguing that early erectus (e.g., KNM-ER 3733) possessed
    fully human language 1.5 million years ago. For a number of reasons,
    this is an unattractive option, since it would make the subsequent
    transition from H. erectus to H. sapiens more difficult to make
    sense of. Clearly some intermediate stage(s) appears reasonable.

    A confounding aspect to the dispute over the interpretation of
    basicranial evidence comes from the seeming insistence by some
    (e.g., Lieberman 1984) that only the full capacity for a modern
    range of vowel articulation should count as evidence for linguistic
    speech. This is contradicted by the fact that toddlers are able to
    talk understandably when their basicranial flexure is identical to
    that of Lieberman's mute Neanderthals! Further skepticism regarding
    his claim of limited Neanderthal speech has been generated by the
    discovery of a Neanderthal hyoid bone indicating a fully modern
    larynx (Arensburg 1991; but see Lieberman 1993). Beyond that, it
    seems far more crucial (as I have stated above) to focus on the
    point in time at which choking becomes possible rather than the
    point at which full vowel range becomes possible.

    The second line of evidence that argues against Donald's view of
    erectus' linguistic and cultural capacity is Brother John himself.
    As noted above, Donald emphasizes Brother John's (and others who
    have been denied any systematic language) ability to function at a
    fully human level sans language. His point in offering this evidence
    is to suggest that human cognition is not completely dependent on
    language and that a coherent, functional culture prior to the
    evolution of articulated language is plausible -- a point which is
    indeed well supported by that evidence.

    But here let me place a different emphasis on Brother John's (and
    others) completely human ability minus language: Without language
    they are indeed fully human, modern humans, in their cognitive
    capacities. They do not find themselves limited to a level of
    function equivalent to Donald's mimetic culture. They are not
    limited to constructing representations out of the product of
    episodic memory and self-image. They appear fully able to construct
    representations formally equivalent to Donald's mythic culture --
    they presumably lack any linguistic hooks on which to hang their
    concepts. In other words, they clearly do not lack the ability to
    conceptualize and freely (i.e., generatively) manipulate those
    concepts. Moreover, the very fact that they can do so without
    language, strongly argues for the idea (which will be expanded upon
    below) that those skills evolved independently, and possibly prior
    to, complex language.

    A second interpretation of the evidence of Brother John's
    performance is equally plausible, and has been argued by Donald's
    colleagues, most notably Derek Bickerton (Bickerton 1993, and see
    below). He questions the distinction Donald makes between symbolic
    thought (which Brother John retains), and linguistic competence
    (which Brother John loses). Bickerton asks why Donald presents
    Brother John in seizure as a "man without language" rather than
    merely "a man without full use of language." This objection begs the
    question of precisely the point Donald is attempting to make, namely
    the role language plays in human cognition.

    In any case, however, H. erectus' cognitive capacities were not
    equivalent to modern humans lacking speech. Rather the
    archaeological evidence suggests that their deficits relative to
    modern humans were specifically cognitive rather than linguistic per
    se, i.e., their inferiority goes far beyond that which might be
    attributed merely to impaired language-enabled social organizational

    Donald's insistence on articulated, complex language as a unitary
    phenomenon forces him to suspend the granting of that ability to any
    hominids prior to modern humans, since to grant it prior to then
    leaves the question open of why the hominids so endowed failed to
    exhibit achievements equivalent to ours. I believe that Donald is
    entirely correct about the existence of mimetic culture without
    articulated language. I merely disagree about the species to which
    he assigns that stage! I believe that Homo habilis or its immediate
    predecessor in the human lineage corresponds closely to Donald's
    mimetic cultural stage. One reason Donald felt compelled to assign
    that stage to erectus appears to be his acceptance of Lieberman's
    (1984) interpretation of shifts in basicranial flexure. Another is
    his acceptance of the notion that the evolution of language was
    largely an issue of the input and output algorithms rather than
    based on a purely abstract cognitive capacity. Both interpretations
    are justifiably in dispute.

    Derek Bickerton & Language and Species
    Again, as with Donald, this discussion will focus only on those
    aspects of Bickerton's work directly relevant to this thesis. Since
    his theories are more integral to the scenario being presented here,
    he will be discussed in greater detail.

    Bickerton makes a very strong case for the existence of two
    separable aspects to modern human language, a lexicon-based proto-
    language and a syntax-based true language; and goes on to attribute
    these two stages to Homo erectus and Homo sapiens respectively.

    Part of Bickerton's persuasive argument is his review of universal
    phrase structure, also known as X-bar theory (Chomsky; Bickerton
    1990; Jackendoff 1977; Stowell 1981) which, by way of showing how
    fiendishly complex syntactic systems are, demonstrates how structure-
    less proto-linguistic utterances are. Protolanguage in this case
    includes ape language, under-two-years human language, pidgins, and
    to some degree the output of some Broca's aphasics.

    Bickerton's division of language capacity is supported by the
    existence of agrammatism, i.e., the separability of referential
    language and syntax in some aphasics. Individuals with particular
    kinds of brain damage can lose all syntactical structure to their
    speech, and suffer impairment of the ability to comprehend complex
    phrases, while their simple two-word associative linguistic ability
    remains intact. This phenomenon occurs similarly even in individuals
    whose language marks syntactic relationships differently, e.g., with
    morphemes (suffixes, prefixes, infixes, etc.) rather than word order.

    Separate specific pathologies (grammar vs. the rest of linguistic
    ability) strongly suggest functionally and evolutionarily distinct
    capacities, regardless of how integral they are in normal function.

    Even better evidence of this division is to be found by way of
    analysis of the language of Genie, the girl discovered in Los
    Angeles who was isolated from language until late childhood
    (Bickerton 1990). Though otherwise of normal intelligence, Genie,
    even after years of tutelage, never acquired spontaneous grammar.
    Although she eventually became quite communicative, her utterances
    typically lacked all inflection for tense, and she never utilized
    complex phrase structure. Thematic roles were never grammatically
    marked. She certainly understood tense, i.e., under direct pressure
    of inquiry she could make distinctions about when a given event
    occurred. But such distinctions were never spontaneously made in
    normal speech. Bickerton finds similar chronic deficits in adult-
    acquired pidgins.

    This evidence supports evidence that the acquisition of complex
    syntax must occur during a classic 'critical period' during
    development. That Genie had less difficulty adopting and expanding
    her lexicon-based referential language use suggests to Bickerton
    that the cognitive skills which support that ability are
    evolutionarily more ancient than those which support high-speed
    syntactic processing. This conclusion is based on the assumption
    that evolutionarily more ancient cognitive capacities will be both
    more resistant to disruption by environmental factors and more
    forgiving with regard to critical periods of acquisition.

    Bickerton conceives of language (and protolanguage) as a Secondary
    Representational System (SRS). Primary Representational Systems are
    simply those which all animal species possess. They can be thought
    of as the way that a given species' sensory processing 'divides' the
    totality of the perceivable world. They include concepts like those
    demonstrated by pigeons (Herrnstein 1964). Their limitation lies in
    the fact that any manipulation to be performed on primary
    representations must deal with them in their full sensory complexity.

    Secondary representational systems transform the output of primary
    representation systems into 'models of the world' which can, to
    various degrees, themselves be the object of manipulation. The
    example Bickerton uses for such systems is money, which allows the
    manipulation of value without the burden of carrying all the
    merchandise it may represent. Elements of a secondary
    representational system constitute a sort of mental shorthand,
    making complex calculations easier.

    Bickerton is convinced that in the origin of both protolanguage
    (with erectus) and syntax (with sapiens), that the linguistic
    capacity itself functioned as a secondary representational system
    (SRS). In other words, he argues that without syntactic language,
    modern humans could not reason, plan, predict, entertain
    counterfactuals, or learn the way they do. He identifies language
    with conceptual reasoning. It certainly simplifies matters to
    theorize in this manner, but Brother John, deaf-mutes, and a number
    of other lines of evidence suggest matters are more complex. One
    example is that although Genie never uses tense spontaneously in
    linguistic utterances, she nevertheless, when pressed, is able to
    demonstrate that she clearly understands what tense means, i.e., she
    has a perfectly human conception of past, present, and future, and
    the flow of time.

    It appears more likely that a conceptual ability underlies the
    categorization and binary distinctions characteristic of syntactic
    processing, but that this ability is distinct from the use to which
    it is put by linguistic processing. This notion will be expanded
    upon below.

    A recurring problem in these discussions so far is one of
    determining what, exactly, we mean when we talk about 'language.'
    If, as Bickerton urges, we attribute symbolic and conceptual ability
    to 'linguistic' capacities, even when language (in terms of
    communicative ability or inner speech, e.g., Brother John or deaf-
    mutes) is absent, then much of the problem is indeed solved. This
    leaves only the slight semantic dilemma of making the distinction
    between this capacity and the communicative uses to which it can be
    put under the right circumstances.

    If, on the other hand, we accept Donald's idea of symbolic or
    conceptual thought in the absence of structured communication, then
    we have, strictly speaking, a non-linguistic foundation to human
    cognitive ability. To me it seems that deciding between the two ways
    of discussing the subject may hinge on the evolutionary history of
    the capacity. If, historically, symbolic/conceptual thought is
    coextensive with the history of linguistic communication, then
    Bickerton's description is unproblematic.

    Elements of a theoretical toolbox
    In an unassuming and brief paper, Paul Rozin (1976) adds a simple
    but powerful notion to a theoretical approach to the evolution of
    cognition. Beginning with the cognitive-adaptationist idea that many
    information-processing skills in the natural world are hard-wired,
    domain-specific, 'algorithms' evolved to solve a particular species-
    specific adaptive problem, Rozin goes on to suggest that many of the
    problem solving-abilities of more complex animals are the result of
    the evolution of wider 'access' to such pre-existent
    narrow 'algorithms' rather than novel evolved cognitive capacities.
    This point is particularly compelling since it mirrors similar
    principles in the theory of purely anatomical and physiological
    evolution, i.e., evolution is conservative, and is far more likely
    to use a more-or-less ill-equipped item which already exists than to
    develop anything de novo.

    Also quite interesting is his speculation that many of the classic
    learning paradigms, rather than reflecting generalized learning
    capacities, may themselves be previously inaccessible adaptive
    specializations (though evolutionarily very old). He notes that the
    phenomenon he is describing is similar to transfer except that,
    rather than the material being referenced having been learned from
    the outside, it is native to the mind. This idea informs my own
    conviction, stated at the outset of this section, that the
    dichotomization of general process learning theory and modular
    theories of cognition is unwise and unnecessary. Rather a
    hierarchical continuum from very domain-specific to quite broad
    learning paradigms is more accurate, with the concept of 'conscious
    access' playing the integrative role between levels.

    Alan Leslie (1987) adds a similarly simple but profound notion to
    the theoretical toolbox with his elaboration of a
    metarepresentational theory of pretense, e.g., pretend play in
    children. He begins by recognizing the problem of 'representational
    abuse.' Note that the kind of pretense Leslie is discussing is a
    particular kind: when a child pretends that one thing is something
    else (a banana is a telephone), pretends that something has
    properties it does not (an empty cup is full), or pretends that
    something exists that does not (imaginary objects). Representational
    abuse is simplest to describe using the first example, but applies
    to each kind of pretense. When a child substitutes one
    concept, 'banana,' for another, 'telephone,' unless both concepts
    are somehow protected their coherence as concepts will inevitably
    begin to disintegrate. In other words, the child will begin to lose
    the distinctions between each object and even become confused about
    what its concepts represent. That this does not occur argues that
    children do not use primary representations when engaging in such

    The key point to be drawn from Leslie is the idea that there appears
    to be a need for an 'algorithm' which serves to 'raise' concepts,
    i.e., put them in quotation marks, to protect them from
    representation abuse during subsequent manipulation. They are then
    metarepresentations: decoupled copies of a primary representation.

    Leslie goes beyond simply noting the necessity of such a process to
    argue that it is not incidental to normal development or simply an
    aspect of the development in the understanding of objects and
    events. He suggests that it is an early manifestation of theories of
    mind, i.e., the ability of a person to impute mental states to self
    and to others and to predict and explain behavior on the basis of
    such states. In other words, the ability to create models of other
    minds begins with the child's ability to create a metarepresentation
    of her own mind, and subsequently alter that metarepresentational
    model. This permits the child to recognize that the knowledge
    available to herself may be different from that available to another.

    The development of such ability is impossible to attribute to
    induction, i.e., perceptual evidence alone would hardly force one to
    imagine mental states in others. Nor is language a plausible
    mechanism for the ability (Leslie 1987:422). In any case, by four
    years of age a child is able to impute false belief in others, i.e.,
    recognize that someone else may be mistaken about a state of affairs
    that the child is correctly aware of, and accurately predict what
    others will do based on their false belief. Leslie attributes their
    ability to perform these calculations to the development of
    the 'decoupler mechanism' which operates on primary representations.

    Human, and to some extent non*human primate, social cognition lies in
    the capacity to form 'theories of mind.' The essence of this, as
    formulated by several researchers ( Premack 1983; Premack and
    Woodruff 1978; Savage-Rumbaugh 1993) lies in the ability to
    create 'models' of other minds, i.e., their desires, fears,
    knowledge, temperament, predilections, etc., and 'run' those models
    under different conditions in order to predict outcomes.

    This capacity puts a premium on maintaining multiple abstract
    factors in mind and manipulating them simultaneously. A prerequisite
    for such processing lies in the existence of a Secondary
    Representational System (a la Bickerton). In other words, one cannot
    build models of other minds without first using a secondary
    representation of one's own mind as a template (Leslie 1987).

    Alexander (1989) elaborates a theory of modern human evolution in
    which the evolution of this ability is driven by "runaway" social
    competition. Alexander emphasizes the importance of play to the
    development and efficient use of 'scenario-building' ability, which
    accords well with Leslie's analysis of play-based pretense. That
    such cognitive skills are distinct and powerful is illustrated by
    tests in which various kinds of logical tests were presented to
    subjects (Tooby & Cosmides 1992: Wason test). In particular, one
    problem tested the ability of the subjects to make logical
    distinction of the 'If A, then B' variety. It was discovered that
    their ability to answer correctly could be made to hinge solely on
    whether or not the problem was presented as a social situation,
    namely 'checking for cheaters', or a non-social situation. What this
    means is that rather than being good at logic, humans are abysmally
    bad at logic, unless the problem can be 'analogized' to one we have
    evolved specific algorithms to solve. Or, placed in complementary
    terms, formal human abstract problem-solving is particularly biased
    toward the types of logical problems presented by social dilemmas.

    This accords well with Rozin's theory that for various kinds of
    problems, if we can access our inbuilt cognitive unconscious, we can
    perform more 'intelligently' than otherwise.

  10. #10

    Post Re: Modern Human Origins: Genetic and cognitive aspects

    Modular Hypotheses and Fodor's Central Processor
    Jerry Fodor (1979,1983) has synthesized much of the theoretical work
    within the cognitivist psychology tradition. Rozin and Leslie's
    contributions accord well with Fodor's explorations of the
    characteristics of mental 'modules,' even though Fodor does not
    appear to typically frame his theories in evolutionary terms.

    Jackendoff (1987: 268-270), while discussing the relationship of his
    cognitive theories to Fodor's (e.g., 1983), points out some of the
    difficulties that arise in too-strictly modular hypotheses. For
    example, Fodor's conception of a language module leads one to ask
    how any particular language can be learned; or, put another way, how
    natural languages can differ at all? It also conflicts with the
    evidence mentioned above regarding language acquisition in deaf
    children, who use none of the prepared phonological systems.
    Likewise, Jackendoff's analysis of musical cognition (1987:213-245;
    see Lerdahl and Jackendoff 1983) might lead one to ask why we have
    musical faculties. This difficulty becomes even more prominent when
    the investigation of cognitive function is placed in an evolutionary
    framework, e.g., what crucial evolutionary function could musical
    talent play?

    Finally, all these issues lose some of their relevance with the
    realization that any overlearned cognitive process (playing the
    bassoon, playing chess, typing, driving, reading, etc.) ends up
    sharing many characteristics of Fodor's modules. But it does not
    then follow that all these capacities are evolved. Rather, in each
    case, a 'capacity to form capacities' has been employed, and it is
    this ability which has presumably evolved. This does not necessarily
    leave us back with general process learning theory. Rather, the
    putative 'capacity to form capacities' is itself constrained in the
    kinds of tasks to which it can effectively be put. This hierarchical
    scheme is precisely the thing referred to above about human
    cognition being neither entirely domain-general or blindly domain-
    specific. Sorting out which capacities particular 'algorithms'
    evolved to facilitate from the ones that are merely side*effects is
    the fundamental challenge of an evolutionary cognitive psychology.

    In any case, it appears that it is difficult to account for modern
    human cognitive capacities purely by means of straightforward inter-
    domain access, i.e., an unstructured collection of
    evolved 'modules'. In other words, one can build extremely
    interesting capacities by linking domains, e.g., integrating a
    proprioreceptive self-image and visuo-spatial processing to allow
    sophisticated hand-body-eye coordination, but these kinds of
    developments show little promise of describing the more impressive
    aspects of human problem-solving ability. Some role for a global,
    abstract, unifying 'space' in which general problem solving occurs
    is indicated, e.g., Fodor's 'central processor'. But, as noted, this
    is a far cry from arguing that any and all problems can be solved
    equally easily. This is a crucial mistake that has been well-
    criticized by proponents of specific 'cognitive adaptations' (see
    Fodor 1983). However, I would argue that such critics, in their
    effort to 'build' more complex cognition out of simpler parts, are
    advocating an overly concrete conception of the role such building-
    blocks may play.

    Fodor argues that the nature and characteristics of global domain-
    insensitive, horizontal faculties are little understood because
    experimentation is so difficult. One characteristic of what we pre-
    theoretically call problem-solving and thought is that it is
    analogical. Saying this, Fodor goes on to note that there is little
    experimental work to define precisely what analogical reasoning is,
    but that this hardly makes it irrelevant.

    In response to these dilemmas, I suggest a return to Rozin's
    implicit notion of a 'cognitive consciousness', i.e., a global, non-
    domain specific, reasoning 'space,' which accords well with Fodor's
    and Leslie's explicitly discussed 'central processor.' Placed in
    this perspective, what becomes obvious about a putative 'central
    processor' is that, quite the opposite of being a 'general' problem-
    solver, it has only the 'toolbox' of that portion of the organism's
    various modules which are consciously, or at least implicitly,
    accessible (a la Rozin). This being the case, the central processor
    begins to appear domain specific in the sense that it is only able
    to effectively solve problems which can be approached analogically
    to one of its already accessible 'tricks.' Thus any 'intelligent'
    species has a potentially global problem-solving ability, but this
    ability is highly constrained by its inherited evolved modules!

    Strict cognitive-adaptationists make a mistake when they say that
    anything which we do easily and intuitively must have an evolved
    module behind it. To say this makes the mistake of treating all
    cognitive tools as if they were part of a Primary Representational
    System. As soon as you have a Secondary Representational System,
    i.e., a meta-representational system, one enables the organism
    to 'extend' the concepts from one realm into another.

    Pinker (1995) discusses the possibility of there being innate
    modules evolved to provide intuitions about biology because there
    are universals of plant and animal classification which reflect
    similar judgments of similarity and difference. If, as Bickerton
    argues, much of X-bar syntax relies on a mechanism of hierarchical
    classification, then why should not this mechanism be extensible to
    other realms? Note that the mechanism presumably evolved (in this
    example) to allow high-speed language encoding and decoding. That it
    may prove useful and adaptive for other cognitive purposes may be

    If, as I argue elsewhere, there is a 'space' in which (more or less)
    domain-general reasoning occurs, and this faculty can access
    previously domain-specific modules and 'extend' their applicability,
    then much of human ability can be boiled down to a handful of likely
    evolved capacities, e.g., hierarchical classification, entity +
    characteristic types of conceptualization, meta-conceptualization
    (e.g., parent-child, friend-enemy), etc.

    What this emphatically does not imply is that domain-general
    reasoning in any way constitutes a 'general learning ability.' It is
    actually rather limited to subject matter which can effectively
    be 'analogized' to one of the evolved human cognitive capacities.
    Indeed, the WASON tests clearly demonstrate that certain problems,
    which by formal measure should be as easy to solve as other simple
    problems, are instead wickedly difficult to answer correctly.

    One way of illustrating this might be to imagine an organism that
    evolves a cognitive module to perform certain statistical
    measurements on particular types of behavior of its associates. This
    module is originally domain-specific, informationally encapsulated,
    automatic, and unconscious. In this case the module is specifically
    devoted to social cognition. The organism later (or previously, it
    is actually irrelevant) evolves a flexible, secondary
    representational system; a problem-solving 'space', perhaps
    originally limited to solving problems of a physical/situational
    nature, like how to stack objects to get to food previously out of
    reach. At this point, the possibility arises of the conscious,
    situation-modeling capacity gaining access to the 'statistics'
    module. The statistical concepts, by virtue of their extension from
    the behavior-quantifying realm, become abstract entities which may
    be manipulated in the previously quite literal, three-dimensional
    problem-solving 'space,' in precisely the same way in which the
    organism can visualize objects being stacked in order to see if they
    will be high enough. They then become 'objects' which may be
    modeled, and allowed to interact. More importantly, being abstract,
    the subject matter to which they may be applied becomes immensely
    widened to anything of which statistics can meaningfully be
    computed, e.g., success at finding food in a particular geographic
    area, the likelihood of capturing a particular kind of prey, etc.

    From Homo erectus to Modern Humans
    Now that the basic theoretical building blocks have been presented,
    the scenario is as follows: The first step in the sequence begins
    with Homo erectus, which is the first fossil hominid to exhibit
    basicranal morphology associated with laryngeal descent. This
    implies that this was the first hominid to use articulated speech as
    a crucial part of its adaptive strategy. (This in turn implies that
    H. habilis was very probably well on its way to that point, but that
    is beyond the scope of this discussion.)

    Linguistic potential at this point need not have been any greater
    than that exhibited by extant primates. Indeed, any
    evolutionary "just so" story should, for the sake of plausibility,
    begin at such a point. What is important is that the use of that
    potential was normal rather than extraordinary. The possible
    environmental (ecological or social) stimuli for the original
    evolution of articulated (and presumably referential) speech are
    again beyond the scope of this discussion. It seems quite likely,
    though, that they are what explain the shift from the
    Australopithecines to Homo.

    However, once in place, I presume that simple, referential
    articulated vocal communication generated drastic effects on the
    complexity of the social environment; effects which would have
    been 'evolutionarily unexpected'. This increased social complexity
    during the Lower and Middle Paleolithic provided the context for the
    continued development and expansion of unconscious, rule-governed
    model-building ability; one which was used for interpersonal problem-
    solving in increasingly complex social contexts (i.e., Alexander's
    (1989) 'runaway social arms race'). This would have been a
    conceptual cognitive ability, not necessarily a linguistic ability
    per se. And since it presumably functioned at an implicit level, it
    could fairly be described as domain-specific and modular.

    Well-developed social cognition is clearly not an evolutionarily new
    ability. It is this implicit capability which underlies all the
    greater apes' ability to model and predict their conspecific's
    actions and mental states. But the novel existence of a (proto-)
    linguistic social environment would have produced a new level of
    requirements for building models of others' minds; requirements
    driving the evolution of abilities quantitatively more sophisticated
    (if not qualitatively so) than those in other hominoids, probably
    involving increases in attention span and concentration.

    At the same time, this ability, by way of its effects on social
    organization and capacity for culture, would have provided for its
    own bioenergetic possibility. Cranial capacity during this period
    was undergoing slow but substantial change. That extra neural mass
    is extraordinarily costly in terms of caloric maintenance and
    parental investment. While strict ability-brain size correlations
    are not valid, some rough relationship between average size and
    ability must exist, given the expense of that tissue. Any account of
    human evolution must account for the point at which ecological
    energy-extraction efficiency ceased to be the primary determiner of
    neural mass. I suggest that the ecological repercussions of the
    social consequences of referential language account for this change.

    I suggest that the next step in the sequence, the Middle-to-Upper
    Paleolithic transition, consisted of the evolution of conscious
    access to this previously unconscious social model-building system,
    not (surprisingly) for its linguistic benefits, but for its value in
    subsistence planning and organization. If, as argued above,
    conscious access to the algorithms of cognitive modules permits
    their abstraction to new conceptual domains, then model-building
    could very profitably be applied to extractive activities.

    Thus the relevance of Leslie's (1987) analysis of pretense to the
    Middle-Upper Pleistocene transition is that building alternate
    models of the world can be adaptive in two ways.

    building alternate systems of belief to accurately predict the
    actions of conspecifics with different experiences of the real
    world; and
    building models of alternate subsistence strategies which can be
    compared along several axes of variation (effort, risks, rewards,
    reliability, etc.).
    The advantage to the second only accrues when the 'world is what you
    make of it' rather than there being a particular 'best' strategy in
    all situations. In other words, its usefulness depends upon the
    flexibility, both social and technological, to take advantage of a
    newly-realized opportunity for exploitation.

    Most importantly, both kinds of problems (social and extractive)
    encounter the same kind of difficulties with computational
    resources: the problem of combinatorial explosion. In social
    interactions, the formation of meta-level relationships such as
    parent-child, friend, enemy, sibling, etc. allow a great variety of
    dyadic relationships to be abstractly conceptualized. This process
    of conceptualization is inherently hierarchical and categorical. By
    enabling a kind of mental shorthand, the individual is freed from
    keeping in mind concrete examples of what is being considered. This
    ability, which allows large collections of entities to be
    considered, manipulated, and compared is plausibly powerful enough
    to underlie much of human ability in social cognition, subsistence
    strategy, syntactic processing, and perhaps a range of other
    uniquely human problem-solving capacities.

    Although Homo erectus was able to live in environments previously
    uninhabitable by primates i.e., high latitudes and sparse
    grasslands, they were far less ecologically efficient than modern
    human hunter-gatherers. At the erectus-sapiens boundary, the ability
    to carefully and consciously weigh multiple courses of subsistence-
    oriented action would have immediate and drastic effects on
    survivorship of any individual with the ability.

    One may reasonably ask why specifically linguistic consequences are
    not offered as the evolutionary driving force behind the shift. As
    Bickerton has noted (1990) any characteristic presumably responding
    to evolutionary pressure must at all times be adaptive now.
    Linguistic capacities rely on a community of speakers for their
    expression and adaptiveness. Now some improvements in linguistic
    capacity have plausible incremental steps by which they can develop.
    But rule-governed grammar appears to be an all-or-nothing
    phenomenon, and thus its (initial) evolutionary explanation must be
    sought elsewhere.

    The idea of two levels of language advocated here seems to reflect a
    distinction between the specificity of perception and the
    specificity of the integration of those perceptions. Ample evidence
    demonstrates that there are a host of auditory processing mechanisms
    which focus attention on articulated speech and increase
    discriminative ability with regard to speech (Lieberman 1984). In
    humans, the important sounds of speech are wired to be given a
    greater degree of attention than other sounds in the child's
    environment: a highly specific form of perceptual tuning. They go on
    to integrate those sounds linguistically. Similar mechanisms can be
    described for the output side of the process. These certainly have
    an evolutionary basis; one which I believe is quite long. These
    are "prepared systems" in every sense of the term, 'hard-wired' into
    the low levels of perceptual processing. But their evolution would
    not necessarily have required language as we know it to drive their

    Yet deaf infants exposed to ASL "babble" and begin to "speak" with
    their hands on exactly the same developmental timetable as hearing
    infants (Petitto and Marenette 1991). They learn to integrate visual
    patterns linguistically as well, and do so with as much ease as
    hearing children. This suggests that full language acquisition is an
    aptitude which occurs at a very high (i.e., abstract) level of
    cognitive processing, i.e., it presumably can occur with input from
    any sensory modality. It is a purely abstract pattern-recognition
    ability rather than a "prepared system" in the narrow sense. At the
    level of linguistic processing, i.e., syntactic parsing, there is no
    domain-specific effect. Presumably, any sensory input with
    sufficient signal-carrying capacity could serve as well as the

    This evidence, to some extent, flies in the face of psycholinguistic
    evidence (see Fodor 1983 and Jackendoff 1987 for fairly brief
    reviews) which paints a picture of (mostly) informationally
    encapsulated and domain-specific acoustic and phonological levels of
    processing. The conflict here can be resolved if one considers the
    possibility of Bickerton's two*step evolutionary sequence for human
    language; the first being the evolution of articulated speech
    (probably with Homo habilis ) and the second being the evolution of
    complex grammar and syntax (with modern Homo sapiens ). This would
    account for the apparently ancient origin of acoustic and
    phonological processing as well as the lack of input domain-
    specificity indicating recent origins.

    Bickerton (1981, 1990) argues for just such a division of human
    linguistic potential into a fairly old 'proto*linguistic' capacity
    and a more recent 'syntactic' capacity. Even aside from any specific
    inferences about the chronological order of evolution, this evidence
    strongly supports the idea that the different levels of what are now
    seemingly integrated systems probably evolved at different times and
    possibly for different purposes. Indeed, just such a view could
    characterize almost all we know about the evolution in the more
    obvious realms of physiology and anatomy.

    This suggests that it is evolutionarily recent, even within the
    timespan of hominid evolution. This difference in level, between
    form and content, is a crucial hint to an understanding of the
    evolutionary basis for the existence of the two systems. From the
    perspective of the sensory system, the content is not arbitrary, it
    is constrained. From the perspective of the cognitive system, the
    content is arbitrary, but the form is constrained. Thus French,
    Chinese and ASL (American Sign Language) use the same cognitive
    linguistic system while having entirely different contents.

    I thus suggest that H. erectus possessed articulated speech and
    referential language, but lacked grammar. Such a language would be
    linguistically identical to the two-word stage of language
    acquisition in children. However, the power and sophistication of
    such a system in an adult with the non-linguistic conceptual
    abilities plausible for erectus would be impressive relative to
    previous developments.

    One could have said "dog brown", or "kitty gone" or "outside play";
    meaning "the dog is brown," "the kitty is gone," or "I want to go
    play outside"; however "the man whose daughter the boy chased was
    angry" would be impossible.

    There is little qualitative difference between this level of grammar
    and that which seems to be within the capabilities of chimpanzees
    and bonobos (at least after they have been intensively trained). The
    difference would be the ease with which the hominid could use this
    system; it clearly would have been spontaneous and easy to exercise
    for the hominoid ancestor. It is neither spontaneous nor easy for
    our extant primate relatives. Just how close other extant primates
    are to this level of language competence is demonstrated by Kanzi, a
    bonobo who spontaneously uses referential symbols, and responds to
    spoken English, after having been exposed to them during his

    The theoretical relevance of spontaneously used symbols vs. those
    acquired by training is well emphasized by Savage-Rumbaugh
    (1990:604). The fact that Kanzi has been observed re-presenting
    symbols to himself, in private, is even more provocative (Savage-
    Rumbaugh et al. 1986:228).

    Obviously, there are important differences between symbol use in
    Bonobos and humans. (1) Bonobos in the wild do not spontaneously
    generate referential symbols, and (2) Even under human tutelage,
    only bonobos exposed to language during development are able to
    demonstrate referential use of symbols (Kanzi's mother failed to
    adopt the desired skills). But this evidence shows that at least one
    extant ape has (potentially) all the major linguistic faculties
    which humans possess except grammar.

    One aspect of Jackendoff's (1987) theory may be useful here to make
    sense of the notion that so little, in terms of completely novel
    mental capacities, separates us from our closest relatives. In order
    to explain the observed interpenetration between bottom-up and top-
    down processes (e.g., in perception, where expectations and even
    conscious decisions can bias low-level decisions, as in the face-
    vase illusion), he argues that several levels of representation are
    maintained in 'registration' with one another in short=term memory
    (1989:258-9). In like manner, he supposes that linguistic
    representations can be held in registration with the conceptual
    structures which generated them (and vice versa). The interesting
    thing about this mechanism is that it does not result in a capacity
    which is an either-or question. In other words, a lack of
    registration would not interfere with conceptual structures or
    linguistic representations. However, the presence of registration,
    he reasons, would result in much more 'stability' in both forms of

    Registration may be the phenomenon responsible for
    the 'explosiveness' of Upper Pleistocene human cognitive ability. It
    effects how easily processing can be carried out, rather than the
    either-or possibility of performing a given form of processing at
    all. Also, just such a picture of high-level human cognition may
    make sense of Goldstein's evidence (see below) of slight deficits in
    language-deprived individuals and Brother John's seemingly
    unhampered problem-solving ability during one of his attacks.

    These speculations greatly suggest that detailed research which can
    sort our subtle effects of language deprivation (either due to
    injury or deafness) is called for.

    The primacy of social cognition
    The role that social cognition plays in this scenario is a dominant
    one, and so it seems logical to justify that assignment,
    particularly when so many other driving forces have been offered for
    this period of human evolution, e.g., tool use, symbolism, etc.
    Likewise, I have argued that, in general, it was social-cognitive
    algorithms being accessed, resulting in such dramatic results.

    First, in general terms, primates are arguably the most (flexibly)
    social order, and thus the most socially sophisticated animals. They
    have been so for at least 20 million years. If for no other reason
    than this, it might be guessed that social-cognitive abilities may
    be the most 'advanced' we possess. Extensive tool use and (more)
    sophisticated subsistence activities, on the other hand, extend at
    most 5*7 million years (to the common ape*human ancestor) and may only
    be as old as the Homo line. Articulated language, as discussed ,
    extends backwards at most to Homo habilis.

    Second, if one accepts that the improved social organizational and
    communicative capabilities of Homo erectus enabled a greater degree
    of 'insulation' from the natural environment, it would follow that
    for erectus, even more than any other primate, the social
    environment becomes the most crucial adaptive environment.
    Intraspecific competition becomes more relevant than environmental
    interactions. Alexander (1989) argues for just such a social 'arms
    race' at this point in human evolution. He sees sociality as the
    initial driving force, with other selective arenas only "kicking in"

    Third, a number of characteristics of both human abstract thought
    (conceptual processing) and language per se share interesting
    characteristics with what we might identify as the hallmarks of
    social cognition: hierarchical structure, binary categorical
    distinctions, and the ability to manipulate multiple mental entities

    Bickerton's (1990) account of syntax as described by X-bar theory
    suggests that the subject-predicate relationship is the fundamental
    relationship of language. Bickerton argues that the reason for this
    is that the basic element of protolanguage was entity +
    characteristic i.e., the noun phrase. He suggests that the verb
    phrase simply represents an extension of noun phrase rules.

    Bickerton argues that the basic template being used here is 'animate
    individual + behavior.' If this is true it would be another reason
    why a primate (rather than some other organism) developed syntactic
    language; small surprise that the most socially attentive order
    produced a cognitive algorithm based on relating individual entities
    and behavioral characteristics. And small surprise that the primate
    facing the greatest degree of social complexity would develop an
    efficient, hierarchical, meta-representational system, since some
    form of abstraction is the only way to sidestep computational limits
    in information processing systems.

    The Enigma of Homo erectus
    The scenario presented above accounts for a number of aspects of the
    archaeological, anatomical, and evolutionary puzzles presented by
    the last 2 million years of human evolution. And if the signature of
    a good theory is that it explains things that perhaps we hadn't even
    noticed needed explaining, then recall that the adaptation of
    erectus to high latitudes should be a bit of an enigma, insofar as
    their adaptation proved inferior to that represented by the modern
    condition. In other words, while erectus was far more successful
    than any previous primate, it was ultimately inferior to modern
    humans (whether viewed as a species or as a phyletic suite of
    characteristics). In discussing erectus' cognitive capacities, most
    discussion focuses either on their superiority or their deficits;
    care must be taken to account simultaneously for their long success
    and subsequent failure.

    The theory presented here would attribute erectus' success as the
    first inter-continental hominid to its social precocity, and view
    its adaptation as a primarily social one. That this kind of
    adaptation did not include adequate technology to completely buffer
    the organism from the hostile environment is illustrated by the
    continuing robusticity of the erectus musculo-skeletal system. It
    then attributes H. sapiens' subsequent success (i.e., the success of
    sapiens genetic attributes) in the same environments to its
    technological precocity (technology in this case encompassing
    technique); and views its adaptation as a technological one. The
    technology in this case is that relying on characteristically human
    culture. This analysis accounts for erectus' apparent success,
    albeit accompanied by the persistence of its physical robusticity;
    as well as modern humans' rapid shift to skeletal gracility.

    It makes sense of some of the bizarre deficiencies of late Middle
    Paleolithic peoples, i.e., fairly sophisticated tools, but little
    long-term planning or large-scale social organization. (Tool-making
    is partially an unconscious, implicit procedure, but subsistence
    scheduling and decision-making is a necessarily explicit thought
    process.) It accounts for the presence of articulatory anatomy from
    2 million years ago but the lack of modern "culture as we know it"
    until 50 thousand years ago. It also would account for
    the 'explosiveness' of the Middle to Upper Paleolithic transition
    since this shift consists of the conscious integration of two well-
    developed, but previously unconnected cognitive domains (social
    cognition and referential language/articulated speech) rather than
    the fresh evolution of basic cognitive algorithms.

    Archaeological Perspectives on Cognitive evolution
    So how consistent is the evolutionary scenario presented above with
    the body of archaeological evidence from the Middle and Upper
    Paleolithic? Ironically, it seems that the constraints embodied in
    the linguistic and cognitive data serve better to eliminate
    alternate possibilities than any particular item of archaeological
    evidence. However, the archaeological data do serve well to provide
    plausible temporal milestones to which various steps in the
    cognitive theory can be assigned. Beyond that, they serve to broadly
    constrain the kinds of explanation that will remain consistent with
    what we know of Middle and Upper Pleistocene subsistence patterns.

    Interestingly enough, in the first section of this thesis I
    endeavored to minimize the distinctions made between the Middle and
    Upper Pleistocene under the belief that there is a phyletic
    relationship between Homo erectus and modern Homo sapiens. In this
    section, the theory presented here suggests that there should be a
    rather clear shift, reflected in evidence of cognitive capacity for
    complex predictive reasoning, both in the social and subsistence

    Isaac (1972:54-56) describes the pattern of change between the
    Middle Pleistocene and the Upper Pleistocene as one of a shift from
    a low-density "walking pattern" to one of fine-grained
    differentiation with more evidence of complex structure in human
    subsistence activity. Arguments over the best way to characterize
    the comparison of Middle Paleolithic technology and subsistence to
    that in the Upper Paleolithic often hinge on the degree to which
    evidence from within the chronological units is 'averaged.' In other
    words if one compares the Middle Paleolithic overall to the Upper
    Paleolithic overall, then there is indeed significant disparity in
    complexity and sophistication.

    On the other hand, several more chronologically restricted studies
    (e.g., Marks 1990:74 for the Near East and Nile valley, Reynolds
    1990:273 for Southwestern France) have shown that differences
    between late Middle Paleolithic sites and early Upper Paleolithic
    sites are much less striking. Reynolds' conclusion is especially
    interesting since Western Europe is typically offered as the most
    unambiguous example of Middle-Upper Paleolithic discontinuity.
    However, he attributes this impression more to the typologies
    traditionally used for the periods than to differences inherent to
    the industries: "The Middle-Upper Paleolithic transition as it is
    pictured through a 'before and after' approach highlights
    differences in the two typological systems in the treatment of
    compound forms and also the technological move towards increased use
    of regular blank forms with additional types being size-based. As
    such, this approach presents an image of increased typological
    complexity and innovation that is not, in fact, substantiable when
    the final Mousterian (MAT) and first Upper Paleolithic industries
    are taken alone" (Reynolds 1990:273). This has also been the
    conclusion of other more general reviews of Paleolithic change
    (Rolland 1990:377).

    Binford (1989), on the other hand, emphasizes the disjunctive nature
    of the transition when viewed in archaeological terms. Contrary to
    what had been thought before, Binford finds no evidence of
    significant time depth to planning of any activities by Middle
    Paleolithic (or earlier) peoples. Most significantly, Binford
    (1987:52) reassesses the evidence of large-scale game drives at
    Torralba. He concludes that the presence of language explains the
    abrupt transition while noting that Wolpoff (in conference)
    maintained that language and culture were present in the Lower

    In archaeological terms a picture has begun to emerge of some
    gradual improvements of a panspecific adaptation throughout the
    Lower and Middle Paleolithic with an abrupt shift to longer-range
    planning, artifact curation, and cultural adaptation to
    environments. This way of looking at the problem, one where
    linguistic ability, social organization, and cognitive potential
    become relevant as more than simply advantageous alleles, is one
    which draws on a much wider range of fields than paleoanthropology
    has so far embraced.

    However, contrary to Binford, there is evidence for extensive
    foresight and planning in the Middle Pleistocene. Raw material for
    tools was often selected well prior to use. For example, raw
    material for tools at Terra Amata came from at least 30 miles from
    the site (de Lumley 1969). This indicates that Binford's simple
    division of cognitive potential into planning/non-planning is
    inadequate. Binford discusses the advantages of tactical thinking as
    if it involved only a simple unitary ability to 'think ahead'.
    Clearly what humans typically do is similar but by no means
    identical to what chimpanzees do when they select a hammerstone
    before they reach where they will use it. His conclusions elsewhere
    have simply been that, for instance,

    "It is most unlikely that we are seeing structured assemblages
    derived from organizationally integrated tool*assisted actions
    centered in camps. Early humans were probably not very much like us"
    (Binford 1987:29).

    Binford rightfully concludes that earlier humans "were ... not very
    much like us", but he then goes on to deprive them of far more than
    is justified by his analysis. At issue here is what Binford
    considers to be 'human', and all that 'humanness' connotes in terms
    of cognitive potential. The issue is not whether Middle Paleolithic
    hominids could 'plan ahead', it is how structured, and thus how
    powerful, that planning was. One way of approaching the question
    might be to ask how much planning is possible, or probable, by
    individuals solving problems largely on their own, and if greater
    amounts of planning involve more sophisticated forms of social
    organization and cultural codification for their persistence.

    Complicating the issue, of course, is the argument presented in this
    thesis that the cognitive capacity underlying social cognition,
    complex planning, and problem-solving, is the same capacity which
    underlies language.

    Ambrose and Lorenz 1990
    This paper is very important for this exploration for two reasons.
    First, at a very detailed level, its conclusions are significant for
    the assessment of MSA (Middle Stone Age) vs. LSA (Late Stone Age)
    technology and subsistence in Southern Africa. Second, and more
    important, it represents precisely the kind of approach needed to
    distinguish in the archaeological record between evolved biological
    potential and non-biological socio-cultural evolution. Ambrose and
    Lorenz (1990) find that at one particular time (between MSA 1 and
    MSA 3/4), approximately 70,000 years ago, the response of human
    hunter-gatherers to environmental change itself changed: ecological
    cycles encountered many times before over the previous two million
    years elicited a novel response. By asking certain questions from
    within a well-developed theoretical framework of resource structure
    and optimal subsistence strategies, they are able to address the
    precise causes of that change.

    The strength of their strategy comes from the breadth and generality
    of resource structure theory: it is a theory about human adaptation
    in general, not merely this or that group. As a means of assessing
    the organization of human adaptation in general, it is able to look
    past the presence, or absence, of technological complexity and
    address the cognitive capacity behind that technology. Put simply,
    MSA peoples were unable to adapt (as LSA peoples did) not because
    they lacked any particular technology, but because they lacked
    (presumably because of neurobiology) modern human social
    organizational potential, itself enabled by the requisite cognitive
    capacities. Once this is said, it is important, as they note, not to
    start seeing 'MSA peoples as unskilled mental midgets.' But it does
    force further analysis of 1) what underlies modern potential and 2)
    what form did pre*modern potential take and what changes to
    archaeological theory must be made to make it suitable for the
    analysis of pre*modern human technology.

    At the least this evidence suggests that the shift in underlying
    cognitive capacity, i.e., that involved in complex subsistence-
    oriented problem-solving, occurred (at least in Southern Africa)
    approximately 70,000 years ago. Most importantly, this shift is
    reflected in adaptive subsistence strategies rather than improved
    technology or obvious changes in social organization. This
    observation suggests that archaeological studies which focus
    specifically on technology or gross indicators of social
    organization, e.g., group size, are likely to miss the subtle
    structural changes which signal underlying cognitive developments.

    Thus, to a large degree, the utility of archaeological data to the
    study of cognitive evolution may depend on first mapping human
    resource structure theory onto particular cognitive capacities. Of
    course, this cannot occur until some unambiguous characteristics of
    the relevant cognitive capacities can be defined in accordance with
    experimental data. The theory presented here at least points to some
    aspects of those capacities: hierarchical structure, binary
    categorical distinctions, and the ability to manipulate multiple
    mental entities simultaneously; all of which bear suspicious
    resemblance to the characteristics of social cognition and syntactic
    linguistic processing.

    In the spirit of intellectual honesty, I shall spell out several
    outstanding dilemmas which remain in my mind concerning the theory
    presented here. The first is theoretical, the second evidentiary,
    the third a suggested subject for research.

    If, as I have argued, the syntactic parsing mechanism is the product
    of evolved access to a social modeling mechanism (by way of an
    explicitly conscious analytical capacity), then why is it not
    accessible to conscious awareness? Other than admitting it is a good
    question, I will offer two possibilities. One, that syntactic
    parsing was plausibly once a conscious type of problem solving
    which, because of computational strains placed upon it, became
    informationally re-encapsulated for the sake of speed. Another
    approach may be to point out that Rozin's notion of the cognitive
    conscious and unconscious is not necessarily the same as a
    distinction between availability and non-availability to
    introspection. In other words, not all of the 'cognitive
    consciousness' need actually be available to consciousness in the
    normal sense, i.e., cognitive abilities may result from access but
    still be implicit.

    Goldstein, an aphasiologist, found that linguistic aphasia is
    commonly accompanied by more subtle non-linguistic deficits, and
    that these deficits are best defined by a loss of what he called
    the "abstract attitude". The 'abstract' (also categorical or
    conceptual) attitude is basic for the following potentialities:

    "1. Assuming a mental set voluntarily, taking initiative, even
    beginning a performance on demand. 2. Shifting voluntarily from one
    aspect of a situation to another, making a choice. 3. Keeping in
    mind simultaneously various aspects of a situation; reacting to two
    stimuli which do not belong intrinsically together. 4. Grasping the
    essential of a given whole, breaking up a given whole into parts,
    isolating them voluntarily, and combining them in wholes. 5.
    Abstracting common properties, planning ahead ideationally, assuming
    an attitude toward the 'merely possible', and thinking or performing
    symbolically. 6. Detaching the ego from the outer world."(Goldstein,
    Lieberman (1991:92) notes that many recent studies have confirmed
    this view. "Although standard tests of intelligence may show no
    decrement in overall intelligence, more specialized tests
    demonstrate that aphasic patients have considerable difficulty in
    performing tasks that involve keeping track of and applying
    different abstract concepts, translating specific facts into
    appropriate action, handling simultaneous sources of information,
    relating isolated details, and failing to grasp the key element of a
    problem (Stuss and Benson, 1986:194-203)."

    This is interesting because it suggests that an extreme form of
    modularity (i.e., that implied by Brother John's ability to reason
    clearly and handle social situations appropriately during a period
    of global aphasia) may not be supported by a closer look at the
    cognitive concomitants of language. More to the point, it challenges
    (to some degree) my description of the separability of human
    analytic ability and human linguistic ability.

    One question which seems to divide the several scholars at work on
    this subject is whether or not complex problem solving, e.g., the
    kind that humans alone appear to excel at, is or is not specifically
    enabled by the presence of language. Brother John and deaf-mutes
    appear to argue against such an identification. Much other evidence
    and theory appears to support the notion, i.e., that such problem
    solving is language-based whether communicative language is present
    or not. While it is obvious that experimental research concerning
    symbolic problem-solving might be made rather difficult by the
    inability of a subject to use language or writing, it seems that
    many of those problems have been overcome with non-human cognitive
    psychology. In any case, it seems something which is clearly open to
    elucidation by experimentation.

Page 1 of 2 12 LastLast

Similar Threads

  1. Replies: 47
    Last Post: Thursday, September 3rd, 2020, 10:27 AM
  2. Racial Anthropology, Genetics and Origins of the Jews
    By Euclides in forum Population Genetics
    Replies: 30
    Last Post: Sunday, December 19th, 2010, 02:14 PM
  3. Replies: 0
    Last Post: Friday, July 9th, 2004, 12:30 AM
  4. Modern Genetics and the Basques
    By Euclides in forum Population Genetics
    Replies: 0
    Last Post: Sunday, July 4th, 2004, 08:05 PM
  5. Volcanic Disruption and Differentiation of Modern Humans
    By Frans_Jozef in forum Anthropogeny & Ethnogenesis
    Replies: 0
    Last Post: Friday, May 7th, 2004, 11:26 AM


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts