language-icon Old Web
English
Sign In

Behavioral modernity

Behavioral modernity is a suite of behavioral and cognitive traits that distinguishes current Homo sapiens from other anatomically modern humans, hominins, and primates. Although often debated, most scholars agree that modern human behavior can be characterized by abstract thinking, planning depth, symbolic behavior (e.g., art, ornamentation), music and dance, exploitation of large game, and blade technology, among others. Underlying these behaviors and technological innovations are cognitive and cultural foundations that have been documented experimentally and ethnographically. Some of these human universal patterns are cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin. It has been argued that the development of these modern behavioral traits, in combination with the climatic conditions of the Last Glacial Maximum causing genetic bottlenecks, was largely responsible for the human replacement of Neanderthals, Denisovans, and the other species of humans of the rest of the world. Behavioral modernity is a suite of behavioral and cognitive traits that distinguishes current Homo sapiens from other anatomically modern humans, hominins, and primates. Although often debated, most scholars agree that modern human behavior can be characterized by abstract thinking, planning depth, symbolic behavior (e.g., art, ornamentation), music and dance, exploitation of large game, and blade technology, among others. Underlying these behaviors and technological innovations are cognitive and cultural foundations that have been documented experimentally and ethnographically. Some of these human universal patterns are cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin. It has been argued that the development of these modern behavioral traits, in combination with the climatic conditions of the Last Glacial Maximum causing genetic bottlenecks, was largely responsible for the human replacement of Neanderthals, Denisovans, and the other species of humans of the rest of the world. Arising from differences in the archaeological record, a debate continues as to whether anatomically modern humans were behaviorally modern as well. There are many theories on the evolution of behavioral modernity. These generally fall into two camps: gradualist and cognitive approaches. The Later Upper Paleolithic Model refers to the theory that modern human behavior arose through cognitive, genetic changes abruptly around 40,000–50,000 years ago. Other models focus on how modern human behavior may have arisen through gradual steps; the archaeological signatures of such behavior only appearing through demographic or subsistence-based changes. To classify what traits should be included in modern human behavior, it is necessary to define behaviors that are universal among living human groups. Some examples of these human universals are abstract thought, planning, trade, cooperative labor, body decoration, control and use of fire. Along with these traits, humans possess much reliance on social learning. This cumulative cultural change or cultural 'ratchet' separates human culture from social learning in animals. As well, a reliance on social learning may be responsible in part for humans' rapid adaptation to many environments outside of Africa.Since cultural universals are found in all cultures including some of the most isolated indigenous groups, these traits must have evolved or have been invented in Africa prior to the exodus. Archaeologically, a number of empirical traits have been used as indicators of modern human behavior. While these are often debated a few are generally agreed upon. Archaeological evidence of behavioral modernity includes: Several critiques have been placed against the traditional concept of behavioral modernity, both methodologically and philosophically. Shea (2011) outlines a variety of problems with this concept, arguing instead for 'behavioral variability', which, according to the author, better describes the archaeological record. The use of trait lists, according to Shea (2011), runs the risk of taphonomic bias, where some sites may yield more artifacts than others despite similar populations; as well, trait lists can be ambiguous in how behaviors may be empirically recognized in the archaeological record. Shea (2011) in particular cautions that population pressure, cultural change, or optimality models, like those in human behavioral ecology, might better predict changes in tool types or subsistence strategies than a change from 'archaic' to 'modern' behavior. Some researchers argue that a greater emphasis should be placed on identifying only those artifacts which are unquestionably, or purely, symbolic as a metric for modern human behavior. The Late Upper Paleolithic Model, or Upper Paleolithic Revolution, refers to the idea that, though anatomically modern humans first appear around 150,000 years ago, they were not cognitively or behaviorally 'modern' until around 50,000 years ago, leading to their expansion into Europe and Asia. These authors note that traits used as a metric for behavioral modernity do not appear as a package until around 40–50,000 years ago. Klein (1995) specifically describes evidence of fishing, bone shaped as a tool, hearths, significant artifact diversity, and elaborate graves are all absent before this point. Although assemblages before 50,000 years ago show some diversity the only distinctly modern tool assemblages appear in Europe at 48,000. According to these authors, art only becomes common beyond this switching point, signifying a change from archaic to modern humans. Most researchers argue that a neurological or genetic change, perhaps one enabling complex language, such as FOXP2, caused this revolutionary change in our species. Contrasted with this view of a spontaneous leap in cognition among ancient humans, some authors like Alison S. Brooks, primarily working in African archaeology, point to the gradual accumulation of 'modern' behaviors, starting well before the 50,000 year benchmark of the Upper Paleolithic Revolution models. Howiesons Poort, Blombos, and other South African archaeological sites, for example, show evidence of marine resource acquisition, trade, the making of bone tools, blade and microlith technology, and abstract ornamentation at least by 80,000 years ago. Given evidence from Africa and the Middle East, a variety of hypotheses have been put forth to describe an earlier, gradual transition from simple to more complex human behavior. Some authors have pushed back the appearance of fully modern behavior to around 80,000 years ago in order to incorporate the South African data. Others focus on the slow accumulation of different technologies and behaviors across time. These researchers describe how anatomically modern humans could have been cognitively the same and what we define as behavioral modernity is just the result of thousands of years of cultural adaptation and learning. D'Errico and others have looked at Neanderthal culture, rather than early human behavior exclusively, for clues into behavioral modernity. Noting that Neanderthal assemblages often portray traits similar to those listed for modern human behavior, researchers stress that the foundations for behavioral modernity may in fact lie deeper in our hominin ancestors. If both modern humans and Neanderthals express abstract art and complex tools then 'modern human behavior' cannot be a derived trait for our species. They argue that the original 'human revolution' theory reflects a profound Eurocentric bias. Recent archaeological evidence, they argue, proves that humans evolving in Africa some 300,000 or even 400,000 years ago were already becoming cognitively and behaviourally 'modern'. These features include blade and microlithic technology, bone tools, increased geographic range, specialized hunting, the use of aquatic resources, long distance trade, systematic processing and use of pigment, and art and decoration. These items do not occur suddenly together as predicted by the 'human revolution' model, but at sites that are widely separated in space and time. This suggests a gradual assembling of the package of modern human behaviours in Africa, and its later export to other regions of the Old World. Between these extremes is the view – currently supported by archaeologists Chris Henshilwood, Curtis Marean, Ian Watts and others – that there was indeed some kind of 'human revolution' but that it occurred in Africa and spanned tens of thousands of years. The term 'revolution' in this context would mean not a sudden mutation but a historical development along the lines of 'the industrial revolution' or 'the Neolithic revolution'. In other words, it was a relatively accelerated process, too rapid for ordinary Darwinian 'descent with modification' yet too gradual to be attributed to a single genetic or other sudden event. These archaeologists point in particular to the relatively explosive emergence of ochre crayons and shell necklaces apparently used for cosmetic purposes. These archaeologists see symbolic organisation of human social life as the key transition in modern human evolution. Recently discovered at sites such as Blombos Cave and Pinnacle Point, South Africa, pierced shells, pigments and other striking signs of personal ornamentation have been dated within a time-window of 70,000–160,000 years ago in the African Middle Stone Age, suggesting that the emergence of Homo sapiens coincided, after all, with the transition to modern cognition and behaviour. While viewing the emergence of language as a 'revolutionary' development, this school of thought generally attributes it to cumulative social, cognitive and cultural evolutionary processes as opposed to a single genetic mutation.

[ "Middle Stone Age" ]
Parent Topic
Child Topic
    No Parent Topic