This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 1 - 10 of 40
Filtering by

Clear all filters

Description

Stone-tipped weapons were a significant innovation for Middle Pleistocene hominins. Hafted hunting technology represents the development of new cognitive and social learning mechanisms within the genus Homo, and may have provided a foraging advantage over simpler forms of hunting technology, such as a sharpened wooden spear. However, the nature of

Stone-tipped weapons were a significant innovation for Middle Pleistocene hominins. Hafted hunting technology represents the development of new cognitive and social learning mechanisms within the genus Homo, and may have provided a foraging advantage over simpler forms of hunting technology, such as a sharpened wooden spear. However, the nature of this foraging advantage has not been confirmed. Experimental studies and ethnographic reports provide conflicting results regarding the relative importance of the functional, economic, and social roles of hafted hunting technology. The controlled experiment reported here was designed to test the functional hypothesis for stone-tipped weapons using spears and ballistics gelatin. It differs from previous investigations of this type because it includes a quantitative analysis of wound track profiles and focuses specifically on hand-delivered spear technology. Our results do not support the hypothesis that tipped spears penetrate deeper than untipped spears. However, tipped spears create a significantly larger inner wound cavity that widens distally. This inner wound cavity is analogous to the permanent wound cavity in ballistics research, which is considered the key variable affecting the relative ‘stopping power’ or ‘killing power’ of a penetrating weapon. Tipped spears conferred a functional advantage to Middle Pleistocene hominins, potentially affecting the frequency and regularity of hunting success with important implications for human adaptation and life history.

ContributorsWilkins, Jayne (Author) / Schoville, Benjamin (Author) / Brown, Kyle S. (Author) / School of Human Evolution and Social Change (Contributor)
Created2014-08-27
Description

Human societies are unique in the level of cooperation among non-kin. Evolutionary models explaining this behavior typically assume pure strategies of cooperation and defection. Behavioral experiments, however, demonstrate that humans are typically conditional co-operators who have other-regarding preferences. Building on existing models on the evolution of cooperation and costly punishment,

Human societies are unique in the level of cooperation among non-kin. Evolutionary models explaining this behavior typically assume pure strategies of cooperation and defection. Behavioral experiments, however, demonstrate that humans are typically conditional co-operators who have other-regarding preferences. Building on existing models on the evolution of cooperation and costly punishment, we use a utilitarian formulation of agent decision making to explore conditions that support the emergence of cooperative behavior. Our results indicate that cooperation levels are significantly lower for larger groups in contrast to the original pure strategy model. Here, defection behavior not only diminishes the public good, but also affects the expectations of group members leading conditional co-operators to change their strategies. Hence defection has a more damaging effect when decisions are based on expectations and not only pure strategies.

Created2014-07-01
128738-Thumbnail Image.png
Description

A major conundrum in evolution is that, despite natural selection, polymorphism is still omnipresent in nature: Numerous species exhibit multiple morphs, namely several abundant values of an important trait. Polymorphism is particularly prevalent in asymmetric traits, which are beneficial to their carrier in disruptive competitive interference but at the same

A major conundrum in evolution is that, despite natural selection, polymorphism is still omnipresent in nature: Numerous species exhibit multiple morphs, namely several abundant values of an important trait. Polymorphism is particularly prevalent in asymmetric traits, which are beneficial to their carrier in disruptive competitive interference but at the same time bear disadvantages in other aspects, such as greater mortality or lower fecundity. Here we focus on asymmetric traits in which a better competitor disperses fewer offspring in the absence of competition. We report a general pattern in which polymorphic populations emerge when disruptive selection increases: The stronger the selection, the greater the number of morphs that evolve. This pattern is general and is insensitive to the form of the fitness function. The pattern is somewhat counterintuitive since directional selection is excepted to sharpen the trait distribution and thereby reduce its diversity (but note that similar patterns were suggested in studies that demonstrated increased biodiversity as local selection increases in ecological communities). We explain the underlying mechanism in which stronger selection drives the population towards more competitive values of the trait, which in turn reduces the population density, thereby enabling lesser competitors to stably persist with reduced need to directly compete. Thus, we believe that the pattern is more general and may apply to asymmetric traits more broadly. This robust pattern suggests a comparative, unified explanation to a variety of polymorphic traits in nature.

Created2016-02-04
129026-Thumbnail Image.png
Description

Background: Increasing our understanding of the factors affecting the severity of the 2009 A/H1N1 influenza pandemic in different regions of the world could lead to improved clinical practice and mitigation strategies for future influenza pandemics. Even though a number of studies have shed light into the risk factors associated with severe

Background: Increasing our understanding of the factors affecting the severity of the 2009 A/H1N1 influenza pandemic in different regions of the world could lead to improved clinical practice and mitigation strategies for future influenza pandemics. Even though a number of studies have shed light into the risk factors associated with severe outcomes of 2009 A/H1N1 influenza infections in different populations (e.g., [1-5]), analyses of the determinants of mortality risk spanning multiple pandemic waves and geographic regions are scarce. Between-country differences in the mortality burden of the 2009 pandemic could be linked to differences in influenza case management, underlying population health, or intrinsic differences in disease transmission [6]. Additional studies elucidating the determinants of disease severity globally are warranted to guide prevention efforts in future influenza pandemics.

In Mexico, the 2009 A/H1N1 influenza pandemic was characterized by a three-wave pattern occurring in the spring, summer, and fall of 2009 with substantial geographical heterogeneity [7]. A recent study suggests that Mexico experienced high excess mortality burden during the 2009 A/H1N1 influenza pandemic relative to other countries [6]. However, an assessment of potential factors that contributed to the relatively high pandemic death toll in Mexico are lacking. Here, we fill this gap by analyzing a large series of laboratory-confirmed A/H1N1 influenza cases, hospitalizations, and deaths monitored by the Mexican Social Security medical system during April 1 through December 31, 2009 in Mexico. In particular, we quantify the association between disease severity, hospital admission delays, and neuraminidase inhibitor use by demographic characteristics, pandemic wave, and geographic regions of Mexico.

Methods: We analyzed a large series of laboratory-confirmed pandemic A/H1N1 influenza cases from a prospective surveillance system maintained by the Mexican Social Security system, April-December 2009. We considered a spectrum of disease severity encompassing outpatient visits, hospitalizations, and deaths, and recorded demographic and geographic information on individual patients. We assessed the impact of neuraminidase inhibitor treatment and hospital admission delay (≤ > 2 days after disease onset) on the risk of death by multivariate logistic regression.

Results: Approximately 50% of all A/H1N1-positive patients received antiviral medication during the Spring and Summer 2009 pandemic waves in Mexico while only 9% of A/H1N1 cases received antiviral medications during the fall wave (P < 0.0001). After adjustment for age, gender, and geography, antiviral treatment significantly reduced the risk of death (OR = 0.52 (95% CI: 0.30, 0.90)) while longer hospital admission delays increased the risk of death by 2.8-fold (95% CI: 2.25, 3.41).

Conclusions: Our findings underscore the potential impact of decreasing admission delays and increasing antiviral use to mitigate the mortality burden of future influenza pandemics.

Created2012-04-20
128811-Thumbnail Image.png
Description

The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout

The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages—Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed.

ContributorsSchoville, Benjamin J. (Author) / Brown, Kyle S. (Author) / Harris, Jacob (Author) / Wilkins, Jayne (Author) / School of Human Evolution and Social Change (Contributor)
Created2016-10-13
128263-Thumbnail Image.png
Description

Tree-like structures are ubiquitous in nature. In particular, neuronal axons and dendrites have tree-like geometries that mediate electrical signaling within and between cells. Electrical activity in neuronal trees is typically modeled using coupled cable equations on multi-compartment representations, where each compartment represents a small segment of the neuronal membrane. The

Tree-like structures are ubiquitous in nature. In particular, neuronal axons and dendrites have tree-like geometries that mediate electrical signaling within and between cells. Electrical activity in neuronal trees is typically modeled using coupled cable equations on multi-compartment representations, where each compartment represents a small segment of the neuronal membrane. The geometry of each compartment is usually defined as a cylinder or, at best, a surface of revolution based on a linear approximation of the radial change in the neurite. The resulting geometry of the model neuron is coarse, with non-smooth or even discontinuous jumps at the boundaries between compartments. We propose a hyperbolic approximation to model the geometry of neurite compartments, a branched, multi-compartment extension, and a simple graphical approach to calculate steady-state solutions of an associated system of coupled cable equations. A simple case of transient solutions is also briefly discussed.

Created2014-07-09
128259-Thumbnail Image.png
Description

We formulate an in silico model of pathogen avoidance mechanism and investigate its impact on defensive behavioural measures (e.g., spontaneous social exclusions and distancing, crowd avoidance and voluntary vaccination adaptation). In particular, we use SIR(B)S (e.g., susceptible-infected-recovered with additional behavioural component) model to investigate the impact of homo-psychologicus aspects of

We formulate an in silico model of pathogen avoidance mechanism and investigate its impact on defensive behavioural measures (e.g., spontaneous social exclusions and distancing, crowd avoidance and voluntary vaccination adaptation). In particular, we use SIR(B)S (e.g., susceptible-infected-recovered with additional behavioural component) model to investigate the impact of homo-psychologicus aspects of epidemics. We focus on reactionary behavioural changes, which apply to both social distancing and voluntary vaccination participations. Our analyses reveal complex relationships between spontaneous and uncoordinated behavioural changes, the emergence of its contagion properties, and mitigation of infectious diseases. We find that the presence of effective behavioural changes can impede the persistence of disease. Furthermore, it was found that under perfect effective behavioural change, there are three regions in the response factor (e.g., imitation and/or reactionary) and behavioural scale factor (e.g., global/local) factors ρ–α behavioural space. Mainly, (1) disease is always endemic even in the presence of behavioural change, (2) behavioural-prevalence plasticity is observed and disease can sometimes be eradication, and (3) elimination of endemic disease under permanence of permanent behavioural change is achieved. These results suggest that preventive behavioural changes (e.g., non-pharmaceutical prophylactic measures, social distancing and exclusion, crowd avoidance) are influenced by individual differences in perception of risks and are a salient feature of epidemics. Additionally, these findings indicates that care needs to be taken when considering the effect of adaptive behavioural change in predicting the course of epidemics, and as well as the interpretation and development of the public health measures that account for spontaneous behavioural changes.

Created2015-10-14
127875-Thumbnail Image.png
Description

Neuropharmacological effects of psychedelics have profound cognitive, emotional, and social effects that inspired the development of cultures and religions worldwide. Findings that psychedelics objectively and reliably produce mystical experiences press the question of the neuropharmacological mechanisms by which these highly significant experiences are produced by exogenous neurotransmitter analogs. Humans have

Neuropharmacological effects of psychedelics have profound cognitive, emotional, and social effects that inspired the development of cultures and religions worldwide. Findings that psychedelics objectively and reliably produce mystical experiences press the question of the neuropharmacological mechanisms by which these highly significant experiences are produced by exogenous neurotransmitter analogs. Humans have a long evolutionary relationship with psychedelics, a consequence of psychedelics' selective effects for human cognitive abilities, exemplified in the information rich visionary experiences. Objective evidence that psychedelics produce classic mystical experiences, coupled with the finding that hallucinatory experiences can be induced by many non-drug mechanisms, illustrates the need for a common model of visionary effects. Several models implicate disturbances of normal regulatory processes in the brain as the underlying mechanisms responsible for the similarities of visionary experiences produced by psychedelic and other methods for altering consciousness. Similarities in psychedelic-induced visionary experiences and those produced by practices such as meditation and hypnosis and pathological conditions such as epilepsy indicate the need for a general model explaining visionary experiences. Common mechanisms underlying diverse alterations of consciousness involve the disruption of normal functions of the prefrontal cortex and default mode network (DMN). This interruption of ordinary control mechanisms allows for the release of thalamic and other lower brain discharges that stimulate a visual information representation system and release the effects of innate cognitive functions and operators. Converging forms of evidence support the hypothesis that the source of psychedelic experiences involves the emergence of these innate cognitive processes of lower brain systems, with visionary experiences resulting from the activation of innate processes based in the mirror neuron system (MNS).

Created2017-09-28
128397-Thumbnail Image.png
Description

The coastal environments of South Africa’s Cape Floristic Region (CFR) provide some of the earliest and most abundant evidence for the emergence of cognitively modern humans. In particular, the south coast of the CFR provided a uniquely diverse resource base for hunter-gatherers, which included marine shellfish, game, and carbohydrate-bearing plants,

The coastal environments of South Africa’s Cape Floristic Region (CFR) provide some of the earliest and most abundant evidence for the emergence of cognitively modern humans. In particular, the south coast of the CFR provided a uniquely diverse resource base for hunter-gatherers, which included marine shellfish, game, and carbohydrate-bearing plants, especially those with Underground Storage Organs (USOs). It has been hypothesized that these resources underpinned the continuity of human occupation in the region since the Middle Pleistocene. Very little research has been conducted on the foraging potential of carbohydrate resources in the CFR. This study focuses on the seasonal availability of plants with edible carbohydrates at six-weekly intervals over a two-year period in four vegetation types on South Africa’s Cape south coast. Different plant species were considered available to foragers if the edible carbohydrate was directly (i.e. above-ground edible portions) or indirectly (above-ground indications to below-ground edible portions) visible to an expert botanist familiar with this landscape. A total of 52 edible plant species were recorded across all vegetation types. Of these, 33 species were geophytes with edible USOs and 21 species had aboveground edible carbohydrates. Limestone Fynbos had the richest flora, followed by Strandveld, Renosterveld and lastly, Sand Fynbos. The availability of plant species differed across vegetation types and between survey years. The number of available USO species was highest for a six-month period from winter to early summer (Jul–Dec) across all vegetation types. Months of lowest species’ availability were in mid-summer to early autumn (Jan–Apr); the early winter (May–Jun) values were variable, being highest in Limestone Fynbos. However, even during the late summer carbohydrate “crunch,” 25 carbohydrate bearing species were visible across the four vegetation types. To establish a robust resource landscape will require additional spatial mapping of plant species abundances. Nonetheless, our results demonstrate that plant-based carbohydrate resources available to Stone Age foragers of the Cape south coast, especially USOs belonging to the Iridaceae family, are likely to have comprised a reliable and nutritious source of calories over most of the year.

ContributorsDe Vynck, Jan C. (Author) / Cowling, Richard M. (Author) / Potts, Alastair J. (Author) / Marean, Curtis (Author) / School of Human Evolution and Social Change (Contributor)
Created2016-02-18
129556-Thumbnail Image.png
Description

Dental microwear has been shown to reflect diet in a broad variety of fossil mammals. Recent studies have suggested that differences in microwear texture attributes between samples may also reflect environmental abrasive loads. Here, we examine dental microwear textures on the incisors of shrews, both to evaluate this idea and

Dental microwear has been shown to reflect diet in a broad variety of fossil mammals. Recent studies have suggested that differences in microwear texture attributes between samples may also reflect environmental abrasive loads. Here, we examine dental microwear textures on the incisors of shrews, both to evaluate this idea and to expand the extant baseline to include Soricidae. Specimens were chosen to sample a broad range of environments, semi-desert to rainforest. Species examined were all largely insectivorous, but some are reported to supplement their diets with vertebrate tissues and others with plant matter. Results indicate subtle but significant differences between samples grouped by both diet independent of environment and environment independent of diet. Subtle diet differences were more evident in microwear texture variation considered by habitat (i.e., grassland). These results suggest that while environment does not swamp the diet signal in shrew incisor microwear, studies can benefit from control of habitat type.

ContributorsWithnell, Charles (Author) / Ungar, Peter S. (Author) / School of Human Evolution and Social Change (Contributor)
Created2014-08-01