This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 1 - 10 of 35
Filtering by

Clear all filters

Description

Stone-tipped weapons were a significant innovation for Middle Pleistocene hominins. Hafted hunting technology represents the development of new cognitive and social learning mechanisms within the genus Homo, and may have provided a foraging advantage over simpler forms of hunting technology, such as a sharpened wooden spear. However, the nature of

Stone-tipped weapons were a significant innovation for Middle Pleistocene hominins. Hafted hunting technology represents the development of new cognitive and social learning mechanisms within the genus Homo, and may have provided a foraging advantage over simpler forms of hunting technology, such as a sharpened wooden spear. However, the nature of this foraging advantage has not been confirmed. Experimental studies and ethnographic reports provide conflicting results regarding the relative importance of the functional, economic, and social roles of hafted hunting technology. The controlled experiment reported here was designed to test the functional hypothesis for stone-tipped weapons using spears and ballistics gelatin. It differs from previous investigations of this type because it includes a quantitative analysis of wound track profiles and focuses specifically on hand-delivered spear technology. Our results do not support the hypothesis that tipped spears penetrate deeper than untipped spears. However, tipped spears create a significantly larger inner wound cavity that widens distally. This inner wound cavity is analogous to the permanent wound cavity in ballistics research, which is considered the key variable affecting the relative ‘stopping power’ or ‘killing power’ of a penetrating weapon. Tipped spears conferred a functional advantage to Middle Pleistocene hominins, potentially affecting the frequency and regularity of hunting success with important implications for human adaptation and life history.

ContributorsWilkins, Jayne (Author) / Schoville, Benjamin (Author) / Brown, Kyle S. (Author) / School of Human Evolution and Social Change (Contributor)
Created2014-08-27
128725-Thumbnail Image.png
Description

In spite of well-documented health benefits of vegetarian diets, less is known regarding the effects of these diets on athletic performance. In this cross-sectional study, we compared elite vegetarian and omnivore adult endurance athletes for maximal oxygen uptake (VO2 max) and strength. Twenty-seven vegetarian (VEG) and 43 omnivore (OMN) athletes

In spite of well-documented health benefits of vegetarian diets, less is known regarding the effects of these diets on athletic performance. In this cross-sectional study, we compared elite vegetarian and omnivore adult endurance athletes for maximal oxygen uptake (VO2 max) and strength. Twenty-seven vegetarian (VEG) and 43 omnivore (OMN) athletes were evaluated using VO2 max testing on the treadmill, and strength assessment using a dynamometer to determine peak torque for leg extensions. Dietary data were assessed using detailed seven-day food logs. Although total protein intake was lower among vegetarians in comparison to omnivores, protein intake as a function of body mass did not differ by group (1.2 ± 0.3 and 1.4 ± 0.5 g/kg body mass for VEG and OMN respectively, p = 0.220). VO2 max differed for females by diet group (53.0 ± 6.9 and 47.1 ± 8.6 mL/kg/min for VEG and OMN respectively, p < 0.05) but not for males (62.6 ± 15.4 and 55.7 ± 8.4 mL/kg/min respectively). Peak torque did not differ significantly between diet groups. Results from this study indicate that vegetarian endurance athletes’ cardiorespiratory fitness was greater than that for their omnivorous counterparts, but that peak torque did not differ between diet groups. These data suggest that vegetarian diets do not compromise performance outcomes and may facilitate aerobic capacity in athletes.

ContributorsLynch, Heidi (Author) / Wharton, Christopher (Author) / Johnston, Carol (Author) / College of Health Solutions (Contributor)
Created2016-11-15
128811-Thumbnail Image.png
Description

The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout

The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages—Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed.

ContributorsSchoville, Benjamin J. (Author) / Brown, Kyle S. (Author) / Harris, Jacob (Author) / Wilkins, Jayne (Author) / School of Human Evolution and Social Change (Contributor)
Created2016-10-13
128259-Thumbnail Image.png
Description

We formulate an in silico model of pathogen avoidance mechanism and investigate its impact on defensive behavioural measures (e.g., spontaneous social exclusions and distancing, crowd avoidance and voluntary vaccination adaptation). In particular, we use SIR(B)S (e.g., susceptible-infected-recovered with additional behavioural component) model to investigate the impact of homo-psychologicus aspects of

We formulate an in silico model of pathogen avoidance mechanism and investigate its impact on defensive behavioural measures (e.g., spontaneous social exclusions and distancing, crowd avoidance and voluntary vaccination adaptation). In particular, we use SIR(B)S (e.g., susceptible-infected-recovered with additional behavioural component) model to investigate the impact of homo-psychologicus aspects of epidemics. We focus on reactionary behavioural changes, which apply to both social distancing and voluntary vaccination participations. Our analyses reveal complex relationships between spontaneous and uncoordinated behavioural changes, the emergence of its contagion properties, and mitigation of infectious diseases. We find that the presence of effective behavioural changes can impede the persistence of disease. Furthermore, it was found that under perfect effective behavioural change, there are three regions in the response factor (e.g., imitation and/or reactionary) and behavioural scale factor (e.g., global/local) factors ρ–α behavioural space. Mainly, (1) disease is always endemic even in the presence of behavioural change, (2) behavioural-prevalence plasticity is observed and disease can sometimes be eradication, and (3) elimination of endemic disease under permanence of permanent behavioural change is achieved. These results suggest that preventive behavioural changes (e.g., non-pharmaceutical prophylactic measures, social distancing and exclusion, crowd avoidance) are influenced by individual differences in perception of risks and are a salient feature of epidemics. Additionally, these findings indicates that care needs to be taken when considering the effect of adaptive behavioural change in predicting the course of epidemics, and as well as the interpretation and development of the public health measures that account for spontaneous behavioural changes.

Created2015-10-14
127875-Thumbnail Image.png
Description

Neuropharmacological effects of psychedelics have profound cognitive, emotional, and social effects that inspired the development of cultures and religions worldwide. Findings that psychedelics objectively and reliably produce mystical experiences press the question of the neuropharmacological mechanisms by which these highly significant experiences are produced by exogenous neurotransmitter analogs. Humans have

Neuropharmacological effects of psychedelics have profound cognitive, emotional, and social effects that inspired the development of cultures and religions worldwide. Findings that psychedelics objectively and reliably produce mystical experiences press the question of the neuropharmacological mechanisms by which these highly significant experiences are produced by exogenous neurotransmitter analogs. Humans have a long evolutionary relationship with psychedelics, a consequence of psychedelics' selective effects for human cognitive abilities, exemplified in the information rich visionary experiences. Objective evidence that psychedelics produce classic mystical experiences, coupled with the finding that hallucinatory experiences can be induced by many non-drug mechanisms, illustrates the need for a common model of visionary effects. Several models implicate disturbances of normal regulatory processes in the brain as the underlying mechanisms responsible for the similarities of visionary experiences produced by psychedelic and other methods for altering consciousness. Similarities in psychedelic-induced visionary experiences and those produced by practices such as meditation and hypnosis and pathological conditions such as epilepsy indicate the need for a general model explaining visionary experiences. Common mechanisms underlying diverse alterations of consciousness involve the disruption of normal functions of the prefrontal cortex and default mode network (DMN). This interruption of ordinary control mechanisms allows for the release of thalamic and other lower brain discharges that stimulate a visual information representation system and release the effects of innate cognitive functions and operators. Converging forms of evidence support the hypothesis that the source of psychedelic experiences involves the emergence of these innate cognitive processes of lower brain systems, with visionary experiences resulting from the activation of innate processes based in the mirror neuron system (MNS).

Created2017-09-28
128397-Thumbnail Image.png
Description

The coastal environments of South Africa’s Cape Floristic Region (CFR) provide some of the earliest and most abundant evidence for the emergence of cognitively modern humans. In particular, the south coast of the CFR provided a uniquely diverse resource base for hunter-gatherers, which included marine shellfish, game, and carbohydrate-bearing plants,

The coastal environments of South Africa’s Cape Floristic Region (CFR) provide some of the earliest and most abundant evidence for the emergence of cognitively modern humans. In particular, the south coast of the CFR provided a uniquely diverse resource base for hunter-gatherers, which included marine shellfish, game, and carbohydrate-bearing plants, especially those with Underground Storage Organs (USOs). It has been hypothesized that these resources underpinned the continuity of human occupation in the region since the Middle Pleistocene. Very little research has been conducted on the foraging potential of carbohydrate resources in the CFR. This study focuses on the seasonal availability of plants with edible carbohydrates at six-weekly intervals over a two-year period in four vegetation types on South Africa’s Cape south coast. Different plant species were considered available to foragers if the edible carbohydrate was directly (i.e. above-ground edible portions) or indirectly (above-ground indications to below-ground edible portions) visible to an expert botanist familiar with this landscape. A total of 52 edible plant species were recorded across all vegetation types. Of these, 33 species were geophytes with edible USOs and 21 species had aboveground edible carbohydrates. Limestone Fynbos had the richest flora, followed by Strandveld, Renosterveld and lastly, Sand Fynbos. The availability of plant species differed across vegetation types and between survey years. The number of available USO species was highest for a six-month period from winter to early summer (Jul–Dec) across all vegetation types. Months of lowest species’ availability were in mid-summer to early autumn (Jan–Apr); the early winter (May–Jun) values were variable, being highest in Limestone Fynbos. However, even during the late summer carbohydrate “crunch,” 25 carbohydrate bearing species were visible across the four vegetation types. To establish a robust resource landscape will require additional spatial mapping of plant species abundances. Nonetheless, our results demonstrate that plant-based carbohydrate resources available to Stone Age foragers of the Cape south coast, especially USOs belonging to the Iridaceae family, are likely to have comprised a reliable and nutritious source of calories over most of the year.

ContributorsDe Vynck, Jan C. (Author) / Cowling, Richard M. (Author) / Potts, Alastair J. (Author) / Marean, Curtis (Author) / School of Human Evolution and Social Change (Contributor)
Created2016-02-18
129556-Thumbnail Image.png
Description

Dental microwear has been shown to reflect diet in a broad variety of fossil mammals. Recent studies have suggested that differences in microwear texture attributes between samples may also reflect environmental abrasive loads. Here, we examine dental microwear textures on the incisors of shrews, both to evaluate this idea and

Dental microwear has been shown to reflect diet in a broad variety of fossil mammals. Recent studies have suggested that differences in microwear texture attributes between samples may also reflect environmental abrasive loads. Here, we examine dental microwear textures on the incisors of shrews, both to evaluate this idea and to expand the extant baseline to include Soricidae. Specimens were chosen to sample a broad range of environments, semi-desert to rainforest. Species examined were all largely insectivorous, but some are reported to supplement their diets with vertebrate tissues and others with plant matter. Results indicate subtle but significant differences between samples grouped by both diet independent of environment and environment independent of diet. Subtle diet differences were more evident in microwear texture variation considered by habitat (i.e., grassland). These results suggest that while environment does not swamp the diet signal in shrew incisor microwear, studies can benefit from control of habitat type.

ContributorsWithnell, Charles (Author) / Ungar, Peter S. (Author) / School of Human Evolution and Social Change (Contributor)
Created2014-08-01
129059-Thumbnail Image.png
Description

Background: Peanut consumption favorably influences satiety. This study examined the acute effect of peanut versus grain bar preloads on postmeal satiety and glycemia in healthy adults and the long-term effect of these meal preloads on body mass in healthy overweight adults.

Methods: In the acute crossover trial (n = 15; 28.4 ± 2.9 y; 23.1 ± 0.9

Background: Peanut consumption favorably influences satiety. This study examined the acute effect of peanut versus grain bar preloads on postmeal satiety and glycemia in healthy adults and the long-term effect of these meal preloads on body mass in healthy overweight adults.

Methods: In the acute crossover trial (n = 15; 28.4 ± 2.9 y; 23.1 ± 0.9 kg/m2), the preload (isoenergetic peanut or grain bar with water, or water alone) was followed after 60 min with ingestion of a standardized glycemic test meal. Satiety and blood glucose were assessed immediately prior to the preload and to the test meal, and for two hours postmeal at 30-min intervals. In the parallel-arm, randomized trial (n = 44; 40.5 ± 1.6 y, 31.8 ± 0.9 kg/m2), the peanut or grain bar preload was consumed one hour prior to the evening meal for eight weeks. Body mass was measured at 2-week intervals, and secondary endpoints included blood hemoglobin A1c and energy intake as assessed by 3-d diet records collected at pre-trial and trial weeks 1 and 8.

Results: Satiety was elevated in the postprandial period following grain bar ingestion in comparison to peanut or water ingestion (p = 0.001, repeated-measures ANOVA). Blood glucose was elevated one hour after ingestion of the grain bar as compared to the peanut or water treatments; yet, total glycemia did not vary between treatments in the two hour postprandial period. In the 8-week trial, body mass was reduced for the grain bar versus peanut groups after eight weeks (−1.3 ± 0.4 kg versus −0.2 ± 0.3 kg, p = 0.033, analysis of covariance). Energy intake was reduced by 458 kcal/d in the first week of the trial for the grain bar group as compared to the peanut group (p = 0.118). Hemoglobin A1c changed significantly between groups during the trial (−0.25 ± 0.07% and −0.18 ± 0.12% for the grain bar and peanut groups respectively, p = 0.001).

Conclusions: Compared to an isoenergetic peanut preload, consumption of a grain bar preload one hour prior to a standardized meal significantly raised postmeal satiety. Moreover, consumption of the grain bar prior to the evening meal was associated with significant weight loss over time suggesting that glycemic carbohydrate ingestion prior to meals may be a weight management strategy.

ContributorsJohnston, Carol (Author) / Catherine, Trier (Author) / Fleming, Katie (Author) / College of Health Solutions (Contributor)
Created2013-03-27
129044-Thumbnail Image.png
Description

Background: Height is an important health assessment measure with many applications. In the medical practice and in research settings, height is typically measured with a stadiometer. Although lasers are commonly used by health professionals for measurement including facial imaging, corneal thickness, and limb length, it has not been utilized for

Background: Height is an important health assessment measure with many applications. In the medical practice and in research settings, height is typically measured with a stadiometer. Although lasers are commonly used by health professionals for measurement including facial imaging, corneal thickness, and limb length, it has not been utilized for measuring height. The purpose of this feasibility study was to examine the ease and accuracy of a laser device for measuring height in children and adults.

Findings: In immediate succession, participant height was measured in triplicate using a stadiometer followed by the laser device. Measurement error for the laser device was significantly higher than that for the stadiometer (0.35 and 0.20 cm respectively). However, the measurement techniques were highly correlated (r2 = 0.998 and 0.990 for the younger [<12 y, n = 25] and older [≥12 y, n = 100] participants respectively), and the estimated reliability between measurement techniques was 0.999 (ICC; 95 % CI: 0.998,1.000) and 0.995 (ICC; 95 % CI: 0.993,0.997) for the younger and older groups respectively. The average differences between the two styles of measurement (e.g., stadiometer minus laser) were significantly different from zero: +0.93 and +0.45 cm for the younger and older groups respectively.

Conclusions: These data demonstrate that laser technology can be adapted to measure height in children and adults. Although refinement is needed, the laser device for measuring height merits further development.

ContributorsMayol-Kreiser, Sandra (Author) / Garcia-Turner, Vanessa (Author) / Johnston, Carol (Author) / College of Health Solutions (Contributor)
Created2015-08-31
128766-Thumbnail Image.png
Description

Background: Highly refined surveillance data on the 2009 A/H1N1 influenza pandemic are crucial to quantify the spatial and temporal characteristics of the pandemic. There is little information about the spatial-temporal dynamics of pandemic influenza in South America. Here we provide a quantitative description of the age-specific morbidity pandemic patterns across administrative

Background: Highly refined surveillance data on the 2009 A/H1N1 influenza pandemic are crucial to quantify the spatial and temporal characteristics of the pandemic. There is little information about the spatial-temporal dynamics of pandemic influenza in South America. Here we provide a quantitative description of the age-specific morbidity pandemic patterns across administrative areas of Peru.

Methods: We used daily cases of influenza-like-illness, tests for A/H1N1 influenza virus infections, and laboratory-confirmed A/H1N1 influenza cases reported to the epidemiological surveillance system of Peru's Ministry of Health from May 1 to December 31, 2009. We analyzed the geographic spread of the pandemic waves and their association with the winter school vacation period, demographic factors, and absolute humidity. We also estimated the reproduction number and quantified the association between the winter school vacation period and the age distribution of cases.

Results: The national pandemic curve revealed a bimodal winter pandemic wave, with the first peak limited to school age children in the Lima metropolitan area, and the second peak more geographically widespread. The reproduction number was estimated at 1.6–2.2 for the Lima metropolitan area and 1.3–1.5 in the rest of Peru. We found a significant association between the timing of the school vacation period and changes in the age distribution of cases, while earlier pandemic onset was correlated with large population size. By contrast there was no association between pandemic dynamics and absolute humidity.

Conclusions: Our results indicate substantial spatial variation in pandemic patterns across Peru, with two pandemic waves of varying timing and impact by age and region. Moreover, the Peru data suggest a hierarchical transmission pattern of pandemic influenza A/H1N1 driven by large population centers. The higher reproduction number of the first pandemic wave could be explained by high contact rates among school-age children, the age group most affected during this early wave.

Created2011-06-21