This growing collection consists of scholarly works authored by ASU-affiliated faculty, staff, and community members, and it contains many open access articles. ASU-affiliated authors are encouraged to Share Your Work in KEEP.

Displaying 1 - 10 of 39
Filtering by

Clear all filters

Description

Stone-tipped weapons were a significant innovation for Middle Pleistocene hominins. Hafted hunting technology represents the development of new cognitive and social learning mechanisms within the genus Homo, and may have provided a foraging advantage over simpler forms of hunting technology, such as a sharpened wooden spear. However, the nature of

Stone-tipped weapons were a significant innovation for Middle Pleistocene hominins. Hafted hunting technology represents the development of new cognitive and social learning mechanisms within the genus Homo, and may have provided a foraging advantage over simpler forms of hunting technology, such as a sharpened wooden spear. However, the nature of this foraging advantage has not been confirmed. Experimental studies and ethnographic reports provide conflicting results regarding the relative importance of the functional, economic, and social roles of hafted hunting technology. The controlled experiment reported here was designed to test the functional hypothesis for stone-tipped weapons using spears and ballistics gelatin. It differs from previous investigations of this type because it includes a quantitative analysis of wound track profiles and focuses specifically on hand-delivered spear technology. Our results do not support the hypothesis that tipped spears penetrate deeper than untipped spears. However, tipped spears create a significantly larger inner wound cavity that widens distally. This inner wound cavity is analogous to the permanent wound cavity in ballistics research, which is considered the key variable affecting the relative ‘stopping power’ or ‘killing power’ of a penetrating weapon. Tipped spears conferred a functional advantage to Middle Pleistocene hominins, potentially affecting the frequency and regularity of hunting success with important implications for human adaptation and life history.

ContributorsWilkins, Jayne (Author) / Schoville, Benjamin (Author) / Brown, Kyle S. (Author) / School of Human Evolution and Social Change (Contributor)
Created2014-08-27
129026-Thumbnail Image.png
Description

Background: Increasing our understanding of the factors affecting the severity of the 2009 A/H1N1 influenza pandemic in different regions of the world could lead to improved clinical practice and mitigation strategies for future influenza pandemics. Even though a number of studies have shed light into the risk factors associated with severe

Background: Increasing our understanding of the factors affecting the severity of the 2009 A/H1N1 influenza pandemic in different regions of the world could lead to improved clinical practice and mitigation strategies for future influenza pandemics. Even though a number of studies have shed light into the risk factors associated with severe outcomes of 2009 A/H1N1 influenza infections in different populations (e.g., [1-5]), analyses of the determinants of mortality risk spanning multiple pandemic waves and geographic regions are scarce. Between-country differences in the mortality burden of the 2009 pandemic could be linked to differences in influenza case management, underlying population health, or intrinsic differences in disease transmission [6]. Additional studies elucidating the determinants of disease severity globally are warranted to guide prevention efforts in future influenza pandemics.

In Mexico, the 2009 A/H1N1 influenza pandemic was characterized by a three-wave pattern occurring in the spring, summer, and fall of 2009 with substantial geographical heterogeneity [7]. A recent study suggests that Mexico experienced high excess mortality burden during the 2009 A/H1N1 influenza pandemic relative to other countries [6]. However, an assessment of potential factors that contributed to the relatively high pandemic death toll in Mexico are lacking. Here, we fill this gap by analyzing a large series of laboratory-confirmed A/H1N1 influenza cases, hospitalizations, and deaths monitored by the Mexican Social Security medical system during April 1 through December 31, 2009 in Mexico. In particular, we quantify the association between disease severity, hospital admission delays, and neuraminidase inhibitor use by demographic characteristics, pandemic wave, and geographic regions of Mexico.

Methods: We analyzed a large series of laboratory-confirmed pandemic A/H1N1 influenza cases from a prospective surveillance system maintained by the Mexican Social Security system, April-December 2009. We considered a spectrum of disease severity encompassing outpatient visits, hospitalizations, and deaths, and recorded demographic and geographic information on individual patients. We assessed the impact of neuraminidase inhibitor treatment and hospital admission delay (≤ > 2 days after disease onset) on the risk of death by multivariate logistic regression.

Results: Approximately 50% of all A/H1N1-positive patients received antiviral medication during the Spring and Summer 2009 pandemic waves in Mexico while only 9% of A/H1N1 cases received antiviral medications during the fall wave (P < 0.0001). After adjustment for age, gender, and geography, antiviral treatment significantly reduced the risk of death (OR = 0.52 (95% CI: 0.30, 0.90)) while longer hospital admission delays increased the risk of death by 2.8-fold (95% CI: 2.25, 3.41).

Conclusions: Our findings underscore the potential impact of decreasing admission delays and increasing antiviral use to mitigate the mortality burden of future influenza pandemics.

Created2012-04-20
128811-Thumbnail Image.png
Description

The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout

The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages—Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed.

ContributorsSchoville, Benjamin J. (Author) / Brown, Kyle S. (Author) / Harris, Jacob (Author) / Wilkins, Jayne (Author) / School of Human Evolution and Social Change (Contributor)
Created2016-10-13
128259-Thumbnail Image.png
Description

We formulate an in silico model of pathogen avoidance mechanism and investigate its impact on defensive behavioural measures (e.g., spontaneous social exclusions and distancing, crowd avoidance and voluntary vaccination adaptation). In particular, we use SIR(B)S (e.g., susceptible-infected-recovered with additional behavioural component) model to investigate the impact of homo-psychologicus aspects of

We formulate an in silico model of pathogen avoidance mechanism and investigate its impact on defensive behavioural measures (e.g., spontaneous social exclusions and distancing, crowd avoidance and voluntary vaccination adaptation). In particular, we use SIR(B)S (e.g., susceptible-infected-recovered with additional behavioural component) model to investigate the impact of homo-psychologicus aspects of epidemics. We focus on reactionary behavioural changes, which apply to both social distancing and voluntary vaccination participations. Our analyses reveal complex relationships between spontaneous and uncoordinated behavioural changes, the emergence of its contagion properties, and mitigation of infectious diseases. We find that the presence of effective behavioural changes can impede the persistence of disease. Furthermore, it was found that under perfect effective behavioural change, there are three regions in the response factor (e.g., imitation and/or reactionary) and behavioural scale factor (e.g., global/local) factors ρ–α behavioural space. Mainly, (1) disease is always endemic even in the presence of behavioural change, (2) behavioural-prevalence plasticity is observed and disease can sometimes be eradication, and (3) elimination of endemic disease under permanence of permanent behavioural change is achieved. These results suggest that preventive behavioural changes (e.g., non-pharmaceutical prophylactic measures, social distancing and exclusion, crowd avoidance) are influenced by individual differences in perception of risks and are a salient feature of epidemics. Additionally, these findings indicates that care needs to be taken when considering the effect of adaptive behavioural change in predicting the course of epidemics, and as well as the interpretation and development of the public health measures that account for spontaneous behavioural changes.

Created2015-10-14
128616-Thumbnail Image.png
Description

Periodicities (repeating patterns) are observed in many human behaviors. Their strength may capture untapped patterns that incorporate sleep, sedentary, and active behaviors into a single metric indicative of better health. We present a framework to detect periodicities from longitudinal wrist-worn accelerometry data. GENEActiv accelerometer data were collected from 20 participants

Periodicities (repeating patterns) are observed in many human behaviors. Their strength may capture untapped patterns that incorporate sleep, sedentary, and active behaviors into a single metric indicative of better health. We present a framework to detect periodicities from longitudinal wrist-worn accelerometry data. GENEActiv accelerometer data were collected from 20 participants (17 men, 3 women, aged 35–65) continuously for 64.4±26.2 (range: 13.9 to 102.0) consecutive days. Cardiometabolic risk biomarkers and health-related quality of life metrics were assessed at baseline. Periodograms were constructed to determine patterns emergent from the accelerometer data. Periodicity strength was calculated using circular autocorrelations for time-lagged windows. The most notable periodicity was at 24 h, indicating a circadian rest-activity cycle; however, its strength varied significantly across participants. Periodicity strength was most consistently associated with LDL-cholesterol (r’s = 0.40–0.79, P’s < 0.05) and triglycerides (r’s = 0.68–0.86, P’s < 0.05) but also associated with hs-CRP and health-related quality of life, even after adjusting for demographics and self-rated physical activity and insomnia symptoms. Our framework demonstrates a new method for characterizing behavior patterns longitudinally which captures relationships between 24 h accelerometry data and health outcomes.

ContributorsBuman, Matthew (Author) / Hu, Feiyan (Author) / Newman, Eamonn (Author) / Smeaton, Alan F. (Author) / Epstein, Dana R. (Author) / College of Health Solutions (Contributor)
Created2016-01-04
128596-Thumbnail Image.png
Description

Background: Falls are a major public health concern in older adults. Recent fall prevention guidelines recommend the use of multifactorial fall prevention programs (FPPs) that include exercise for community-dwelling older adults; however, the availability of sustainable, community-based FPPs is limited.

Methods: We conducted a 24-week quasi-experimental study to evaluate the efficacy

Background: Falls are a major public health concern in older adults. Recent fall prevention guidelines recommend the use of multifactorial fall prevention programs (FPPs) that include exercise for community-dwelling older adults; however, the availability of sustainable, community-based FPPs is limited.

Methods: We conducted a 24-week quasi-experimental study to evaluate the efficacy of a community-based, multifactorial FPP [Stay in Balance (SIB)] on dynamic and functional balance and muscular strength. The SIB program was delivered by allied health students and included a health education program focused on fall risk factors and a progressive exercise program emphasizing lower-extremity strength and balance. All participants initially received the 12-week SIB program, and participants were non-randomly assigned at baseline to either continue the SIB exercise program at home or as a center-based program for an additional 12 weeks. Adults aged 60 and older (n = 69) who were at-risk of falling (fall history or 2+ fall risk factors) were recruited to participate. Mixed effects repeated measures using Statistical Application Software Proc Mixed were used to examine group, time, and group-by-time effects on dynamic balance (8-Foot Up and Go), functional balance (Berg Balance Scale), and muscular strength (30 s chair stands and 30 s arm curls). Non-normally distributed outcome variables were log-transformed.

Results: After adjusting for age, gender, and body mass index, 8-Foot Up and Go scores, improved significantly over time [F(2,173) = 8.92, p = 0.0; T0 − T2 diff = 1.2 (1.0)]. Berg Balance Scores [F(2,173) = 29.0, p < 0.0001; T0 − T2 diff = 4.96 (0.72)], chair stands [F(2,171) = 10.17, p < 0.0001; T0 − T2 diff = 3.1 (0.7)], and arm curls [F(2,171) = 12.7, p < 0.02; T0 − T2 diff = 2.7 (0.6)] also all improved significantly over time. There were no significant group-by-time effects observed for any of the outcomes.

Conclusion: The SIB program improved dynamic and functional balance and muscular strength in older adults at-risk for falling. Our findings indicate continuing home-based strength and balance exercises at home after completion of a center-based FPP program may be an effective and feasible way to maintain improvements in balance and strength parameters.

ContributorsDer Ananian, Cheryl (Author) / Mitros, Melanie (Author) / Buman, Matthew (Author) / College of Health Solutions (Contributor)
Created2017-02-27
127875-Thumbnail Image.png
Description

Neuropharmacological effects of psychedelics have profound cognitive, emotional, and social effects that inspired the development of cultures and religions worldwide. Findings that psychedelics objectively and reliably produce mystical experiences press the question of the neuropharmacological mechanisms by which these highly significant experiences are produced by exogenous neurotransmitter analogs. Humans have

Neuropharmacological effects of psychedelics have profound cognitive, emotional, and social effects that inspired the development of cultures and religions worldwide. Findings that psychedelics objectively and reliably produce mystical experiences press the question of the neuropharmacological mechanisms by which these highly significant experiences are produced by exogenous neurotransmitter analogs. Humans have a long evolutionary relationship with psychedelics, a consequence of psychedelics' selective effects for human cognitive abilities, exemplified in the information rich visionary experiences. Objective evidence that psychedelics produce classic mystical experiences, coupled with the finding that hallucinatory experiences can be induced by many non-drug mechanisms, illustrates the need for a common model of visionary effects. Several models implicate disturbances of normal regulatory processes in the brain as the underlying mechanisms responsible for the similarities of visionary experiences produced by psychedelic and other methods for altering consciousness. Similarities in psychedelic-induced visionary experiences and those produced by practices such as meditation and hypnosis and pathological conditions such as epilepsy indicate the need for a general model explaining visionary experiences. Common mechanisms underlying diverse alterations of consciousness involve the disruption of normal functions of the prefrontal cortex and default mode network (DMN). This interruption of ordinary control mechanisms allows for the release of thalamic and other lower brain discharges that stimulate a visual information representation system and release the effects of innate cognitive functions and operators. Converging forms of evidence support the hypothesis that the source of psychedelic experiences involves the emergence of these innate cognitive processes of lower brain systems, with visionary experiences resulting from the activation of innate processes based in the mirror neuron system (MNS).

Created2017-09-28
128397-Thumbnail Image.png
Description

The coastal environments of South Africa’s Cape Floristic Region (CFR) provide some of the earliest and most abundant evidence for the emergence of cognitively modern humans. In particular, the south coast of the CFR provided a uniquely diverse resource base for hunter-gatherers, which included marine shellfish, game, and carbohydrate-bearing plants,

The coastal environments of South Africa’s Cape Floristic Region (CFR) provide some of the earliest and most abundant evidence for the emergence of cognitively modern humans. In particular, the south coast of the CFR provided a uniquely diverse resource base for hunter-gatherers, which included marine shellfish, game, and carbohydrate-bearing plants, especially those with Underground Storage Organs (USOs). It has been hypothesized that these resources underpinned the continuity of human occupation in the region since the Middle Pleistocene. Very little research has been conducted on the foraging potential of carbohydrate resources in the CFR. This study focuses on the seasonal availability of plants with edible carbohydrates at six-weekly intervals over a two-year period in four vegetation types on South Africa’s Cape south coast. Different plant species were considered available to foragers if the edible carbohydrate was directly (i.e. above-ground edible portions) or indirectly (above-ground indications to below-ground edible portions) visible to an expert botanist familiar with this landscape. A total of 52 edible plant species were recorded across all vegetation types. Of these, 33 species were geophytes with edible USOs and 21 species had aboveground edible carbohydrates. Limestone Fynbos had the richest flora, followed by Strandveld, Renosterveld and lastly, Sand Fynbos. The availability of plant species differed across vegetation types and between survey years. The number of available USO species was highest for a six-month period from winter to early summer (Jul–Dec) across all vegetation types. Months of lowest species’ availability were in mid-summer to early autumn (Jan–Apr); the early winter (May–Jun) values were variable, being highest in Limestone Fynbos. However, even during the late summer carbohydrate “crunch,” 25 carbohydrate bearing species were visible across the four vegetation types. To establish a robust resource landscape will require additional spatial mapping of plant species abundances. Nonetheless, our results demonstrate that plant-based carbohydrate resources available to Stone Age foragers of the Cape south coast, especially USOs belonging to the Iridaceae family, are likely to have comprised a reliable and nutritious source of calories over most of the year.

ContributorsDe Vynck, Jan C. (Author) / Cowling, Richard M. (Author) / Potts, Alastair J. (Author) / Marean, Curtis (Author) / School of Human Evolution and Social Change (Contributor)
Created2016-02-18
129556-Thumbnail Image.png
Description

Dental microwear has been shown to reflect diet in a broad variety of fossil mammals. Recent studies have suggested that differences in microwear texture attributes between samples may also reflect environmental abrasive loads. Here, we examine dental microwear textures on the incisors of shrews, both to evaluate this idea and

Dental microwear has been shown to reflect diet in a broad variety of fossil mammals. Recent studies have suggested that differences in microwear texture attributes between samples may also reflect environmental abrasive loads. Here, we examine dental microwear textures on the incisors of shrews, both to evaluate this idea and to expand the extant baseline to include Soricidae. Specimens were chosen to sample a broad range of environments, semi-desert to rainforest. Species examined were all largely insectivorous, but some are reported to supplement their diets with vertebrate tissues and others with plant matter. Results indicate subtle but significant differences between samples grouped by both diet independent of environment and environment independent of diet. Subtle diet differences were more evident in microwear texture variation considered by habitat (i.e., grassland). These results suggest that while environment does not swamp the diet signal in shrew incisor microwear, studies can benefit from control of habitat type.

ContributorsWithnell, Charles (Author) / Ungar, Peter S. (Author) / School of Human Evolution and Social Change (Contributor)
Created2014-08-01
129067-Thumbnail Image.png
Description

Background: Little research has explored who responds better to an automated vs. human advisor for health behaviors in general, and for physical activity (PA) promotion in particular. The purpose of this study was to explore baseline factors (i.e., demographics, motivation, interpersonal style, and external resources) that moderate intervention efficacy delivered by

Background: Little research has explored who responds better to an automated vs. human advisor for health behaviors in general, and for physical activity (PA) promotion in particular. The purpose of this study was to explore baseline factors (i.e., demographics, motivation, interpersonal style, and external resources) that moderate intervention efficacy delivered by either a human or automated advisor.

Methods: Data were from the CHAT Trial, a 12-month randomized controlled trial to increase PA among underactive older adults (full trial N = 218) via a human advisor or automated interactive voice response advisor. Trial results indicated significant increases in PA in both interventions by 12 months that were maintained at 18-months. Regression was used to explore moderation of the two interventions.

Results: Results indicated amotivation (i.e., lack of intent in PA) moderated 12-month PA (d = 0.55, p < 0.01) and private self-consciousness (i.e., tendency to attune to one’s own inner thoughts and emotions) moderated 18-month PA (d = 0.34, p < 0.05) but a variety of other factors (e.g., demographics) did not (p > 0.12).

Conclusions: Results provide preliminary evidence for generating hypotheses about pathways for supporting later clinical decision-making with regard to the use of either human- vs. computer-delivered interventions for PA promotion.

ContributorsHekler, Eric (Author) / Buman, Matthew (Author) / Otten, Jennifer (Author) / Castro, Cynthia (Author) / Grieco, Lauren (Author) / Marcus, Bess (Author) / Friedman, Robert H. (Author) / Napolitano, Melissa A. (Author) / King, Abby C. (Author) / College of Health Solutions (Contributor)
Created2013-09-22