Matching Items (725)
Filtering by

Clear all filters

151284-Thumbnail Image.png
Description
Dietary protein is known to increase postprandial thermogenesis more so than carbohydrates or fats, probably related to the fact that amino acids have no immediate form of storage in the body and can become toxic if not readily incorporated into body tissues or excreted. It is also well documented that

Dietary protein is known to increase postprandial thermogenesis more so than carbohydrates or fats, probably related to the fact that amino acids have no immediate form of storage in the body and can become toxic if not readily incorporated into body tissues or excreted. It is also well documented that subjects report greater satiety on high- versus low-protein diets and that subject compliance tends to be greater on high-protein diets, thus contributing to their popularity. What is not as well known is how a high-protein diet affects resting metabolic rate over time, and what is even less well known is if resting metabolic rate changes significantly when a person consuming an omnivorous diet suddenly adopts a vegetarian one. This pilot study sought to determine whether subjects adopting a vegetarian diet would report decreased satiety or demonstrate a decreased metabolic rate due to a change in protein intake and possible increase in carbohydrates. Further, this study sought to validate a new device called the SenseWear Armband (SWA) to determine if it might be sensitive enough to detect subtle changes in metabolic rate related to diet. Subjects were tested twice on all variables, at baseline and post-test. Independent and related samples tests revealed no significant differences between or within groups for any variable at any time point in the study. The SWA had a strong positive correlation to the Oxycon Mobile metabolic cart but due to a lack of change in metabolic rate, its sensitivity was undetermined. These data do not support the theory that adopting a vegetarian diet results in a long-term change in metabolic rate.
ContributorsMoore, Amy (Author) / Johnston, Carol (Thesis advisor) / Appel, Christy (Thesis advisor) / Gaesser, Glenn (Committee member) / Arizona State University (Publisher)
Created2012
151302-Thumbnail Image.png
Description
Cognitive function declines with normal age and disease states, such as Alzheimer's disease (AD). Loss of ovarian hormones at menopause has been shown to exacerbate age-related memory decline and may be related to the increased risk of AD in women versus men. Some studies show that hormone therapy (HT) can

Cognitive function declines with normal age and disease states, such as Alzheimer's disease (AD). Loss of ovarian hormones at menopause has been shown to exacerbate age-related memory decline and may be related to the increased risk of AD in women versus men. Some studies show that hormone therapy (HT) can have beneficial effects on cognition in normal aging and AD, but increasing evidence suggests that the most commonly used HT formulation is not ideal. Work in this dissertation used the surgically menopausal rat to evaluate the cognitive effects and mechanisms of progestogens proscribed to women. I also translated these questions to the clinic, evaluating whether history of HT use impacts hippocampal and entorhinal cortex volumes assessed via imaging, and cognition, in menopausal women. Further, this dissertation investigates how sex impacts responsiveness to dietary interventions in a mouse model of AD. Results indicate that the most commonly used progestogen component of HT, medroxyprogesterone acetate (MPA), impairs cognition in the middle-aged and aged surgically menopausal rat. Further, MPA is the sole hormone component of the contraceptive Depo Provera, and my research indicates that MPA administered to young-adult rats leads to long lasting cognitive impairments, evident at middle age. Natural progesterone has been gaining increasing popularity as an alternate option to MPA for HT; however, my findings suggest that progesterone also impairs cognition in the middle-aged and aged surgically menopausal rat, and that the mechanism may be through increased GABAergic activation. This dissertation identified two less commonly used progestogens, norethindrone acetate and levonorgestrel, as potential HTs that could improve cognition in the surgically menopausal rat. Parameters guiding divergent effects on cognition were discovered. In women, prior HT use was associated with larger hippocampal and entorhinal cortex volumes, as well as a modest verbal memory enhancement. Finally, in a model of AD, sex impacts responsiveness to a dietary cognitive intervention, with benefits seen in male, but not female, transgenic mice. These findings have clinical implications, especially since women are at higher risk for AD diagnosis. Together, it is my hope that this information adds to the overarching goal of optimizing cognitive aging in women.
ContributorsBraden, Brittany Blair (Author) / Bimonte-Nelson, Heather A. (Thesis advisor) / Neisewander, Janet L (Committee member) / Conrad, Cheryl D. (Committee member) / Baxter, Leslie C (Committee member) / Arizona State University (Publisher)
Created2012
151304-Thumbnail Image.png
Description
Food system and health characteristics were evaluated across the last Waorani hunter-gatherer group in Amazonian Ecuador and a remote neighboring Kichwa indigenous subsistence agriculture community. Hunter-gatherer food systems like the Waorani foragers may not only be nutritionally, but also pharmaceutically beneficial because of high dietary intake of varied plant phytochemical

Food system and health characteristics were evaluated across the last Waorani hunter-gatherer group in Amazonian Ecuador and a remote neighboring Kichwa indigenous subsistence agriculture community. Hunter-gatherer food systems like the Waorani foragers may not only be nutritionally, but also pharmaceutically beneficial because of high dietary intake of varied plant phytochemical compounds. A modern diet that reduces these dietary plant defense phytochemicals below levels typical in human evolutionary history may leave humans vulnerable to diseases that were controlled through a foraging diet. Few studies consider the health impact of the recent drastic reduction of plant phytochemical content in the modern global food system, which has eliminated essential components of food because they are not considered "nutrients". The antimicrobial and anti-inflammatory nature of the food system may not only regulate infectious pathogens and inflammatory disease, but also support beneficial microbes in human hosts, reducing vulnerability to chronic diseases. Waorani foragers seem immune to certain infections with very low rates of chronic disease. Does returning to certain characteristics of a foraging food system begin to restore the human body microbe balance and inflammatory response to evolutionary norms, and if so, what implication does this have for the treatment of disease? Several years of data on dietary and health differences across the foragers and the farmers was gathered. There were major differences in health outcomes across the board. In the Waorani forager group there were no signs of infection in serious wounds such as 3rd degree burns and spear wounds. The foragers had one-degree lower body temperature than the farmers. The Waorani had an absence of signs of chronic diseases including vision and blood pressure that did not change markedly with age while Kichwa farmers suffered from both chronic diseases and physiological indicators of aging. In the Waorani forager population, there was an absence of many common regional infectious diseases, from helminthes to staphylococcus. Study design helped control for confounders (exercise, environment, genetic factors, non-phytochemical dietary intake). This study provides evidence of the major role total phytochemical dietary intake plays in human health, often not considered by policymakers and nutritional and agricultural scientists.
ContributorsLondon, Douglas (Author) / Tsuda, Takeyuki (Thesis advisor) / Beezhold, Bonnie L (Committee member) / Hruschka, Daniel (Committee member) / Eder, James (Committee member) / Arizona State University (Publisher)
Created2012
151552-Thumbnail Image.png
Description
The use of bias indicators in psychological measurement has been contentious, with some researchers questioning whether they actually suppress or moderate the ability of substantive psychological indictors to discriminate (McGrath, Mitchell, Kim, & Hough, 2010). Bias indicators on the MMPI-2-RF (F-r, Fs, FBS-r, K-r, and L-r) were tested for suppression

The use of bias indicators in psychological measurement has been contentious, with some researchers questioning whether they actually suppress or moderate the ability of substantive psychological indictors to discriminate (McGrath, Mitchell, Kim, & Hough, 2010). Bias indicators on the MMPI-2-RF (F-r, Fs, FBS-r, K-r, and L-r) were tested for suppression or moderation of the ability of the RC1 and NUC scales to discriminate between Epileptic Seizures (ES) and Non-epileptic Seizures (NES, a conversion disorder that is often misdiagnosed as ES). RC1 and NUC had previously been found to be the best scales on the MMPI-2-RF to differentiate between ES and NES, with optimal cut scores occurring at a cut score of 65 for RC1 (classification rate of 68%) and 85 for NUC (classification rate of 64%; Locke et al., 2010). The MMPI-2-RF was completed by 429 inpatients on the Epilepsy Monitoring Unit (EMU) at the Scottsdale Mayo Clinic Hospital, all of whom had confirmed diagnoses of ES or NES. Moderated logistic regression was used to test for moderation and logistic regression was used to test for suppression. Classification rates of RC1 and NUC were calculated at different bias level indicators to evaluate clinical utility for diagnosticians. No moderation was found. Suppression was found for F-r, Fs, K-r, and L-r with RC1, and for all variables with NUC. For F-r and Fs, the optimal RC1 and NUC cut scores increased at higher levels of bias, but tended to decrease at higher levels of K-r, L-r, and FBS-r. K-r provided the greatest suppression for RC1, as well as the greatest increases in classification rates at optimal cut scores, given different levels of bias. It was concluded that, consistent with expectations, taking account of bias indicator suppression on the MMPI-2-RF can improve discrimination of ES and NES. At higher levels of negative impression management, higher cut scores on substantive scales are needed to attain optimal discrimination, whereas at higher levels of positive impression management and FBS-r, lower cut scores are needed. Using these new cut scores resulted in modest improvements in accuracy in discrimination. These findings are consistent with prior research in showing the efficacy of bias indicators, and extend the findings to a psycho-medical context.
ContributorsWershba, Rebecca E (Author) / Lanyon, Richard I (Thesis advisor) / Barrera, Manuel (Committee member) / Karoly, Paul (Committee member) / Millsap, Roger E (Committee member) / Arizona State University (Publisher)
Created2013
151553-Thumbnail Image.png
Description
Recommendations made by expert groups are pervasive throughout various life domains. Yet not all recommendations--or expert groups--are equally persuasive. This research aims to identify factors that influence the persuasiveness of recommendations. More specifically, this study examined the effects of decisional cohesion (the amount of agreement among the experts in support

Recommendations made by expert groups are pervasive throughout various life domains. Yet not all recommendations--or expert groups--are equally persuasive. This research aims to identify factors that influence the persuasiveness of recommendations. More specifically, this study examined the effects of decisional cohesion (the amount of agreement among the experts in support of the recommendation), framing (whether the message is framed as a loss or gain), and the domain of the recommendation (health vs. financial) on the persuasiveness of the recommendation. The participants consisted of 1,981 undergraduates from Arizona State University. The participants read a vignette including information about the expert group making a recommendation--which varied the amount of expert agreement for the recommendation--and the recommendation, which was framed as either a gain or loss. Participants then responded to questions about the persuasiveness of the recommendation. In this study, there was a linear main effect of decisional cohesion such that the greater the decisional cohesion of the expert group the more persuasive their recommendation. In addition, there was a main effect of domain such that the health recommendation was more persuasive than the financial recommendation. Contrary to predictions, there was no observed interaction between the amount of decisional cohesion and the framing of the recommendation nor was there a main effect of framing. Further analyses show support for a mediation effect indicating that high levels of decisional cohesion increased the perceived entitativity of the expert group--the degree to which the group was perceived as a unified, cohesive group¬--which increased the recommendation's persuasiveness. An implication of this research is that policy makers could increase the persuasiveness of their recommendations by promoting recommendations that are unanimously supported by their experts or at least show higher levels of decisional cohesion.
ContributorsVotruba, Ashley M (Author) / Kwan, Virginia S.Y. (Thesis advisor) / Saks, Michael J. (Committee member) / Demaine, Linda (Committee member) / Arizona State University (Publisher)
Created2013
151593-Thumbnail Image.png
Description
Although aggression is sometimes thought to be maladaptive, evolutionary theories of resource control and dominance posit that aggression may be used to gain and maintain high social prominence within the peer group. The success of using aggression to increase social prominence may depend on the form of aggression used (relational

Although aggression is sometimes thought to be maladaptive, evolutionary theories of resource control and dominance posit that aggression may be used to gain and maintain high social prominence within the peer group. The success of using aggression to increase social prominence may depend on the form of aggression used (relational versus physical), the gender of the aggressor, and the prominence of the victim. Thus, the current study examined the associations between aggression and victimization and social prominence. In addition, the current study extended previous research by examining multiple forms of aggression and victimization and conceptualizing and measuring social prominence using social network analysis. Participants were 339 6th grade students from ethnically diverse backgrounds (50.4% girls). Participants completed a peer nomination measure assessing relational and physical aggression and victimization. They also nominated friends within their grade, which were used to calculate three indices of social prominence, using social network analysis. As expected, results indicated that relational aggression was associated with higher social prominence, particularly for girls, whereas physical aggression was less robustly associated with social prominence. Results for victimization were less clear, but suggested that, for girls, those at mid-levels of social prominence were most highly victimized. For boys, results indicated that those both high and low in prominence were most highly relationally victimized, and those at mid-levels of prominence were most highly physically victimized. These findings help inform intervention work focused on decreasing overall levels of aggressive behavior.
ContributorsAndrews, Naomi C. Z (Author) / Hanish, Laura D. (Thesis advisor) / Martin, Carol Lynn (Committee member) / Updegraff, Kimberly A (Committee member) / Arizona State University (Publisher)
Created2013
151623-Thumbnail Image.png
Description
Approximately one-third of Iraq and Afghanistan veterans develop mental health problems, yet only 35-40% of those with mental disorders are seeking mental healthcare (Hoge, et al., 2004; Vogt, 2011). Military spouses may be an important resource for facilitating treatment seeking (Warner, et al., 2008), especially if service member mental health

Approximately one-third of Iraq and Afghanistan veterans develop mental health problems, yet only 35-40% of those with mental disorders are seeking mental healthcare (Hoge, et al., 2004; Vogt, 2011). Military spouses may be an important resource for facilitating treatment seeking (Warner, et al., 2008), especially if service member mental health issues are impacting the marriage. Military spouses might be hesitant to encourage service member help-seeking, however, due to perceived threat of adverse military career consequences. For this study, 62 military wives completed an online survey. As part of the survey, participants were randomly assigned to one of four vignettes containing a description of a hypothetical military husband with mental health symptoms. Each vignette presented different combinations of marital conflict (high versus low) and service member concerns about adverse career consequences (high versus low). Wives rated on a five-point scale how likely they were to encourage the hypothetical military husband to seek help. It was hypothesized that spouses would be more willing to encourage help-seeking when concerns about adverse military career consequences were low and marital distress was high. No main effects or interaction effect were found for marriage and career. Perceived stigma about seeking mental health treatment in the military, psychological identification as a military spouse, and experience and familiarity with military mental healthcare policies failed to moderate the relationship between marital conflict, career concerns, and encouragement of help-seeking. Correlational analyses revealed that (1) greater experience with military mental healthcare (first- or secondhand), and (2) greater perceptions of stigma regarding seeking mental healthcare in the military each were associated with decreased perceptions of military supportiveness of mental healthcare. Therefore, although the experimental manipulation in this study did not lead to differences in military spouses' encouragement of a hypothetical military service member to seek mental health services, other findings based on participants' actual experiences suggest that experiences with military mental healthcare may generate or reinforce negative perceptions of military mental healthcare. Altering actual experiences with military mental healthcare, in addition to perceptions of stigma, may be a useful area of intervention for military service members and spouses.
ContributorsHermosillo, Lori (Author) / Roberts, Nicole (Thesis advisor) / Burleson, Mary (Committee member) / Tinsley, Barbara (Committee member) / Arizona State University (Publisher)
Created2013
151461-Thumbnail Image.png
Description
Time adolescents spend in organized or informal skill based activities after school is associated with a variety of positive developmental outcomes. Little is known about how siblings might shape adolescents' motivation to participate in after-school activities. The current study applied the expectancy value model and ecological theory to understand if

Time adolescents spend in organized or informal skill based activities after school is associated with a variety of positive developmental outcomes. Little is known about how siblings might shape adolescents' motivation to participate in after-school activities. The current study applied the expectancy value model and ecological theory to understand if sibling behaviors were related to adolescents' after-school activities for 34 Mexican origin families. Qualitative and quantitative results suggested siblings engaged in five promoting behaviors (i.e., support, provider of information, role modeling, comparison, co-participation) and three inhibiting behaviors (i.e., babysitting, transportation, and negativity) towards adolescent activity participation. Furthermore, sibling behaviors differed by adolescent characteristics (i.e., cultural orientation, familism, and neighborhood) and sibling characteristics (i.e., gender, age). The results provide evidence of the various promoting and inhibiting socialization behaviors sibling might use to influence adolescents' activity motivation.
ContributorsPrice, Chara Dale (Author) / Simpkins, Sandra (Thesis advisor) / Updegraff, Kimberly (Committee member) / Menjivar, Cecilia (Committee member) / Arizona State University (Publisher)
Created2012
151484-Thumbnail Image.png
Description
An understanding of diet habits is crucial in implementing proper management strategies for wildlife. Diet analysis, however, remains a challenge for ruminant species. Microhistological analysis, the method most often employed in herbivore diet studies, is tedious and time consuming. In addition, it requires considerable training and an extensive reference plant

An understanding of diet habits is crucial in implementing proper management strategies for wildlife. Diet analysis, however, remains a challenge for ruminant species. Microhistological analysis, the method most often employed in herbivore diet studies, is tedious and time consuming. In addition, it requires considerable training and an extensive reference plant collection. The development of DNA barcoding (species identification using a standardized DNA sequence) and the availability of recent DNA sequencing techniques offer new possibilities in diet analysis for ungulates. Using fecal material collected from controlled feeding trials on pygmy goats, (Capra hicus), novel DNA barcoding technology using the P6-loop of the chloroplast trnL (UAA) intron was compared with the traditional microhistological technique. At its current stage of technological development, this study demonstrated that DNA barcoding did not enhance the ability to detect plant species in herbivore diets. A higher mean species composition was reported with microhistological analysis (79%) as compared to DNA barcoding (50%). Microhistological analysis consistently reported a higher species presence by forage class. For affect positive species identification, microhistology estimated an average of 89% correct detection in control diets, while DNA barcoding estimated 50% correct detection of species. It was hypothesized that a number of factors, including variation in chloroplast content in feed species and the effect of rumen bacteria on degradation of DNA, influenced the ability to detect plant species in herbivore diets and concluded that while DNA barcoding opens up new possibilities in the study of plant-herbivore interactions, further studies are needed to standardize techniques and for DNA bar-coding in this context.
ContributorsMurphree, Julie Joan (Author) / Miller, William H. (Thesis advisor) / Steele, Kelly (Committee member) / Salywon, Andrew (Committee member) / Arizona State University (Publisher)
Created2012
151503-Thumbnail Image.png
Description
Objective: Vinegar consumption studies have demonstrated possible therapeutic effects in reducing HbA1c and postprandial glycemia. The purpose of the study was to closely examine the effects of a commercial vinegar drink on daily fluctuations in fasting glucose concentrations and postprandial glycemia, and on HbA1c, in individuals at risk for Type

Objective: Vinegar consumption studies have demonstrated possible therapeutic effects in reducing HbA1c and postprandial glycemia. The purpose of the study was to closely examine the effects of a commercial vinegar drink on daily fluctuations in fasting glucose concentrations and postprandial glycemia, and on HbA1c, in individuals at risk for Type 2 Diabetes Mellitus (T2D). Design: Thirteen women and one man (21-62 y; mean, 46.0±3.9 y) participated in this 12-week parallel-arm trial. Participants were recruited from a campus community and were healthy and not diabetic by self-report. Participants were not prescribed oral hypoglycemic medications or insulin; other medications were allowed if use was stable for > 3 months. Subjects were randomized to one of two groups: VIN (8 ounces vinegar drink providing 1.5 g acetic acid) or CON (1 vinegar pill providing 0.04 g acetic acid). Treatments were taken twice daily immediately prior to the lunch and dinner meals. Venous blood samples were drawn at trial weeks 0 and 12 to measure insulin, fasting glucose, and HbA1c. Subjects recorded fasting glucose and 2-h postprandial glycemia concentrations daily using a glucometer. Results: The VIN group showed significant reductions in fasting capillary blood glucose concentrations (p=0.05) that were immediate and sustained throughout the duration of the study. The VIN group had reductions in 2-h postprandial glucose (mean change of −7.6±6.8 mg/dL over the 12-week trial), but this value was not significantly different than that for the CON group (mean change of 3.3±5.3 mg/dL over the 12-week trial, p=0.232). HbA1c did not significantly change (p=0.702), but the reduction in HbA1c in the VIN group, −0.14±0.1%, may have physiological relevance. Conclusions: Significant reductions in HbA1c were not observed after daily consumption of a vinegar drink containing 1.5 g acetic acid in non-diabetic individuals. However, the vinegar drink did significantly reduce fasting capillary blood glucose concentrations in these individuals as compared to a vinegar pill containing 0.04 g acetic acid. These results support a therapeutic effect for vinegar in T2D prevention and progression, specifically in high-risk populations.
ContributorsQuagliano, Samantha (Author) / Johnston, Carol (Thesis advisor) / Appel, Christy (Committee member) / Dixon, Kathleen (Committee member) / Arizona State University (Publisher)
Created2013