Matching Items (206)
150534-Thumbnail Image.png
Description
Different logic-based knowledge representation formalisms have different limitations either with respect to expressivity or with respect to computational efficiency. First-order logic, which is the basis of Description Logics (DLs), is not suitable for defeasible reasoning due to its monotonic nature. The nonmonotonic formalisms that extend first-order logic, such as circumscription

Different logic-based knowledge representation formalisms have different limitations either with respect to expressivity or with respect to computational efficiency. First-order logic, which is the basis of Description Logics (DLs), is not suitable for defeasible reasoning due to its monotonic nature. The nonmonotonic formalisms that extend first-order logic, such as circumscription and default logic, are expressive but lack efficient implementations. The nonmonotonic formalisms that are based on the declarative logic programming approach, such as Answer Set Programming (ASP), have efficient implementations but are not expressive enough for representing and reasoning with open domains. This dissertation uses the first-order stable model semantics, which extends both first-order logic and ASP, to relate circumscription to ASP, and to integrate DLs and ASP, thereby partially overcoming the limitations of the formalisms. By exploiting the relationship between circumscription and ASP, well-known action formalisms, such as the situation calculus, the event calculus, and Temporal Action Logics, are reformulated in ASP. The advantages of these reformulations are shown with respect to the generality of the reasoning tasks that can be handled and with respect to the computational efficiency. The integration of DLs and ASP presented in this dissertation provides a framework for integrating rules and ontologies for the semantic web. This framework enables us to perform nonmonotonic reasoning with DL knowledge bases. Observing the need to integrate action theories and ontologies, the above results are used to reformulate the problem of integrating action theories and ontologies as a problem of integrating rules and ontologies, thus enabling us to use the computational tools developed in the context of the latter for the former.
ContributorsPalla, Ravi (Author) / Lee, Joohyung (Thesis advisor) / Baral, Chitta (Committee member) / Kambhampati, Subbarao (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2012
150854-Thumbnail Image.png
Description
Profound alterations to instruments that take place over short periods of time are fascinating, and the changes undergone by the guitar during the late eighteenth century make for an intriguing transition in the instrument's history. The guitar that existed before 1750 is most commonly referred to as the 'Baroque guitar'

Profound alterations to instruments that take place over short periods of time are fascinating, and the changes undergone by the guitar during the late eighteenth century make for an intriguing transition in the instrument's history. The guitar that existed before 1750 is most commonly referred to as the 'Baroque guitar' and is vastly different from the guitar of today. It was considerably smaller than the guitars that followed, pitched higher, and used primarily for accompaniment through chord strumming. From roughly 1750 to 1800 the guitar underwent a transformation that eventually led to the design and performance practices that have continued through to this day; larger, with lower-pitched courses (and sometimes single stringing), and used increasingly more in punteado (plucked) style. By defining the instrument as it existed prior to 1750, and the changes that it underwent after 1750, we can ensure that the instrument discussed is the one that has directly led to the instrument we use today. Because instrument design and performance practice inevitably influence each other, a thorough examination of ornamentation practices from 1750-1800 can lead to a greater understanding of the instrument as it changed, and the instrument it eventually turned into. Since the early nineteenth century was one of the more productive time periods for the guitar, having a better understanding of the ornamentation performance practices that preceded it may provide insight to how the players and composers of this fertile time (Sor, Aguado, Giuliani, etc.) approached their instrument. Although there was not much music printed or copied for guitar during the latter half of the eighteenth century, a substantial number of guitars were built, along with instruction manuals featuring the guitar. Instruction manuals were examined, along with works for solo guitar and guitar in ensemble with other instruments, to explore ornamentation practices from 1750-1800. Through examination of the guitar instruction manuals of the late eighteenth century, an increased understanding is gained regarding the techniques that eventually became cornerstones of nineteenth-century guitar performance practice.
ContributorsCopeland, Jeffrey S. (Jeffrey Scott), 1953- (Author) / Koonce, Frank (Thesis advisor) / Aspnes, Lynne (Committee member) / Feisst, Sabine (Committee member) / Jiang, Danwen (Committee member) / Landschoot, Thomas (Committee member) / Arizona State University (Publisher)
Created2012
150863-Thumbnail Image.png
Description
The solo repertoire from the Light Music Era serves as an important link between the Classical and Jazz soloist traditions. These characteristics are best highlighted through an analysis of three solo transcriptions: Felix Arndt's Nola as performed by Al Gallodoro, Rudy Wiedoeft's Valse Vanité, as performed by Freddy Gardener, and

The solo repertoire from the Light Music Era serves as an important link between the Classical and Jazz soloist traditions. These characteristics are best highlighted through an analysis of three solo transcriptions: Felix Arndt's Nola as performed by Al Gallodoro, Rudy Wiedoeft's Valse Vanité, as performed by Freddy Gardener, and Jimmy Dorsey's Oodles of Noodles, as performed by Al Gallodoro. The transcriptions, done by the author, are taken from primary source recordings, and the ensuing analysis serves to show the saxophone soloists of the Light Music Era as an amalgamation of classical and jazz saxophone. Many of the works performed during the Light Music Era are extant only in recorded form. Even so, these performances possess great historical significance within the context of the state of the saxophone as an important solo instrument in the wider musical landscape. The saxophone solos from the Light Music Era distinguish themselves through the use of formal development and embellishment of standard "song forms" (such as ABA, and AABA), and the use of improvisational techniques that are common to early Jazz; however, the analysis shows that the improvisational techniques were distinctly different than a Jazz solo improvisation in nature. Although it has many characteristics in common with both "Classical Music" (this is used as a generic term to refer to the music of the Western European common practice period that is not Pop music or Jazz) and Jazz, the original research shows that the saxophone solo music from the Light Music Era is a distinctly original genre due to the amalgamation of seemingly disparate elements.
ContributorsPuccio, Dan (Author) / Mcallister, Timothy P (Thesis advisor) / Feisst, Sabine (Committee member) / Kocour, Michael (Committee member) / Pilafian, J. Samuel (Committee member) / Spring, Robert (Committee member) / Arizona State University (Publisher)
Created2012
150550-Thumbnail Image.png
Description
Ultra-concealable multi-threat body armor used by law-enforcement is a multi-purpose armor that protects against attacks from knife, spikes, and small caliber rounds. The design of this type of armor involves fiber-resin composite materials that are flexible, light, are not unduly affected by environmental conditions, and perform as required. The National

Ultra-concealable multi-threat body armor used by law-enforcement is a multi-purpose armor that protects against attacks from knife, spikes, and small caliber rounds. The design of this type of armor involves fiber-resin composite materials that are flexible, light, are not unduly affected by environmental conditions, and perform as required. The National Institute of Justice (NIJ) characterizes this type of armor as low-level protection armor. NIJ also specifies the geometry of the knife and spike as well as the strike energy levels required for this level of protection. The biggest challenges are to design a thin, lightweight and ultra-concealable armor that can be worn under street clothes. In this study, several fundamental tasks involved in the design of such armor are addressed. First, the roles of design of experiments and regression analysis in experimental testing and finite element analysis are presented. Second, off-the-shelf materials available from international material manufacturers are characterized via laboratory experiments. Third, the calibration process required for a constitutive model is explained through the use of experimental data and computer software. Various material models in LS-DYNA for use in the finite element model are discussed. Numerical results are generated via finite element simulations and are compared against experimental data thus establishing the foundation for optimizing the design.
ContributorsVokshi, Erblina (Author) / Rajan, Subramaniam D. (Thesis advisor) / Neithalath, Narayanan (Committee member) / Mobasher, Barzin (Committee member) / Arizona State University (Publisher)
Created2012
150433-Thumbnail Image.png
Description

The current method of measuring thermal conductivity requires flat plates. For most common civil engineering materials, creating or extracting such samples is difficult. A prototype thermal conductivity experiment had been developed at Arizona State University (ASU) to test cylindrical specimens but proved difficult for repeated testing. In this study, enhancements

The current method of measuring thermal conductivity requires flat plates. For most common civil engineering materials, creating or extracting such samples is difficult. A prototype thermal conductivity experiment had been developed at Arizona State University (ASU) to test cylindrical specimens but proved difficult for repeated testing. In this study, enhancements to both testing methods were made. Additionally, test results of cylindrical testing were correlated with the results from identical materials tested by the Guarded Hot&ndashPlate; method, which uses flat plate specimens. In validating the enhancements made to the Guarded Hot&ndashPlate; and Cylindrical Specimen methods, 23 tests were ran on five different materials. The percent difference shown for the Guarded Hot&ndashPlate; method was less than 1%. This gives strong evidence that the enhanced Guarded Hot-Plate apparatus in itself is now more accurate for measuring thermal conductivity. The correlation between the thermal conductivity values of the Guarded Hot&ndashPlate; to those of the enhanced Cylindrical Specimen method was excellent. The conventional concrete mixture, due to much higher thermal conductivity values compared to the other mixtures, yielded a P&ndashvalue; of 0.600 which provided confidence in the performance of the enhanced Cylindrical Specimen Apparatus. Several recommendations were made for the future implementation of both test methods. The work in this study fulfills the research community and industry desire for a more streamlined, cost effective, and inexpensive means to determine the thermal conductivity of various civil engineering materials.

ContributorsMorris, Derek (Author) / Kaloush, Kamil (Thesis advisor) / Mobasher, Barzin (Committee member) / Phelan, Patrick E (Committee member) / Arizona State University (Publisher)
Created2011
151144-Thumbnail Image.png
Description
Automated planning problems classically involve finding a sequence of actions that transform an initial state to some state satisfying a conjunctive set of goals with no temporal constraints. But in many real-world problems, the best plan may involve satisfying only a subset of goals or missing defined goal deadlines. For

Automated planning problems classically involve finding a sequence of actions that transform an initial state to some state satisfying a conjunctive set of goals with no temporal constraints. But in many real-world problems, the best plan may involve satisfying only a subset of goals or missing defined goal deadlines. For example, this may be required when goals are logically conflicting, or when there are time or cost constraints such that achieving all goals on time may be too expensive. In this case, goals and deadlines must be declared as soft. I call these partial satisfaction planning (PSP) problems. In this work, I focus on particular types of PSP problems, where goals are given a quantitative value based on whether (or when) they are achieved. The objective is to find a plan with the best quality. A first challenge is in finding adequate goal representations that capture common types of goal achievement rewards and costs. One popular representation is to give a single reward on each goal of a planning problem. I further expand on this approach by allowing users to directly introduce utility dependencies, providing for changes of goal achievement reward directly based on the goals a plan achieves. After, I introduce time-dependent goal costs, where a plan incurs penalty if it will achieve a goal past a specified deadline. To solve PSP problems with goal utility dependencies, I look at using state-of-the-art methodologies currently employed for classical planning problems involving heuristic search. In doing so, one faces the challenge of simultaneously determining the best set of goals and plan to achieve them. This is complicated by utility dependencies defined by a user and cost dependencies within the plan. To address this, I introduce a set of heuristics based on combinations using relaxed plans and integer programming formulations. Further, I explore an approach to improve search through learning techniques by using automatically generated state features to find new states from which to search. Finally, the investigation into handling time-dependent goal costs leads us to an improved search technique derived from observations based on solving discretized approximations of cost functions.
ContributorsBenton, J (Author) / Kambhampati, Subbarao (Thesis advisor) / Baral, Chitta (Committee member) / Do, Minh B. (Committee member) / Smith, David E. (Committee member) / Langley, Pat (Committee member) / Arizona State University (Publisher)
Created2012
151173-Thumbnail Image.png
Description
While developing autonomous intelligent robots has been the goal of many research programs, a more practical application involving intelligent robots is the formation of teams consisting of both humans and robots. An example of such an application is search and rescue operations where robots commanded by humans are sent to

While developing autonomous intelligent robots has been the goal of many research programs, a more practical application involving intelligent robots is the formation of teams consisting of both humans and robots. An example of such an application is search and rescue operations where robots commanded by humans are sent to environments too dangerous for humans. For such human-robot interaction, natural language is considered a good communication medium as it allows humans with less training about the robot's internal language to be able to command and interact with the robot. However, any natural language communication from the human needs to be translated to a formal language that the robot can understand. Similarly, before the robot can communicate (in natural language) with the human, it needs to formulate its communique in some formal language which then gets translated into natural language. In this paper, I develop a high level language for communication between humans and robots and demonstrate various aspects through a robotics simulation. These language constructs borrow some ideas from action execution languages and are grounded with respect to simulated human-robot interaction transcripts.
ContributorsLumpkin, Barry Thomas (Author) / Baral, Chitta (Thesis advisor) / Lee, Joohyung (Committee member) / Fainekos, Georgios (Committee member) / Arizona State University (Publisher)
Created2012
151180-Thumbnail Image.png
Description
As we migrate into an era of personalized medicine, understanding how bio-molecules interact with one another to form cellular systems is one of the key focus areas of systems biology. Several challenges such as the dynamic nature of cellular systems, uncertainty due to environmental influences, and the heterogeneity between individual

As we migrate into an era of personalized medicine, understanding how bio-molecules interact with one another to form cellular systems is one of the key focus areas of systems biology. Several challenges such as the dynamic nature of cellular systems, uncertainty due to environmental influences, and the heterogeneity between individual patients render this a difficult task. In the last decade, several algorithms have been proposed to elucidate cellular systems from data, resulting in numerous data-driven hypotheses. However, due to the large number of variables involved in the process, many of which are unknown or not measurable, such computational approaches often lead to a high proportion of false positives. This renders interpretation of the data-driven hypotheses extremely difficult. Consequently, a dismal proportion of these hypotheses are subject to further experimental validation, eventually limiting their potential to augment existing biological knowledge. This dissertation develops a framework of computational methods for the analysis of such data-driven hypotheses leveraging existing biological knowledge. Specifically, I show how biological knowledge can be mapped onto these hypotheses and subsequently augmented through novel hypotheses. Biological hypotheses are learnt in three levels of abstraction -- individual interactions, functional modules and relationships between pathways, corresponding to three complementary aspects of biological systems. The computational methods developed in this dissertation are applied to high throughput cancer data, resulting in novel hypotheses with potentially significant biological impact.
ContributorsRamesh, Archana (Author) / Kim, Seungchan (Thesis advisor) / Langley, Patrick W (Committee member) / Baral, Chitta (Committee member) / Kiefer, Jeffrey (Committee member) / Arizona State University (Publisher)
Created2012
136540-Thumbnail Image.png
Description
Since the early 1990's, researchers have been looking at intersections between education and music. After a highly popular study correlating listening to Mozart to temporary increases in spatial reasoning, many other researchers tried to find a link between different musical genres and learning outcomes. Using three musical treatments (Pop, classical,

Since the early 1990's, researchers have been looking at intersections between education and music. After a highly popular study correlating listening to Mozart to temporary increases in spatial reasoning, many other researchers tried to find a link between different musical genres and learning outcomes. Using three musical treatments (Pop, classical, silence), this study had subjects (N=34) complete a reading-based task whereupon they were tested on their comprehension. Using a suite of sensors, data was collected to analyze the participants' emotions and affect while they read from an educational psychology textbook. The present study has two major focuses: They detail whether (1) changes in musical condition affect learning outcomes and (2) whether changes in musical condition affect emotional outcomes. The popular conception that listening to classical music makes you smarter was proven false long ago, but there may actually be some merit to using music to assist one in studying. While there were no significant changes in test scores depending on musical condition; frustration levels were significantly lower for those who listened to classical instead of pop music.
ContributorsPaley, Benjamin Henry (Author) / Atkinson, Robert (Thesis director) / Feisst, Sabine (Committee member) / Barrett, The Honors College (Contributor) / School of Music (Contributor) / T. Denny Sanford School of Social and Family Dynamics (Contributor)
Created2015-05
136132-Thumbnail Image.png
Description
Calcium hydroxide carbonation processes were studied to investigate the potential for abiotic soil improvement. Different mixtures of common soil constituents such as sand, clay, and granite were mixed with a calcium hydroxide slurry and carbonated at approximately 860 psi. While the carbonation was successful and calcite formation was strong on

Calcium hydroxide carbonation processes were studied to investigate the potential for abiotic soil improvement. Different mixtures of common soil constituents such as sand, clay, and granite were mixed with a calcium hydroxide slurry and carbonated at approximately 860 psi. While the carbonation was successful and calcite formation was strong on sample exteriors, a 4 mm passivating boundary layer effect was observed, impeding the carbonation process at the center. XRD analysis was used to characterize the extent of carbonation, indicating extremely poor carbonation and therefore CO2 penetration inside the visible boundary. The depth of the passivating layer was found to be independent of both time and choice of aggregate. Less than adequate strength was developed in carbonated trials due to formation of small, weakly-connected crystals, shown with SEM analysis. Additional research, especially in situ analysis with thermogravimetric analysis would be useful to determine the causation of poor carbonation performance. This technology has great potential to substitute for certain Portland cement applications if these issues can be addressed.
ContributorsHermens, Stephen Edward (Author) / Bearat, Hamdallah (Thesis director) / Dai, Lenore (Committee member) / Mobasher, Barzin (Committee member) / Barrett, The Honors College (Contributor) / Chemical Engineering Program (Contributor)
Created2015-05