Matching Items (23)
Filtering by

Clear all filters

150093-Thumbnail Image.png
Description
Action language C+ is a formalism for describing properties of actions, which is based on nonmonotonic causal logic. The definite fragment of C+ is implemented in the Causal Calculator (CCalc), which is based on the reduction of nonmonotonic causal logic to propositional logic. This thesis describes the language

Action language C+ is a formalism for describing properties of actions, which is based on nonmonotonic causal logic. The definite fragment of C+ is implemented in the Causal Calculator (CCalc), which is based on the reduction of nonmonotonic causal logic to propositional logic. This thesis describes the language of CCalc in terms of answer set programming (ASP), based on the translation of nonmonotonic causal logic to formulas under the stable model semantics. I designed a standard library which describes the constructs of the input language of CCalc in terms of ASP, allowing a simple modular method to represent CCalc input programs in the language of ASP. Using the combination of system F2LP and answer set solvers, this method achieves functionality close to that of CCalc while taking advantage of answer set solvers to yield efficient computation that is orders of magnitude faster than CCalc for many benchmark examples. In support of this, I created an automated translation system Cplus2ASP that implements the translation and encoding method and automatically invokes the necessary software to solve the translated input programs.
ContributorsCasolary, Michael (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Arizona State University (Publisher)
Created2011
151653-Thumbnail Image.png
Description
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling

Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to its modeling language in order to enhance expressivity, such as incorporating aggregates and interfaces with ontologies. Also, in order to overcome the grounding bottleneck of computation in ASP, there are increasing interests in integrating ASP with other computing paradigms, such as Constraint Programming (CP) and Satisfiability Modulo Theories (SMT). Due to the non-monotonic nature of the ASP semantics, such enhancements turned out to be non-trivial and the existing extensions are not fully satisfactory. We observe that one main reason for the difficulties rooted in the propositional semantics of ASP, which is limited in handling first-order constructs (such as aggregates and ontologies) and functions (such as constraint variables in CP and SMT) in natural ways. This dissertation presents a unifying view on these extensions by viewing them as instances of formulas with generalized quantifiers and intensional functions. We extend the first-order stable model semantics by by Ferraris, Lee, and Lifschitz to allow generalized quantifiers, which cover aggregate, DL-atoms, constraints and SMT theory atoms as special cases. Using this unifying framework, we study and relate different extensions of ASP. We also present a tight integration of ASP with SMT, based on which we enhance action language C+ to handle reasoning about continuous changes. Our framework yields a systematic approach to study and extend non-monotonic languages.
ContributorsMeng, Yunsong (Author) / Lee, Joohyung (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Baral, Chitta (Committee member) / Fainekos, Georgios (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2013
152428-Thumbnail Image.png
Description
Biological organisms are made up of cells containing numerous interconnected biochemical processes. Diseases occur when normal functionality of these processes is disrupted, manifesting as disease symptoms. Thus, understanding these biochemical processes and their interrelationships is a primary task in biomedical research and a prerequisite for activities including diagnosing diseases and

Biological organisms are made up of cells containing numerous interconnected biochemical processes. Diseases occur when normal functionality of these processes is disrupted, manifesting as disease symptoms. Thus, understanding these biochemical processes and their interrelationships is a primary task in biomedical research and a prerequisite for activities including diagnosing diseases and drug development. Scientists studying these interconnected processes have identified various pathways involved in drug metabolism, diseases, and signal transduction, etc. High-throughput technologies, new algorithms and speed improvements over the last decade have resulted in deeper knowledge about biological systems, leading to more refined pathways. Such pathways tend to be large and complex, making it difficult for an individual to remember all aspects. Thus, computer models are needed to represent and analyze them. The refinement activity itself requires reasoning with a pathway model by posing queries against it and comparing the results against the real biological system. Many existing models focus on structural and/or factoid questions, relying on surface-level information. These are generally not the kind of questions that a biologist may ask someone to test their understanding of biological processes. Examples of questions requiring understanding of biological processes are available in introductory college level biology text books. Such questions serve as a model for the question answering system developed in this thesis. Thus, the main goal of this thesis is to develop a system that allows the encoding of knowledge about biological pathways to answer questions demonstrating understanding of the pathways. To that end, a language is developed to specify a pathway and pose questions against it. Some existing tools are modified and used to accomplish this goal. The utility of the framework developed in this thesis is illustrated with applications in the biological domain. Finally, the question answering system is used in real world applications by extracting pathway knowledge from text and answering questions related to drug development.
ContributorsAnwar, Saadat (Author) / Baral, Chitta (Thesis advisor) / Inoue, Katsumi (Committee member) / Chen, Yi (Committee member) / Davulcu, Hasan (Committee member) / Lee, Joohyung (Committee member) / Arizona State University (Publisher)
Created2014
150534-Thumbnail Image.png
Description
Different logic-based knowledge representation formalisms have different limitations either with respect to expressivity or with respect to computational efficiency. First-order logic, which is the basis of Description Logics (DLs), is not suitable for defeasible reasoning due to its monotonic nature. The nonmonotonic formalisms that extend first-order logic, such as circumscription

Different logic-based knowledge representation formalisms have different limitations either with respect to expressivity or with respect to computational efficiency. First-order logic, which is the basis of Description Logics (DLs), is not suitable for defeasible reasoning due to its monotonic nature. The nonmonotonic formalisms that extend first-order logic, such as circumscription and default logic, are expressive but lack efficient implementations. The nonmonotonic formalisms that are based on the declarative logic programming approach, such as Answer Set Programming (ASP), have efficient implementations but are not expressive enough for representing and reasoning with open domains. This dissertation uses the first-order stable model semantics, which extends both first-order logic and ASP, to relate circumscription to ASP, and to integrate DLs and ASP, thereby partially overcoming the limitations of the formalisms. By exploiting the relationship between circumscription and ASP, well-known action formalisms, such as the situation calculus, the event calculus, and Temporal Action Logics, are reformulated in ASP. The advantages of these reformulations are shown with respect to the generality of the reasoning tasks that can be handled and with respect to the computational efficiency. The integration of DLs and ASP presented in this dissertation provides a framework for integrating rules and ontologies for the semantic web. This framework enables us to perform nonmonotonic reasoning with DL knowledge bases. Observing the need to integrate action theories and ontologies, the above results are used to reformulate the problem of integrating action theories and ontologies as a problem of integrating rules and ontologies, thus enabling us to use the computational tools developed in the context of the latter for the former.
ContributorsPalla, Ravi (Author) / Lee, Joohyung (Thesis advisor) / Baral, Chitta (Committee member) / Kambhampati, Subbarao (Committee member) / Lifschitz, Vladimir (Committee member) / Arizona State University (Publisher)
Created2012
147614-Thumbnail Image.png
Description

This project did a deep dive on AI, business applications for AI and then my team and I built an AI model to better understand shipping patterns and inefficiencies of different porting regions.

ContributorsFreudenberger, Evan Martin (Author) / Wiedmer, Robert (Thesis director) / Duarte, Brett (Committee member) / Thunderbird School of Global Management (Contributor) / Department of Supply Chain Management (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
149373-Thumbnail Image.png
Description
Natural Language Processing is a subject that combines computer science and linguistics, aiming to provide computers with the ability to understand natural language and to develop a more intuitive human-computer interaction. The research community has developed ways to translate natural language to mathematical formalisms. It has not yet been shown,

Natural Language Processing is a subject that combines computer science and linguistics, aiming to provide computers with the ability to understand natural language and to develop a more intuitive human-computer interaction. The research community has developed ways to translate natural language to mathematical formalisms. It has not yet been shown, however, how to automatically translate different kinds of knowledge in English to distinct formal languages. Most of the recent work presents the problem that the translation method aims to a specific formal language or is hard to generalize. In this research, I take a first step to overcome this difficulty and present two algorithms which take as input two lambda-calculus expressions G and H and compute a lambda-calculus expression F. The expression F returned by the first algorithm satisfies F@G=H and, in the case of the second algorithm, we obtain G@F=H. The lambda expressions represent the meanings of words and sentences. For each formal language that one desires to use with the algorithms, the language must be defined in terms of lambda calculus. Also, some additional concepts must be included. After doing this, given a sentence, its representation and knowing the representation of several words in the sentence, the algorithms can be used to obtain the representation of the other words in that sentence. In this work, I define two languages and show examples of their use with the algorithms. The algorithms are illustrated along with soundness and completeness proofs, the latter with respect to typed lambda-calculus formulas up to the second order. These algorithms are a core part of a natural language semantics system that translates sentences from English to formulas in different formal languages.
ContributorsAlvarez Gonzalez, Marcos (Author) / Baral, Chitta (Thesis advisor) / Lee, Joohyung (Committee member) / Ye, Jieping (Committee member) / Arizona State University (Publisher)
Created2010
131087-Thumbnail Image.png
Description
The focus of this research paper is understanding the impacts of human factors on the technology innovations in automobiles and the direction our society is headed. There will be an assessment of our current state and the possible solutions to combat the issue of creating technology advancements for automobiles that

The focus of this research paper is understanding the impacts of human factors on the technology innovations in automobiles and the direction our society is headed. There will be an assessment of our current state and the possible solutions to combat the issue of creating technology advancements for automobiles that cater towards the human factors. There will be an introduction on the history of the first automobile invented to provide an understanding of the what the first automobile consisted of and will continue discussing the technological innovations that were implemented due to human factors. Diving into the types of technological innovations such as the ignition system, car radio, the power steering system, and self-driving, it will show the progression of the technological advancements that was implemented in relation to the human factors that was prominent among society. From there, it is important to understand what human factors and the concept of human factor engineering are. It will provide a better understanding of why humans have created technology in relation to the human factors. Then, there will be an introduction of the mobile phone industry history/timeline as a comparison to show the impacts of how human factors have had on the development of the technology in mobile phones and how heavily it catered towards human factors. There will be a discussion of the 3 key human factors that have been catered towards the development and implementation of technology in automobiles. They are selecting the path that requires the least cognitive effort, overestimating the performance of technology, and reducing the attention due to an automated system being put into place. Lastly, is understanding that if we create or implement technology such as self-driving, it should not solely be for comfort and ease of use, but for the overall efficient use of transportation in the future. This way humans would not rely heavily too much on the technology and limit the effect that human factors have on us.
ContributorsParham, Gi-onli (Author) / Keane, Katy (Thesis director) / Collins, Gregory (Committee member) / Department of Supply Chain Management (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
189394-Thumbnail Image.png
Description
One of the challenges in Artificial Intelligence (AI) is to integrate fast, automatic, and intuitive System-1 thinking with slow, deliberate, and logical System-2 thinking. While deep learning approaches excel at perception tasks for System-1, their reasoning capabilities for System-2 are limited. Besides, deep learning approaches are usually data-hungry, hard to

One of the challenges in Artificial Intelligence (AI) is to integrate fast, automatic, and intuitive System-1 thinking with slow, deliberate, and logical System-2 thinking. While deep learning approaches excel at perception tasks for System-1, their reasoning capabilities for System-2 are limited. Besides, deep learning approaches are usually data-hungry, hard to make use of explicit knowledge, and struggling with interpretability and justification. This dissertation presents three neuro-symbolic AI approaches that integrate neural networks (NNs) with symbolic AI methods to address these issues. The first approach presented in this dissertation is NeurASP, which combines NNs with Answer Set Programming (ASP), a logic programming formalism. NeurASP provides an effective way to integrate sub-symbolic and symbolic computation by treating NN outputs as probability distributions over atomic facts in ASP. The explicit knowledge encoded in ASP corrects mistakes in NN outputs and allows for better training with less data. To avoid NeurASP's bottleneck in symbolic computation, this dissertation presents a Constraint Loss via Straight-Through Estimators (CL-STE). CL-STE provides a systematic way to compile discrete logical constraints into a loss function over discretized NN outputs and scales significantly better than state-of-the-art neuro-symbolic methods. This dissertation also presents a finding when CL-STE was applied to Transformers. Transformers can be extended with recurrence to enhance its power for multi-step reasoning. Such Recurrent Transformer can straightforwardly be applied to visual constraint reasoning problems while successfully addressing the symbol grounding problem. Lastly, this dissertation addresses the limitation of pre-trained Large Language Models (LLMs) on multi-step logical reasoning problems with a dual-process neuro-symbolic reasoning system called LLM+ASP, where an LLM (e.g., GPT-3) serves as a highly effective few-shot semantic parser that turns natural language sentences into a logical form that can be used as input to ASP. LLM+ASP achieves state-of-the-art performance on several textual reasoning benchmarks and can handle robot planning tasks that an LLM alone fails to solve.
ContributorsYang, Zhun (Author) / Lee, Joohyung (Thesis advisor) / Baral, Chitta (Committee member) / Li, Baoxin (Committee member) / Yang, Yezhou (Committee member) / Arizona State University (Publisher)
Created2023
Description

This research investigates the attitude of students towards chatbots and their potential usage in finding career resources. Survey data from two sources were analyzed using descriptive statistics and correlation analysis. The first survey found that students had a neutral attitude towards chatbots, but chatbot understanding was a key factor in

This research investigates the attitude of students towards chatbots and their potential usage in finding career resources. Survey data from two sources were analyzed using descriptive statistics and correlation analysis. The first survey found that students had a neutral attitude towards chatbots, but chatbot understanding was a key factor in increasing their usage. The survey data suggested that chatbots could provide quick and convenient access to information and personalized recommendations, but their effectiveness for career resource searches may be limited. The second survey found that students who were more satisfied with the quality of resources from the career office were more likely to use chatbots. However, students who felt more prepared to explore their career options were less likely to use chatbots. These results suggest that the W. P. Carey Career Office could benefit from offering more and better resources to prepare students for exploring their career options and could explore the use of chatbots to enhance the quality of their resources and increase student satisfaction. Further research is needed to confirm these suggestions and explore other possible factors that may affect the use of chatbots and the satisfaction with career office resources.

ContributorsHuang, Hai (Author) / Kappes, Janelle (Thesis director) / Eaton, John (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor) / Department of Supply Chain Management (Contributor)
Created2023-05
Description
In our society, technology has found itself as the root cause of a certain level of modernization. It wasn’t long ago when people heavily depended on bank tellers to complete cash transactions at a bank. Now however, much of the bank teller’s job has been automated in the form of

In our society, technology has found itself as the root cause of a certain level of modernization. It wasn’t long ago when people heavily depended on bank tellers to complete cash transactions at a bank. Now however, much of the bank teller’s job has been automated in the form of ATM’s and electronic kiosks at drive through lanes. Automation is the current trend, and more departments are going to experience it. To those wondering which area or department may be hit next by a wave of technological automation, the answer is quite simple: CRM. In its raw form, CRM, which stands for Customer Relationship Management, is a “system for managing your relationships with customers” (Hubspot). Essentially, it is a software intended to help companies maintain strong relationships with their customers, customers being a critical part of the process. A good CRM system should benefit both the business and the customer. However, this is easier said than done, making the million dollar question the following: how can CRM systems be improved to truly benefit both the business and the customer? This paper will demonstrate that the answer is quite simple: automation. Through secondary research, as well as interviews conducted with various business professionals, I will demonstrate that automation and integration can make the process much more efficient and can erase a lot of errors in the process. Automation is the future of business, and this fact is not any less true in the CRM field.
ContributorsWarrier, Akshay (Author) / Riker, Elise (Thesis director) / Lee, Sanghak (Committee member) / Barrett, The Honors College (Contributor) / Department of Supply Chain Management (Contributor) / Department of Finance (Contributor) / Department of Marketing (Contributor)
Created2023-05