Matching Items (8)
Filtering by

Clear all filters

147898-Thumbnail Image.png
Description

The basis of social power which has expanded in the most dangerous way over the last few decades has been that of information control, and how that control is used. Misinformation and the intentional spread of misinformation referred to as disinformation, has become commonplace among various bodies of power to

The basis of social power which has expanded in the most dangerous way over the last few decades has been that of information control, and how that control is used. Misinformation and the intentional spread of misinformation referred to as disinformation, has become commonplace among various bodies of power to either expand their own influence or diminish opposing influence. The methods of disinformation utilized in the various spheres of politics, the commercial marketplace, and the media today are explored in depth to better contextualize and describe the problems that disinformation and its use pose in the world today.

ContributorsDoyle, Brenden C (Author) / Sturgess, Jessica (Thesis director) / Walker, Shawn (Committee member) / College of Health Solutions (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
148002-Thumbnail Image.png
Description

Disinformation has long been a tactic used by the Russian government to achieve its goals. Today, Vladimir Putin aims to achieve several things: weaken the United States’ strength on the world stage, relieve Western sanctions on himself and his inner circle, and reassert dominant influence over Russia’s near abroad (the

Disinformation has long been a tactic used by the Russian government to achieve its goals. Today, Vladimir Putin aims to achieve several things: weaken the United States’ strength on the world stage, relieve Western sanctions on himself and his inner circle, and reassert dominant influence over Russia’s near abroad (the Baltics, Ukraine, etc.). This research analyzed disinformation in English, Spanish, and Russian; noting the dominant narratives and geopolitical goals Russia hoped to achieve by destabilizing democracy in each country/region.

Created2021-05
Description

This paper examines the issue of Russian disinformation in Estonia and how the country has built resilience against this threat. Drawing upon existing literature and a series of interviews conducted with Estonians of a variety of professional backgrounds, this work explores Estonia's whole-of-society approach to resilience and examines its incorporation

This paper examines the issue of Russian disinformation in Estonia and how the country has built resilience against this threat. Drawing upon existing literature and a series of interviews conducted with Estonians of a variety of professional backgrounds, this work explores Estonia's whole-of-society approach to resilience and examines its incorporation of national security strategy, inter-institutional cooperation, and media literacy education. Ultimately, this paper argues that Estonia's efforts have been largely successful in enabling the country to strengthen its society against Russian disinformation and offers key takeaways for other countries such as the United States.

ContributorsWalsh, Sofia (Author) / Sivak, Henry (Thesis director) / Brown, Keith (Committee member) / Barrett, The Honors College (Contributor) / School of Politics and Global Studies (Contributor) / School of International Letters and Cultures (Contributor) / Historical, Philosophical & Religious Studies, Sch (Contributor)
Created2023-05
187768-Thumbnail Image.png
Description
The proliferation of fake news on social media has become a concern for manycountries due to its adverse effects on various areas, such as the economy, politics, health, and society. In light of the growing use of social media in Saudi Arabia, numerous media outlets actively utilize social media platforms to collect

The proliferation of fake news on social media has become a concern for manycountries due to its adverse effects on various areas, such as the economy, politics, health, and society. In light of the growing use of social media in Saudi Arabia, numerous media outlets actively utilize social media platforms to collect and disseminate news and information. As a result, Saudi journalists have faced various challenges, including the spread of fake news. Therefore, this study explores how Saudi journalists define and verify fake news published on social media and the challenges they face. Furthermore, this study explores journalists’ role perceptions in society concerning spreading fake news and how they can promote media literacy to the audience. This study employed in-depth qualitative interviews with 14 journalists from various Saudi printing and online newspapers. The thematic analysis of the interviews showed that Saudi journalists define fake news in several ways, encompassing three essential elements: source, content, and timing. In addition, the study found that journalists primarily use traditional verification practices to verify fake news published on social media, followed by new verification practices. The findings showed that Saudi journalists face challenges at all levels of the hierarchy of influence model. Moreover, the findings identify three different roles journalists perceive in society regarding fake news published on social media: disseminators, populist mobilizers, and interpreters. Lastly, the study found that journalists lack media literacy knowledge but are willing to cooperate with other government institutions to promote and distribute media literacy among the public.
ContributorsBasfar, Majed (Author) / Thornton, Leslie-Jean (Thesis advisor) / Silcock, B. William (Committee member) / Roschke, Kristy (Committee member) / Kim, Jeongeun (Committee member) / Arizona State University (Publisher)
Created2023
158566-Thumbnail Image.png
Description
Social media has become an important means of user-centered information sharing and communications in a gamut of domains, including news consumption, entertainment, marketing, public relations, and many more. The low cost, easy access, and rapid dissemination of information on social media draws a large audience but also exacerbate the wide

Social media has become an important means of user-centered information sharing and communications in a gamut of domains, including news consumption, entertainment, marketing, public relations, and many more. The low cost, easy access, and rapid dissemination of information on social media draws a large audience but also exacerbate the wide propagation of disinformation including fake news, i.e., news with intentionally false information. Disinformation on social media is growing fast in volume and can have detrimental societal effects. Despite the importance of this problem, our understanding of disinformation in social media is still limited. Recent advancements of computational approaches on detecting disinformation and fake news have shown some early promising results. Novel challenges are still abundant due to its complexity, diversity, dynamics, multi-modality, and costs of fact-checking or annotation.

Social media data opens the door to interdisciplinary research and allows one to collectively study large-scale human behaviors otherwise impossible. For example, user engagements over information such as news articles, including posting about, commenting on, or recommending the news on social media, contain abundant rich information. Since social media data is big, incomplete, noisy, unstructured, with abundant social relations, solely relying on user engagements can be sensitive to noisy user feedback. To alleviate the problem of limited labeled data, it is important to combine contents and this new (but weak) type of information as supervision signals, i.e., weak social supervision, to advance fake news detection.

The goal of this dissertation is to understand disinformation by proposing and exploiting weak social supervision for learning with little labeled data and effectively detect disinformation via innovative research and novel computational methods. In particular, I investigate learning with weak social supervision for understanding disinformation with the following computational tasks: bringing the heterogeneous social context as auxiliary information for effective fake news detection; discovering explanations of fake news from social media for explainable fake news detection; modeling multi-source of weak social supervision for early fake news detection; and transferring knowledge across domains with adversarial machine learning for cross-domain fake news detection. The findings of the dissertation significantly expand the boundaries of disinformation research and establish a novel paradigm of learning with weak social supervision that has important implications in broad applications in social media.
ContributorsShu, Kai (Author) / Liu, Huan (Thesis advisor) / Bernard, H. Russell (Committee member) / Maciejewski, Ross (Committee member) / Xue, Guoliang (Committee member) / Arizona State University (Publisher)
Created2020
Description
This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content moderation. The thesis then proposes a process flow for an app to teach any kind of person how to evaluate news sources.
ContributorsLee, Helen (Author) / Sopha, Matthew (Thesis director) / Roschke, Kristy (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor)
Created2022-05
164813-Thumbnail Image.png
Description

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content moderation. The thesis then proposes a process flow for an app to teach any kind of person how to evaluate news sources.

ContributorsLee, Helen (Author) / Sopha, Matthew (Thesis director) / Roschke, Kristy (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor)
Created2022-05
164814-Thumbnail Image.jpg
Description

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content moderation. The thesis then proposes a process flow for an app to teach any kind of person how to evaluate news sources.

ContributorsLee, Helen (Author) / Sopha, Matthew (Thesis director) / Roschke, Kristy (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor)
Created2022-05