Matching Items (5)
Filtering by

Clear all filters

133611-Thumbnail Image.png
Description
This paper uses Facebook as a case study for other technological and social media companies given factors presented by the Digital Age. Three different pillars are used to analyze the company. First an examination of the manipulation of users on Facebook by Russian actors is presented. Next, the paper examines

This paper uses Facebook as a case study for other technological and social media companies given factors presented by the Digital Age. Three different pillars are used to analyze the company. First an examination of the manipulation of users on Facebook by Russian actors is presented. Next, the paper examines whether Facebook is promoting civic participation for good. Lastly, an analyzation of the rising trend of hate speech and extremists using the site is presented. This examination of Facebook then posed three questions regarding companies in the Digital Age as a whole. The first was "What is the extent of Corporate Social Responsibility in the Digital Age?" The second was, "What special obligations do for-profit companies have when it comes to safeguarding the privacy of individuals, or at least insuring that their stored information does not harm them?". The last question presented was, "How Can the Profit Motive and Corporate Morality Co-Exist in the Digital Age?" The findings of this case study showed that due to different factors that are presented in the Digital Age, these ideals of Corporate Social Responsibility, Privacy and Corporate Morality may be even more challenging to uphold during this Age of Information. Due to this fact, companies such as Facebook have an even greater responsibility to abide by these ideals of Corporate Social Responsibility, Privacy and Corporate Morality. This is because of an even larger potential for negative effects due to technological change. Regardless of the possibility for regulation by government, third-party organization or by the organizations themselves, Digital Age Corporations have the duty to protect their users from harm and maintain these three ideals.
ContributorsBrandt, Madeline (Author) / Zachary, Gregg (Thesis director) / Wetmore, Jameson (Committee member) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
134809-Thumbnail Image.png
Description
Social media has become a direct and effective means of transmitting personal opinions into the cyberspace. The use of certain key-words and their connotations in tweets portray a meaning that goes beyond the screen and affects behavior. During terror attacks or worldwide crises, people turn to social media as a

Social media has become a direct and effective means of transmitting personal opinions into the cyberspace. The use of certain key-words and their connotations in tweets portray a meaning that goes beyond the screen and affects behavior. During terror attacks or worldwide crises, people turn to social media as a means of managing their anxiety, a mechanism of Terror Management Theory (TMT). These opinions have distinct impacts on the emotions that people express both online and offline through both positive and negative sentiments. This paper focuses on using sentiment analysis on twitter hash-tags during five major terrorist attacks that created a significant response on social media, which collectively show the effects that 140-character tweets have on perceptions in social media. The purpose of analyzing the sentiments of tweets after terror attacks allows for the visualization of the effect of key-words and the possibility of manipulation by the use of emotional contagion. Through sentiment analysis, positive, negative and neutral emotions were portrayed in the tweets. The keywords detected also portray characteristics about terror attacks which would allow for future analysis and predictions in regards to propagating a specific emotion on social media during future crisis.
ContributorsHarikumar, Swathikrishna (Author) / Davulcu, Hasan (Thesis director) / Bodford, Jessica (Committee member) / Computer Science and Engineering Program (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
Description
This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content moderation. The thesis then proposes a process flow for an app to teach any kind of person how to evaluate news sources.
ContributorsLee, Helen (Author) / Sopha, Matthew (Thesis director) / Roschke, Kristy (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor)
Created2022-05
164813-Thumbnail Image.png
Description

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content moderation. The thesis then proposes a process flow for an app to teach any kind of person how to evaluate news sources.

ContributorsLee, Helen (Author) / Sopha, Matthew (Thesis director) / Roschke, Kristy (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor)
Created2022-05
164814-Thumbnail Image.jpg
Description

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content

This thesis looks at recent and historical examples of mis/disinformation and discovers that there are many psychological factors contributing to why people get fooled by deceptive media throughout history, and in modern times, deception is amplified by social media, a platform designed to prioritize profits and user engagement over content moderation. The thesis then proposes a process flow for an app to teach any kind of person how to evaluate news sources.

ContributorsLee, Helen (Author) / Sopha, Matthew (Thesis director) / Roschke, Kristy (Committee member) / Barrett, The Honors College (Contributor) / Department of Information Systems (Contributor)
Created2022-05