Matching Items (14)
Filtering by

Clear all filters

136218-Thumbnail Image.png
Description
This study addresses the question: is it possible for consumers to make informed decisions regarding their privacy, while using smartphones, in the face of the complex web of actors, incentives, and conveniences afforded by the technology? To address this question, the Social Construction of Technology (SCOT) model is used to

This study addresses the question: is it possible for consumers to make informed decisions regarding their privacy, while using smartphones, in the face of the complex web of actors, incentives, and conveniences afforded by the technology? To address this question, the Social Construction of Technology (SCOT) model is used to analyze common situations consumers find themselves engaged in. Using the SCOT model, relevant actors are identified; their interpretations of various technologies are expressed; relative power is discussed; and possible directions for closure are examined. This analysis takes place by looking at three specific themes within privacy disputes in general: anonymity, confidentiality, and surveillance. These themes are compared and contrasted in regards to their impact on perception of privacy and implications for closure. Arguments are supported through evidence drawn from scholarship on the topic as well as industry and news media. Conclusions are supported through the framework of anticipatory governance.
ContributorsKula, Shane (Author) / Hackett, Ed (Thesis director) / Sarewitz, Daniel (Committee member) / Wetmore, Jamey (Committee member) / Barrett, The Honors College (Contributor) / College of Letters and Sciences (Contributor) / School of Sustainability (Contributor) / School of Life Sciences (Contributor)
Created2015-05
132443-Thumbnail Image.png
Description
Data has quickly become a cornerstone of society. Across our daily lives, industry, policy, and more, we are experiencing what can only be called a “data revolution” igniting ferociously. While data is gaining more and more importance, consumers do not fully understand the extent of its use and subsequent capitalization

Data has quickly become a cornerstone of society. Across our daily lives, industry, policy, and more, we are experiencing what can only be called a “data revolution” igniting ferociously. While data is gaining more and more importance, consumers do not fully understand the extent of its use and subsequent capitalization by companies. This paper explores the current climate relating to data security and data privacy. It aims to start a conversation regarding the culture around the sharing and collection of data. We explore aspects of data privacy in four tiers: the current cultural and social perception of data privacy, its relevance in our daily lives, its importance in society’s dialogue. Next, we look at current policy and legislature in place today, focusing primarily on Europe’s established GDPR and the incoming California Consumer Privacy Act, to see what measures are already in place and what measures need to be adopted to mold more of a culture of transparency. Next, we analyze current data privacy regulations and power of regulators like the FTC and SEC to see what tools they have at their disposal to ensure accountability in the tech industry when it comes to how our data is used. Lastly, we look at the potential act of treating and viewing data as an asset, and the implications of doing so in the scope of possible valuation and depreciation techniques. The goal of this paper is to outline initial steps to better understand and regulate data privacy and collection practices. Our goal is to bring this issue to the forefront of conversation in society, so that we may start the first step in the metaphorical marathon of data privacy, with the goal of establishing better data privacy controls and become a more data-conscious society.
ContributorsAnderson, Thomas C (Co-author) / Shafeeva, Zarina (Co-author) / Swiech, Jakub (Co-author) / Marchant, Gary (Thesis director) / Sopha, Matthew (Committee member) / WPC Graduate Programs (Contributor) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
133611-Thumbnail Image.png
Description
This paper uses Facebook as a case study for other technological and social media companies given factors presented by the Digital Age. Three different pillars are used to analyze the company. First an examination of the manipulation of users on Facebook by Russian actors is presented. Next, the paper examines

This paper uses Facebook as a case study for other technological and social media companies given factors presented by the Digital Age. Three different pillars are used to analyze the company. First an examination of the manipulation of users on Facebook by Russian actors is presented. Next, the paper examines whether Facebook is promoting civic participation for good. Lastly, an analyzation of the rising trend of hate speech and extremists using the site is presented. This examination of Facebook then posed three questions regarding companies in the Digital Age as a whole. The first was "What is the extent of Corporate Social Responsibility in the Digital Age?" The second was, "What special obligations do for-profit companies have when it comes to safeguarding the privacy of individuals, or at least insuring that their stored information does not harm them?". The last question presented was, "How Can the Profit Motive and Corporate Morality Co-Exist in the Digital Age?" The findings of this case study showed that due to different factors that are presented in the Digital Age, these ideals of Corporate Social Responsibility, Privacy and Corporate Morality may be even more challenging to uphold during this Age of Information. Due to this fact, companies such as Facebook have an even greater responsibility to abide by these ideals of Corporate Social Responsibility, Privacy and Corporate Morality. This is because of an even larger potential for negative effects due to technological change. Regardless of the possibility for regulation by government, third-party organization or by the organizations themselves, Digital Age Corporations have the duty to protect their users from harm and maintain these three ideals.
ContributorsBrandt, Madeline (Author) / Zachary, Gregg (Thesis director) / Wetmore, Jameson (Committee member) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133159-Thumbnail Image.png
Description
Our lives are documented and facilitated by the internet. Given that an increasing proportion of time is being spent online, search and browsing history offers a unique frame of reference to conduct a qualitative study since it contains individual goals, day-to-day experiences, illicit thoughts, and questions, all while capturing sentiments

Our lives are documented and facilitated by the internet. Given that an increasing proportion of time is being spent online, search and browsing history offers a unique frame of reference to conduct a qualitative study since it contains individual goals, day-to-day experiences, illicit thoughts, and questions, all while capturing sentiments rather than statistics. Seeing this recorded daily activity mapped out over the course of several years would hopefully provide a startling reminder of how life can be accurately and simply described as a series of constantly evolving interests and intentions, as well as give a sense of how exhaustively massive internet companies collect private information online. The search engine giant Google offers its users the transparency and freedom to export and download an archive of their web activity through a service known as Google Takeout. We propose using this service to empower ordinary individuals with Google accounts by developing a comprehensive and qualitative approach to understanding and gaining insights about their personal behavior online. In this paper, we first define and analyze the need for such a product. Then we conduct a variety of intent and interest-sensitive computational analysis methods on a sample browser history to explore and contextualize emergent trends, as a proof of concept. Finally, we create a blueprint for building an interactive application which uses our approach to generate dynamic dashboards and unique user profiles from search and browsing data.
ContributorsLi, Jason (Author) / Sopha, Matthew (Thesis director) / Shutters, Shade (Committee member) / Department of Information Systems (Contributor, Contributor) / Dean, W.P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2018-12
133279-Thumbnail Image.png
Description
There are potential risks when individuals choose to share information on social media platforms such as Facebook. With over 2.20 billion active monthly users, Facebook has the largest collection of user information compared to other social media sites. Due to their large collection of data, Facebook has constantly received criticism

There are potential risks when individuals choose to share information on social media platforms such as Facebook. With over 2.20 billion active monthly users, Facebook has the largest collection of user information compared to other social media sites. Due to their large collection of data, Facebook has constantly received criticism for their data privacy policies. Facebook has constantly changed its privacy policies in the effort to protect itself and end users. However, the changes in privacy policy may not translate into users changing their privacy controls. The goal of Facebook privacy controls is to allow Facebook users to be in charge of their data privacy. The goal of this study was to determine if a gap between user perceived privacy and reality existed. If this gap existed we investigated to see if certain information about the user would have a relationship to their ability to implement their settings successfully. We gathered information of ASU college students such as: gender, field of study, political affiliations, leadership involvement, privacy settings and online behaviors. After collecting the data, we reviewed each participants' Facebook profiles to examine the existence of the gap between their privacy settings and information available as a stranger. We found that there existed a difference between their settings and reality and it was not related to any of the users' background information.
ContributorsPascua, Raphael Matthew Bustos (Author) / Bazzi, Rida (Thesis director) / Dasgupta, Partha (Committee member) / Computer Science and Engineering Program (Contributor) / W.P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
135099-Thumbnail Image.png
Description
Smartphone privacy is a growing concern around the world; smartphone applications routinely take personal information from our phones and monetize it for their own profit. Worse, they're doing it legally. The Terms of Service allow companies to use this information to market, promote, and sell personal data. Most users seem

Smartphone privacy is a growing concern around the world; smartphone applications routinely take personal information from our phones and monetize it for their own profit. Worse, they're doing it legally. The Terms of Service allow companies to use this information to market, promote, and sell personal data. Most users seem to be either unaware of it, or unconcerned by it. This has negative implications for the future of privacy, particularly as the idea of smart home technology becomes a reality. If this is what privacy looks like now, with only one major type of smart device on the market, what will the future hold, when the smart home systems come into play. In order to examine this question, I investigated how much awareness/knowledge smartphone users of a specific demographic (millennials aged 18-25) knew about their smartphone's data and where it goes. I wanted three questions answered: - For what purposes do millennials use their smartphones? - What do they know about smartphone privacy and security? - How will this affect the future of privacy? To accomplish this, I gathered information using a distributed survey to millennials attending Arizona State University. Using statistical analysis, I exposed trends for this demographic, discovering that there isn't a lack of knowledge among millennials; most are aware that smartphone apps can collect and share data and many of the participants are not comfortable with the current state of smartphone privacy. However, more than half of the study participants indicated that they never read an app's Terms of Service. Due to the nature of the privacy vs. convenience argument, users will willingly agree to let apps take their personal in- formation, since they don't want to give up the convenience.
ContributorsJones, Scott Spenser (Author) / Atkinson, Robert (Thesis director) / Chavez-Echeagaray, Maria Elena (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
137618-Thumbnail Image.png
Description
Currently conventional Subtitle D landfills are the primary means of disposing of our waste in the United States. While this method of waste disposal aims at protecting the environment, it does so through the use of liners and caps that effectively freeze the breakdown of waste. Because this method can

Currently conventional Subtitle D landfills are the primary means of disposing of our waste in the United States. While this method of waste disposal aims at protecting the environment, it does so through the use of liners and caps that effectively freeze the breakdown of waste. Because this method can keep landfills active, and thus a potential groundwater threat for over a hundred years, I take an in depth look at the ability of bioreactor landfills to quickly stabilize waste. In the thesis I detail the current state of bioreactor landfill technologies, assessing the pros and cons of anaerobic and aerobic bioreactor technologies. Finally, with an industrial perspective, I conclude that moving on to bioreactor landfills as an alternative isn't as simple as it may first appear, and that it is a contextually specific solution that must be further refined before replacing current landfills.
ContributorsWhitten, George Avery (Author) / Kavazanjian, Edward (Thesis director) / Allenby, Braden (Committee member) / Houston, Sandra (Committee member) / Civil, Environmental and Sustainable Engineering Programs (Contributor) / Barrett, The Honors College (Contributor)
Created2013-05
135765-Thumbnail Image.png
Description
The development of computational systems known as brain-computer interfaces (BCIs) offers the possibility of allowing individuals disabled by neurological disorders such as Amyotrophic Lateral Sclerosis (ALS) and ischemic stroke the ability to perform relatively complex tasks such as communicating with others and walking. BCIs are closed-loop systems that record physiological

The development of computational systems known as brain-computer interfaces (BCIs) offers the possibility of allowing individuals disabled by neurological disorders such as Amyotrophic Lateral Sclerosis (ALS) and ischemic stroke the ability to perform relatively complex tasks such as communicating with others and walking. BCIs are closed-loop systems that record physiological signals from the brain and translate those signals into commands that control an external device such as a wheelchair or a robotic exoskeleton. Despite the potential for BCIs to vastly improve the lives of almost one billion people, one question arises: Just because we can use brain-computer interfaces, should we? The human brain is an embodiment of the mind, which is largely seen to determine a person's identity, so a number of ethical and philosophical concerns emerge over current and future uses of BCIs. These concerns include privacy, informed consent, autonomy, identity, enhancement, and justice. In this thesis, I focus on three of these issues: privacy, informed consent, and autonomy. The ultimate purpose of brain-computer interfaces is to provide patients with a greater degree of autonomy; thus, many of the ethical issues associated with BCIs are intertwined with autonomy. Currently, brain-computer interfaces exist mainly in the domain of medicine and medical research, but recently companies have started commercializing BCIs and providing them at affordable prices. These consumer-grade BCIs are primarily for non-medical purposes, and so they are beyond the scope of medicine. As BCIs become more widespread in the near future, it is crucial for interdisciplinary teams of ethicists, philosophers, engineers, and physicians to collaborate to address these ethical concerns now before BCIs become more commonplace.
ContributorsChu, Kevin Michael (Author) / Ankeny, Casey (Thesis director) / Robert, Jason (Committee member) / Frow, Emma (Committee member) / Harrington Bioengineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor) / School for the Future of Innovation in Society (Contributor) / Lincoln Center for Applied Ethics (Contributor)
Created2016-05
147594-Thumbnail Image.png
Description

By evaluating recent anti-terror legislation, this project examines to what end individual American rights and values are affected as a result.

ContributorsGarrison, Stephen (Author) / DeCarolis, Claudine (Thesis director) / Gordon, Karen (Committee member) / School of Public Affairs (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
147971-Thumbnail Image.png
Description

This survey takes information on a participant’s beliefs on privacy security, the general digital knowledge, demographics, and willingness-to-pay points on if they would delete information on their social media, to see how an information treatment affects those payment points. This information treatment is meant to make half of the participants

This survey takes information on a participant’s beliefs on privacy security, the general digital knowledge, demographics, and willingness-to-pay points on if they would delete information on their social media, to see how an information treatment affects those payment points. This information treatment is meant to make half of the participants think about the deeper ramifications of the information they reveal. The initial hypothesis is that this information will make people want to pay more to remove their information from the web, but the results find a surprising negative correlation with the treatment.

ContributorsDeitrick, Noah Sumner (Author) / Silverman, Daniel (Thesis director) / Kuminoff, Nicolai (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05