Matching Items (2)
Filtering by

Clear all filters

135486-Thumbnail Image.png
Description
This creative project is the first draft of a database of financial records from Arizona law enforcement's use of the state asset forfeiture program from fiscal 2011-2015. Asset forfeiture is a program by which law enforcement can seize property suspected to have been used in a crime and can then

This creative project is the first draft of a database of financial records from Arizona law enforcement's use of the state asset forfeiture program from fiscal 2011-2015. Asset forfeiture is a program by which law enforcement can seize property suspected to have been used in a crime and can then use the property, cash, or proceeds from the property's auction for its own purposes, raising questions of conflicts of interest. The paper explains the methodology and goals for the database, while the database itself represents more than 11,000 pages of financial records and is more than 70,300 cells large.
ContributorsMahoney, Emily Livingston (Author) / Doig, Steve (Thesis director) / Petchel, Jacqueline (Committee member) / Walter Cronkite School of Journalism and Mass Communication (Contributor) / School of Music (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
168708-Thumbnail Image.png
Description
Software systems can exacerbate and cause contemporary social inequities. As such, scholars and activists have scrutinized sociotechnical systems like those used in facial recognition technology or predictive policing using the frameworks of algorithmic bias and dataset bias. However, these conversations are incomplete without study of data models: the

Software systems can exacerbate and cause contemporary social inequities. As such, scholars and activists have scrutinized sociotechnical systems like those used in facial recognition technology or predictive policing using the frameworks of algorithmic bias and dataset bias. However, these conversations are incomplete without study of data models: the structural, epistemological, and technical frameworks that shape data. In Modeling Power: Data Models and the Production of Social Inequality, I elucidate the connections between relational data modeling techniques and manifestations of systems of power in the United States, specifically white supremacy and cisgender normativity. This project has three distinct parts. First, I historicize early publications by E. F. Codd, Peter Chen, Miles Smith & Diane Smith, and J. R. Abrial to demonstrate that now-taken-for-granted data modeling techniques were products of their social and technical moments and, as such, reinforced dominant systems of power. I further connect database reification techniques to contemporary racial analyses of reification via the work of Cheryl Harris. Second, I reverse engineer Android applications (with Jadx and apktool) to uncover the relational data models within. I analyze DAO annotations, create entity-relationship diagrams, and then examine those resultant models, again linking them back to systems of race and gender power. I craft a method for performing a reverse engineering investigation within a specific sociotechnical context -- a situated analysis of the contextual epistemological frames embedded within relational paradigms. Finally, I develop a relational data model that integrates insights from the project’s reverse and historical engineering phases. In my speculative engineering process, I suggest that the temporality of modern digital computing is incommensurate with the temporality of modern transgender lives. Following this, I speculate and build a trans-inclusive data model that demonstrates uses of reification to actively subvert systems of racialized and gendered power. By promoting aspects of social identity to first-order objects within a data model, I show that additional “intellectual manageability” is possible through reification. Through each part, I argue that contemporary approaches to the social impacts of software systems incomplete without data models. Data models structure algorithmic opportunities. As algorithms continue to reinforce systems of inequality, data models provide opportunities for intervention and subversion.
ContributorsStevens, Nikki Lane (Author) / Wernimont, Jacqueline D (Thesis advisor) / Michael, Katina (Thesis advisor) / Richter, Jennifer (Committee member) / Duarte, Marisa E. (Committee member) / Arizona State University (Publisher)
Created2022