Matching Items (4)
Filtering by

Clear all filters

135637-Thumbnail Image.png
Description
The foundations of legacy media, especially the news media, are not as strong as they once were. A digital revolution has changed the operation models for and journalistic organizations are trying to find their place in the new market. This project is intended to analyze the effects of new/emerging technologies

The foundations of legacy media, especially the news media, are not as strong as they once were. A digital revolution has changed the operation models for and journalistic organizations are trying to find their place in the new market. This project is intended to analyze the effects of new/emerging technologies on the journalism industry. Five different categories of technology will be explored. They are as follows: the semantic web, automation software, data analysis and aggregators, virtual reality and drone journalism. The potential of these technologies will be broken up according to four guidelines, ethical implications, effects on the reportorial process, business impacts and changes to the consumer experience. Upon my examination, it is apparent that no single technology will offer the journalism industry the remedy it has been searching for. Some combination of emerging technologies however, may form the basis for the next generation of news. Findings are presented on a website that features video, visuals, linked content, and original graphics. Website found at http://www.explorenewstech.com/
Created2016-05
135105-Thumbnail Image.png
Description
Academic integrity policies coded specifically for journalism schools or departments are devised for the purpose of fostering a realistic, informative learning environment. Plagiarism and fabrication are two of the most egregious errors of judgment a journalist can commit, and journalism schools and departments address these errors through their academic integrity

Academic integrity policies coded specifically for journalism schools or departments are devised for the purpose of fostering a realistic, informative learning environment. Plagiarism and fabrication are two of the most egregious errors of judgment a journalist can commit, and journalism schools and departments address these errors through their academic integrity policies. Some schools take a zero-tolerance approach, often expelling the student after the first or second violation, while other schools take a tolerant approach, in which a student is permitted at least three violations before suspension is considered. In a time where plagiarizing and fabricating stories has never been easier to commit and never easier to catch, students must be prepared to understand plagiarism and fabrication with multimedia elements, such as video, audio, and photos. In this project, journalism academic integrity codes were gathered from across the U.S. and designated to a zero-tolerance, semi-tolerant or tolerant category the researcher designed in order to determine what is preparing students most for the real journalism world, and to suggest how some policies could improve themselves.
ContributorsRoney, Claire Marie (Author) / McGuire, Tim (Thesis director) / Russomanno, Joseph (Committee member) / W. P. Carey School of Business (Contributor) / Walter Cronkite School of Journalism and Mass Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
158416-Thumbnail Image.png
Description
Plagiarism is a huge problem in a learning environment. In programming classes especially, plagiarism can be hard to detect as source codes' appearance can be easily modified without changing the intent through simple formatting changes or refactoring. There are a number of plagiarism detection tools that attempt to encode knowledge

Plagiarism is a huge problem in a learning environment. In programming classes especially, plagiarism can be hard to detect as source codes' appearance can be easily modified without changing the intent through simple formatting changes or refactoring. There are a number of plagiarism detection tools that attempt to encode knowledge about the programming languages they support in order to better detect obscured duplicates. Many such tools do not support a large number of languages because doing so requires too much code and therefore too much maintenance. It is also difficult to add support for new languages because each language is vastly different syntactically. Tools that are more extensible often do so by reducing the features of a language that are encoded and end up closer to text comparison tools than structurally-aware program analysis tools.

Kitsune attempts to remedy these issues by tying itself to Antlr, a pre-existing language recognition tool with over 200 currently supported languages. In addition, it provides an interface through which generic manipulations can be applied to the parse tree generated by Antlr. As Kitsune relies on language-agnostic structure modifications, it can be adapted with minimal effort to provide plagiarism detection for new languages. Kitsune has been evaluated for 10 of the languages in the Antlr grammar repository with success and could easily be extended to support all of the grammars currently developed by Antlr or future grammars which are developed as new languages are written.
ContributorsMonroe, Zachary Lynn (Author) / Bansal, Ajay (Thesis advisor) / Lindquist, Timothy (Committee member) / Acuna, Ruben (Committee member) / Arizona State University (Publisher)
Created2020
161655-Thumbnail Image.png
Description
There has been a substantial development in the field of data transmission in the last two decades. One does not have to wait much for a high-definition video to load on the systems anymore. Data compression is one of the most important technologies that helped achieve this seamless data transmission

There has been a substantial development in the field of data transmission in the last two decades. One does not have to wait much for a high-definition video to load on the systems anymore. Data compression is one of the most important technologies that helped achieve this seamless data transmission experience. It helps to store or send more data using less memory or network resources. However, it appears that there is a limit on the amount of compression that can be achieved with the existing lossless data compression techniques because they rely on the frequency of characters or set of characters in the data. The thesis proposes a lossless data compression technique in which the data is compressed by representing it as a set of parameters that can reproduce the original data without any loss when given to the corresponding mathematical equation. The mathematical equation used in the thesis is the sum of the first N terms in a geometric series. Various changes are made to this mathematical equation so that any given data can be compressed and decompressed. According to the proposed technique, the whole data is taken as a single decimal number and replaced with one of the terms of the used equation. All the other terms of the equation are computed and stored as a compressed file. The performance of the developed technique is evaluated in terms of compression ratio, compression time and decompression time. The evaluation metrics are then compared with the other existing techniques of the same domain.
ContributorsGrewal, Karandeep Singh (Author) / Gonzalez Sanchez, Javier (Thesis advisor) / Bansal, Ajay (Committee member) / Findler, Michael (Committee member) / Arizona State University (Publisher)
Created2021