This was achieved by first using offline explorer, an application that can download websites, to gather job postings from Dice.com that were searched by a pre-defined list of technical skills. Next came the parsing of the downloaded postings to extract and clean the data that was required and filling a database with that cleaned data. Then the companies were matched up with their corresponding industries. This was done using their NAICS (North American Industry Classification System) codes. The descriptions were then analyzed, and a group of soft skills was chosen based on the results of Word2Vec (a group of models that assists in creating word embeddings). A master table was then created by combining all of the tables in the database. The master table was then filtered down to exclude posts that required too much experience. Lastly, the web app was created using node.js as the back-end. This web app allows the user to choose their desired criteria and navigate through the postings that meet their criteria.
Creation of a database and Python API to clean, organize, and streamline data collection from an updated Qualtrics survey used to capture applicant information for the Fleischer Scholars Program run by the W. P. Carey UG Admissions Office.
Creation of a database and Python API to clean, organize, and streamline data collection from an updated Qualtrics survey used to capture applicant information for the Fleischer Scholars Program run by the W. P. Carey UG Admissions Office.
This thesis paper contains all the information, processes, and scripts used to create the final SQL database and website for use by University Housing at Arizona State University. This project aims to resolve problems currently facing University Housing's Community Assistants with their resource distribution and processes.