Matching Items (1,093)
Filtering by

Clear all filters

135355-Thumbnail Image.png
Description
Glioblastoma multiforme (GBM) is a malignant, aggressive and infiltrative cancer of the central nervous system with a median survival of 14.6 months with standard care. Diagnosis of GBM is made using medical imaging such as magnetic resonance imaging (MRI) or computed tomography (CT). Treatment is informed by medical images and

Glioblastoma multiforme (GBM) is a malignant, aggressive and infiltrative cancer of the central nervous system with a median survival of 14.6 months with standard care. Diagnosis of GBM is made using medical imaging such as magnetic resonance imaging (MRI) or computed tomography (CT). Treatment is informed by medical images and includes chemotherapy, radiation therapy, and surgical removal if the tumor is surgically accessible. Treatment seldom results in a significant increase in longevity, partly due to the lack of precise information regarding tumor size and location. This lack of information arises from the physical limitations of MR and CT imaging coupled with the diffusive nature of glioblastoma tumors. GBM tumor cells can migrate far beyond the visible boundaries of the tumor and will result in a recurring tumor if not killed or removed. Since medical images are the only readily available information about the tumor, we aim to improve mathematical models of tumor growth to better estimate the missing information. Particularly, we investigate the effect of random variation in tumor cell behavior (anisotropy) using stochastic parameterizations of an established proliferation-diffusion model of tumor growth. To evaluate the performance of our mathematical model, we use MR images from an animal model consisting of Murine GL261 tumors implanted in immunocompetent mice, which provides consistency in tumor initiation and location, immune response, genetic variation, and treatment. Compared to non-stochastic simulations, stochastic simulations showed improved volume accuracy when proliferation variability was high, but diffusion variability was found to only marginally affect tumor volume estimates. Neither proliferation nor diffusion variability significantly affected the spatial distribution accuracy of the simulations. While certain cases of stochastic parameterizations improved volume accuracy, they failed to significantly improve simulation accuracy overall. Both the non-stochastic and stochastic simulations failed to achieve over 75% spatial distribution accuracy, suggesting that the underlying structure of the model fails to capture one or more biological processes that affect tumor growth. Two biological features that are candidates for further investigation are angiogenesis and anisotropy resulting from differences between white and gray matter. Time-dependent proliferation and diffusion terms could be introduced to model angiogenesis, and diffusion weighed imaging (DTI) could be used to differentiate between white and gray matter, which might allow for improved estimates brain anisotropy.
ContributorsAnderies, Barrett James (Author) / Kostelich, Eric (Thesis director) / Kuang, Yang (Committee member) / Stepien, Tracy (Committee member) / Harrington Bioengineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135360-Thumbnail Image.png
Description
Aberrant glycosylation has been shown to be linked to specific cancers, and using this idea, it was proposed that the levels of glycans in the blood could predict stage I adenocarcinoma. To track this glycosylation, glycan were broken down into glycan nodes via methylation analysis. This analysis utilized information from

Aberrant glycosylation has been shown to be linked to specific cancers, and using this idea, it was proposed that the levels of glycans in the blood could predict stage I adenocarcinoma. To track this glycosylation, glycan were broken down into glycan nodes via methylation analysis. This analysis utilized information from N-, O-, and lipid linked glycans detected from gas chromatography-mass spectrometry. The resulting glycan node-ratios represent the initial quantitative data that were used in this experiment.
For this experiment, two Sets of 50 µl blood plasma samples were provided by NYU Medical School. These samples were then analyzed by Dr. Borges’s lab so that they contained normalized biomarker levels from patients with stage 1 adenocarcinoma and control patients with matched age, smoking status, and gender were examined. An ROC curve was constructed under individual and paired conditions and AUC calculated in Wolfram Mathematica 10.2. Methods such as increasing size of training set, using hard vs. soft margins, and processing biomarkers together and individually were used in order to increase the AUC. Using a soft margin for this particular data set was proved to be most useful compared to the initial set hard margin, raising the AUC from 0.6013 to 0.6585. In regards to which biomarkers yielded the better value, 6-Glc/6-Man and 3,6-Gal glycan node ratios had the best with 0.7687 AUC and a sensitivity of .7684 and specificity of .6051. While this is not enough accuracy to become a primary diagnostic tool for diagnosing stage I adenocarcinoma, the methods examined in the paper should be evaluated further. . By comparison, the current clinical standard blood test for prostate cancer that has an AUC of only 0.67.
ContributorsDe Jesus, Celine Spicer (Author) / Taylor, Thomas (Thesis director) / Borges, Chad (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of Molecular Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135425-Thumbnail Image.png
Description
The detection and characterization of transients in signals is important in many wide-ranging applications from computer vision to audio processing. Edge detection on images is typically realized using small, local, discrete convolution kernels, but this is not possible when samples are measured directly in the frequency domain. The concentration factor

The detection and characterization of transients in signals is important in many wide-ranging applications from computer vision to audio processing. Edge detection on images is typically realized using small, local, discrete convolution kernels, but this is not possible when samples are measured directly in the frequency domain. The concentration factor edge detection method was therefore developed to realize an edge detector directly from spectral data. This thesis explores the possibilities of detecting edges from the phase of the spectral data, that is, without the magnitude of the sampled spectral data. Prior work has demonstrated that the spectral phase contains particularly important information about underlying features in a signal. Furthermore, the concentration factor method yields some insight into the detection of edges in spectral phase data. An iterative design approach was taken to realize an edge detector using only the spectral phase data, also allowing for the design of an edge detector when phase data are intermittent or corrupted. Problem formulations showing the power of the design approach are given throughout. A post-processing scheme relying on the difference of multiple edge approximations yields a strong edge detector which is shown to be resilient under noisy, intermittent phase data. Lastly, a thresholding technique is applied to give an explicit enhanced edge detector ready to be used. Examples throughout are demonstrate both on signals and images.
ContributorsReynolds, Alexander Bryce (Author) / Gelb, Anne (Thesis director) / Cochran, Douglas (Committee member) / Viswanathan, Adityavikram (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135426-Thumbnail Image.png
Description
Company X is one of the world's largest manufacturer of semiconductors. The company relies on various suppliers in the U.S. and around the globe for its manufacturing process. The financial health of these suppliers is vital to the continuation of Company X's business without any material interruption. Therefore, it is

Company X is one of the world's largest manufacturer of semiconductors. The company relies on various suppliers in the U.S. and around the globe for its manufacturing process. The financial health of these suppliers is vital to the continuation of Company X's business without any material interruption. Therefore, it is in Company X's interest to monitor its supplier's financial performance. Company X has a supplier financial health model currently in use. Having been developed prior to watershed events like the Great Recession, the current model may not reflect the significant changes in the economic environment due to these events. Company X wants to know if there is a more accurate model for evaluating supplier health that better indicates business risk. The scope of this project will be limited to a sample of 24 suppliers representative of Company X's supplier base that are public companies. While Company X's suppliers consist of both private and public companies, the used of exclusively public companies ensures that we will have sufficient and appropriate data for the necessary analysis. The goal of this project is to discover if there is a more accurate model for evaluating the financial health of publicly traded suppliers that better indicates business risk. Analyzing this problem will require a comprehensive understanding of various financial health models available and their components. The team will study best practice and academia. This comprehension will allow us to customize a model by incorporating metrics that allows greater accuracy in evaluating supplier financial health in accordance with Company X's values.
ContributorsLi, Tong (Co-author) / Gonzalez, Alexandra (Co-author) / Park, Zoon Beom (Co-author) / Vogelsang, Meridith (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Mike (Committee member) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / School of Accountancy (Contributor) / WPC Graduate Programs (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135433-Thumbnail Image.png
Description
For our collaborative thesis we explored the US electric utility market and how the Internet of Things technology movement could capture a possible advancement of the current existing grid. Our objective of this project was to successfully understand the market trends in the utility space and identify where a semiconductor

For our collaborative thesis we explored the US electric utility market and how the Internet of Things technology movement could capture a possible advancement of the current existing grid. Our objective of this project was to successfully understand the market trends in the utility space and identify where a semiconductor manufacturing company, with a focus on IoT technology, could penetrate the market using their products. The methodology used for our research was to conduct industry interviews to formulate common trends in the utility and industrial hardware manufacturer industries. From there, we composed various strategies that The Company should explore. These strategies were backed up using qualitative reasoning and forecasted discounted cash flow and net present value analysis. We confirmed that The Company should use specific silicon microprocessors and microcontrollers that pertained to each of the four devices analytics demand. Along with a silicon strategy, our group believes that there is a strong argument for a data analytics software package by forming strategic partnerships in this space.
ContributorsLlazani, Loris (Co-author) / Ruland, Matthew (Co-author) / Medl, Jordan (Co-author) / Crowe, David (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Mike (Committee member) / Department of Economics (Contributor) / Department of Finance (Contributor) / Department of Supply Chain Management (Contributor) / Department of Information Systems (Contributor) / Hugh Downs School of Human Communication (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135434-Thumbnail Image.png
Description
Chebfun is a collection of algorithms and an open-source software system in object-oriented Matlab that extends familiar powerful methods of numerical computation involving numbers to continuous or piecewise-continuous functions. The success of this strategy is based on the mathematical fact that smooth functions can be represented very efficiently by polynomial

Chebfun is a collection of algorithms and an open-source software system in object-oriented Matlab that extends familiar powerful methods of numerical computation involving numbers to continuous or piecewise-continuous functions. The success of this strategy is based on the mathematical fact that smooth functions can be represented very efficiently by polynomial interpolation at Chebyshev points or by trigonometric interpolation at equispaced points for periodic functions. More recently, the system has been extended to handle bivariate functions and vector fields. These two new classes of objects are called Chebfun2 and Chebfun2v, respectively. We will show that Chebfun2 and Chebfun2v, and can be used to accurately and efficiently perform various computations on parametric surfaces in two or three dimensions, including path trajectories and mean and Gaussian curvatures. More advanced surface computations such as mean curvature flows are also explored. This is also the first work to use the newly implemented trigonometric representation, namely Trigfun, for computations on surfaces.
ContributorsPage-Bottorff, Courtney Michelle (Author) / Platte, Rodrigo (Thesis director) / Kostelich, Eric (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135450-Thumbnail Image.png
Description
As the IoT (Internet of Things) market continues to grow, Company X needs to find a way to penetrate the market and establish larger market share. The problem with Company X's current strategy and cost structure lies in the fact that the fastest growing portion of the IoT market is

As the IoT (Internet of Things) market continues to grow, Company X needs to find a way to penetrate the market and establish larger market share. The problem with Company X's current strategy and cost structure lies in the fact that the fastest growing portion of the IoT market is microcontrollers (MCUs). As Company X currently holds its focus in manufacturing microprocessors (MPUs), the current manufacturing strategy is not optimal for entering competitively into the MCU space. Within the MCU space, the companies that are competing the best do not utilize such high level manufacturing processes because these low cost products do not demand them. Given that the MCU market is largely untested by Company X and its products would need to be manufactured at increasingly lower costs, it runs the risk of over producing and holding obsolete inventory that is either scrapped or sold at or below cost. In order to eliminate that risk, we will explore alternative manufacturing strategies for Company X's MCU products specifically, which will allow for a more optimal cost structure and ultimately a more profitable Internet of Things Group (IoTG). The IoT MCU ecosystem does not require the high powered technology Company X is currently manufacturing and therefore, Company X loses large margins due to its unnecessary leading technology. Since cash is king, pursuing a fully external model for MCU design and manufacturing processes will generate the highest NPV for Company X. It also will increase Company X's market share, which is extremely important given that every tech company in the world is trying to get its hands into the IoT market. It is possible that in ten to thirty years down the road, Company X can manufacture enough units to keep its products in-house, but this is not feasible in the foreseeable future. For now, Company X should focus on the cost market of MCUs by driving its prices down while maintaining low costs due to the variables of COGS and R&D given in our fully external strategy.
ContributorsKadi, Bengimen (Co-author) / Peterson, Tyler (Co-author) / Langmack, Haley (Co-author) / Quintana, Vince (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Department of Supply Chain Management (Contributor) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / Department of Marketing (Contributor) / School of Accountancy (Contributor) / W. P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135651-Thumbnail Image.png
Description
Honey bees (Apis mellifera) are responsible for pollinating nearly 80\% of all pollinated plants, meaning humans depend on honey bees to pollinate many staple crops. The success or failure of a colony is vital to global food production. There are various complex factors that can contribute to a colony's failure,

Honey bees (Apis mellifera) are responsible for pollinating nearly 80\% of all pollinated plants, meaning humans depend on honey bees to pollinate many staple crops. The success or failure of a colony is vital to global food production. There are various complex factors that can contribute to a colony's failure, including pesticides. Neonicotoids are a popular pesticide that have been used in recent times. In this study we concern ourselves with pesticides and its impact on honey bee colonies. Previous investigations that we draw significant inspiration from include Khoury et Al's \emph{A Quantitative Model of Honey Bee Colony Population Dynamics}, Henry et Al's \emph{A Common Pesticide Decreases Foraging Success and Survival in Honey Bees}, and Brown's \emph{ Mathematical Models of Honey Bee Populations: Rapid Population Decline}. In this project we extend a mathematical model to investigate the impact of pesticides on a honey bee colony, with birth rates and death rates being dependent on pesticides, and we see how these death rates influence the growth of a colony. Our studies have found an equilibrium point that depends on pesticides. Trace amounts of pesticide are detrimental as they not only affect death rates, but birth rates as well.
ContributorsSalinas, Armando (Author) / Vaz, Paul (Thesis director) / Jones, Donald (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / School of International Letters and Cultures (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
Description
Business students are trained to be professional problem solver. In order to improve students' ability to solve real-life problem, more and more business schools are encouraging students to attend case competitions and do internships before graduation. In curriculum, students are required to work on business cases and projects in team.

Business students are trained to be professional problem solver. In order to improve students' ability to solve real-life problem, more and more business schools are encouraging students to attend case competitions and do internships before graduation. In curriculum, students are required to work on business cases and projects in team. However, due to the limited exposure to real-life business scenarios, most undergraduate students feel unprepared when faced with business problems in course projects, case competitions, and internships. Therefore, the goal of this Honors Creative Project is to provide students with an interactive resource to succeed in course projects, case competitions, and even internship projects. By introducing resources that focused on analysis approach and project management, students can learn from some successful experience and become more competitive in job market. After competing at four case competitions with talents all over the nation, we accumulated precious experience in case analysis and teamwork development within a high-pressure environment. In addition, the experiences with internships, consulting and course projects have also aided the participants' development in professionalism and quantitative analytics. Reflecting on what we have learned from our experiences, we strongly believe that the insights gained from the past are not only a treasure for us individually, but also a great resource for our colleagues. We hope to transfer our knowledge to others for their own success where "best practices" can be learned.
ContributorsXiahou, Xiaonan (Co-author) / Thoi, Kenson (Co-author) / Printezis, Antonios (Thesis director) / Arrfelt, Mathias (Committee member) / Department of Supply Chain Management (Contributor) / Department of Economics (Contributor) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135671-Thumbnail Image.png
Description
Financial statements are one of the most important, if not the most important, documents for investors. These statements are prepared quarterly and yearly by the company accounting department, and are then audited in detail by a large external accounting firm. Investors use these documents to determine the value of the

Financial statements are one of the most important, if not the most important, documents for investors. These statements are prepared quarterly and yearly by the company accounting department, and are then audited in detail by a large external accounting firm. Investors use these documents to determine the value of the company, and trust that the company was truthful in its statements, and the auditing firm correctly audited the company's financial statements for any mistakes in their books and balances. Mistakes on a company's financial statements can be costly. However, financial fraud on the statements can be outright disastrous. Penalties for accounting fraud can include individual lifetime prison sentences, as well as company fines for billions of dollars. As students in the accounting major, it is our responsibility to ensure that financial statements are accurate and truthful to protect ourselves, other stakeholders, and the companies we work for. This ethics game takes the stories of Enron, WorldCom, and Lehman Brothers and uses them to help students identify financial fraud and how it can be prevented, as well as the consequences behind unethical decisions in financial reporting. The Enron scandal involved CEO Kenneth Lay and his predecessor Jeffery Skilling hiding losses in their financial statements with the help of their auditing firm, Arthur Andersen. Enron collapsed in 2002, and Lay was sentenced to 45 years in prison with his conspirator Skilling sentenced to 24 years in prison. In the WorldCom scandal, CEO Bernard "Bernie" Ebbers booked line costs as capital expenses (overstating WorldCom's assets), and created fraudulent accounts to inflate revenue and WorldCom's profit. Ebbers was sentenced to 25 years in prison and lost his title as WorldCom's Chief Executive Officer. Lehman Brothers took advantage of a loophole in accounting procedure Repo 105, that let the firm hide $50 billion in profits. No one at Lehman Brothers was sentenced to jail since the transaction was technically considered legal, but Lehman was the largest investment bank to fail and the only large financial institution that was not bailed out by the U.S. government.
ContributorsPanikkar, Manoj Madhuraj (Author) / Samuelson, Melissa (Thesis director) / Ahmad, Altaf (Committee member) / Department of Information Systems (Contributor) / School of Accountancy (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05