Stellar mass loss has a high impact on the overall evolution of a star. The amount<br/>of mass lost during a star’s lifetime dictates which remnant will be left behind and how<br/>the circumstellar environment will be affected. Several rates of mass loss have been<br/>proposed for use in stellar evolution codes, yielding discrepant results from codes using<br/>different rates. In this paper, I compare the effect of varying the mass loss rate in the<br/>stellar evolution code TYCHO on the initial-final mass relation. I computed four sets of<br/>models with varying mass loss rates and metallicities. Due to a large number of models<br/>reaching the luminous blue variable stage, only the two lower metallicity groups were<br/>considered. Their mass loss was analyzed using Python. Luminosity, temperature, and<br/>radius were also compared. The initial-final mass relation plots showed that in the 1/10<br/>solar metallicity case, reducing the mass loss rate tended to increase the dependence of final mass on initial mass. The limited nature of these results implies a need for further study into the effects of using different mass loss rates in the code TYCHO.