Browsing Cand.merc.MAT  Erhvervsøkonomi & Matematik / MSc in Mathematics by Year Published
Now showing items 120 of 88
Next Page
[More information][Less information]
Abstract: Through this thesis it will be shown how investors can profit from adding inflationlinked derivatives to their portfolios and how these securities can be used to hedge against inflation. In these times of financial crisis inflation is an important factor to consider when investing. Because of this, this thesis was aimed at providing a general understanding of how these securities can be used with regards to risk management. URI: http://hdl.handle.net/10417/540 Files in this item: 1
heidi_modvig_nielsen.pdf (943.8Kb) 
Udledning, egenskaber og praktisk anvendelseBech Rasmussen, Martin (Frederiksberg, 2009)[More information][Less information]
Abstract: The main subject of this master thesis is to derive and analyse a closedform solution for the stochastic volatility model developed by Stephen L. Heston in 1993. The main properties of the model are illustrated and the practical use of the model tested in various situations. The motivation for developing a stochastic volatility model is that the basic option pricing model developed by Black & Scholes [1973] is not consistent with observed market prices. One of the main characteristics of the BlackScholes model is that the underlying asset follows a geometric Brownian motion with constant volatility. However empirical studies show that the volatility is not constant over time, so Heston [1993] suggested an extension to the BlackScholes model, where the volatility follows a stochastic process and thus varies over time. The technique used by Heston to derive the closedform solution for option prices under stochastic volatility is based on solving two partial differential equations for the characteristic functions associated with the probabilities. Then the connection to the probabilities in the model is calculated using an inverse Fourier transformation. The solution technique is presented in detail in chapter 3 of this thesis, and it can be used to derive closedform solutions to many different problems. The derivation of the partial differential equations for the characteristic functions is shown in appendix B. Another version of the Heston model, known as the Displaced Heston model, is also introduced. The main difference between the two models is that the process for the underlying asset is defined differently in the Displaced Heston model, which gives the model the advantage of allowing the correlation between the underlying asset and the variance to be zero without losing one of the important properties of the stochastic volatility model. In some applications, this proves to be an advantage. The main effect that causes the Heston model to differ from the BlackScholes model, is its ability to generate skewness and kurtosis in the probability density function. In the Heston model, skewness is generated by the correlation parameter, and kurtosis is generated by the volatility of volatility parameter. In the Displaced Heston model the correlation was set to zero, but this model still has the ability to generate skewness through the displaced parameter. The effects on the probability density function lead to another important property of the model – its ability to generate a volatility smile that is similar to the volatility smiles observed in the market. In the volatility smile, the skew is generated by the correlation parameter, and the smile is generated by the volatility of the volatility parameter. All the other model parameter effects on the volatility smile are analysed in chapter 4 in this thesis. A practical use is introduced in chapter 5 where the two stochastic volatility models are used to price the interestrate derivative known as a cap. The model parameters are calibrated to observed market prices by minimising the sum of the squared pricing error for all the maturities and exercise prices. Various tests show that the Displaced Heston model is the best model to use on the data in this thesis, and that it performs very well in describing the market data. Another type of practical use is introduced in chapter 6 where the Heston model simulation abilities are tested in a Monte Carlo setup. A simple Euler scheme for simulating the two processes is described, and the pricing ability of the method is compared to the analytical Heston formula. It is shown that when enough simulations are used and the discrete time steps in each simulation are small then the simulated price is very close to the analytical price. Finally, the simulation setup is used in a reallife example where a structured stock obligation is priced using the Euler scheme and a calibration setup similar to the one used in chapter 5. The two models and the calculations made throughout the thesis are implemented in the mathematical programming language Matlab, and the source code for the programs is shown in appendix C. URI: http://hdl.handle.net/10417/749 Files in this item: 1
martin_bech_rasmussen.pdf (910.6Kb) 
Udvidelser af de klassiske Value at Risk og Expected Shortfall modellerLerche Iversen, Mads; Larsen Laugesen, Jørgen (Frederiksberg, 2009)[More information][Less information]
Abstract: The purpose of this thesis is to investigate various methods for estimation of market risk. Our investigation is based on a comparative study of two parametric , a semi parametric  and a nonparametric model. The aim of the models is to consistently estimate the actual market risk through periods of financial distress. We use two well known risk measures, namely Value at Risk (VaR) and Expected Shortfall (ES) and provide results and key findings. Models are estimated and tested for ten selected stocks from the Danish OMXC20 index. At first we define a variance model. The recent year has shown a significant increase in market volatility due to the financial turmoil and has stressed the need for a valid variance model. We test various models to assess which of these provides the best fit. We find that the GARCH(1,2) model leads to the best test results and this is used throughout the thesis as the model for variance. The parametric models consist of a class of generalized hyperbolic (GH) distributions. These include the normal distribution as well as symmetric and asymmetric distributions. Of several GH distribution families considered, the most successful is the skewedt distribution (sometimes referred to as the skewed Studentt distribution). We compare VaR and ES estimates for the skewedt and the normal distribution as the latter is the most commonly used in practice. Furthermore, we introduce a nonparametric  and a semi parametric model which are not restricted by the same distributional assumptions as the fully parametric models. The first model is the well known Historical Simulation (HS) approach which is commonly used in financial institutions. The second model is Filtered Historical Simulation (FHS) which originates from the HS approach, but incorporates a model for the variance. Both use historical simulation schemes. VaR and ES estimates are provided for both models and it is shown that the HS model overestimates risk in periods of low volatility and vice versa. Furthermore, the backtesting procedure rejects the HS model whereas it accepts the FHS model. The general conclusion is that a market risk measure must essentially incorporate a model for the variance as an indicator for changes in market volatility. In addition, results indicate that the introduction of more complex distributions (GH) leads to a more consistent measure of risk. URI: http://hdl.handle.net/10417/753 Files in this item: 1

Herunder asymmetrisk informationDucic, Mirela (Frederiksberg, 2009)[More information][Less information]
Abstract: The purpose of this paper is to give an insight into the healthcare systems in Denmark and US, where a growing interest over the past few years is awakened due to their differences in organizations and immense criticism. The healthcare systems in these two countries is characterized as being completely opposite of each other, due to the organization and political goals. The Danish government has a purpose of providing universal coverage, which can be characterized as free and equal healthcare access to the total population. The US government however has based their political goals on choice of freedom, where approximately 40 mill. Americans (free riders) has no insurance and thereby minimum access to healthcare. The Danish healthcare system is rigid in the sense, that supply and demand for healthcare is controlled by the government with small private healthcare suppliers. However the US healthcare system is characterized by the opposite, where the majority of the healthcare market is represented by private suppliers that can be categorized as notforprofit (NFP) organizations and forprofit (FP). The governments own healthcare programs are Medicaid and Medicare, and the part of the population covered by these programs is poor people and pensioners/disabled people. The insurance plans can be split into the following HMO, POS, PPO categorized as managed care plans and the traditionally FFS plan. Under managed care plans the provider and the insurance companies is defined as an organization, where the goal is to minimize healthcare consumption and maximize profits. The reimbursement model in managed care plans is based on a fixed rate pr. patient, where more volume means less profit, therefore the suppliers are aware of the cost containment. Under the FFS plan the supplier is reimbursed by the total amount, where the supplier doesn’t care about cost containment. Even though the healthcare systems in these two countries are different, they battle the same problems in their contract design relating to asymmetric information in the form of moral hazard and adverse selection. These types of market failure lead to inefficiency primarily due to rising competition in the supplier market, where competition helps improving welfare. The disadvantages come in the form of neglected patient care, repel the sick that increase costs or seek to treat only healthy patient, which is known as adverse selection. Moral hazard relates to overconsumption of healthcare from healthy people, which also deteriorates the welfare in the society, due to wasting of resources. None of the healthcare systems are perfect and there can be made adjustments in both countries. In US by introducing universal healthcare that can eliminate the free riders and in Denmark overconsumption can be minimized by introducing a patient fee. URI: http://hdl.handle.net/10417/762 Files in this item: 1
mirela_ducic.pdf (1.049Mb) 
Kjeldgaard, Christian Philip; Holst, Aage Møller (Frederiksberg, 2009)[More information][Less information]
Abstract: The famous Black Scholes model assumes that the underlying asset follows a geometric Brownian motion with constant drift and volatility. This assumption implies that returns are lognormally distributed and that the volatility is constant, although it is very easy to show that this is not the case for empirical return series. In this thesis we examine the effect of changing the Black Scholes assumption in such a way that we address the two implications. First, we implement a Normal Inverse Gaussian (NIG) option pricing model that addresses the implication of the marginal distribution of the logreturns. Second, we implement S. Heston’s stochastic volatility model that addresses the implication of constant volatility, thereby also changing the marginal distribution of the logreturns. To ensure that we use liquid options in our calibration, we examine Volume and Open Interest and find a suitable slice of strikes. We calibrate the parameters of our three models daily to liquid S&P 500 plain vanilla options under a risk neutral probability measure Q. We use several ‘goodness of fit’ criteria to ensure that our comparison of the three models does not depend on a single criterion. We see that the Heston model fits observed option prices reasonably well but that the NIG model displays systematical errors, which suggest that it is not a very good model. In order to decide whether one model outperforms the other models, we use the calibrated parameters to perform three different types of delta hedges. The main result of the paper is that the Heston model outperforms the two other models, especially if we aggregate the P&L of the individual options into a portfolio. Although the Heston model is superior to the Black Scholes model, it should be noted that the Black Scholes model is much easier to implement and that the Black Scholes model performs quite well and outperforms the NIG model. The NIG model turns out to be the worst model on all accounts. URI: http://hdl.handle.net/10417/768 Files in this item: 1

En analyse af risikopræmien på det danske aktiemarkedBech, Louise Wellner (Frederiksberg, 2009)[More information][Less information]
Abstract: This thesis presents an attempt to resolve the problems emerging from the assumptions concerning the risk premium on the stock markets. The main concern is the calculation of the historical risk premium, which until now has been based on the assumption that the risk premium can be assumed constant, if the sample period is long enough. Since the risk premium for a specific stock market has different size depending on the period, the risk premium might not be constant after all. This thesis presents calculations of a timevariant risk premium and discusses the relevance for investors. The thesis firstly reviews the original risk premium theory from 1985, presenting the definition of the risk premium, the calculation methods and the main assumptions. In the continuation of this, the next part of the thesis presents the appropriate series to represent both the Danish stock market and a Danish riskfree asset, with a prior disscusion of the theoretical relevance of the alternatives. The thesis proceeds to review the cointegration theory. The variation of the VARrepresentation of multiple series is presented with the main focus on the effects from cointegration between the series. Especially the split into the long run parameters a, b and the short run parameters i G are relevant, because it shows the different information within the series. Afterwards, both the univarate and the multivariate cointegration theory are described, with the center of attention on the consequences following choices concerning the parameters indicated by testresults. The empirical part of the thesis shows how the theoretical arguments are put to work on the Danish stock market. Extensive attention is payed to the specification of the unrestricted VARmodel to ensure that the assumption of multivariate normality in the residuals is obtained. The final restricted model consists of one cointegration that describes the long run dynamic and respectively an AR(2) process for the risk free asset and an AR(3)process for the stock market to describe the short run dynamic. It is shown that the timevariante risk premium can be calculated from a statistical method and a financial method using the estimated parameters, but both methods are in some areas inconsistent with the known financial forces and the statistical definitions concerning the series. The risk premium is calculated under the assumption of concistency, so that the result can be used as benchmark. Both timevariate calculation methods and the concistency method show that an investor can expect an annual risk premium of about 5% on a long investment. But the assumption of concistency during the periode is shown to be unacceptable, since the spread of the annual timevariant riskpremium is 28% to 38%. The timing of a shorter investment is therfor crucial for the size of the risk premium. URI: http://hdl.handle.net/10417/778 Files in this item: 2
Data og Beregninger.xls (5.609Mb)louise_wellner_bech.pdf (684.4Kb) 
Linde, Erik; Aasted Sørensen, Thomas (Frederiksberg, 2009)[More information][Less information]
Abstract: The aim of this thesis is to investigate the theoretical basis for 3 models, used for pricing credit risk. We will examine, if the models are used easily in practice, and how well they replicate the spreads observed from the CDS market. Our focus will be on the classic Merton model, which was the pioneer within this area, and the extension made to this model derived by Black and Cox. Those 2 models are characterized as structural models and will be compared to a market model derived by Hull and White. For the purpose of testing the models on real data, we have chosen 2 companies from the auto industry. We will use these companies accounting figures as input to the structural models, and bond prices observed from the market as input to the market model. We find from our analysis that the estimation of the input data for the structural models, are difficult and based on great uncertainty. The application of the HullWhite model in practice, are easier because of the low requirements regarding the input data. The theoretical spreads calculated from the models, are not able to replicate the observed spreads from the CDS market. The HullWhite model tends to overestimate the spread while there is no general conclusion for the structural models. Based on the whole analysis, HullWhite achieves the highest correlation with the observed spreads, but there are cases where the structural models have higher correlation performance than HullWhite. In general we must conclude that it is not possible to replicate the observed spreads from any of these models. Regarding the practical application, we will recommend the HullWhite model because of the low requirements to the input data. URI: http://hdl.handle.net/10417/832 Files in this item: 1

Anvendelse af ekstremværditeori og copulaer i estimation af ValueatRiskHal, Jonas (Frederiksberg, 2004)[More information][Less information]
Abstract: Value at Risk (VaR) is one of the most used risk measures today. The validity of a VaR measure is closely connected with how well the statistical model underlying VaR captures the structure in the empirical data. In practical application, it is very common to base VaR on the assumption that all the risk factors changes in a given portfolio are Gaussian and that their joint distribution function is Gaussian too. The Gaussian assumption makes the calculation easy and it is often possible to nd an analytical solution which makes VaR calcu lation straightforward. But empirical ndings suggest that this is not a good approximation. The reason for this is that nancial data are characterized as being leptokurtic, skewed and heteroskedastic. And often one observes more extreme observations in nancial data than can be explained by the Gaussian distribution and VaR estimates are thereby underesti mated. That is the capital reserve based on VaR will be too small. Another point is that the Gaussian distribution is strictly symmetrical thus making it unsuitable for skewed empirical distribution. Chapter one is an introduction to VaR and how it is calculated. In order to calculate VaR you need to have a joint probability function for the risk factors of your portfolio as well as a speci cation of the loss operator. A loss operator is a function that maps changes in risk factors to loss of the portfolio. While the selection and estimation of the statistical model underlying VaR is the focus of chapter 3 and 4, chapter 2 looks at how to transform the joint probability distribution for the risk factors to a probability function for portfolio losses. I show that in certain special cases it is possible to get an analytical expression for the loss distribution but most often given the special characteristic of nancial data one has to use Monte Carlo simulation in order to get a VaR estimate. This chapter also discuss VaR as a relevant risk measure. The statistical model underlying VaR will be based on a copula approach. A copula is a function which couples given marginal distributions to form a joint distribution function. Chapter 4 illustrate the dependence structure of 3 di¤erent copula functions, namely the Gaussian, t and Gumbel copula. Besides introducing a whole new array of dependence struc tures, the interesting aspect of a copula is that it makes it possible to separate the marginal distribution from the dependence structure, thus splitting the model speci cation in two. From a mathematical viewpoint the copula approach introduces nothing new. From a sta tistical viewpoint, however the selecting and estimation process are eased, since we can pick the marginal distribution function independently from each other and from the dependence structure. I show how the dependence structure can be estimated even without knowledge of the parametric form for the marginal distributions functions. Of course we still have to select a copula function and in practical application here lies the challenge. I also show  via a small monte carlo simulation study  that one could choose the copula with the lowest AIC from a list of copulas. The use of copula theory allows us to look at the marginal distributions individually.With respect to VaR calculation, we are mainly interested in the tail of the marginal distributions. But since most observation lies in the centre (around the mean), the estimation is of course most accurate here. This inherent problem can be overcome by looking at extreme value theory. More speci cly a theory known as Peaks over Threshold (POT) allows us to estimate the tail distribution solely. In chapter 3 I show that POT is based on classical extreme value theory and under some weak assumptions the theory tells us the tail approximate, a distribution known as Generalized Pareto Distribution (GPD). The approximation gets better the further out in the tail one is looking. Of course the futher out in the tail the number of observations will be scarce. One must address this dilemma in choosing the starting point for the tail distribution and this issue will also be dealt with in chapter 3. The main contribution from chapter 3 is that it is possible under some very general assumptions to use GPD as an approximation for the tail distribution. The copula/POT approach is not just another assumption on the parametric form of the underlying model but a new more exible framework which can incorporate the special char acteristics of nancial assets. To illustrate this approach, I use empirical data to estimate a daily VaR based on a copula approach. The marginal distribution is modelled by a mix distri bution where the tail is modelled by a GPD and for the rest, I used a Gaussian distribution. I compared VaR based on this model by VaR based on the traditional variancecovariance approach. The ndings suggest that the copula/POT approach gives a more accurate VaR even though the variancecovariance method wasn t that far o¤ in the back testing period. Other ndings suggest that the speci cation of the copula is important especially for the accuracy of the model. URI: http://hdl.handle.net/10417/956 Files in this item: 1
jonas_hal.pdf (1.009Mb) 
Brandt Andersen, Jonas; Grønbeck Andersen, Adam (Frederiksberg, 2009)[More information][Less information]
Abstract: In 2007 a new Danish legislation was rati ed to incorporate CRDcompliant covered bonds in Denmark. From this rati cation, di erent kinds of new risks have arisen compared to the previous well renowned Danish mortgage bonds. Furthermore the CRDcompliant bonds have been blamed to have a negative e ect on the nancial stability in Denmark. This thesis identi es and analyzes the new risks by investigating the background, general structure of the legislation and especially risks following the event of an issuer default. Furthermore the international rating agencies' models for rating covered bonds are being evaluated and compared. Finally the in uence of Danish covered bonds on the nancial stability in Denmark during the recent economic crisis is analyzed. The CRDcompliant bonds have brought new general risks in form of maturity mismatch, currency and interest risk, which have a negative e ect on the transparency of the product. Furthermore, some risk factors remain unclear in the event of an issuer default, but we conclude, that the implementation of CRDcompliant covered bonds was necessary in order to maintain the competitiveness of Danish nancial institutions. Despite the risk factors, the CRDcompliant bonds did help Danish banks in a frozen liquidity market during the economic crisis to raise funds through repo activities. We nd those activities nancial stabilizing. We conclude that the core of the nancial instability is to be found in the huge growth in priority loans in Denmark and not in the CRDcompliant covered bonds. URI: http://hdl.handle.net/10417/1030 Files in this item: 1

Anvendelse af metoden samt introduktion af de variansreducerende metoder og HestonmodellenAamand Nielsen, Thomas (Frederiksberg, 2010)[More information][Less information]
Abstract: There are three main subjects in this master thesis, which all involves the simulation techniques of the Monte Carlo method. The first subject is to introduce the idea of the most basic variation of the Monte Carlo method. This introduction will later make it possible to model more sophisticated Monte Carlo models, known as Antithetic Variates and Control Variates, which will be the second main subject. These models are known as variance reduction methods, because the main idea is to make the estimate from simulation more precise, by minimizing the standard deviation of the estimated price. The last subject is to take a closer look at how it’s possible to use the Monte Carlo techniques to price options with stochastic volatility. This is done by introducing the Hestonmodel, which simulates both the variance and the asset at the same time. The first model introduced in this thesis, is the most basic Monte Carlo model, known as the standard model, and its ability to price options will be shown from examples where it prices European calloptions. The reason why this kind of option is chosen is that the estimated price can be evaluated against the analytical solution from the BlackScholes formula, and it’s therefore possible to test the speed in which the estimated price will converge against the analytical solution. The thing about the standard model is, that I take a lot of simulations, to make a precise estimate off the price, and this is the main reason why the variance reduction techniques are introduced. Antithetic Variates is one of the simplest techniques to minimize the standard deviation of the estimate, and the basic idea behind the model is to simulate paths of the asset by using both the random number Z and –Z. The model is simple and takes only a few minutes to add to the original standard model. A more sophisticated model is therefore introduced, and it’s called Control Variates. The idea behind this model is to simulate something you know the value of at the first place, with the same random numbers as you use to simulate the price of the option. By doing this, the model gives you the chance to adjust the estimated price of the option, according to the estimation error of the known value. Both these models will be tested, to see which one is best to price options, with different characteristics. A model outside the world of BlackScholes is The Heston model, and the model gives some realism to the simulation, by introducing stochastic volatility. The variation reduction techniques will also be tested together with The Heston model, but the main subject is to investigate some of the problems that may be found, by simulating The Heston model with the Eulerscheme. This analysis will be based on the article of Leif Andersen, and the solution is to test a more robust scheme, that handles more stressed scenarios in a better way. URI: http://hdl.handle.net/10417/1038 Files in this item: 1
thomas_aamand_nielsen.pdf (4.139Mb) 
Rosenkvist, Lars (Frederiksberg, 2010)[More information][Less information]
Abstract: Når man læser artikler om EBITmodeller, er det en udbredt antagelse, at modellens drift afhænger af den kupon, der udbetales til gældsejerne. En anden antagelse i de samme artikler er, at virksomhedens værdi ikke afhænger af dens kapitalstruktur forstået som fordelingen mellem gæld og aktiekapital. Da drift og virksomhedsværdi er meget tæt forbundne, synes disse to antagelser at stride mod hinanden. Jeg vil i denne opgave fjerne den første af de to antagelser, da det umiddelbart er denne, der ikke er meningsfuld. Dette har den konsekvens, at modellens drift ikke længere er endogent givet, men i stedet skal angives exogent. Idet modellens drift nu skal angives exogent, er det interessant at betragte empirisk data og forsøge at estimere drift og volalitet ud fra regnskabsdata. Jeg har valgt at betragte tre store danske virksomheder: Vestas, der er med i C20indekset og B&O en gammel og velanset virksomhed, samt Parken Sport og Entertainment (PSE), der har fyldt meget i mediebilledet siden selskabets børsnotering og især i den senere tid er blevet kritiseret kraftigt for at være gearet alt for højt. Jeg vil se, om jeg enten kan afvise eller bekræfte, om dette er sandt. URI: http://hdl.handle.net/10417/1587 Files in this item: 2
lars_rosenkvist_speciale.xlsm (5.436Mb)lars_rosenkvist.pdf (753.4Kb) 
Interessekonflikter, ratings shopping og rating inflationWex, Katrine Cederstrøm (Frederiksberg, 2010)[More information][Less information]
Abstract: In the recent financial meltdown there has been countless discussions on where to place the blame and many agrees that the credit rating industry and in particular the credit rating agencies played a signicant role. Credit rating agencies are accused of deliberately inflating ratings and generally being too lax regarding the ratings of some structured products. One reason, for why attention goes to the rating agencies and people argue that ratings are inflated, is the industry's extensive problem with conflicts of interest due to the current business model. In the business model the issuers who are to be rated also pay the fees for the rating. This combined with issuers' choice to shop for the most favorable rating, also called ratings shopping, induces a moral hazard problem by giving the agencies incentive to inflate ratings in contrast to reporting truthfully. The consequence would then be a general rating inflation. In response to the numerous accusations the rating agencies argue that such behavior would be very damaging for them, since their reputation is at stake. They also emphasize their argument by stating that reputational capital is of key importance in the industry. It has therefore been discussed whether reputational concerns are powerful enough to discipline the rating agencies. Furthermore some economists argue that although the business model has been in use since the 1970s the problem with ratings shopping did not arise until recent years when the marked for complex structured financial products grew rapidly. This could indicate that the complexity of the new products also is a significant factor in the financial turmoil. Another point is the fact that competition is somewhat limited in the rating marked due to the SEC's designation of NRSROs (Nationally Recognized Statistical Rating Organizations). The NRSRO label recognizes ratings of chosen rating agencies to be of significant value in investment decisions. This created a barrier to entry in the credit market, and many have therefore argued that inducing competition could reduce the problems in the industry. The objective of this thesis is to analyze rating inflation thus the relevance of the above mentioned points: Whether reputational concerns are enough to discipline the agencies; whether or not the complexity of the product is of significance in discussions of rating inflation; and finally if encouraging competition will in fact reduce conflicts of interests or instead increase ratings shopping. Through the analysis, we find that when large fractions of the rating agencies' revenues comes from rating complex products reputational concerns are not powerful enough to discipline the ratings agencies. Introducing competition, given that a certain amount the investors are naïve, will only increase issuers' options for ratings shopping. Therefore encouraging competition will probably not be a preferable solution to the ratings industry's problems. Furthermore we find that when the rated products are sufficiently complex, the issuer's incentive to shop for ratings is strong. Therefore an increase in complexity could induce a systematic bias in disclosed ratings, even when the agencies report truthfully. Combining the analysis and the results, this leads to one clear conclusion the credit rating industry needs regulation. We therefore discuss different possible policy recommendations, finding that a platformpays model might be a sustainable solution. URI: http://hdl.handle.net/10417/1632 Files in this item: 1
katrine_cederstroem_wex.pdf (992.2Kb) 
Thostrup Andersen, Troels; Frederiksen, Thomas Chr. (Frederiksberg, 2010)[More information][Less information]
Abstract: The thesis nds evidence supporting that the Basel II benchmark model provides poor estimates of the market risk VaR  both over a 10 days, as well as a one day horizon. This is particularly due to the homoskedastic estimates provided by this model, but also because the model assumes returns to be Gaussian and thereby independently and identically distributed. Evidence further suggests that it is na ve to assume that one model su ciently describes the risk in a market which at times seems fairly stable, while at others becomes quite volatile. Analysis of the historical returns show that when divided into a normal and an extreme market, the empirical distribution of both return series has higher kurtosis than assumed by the Gaussian distribution. However, only the extreme market seems negatively skewed. Because of this asymmetry, each side of the distribution of the extreme market should be estimated individually. For this reason and because of other di erences in market characteristics, two individual models are proposed. For the normal market a GARCH(1,1) model is proposed. The advantage of such a model is its ability to incorporate historical error terms in order to deliver heteroskedastic volatility estimates. Because of the prevailing autocorrelation, such a model seems particularly e cient. In order to give accurate VaR estimate in an extreme market setting, a conditional peaksoverthreshold model building on extreme value theory is proposed. This model combines standard POT modelling with GARCH modelling in order to deliver heteroskedastic volatility estimates. The advantage of applying such a model is its ability to give accurate estimates of the individual tails, and thereby conditional volatility estimates, when observations are unevenly distributed. URI: http://hdl.handle.net/10417/1655 Files in this item: 1

Søndergaard, Mads Berendt (Frederiksberg, 2010)[More information][Less information]
Abstract: The main objective of this thesis is to analyse the use of in ationlinked products such as indexedlinked bonds and in ation derivatives in asset and risk management. To get a thorough knowledge of what the in ation means, the relationship between the nominal interest rate, the real interest rate and the in ation is established using the Fisher relation, which says that nominal interest rate is the sum of the real interest rate and the in ation leading to the de nition of the breakeven in ation, which is the in ation that makes the investor indi erent between investing in an in ationindexed bond and a nominel bond with the same characteristics. To get a better understanding of the reasons to issue or invest in indexedlinked bonds these motives have been analysed. One of the primary reasons to issue in ationlinked bonds is if you have revenues depending on the price movements. Another reason is if you want to reduce the risk premium associated with the bonds. One of the primary reasons to invest in in ationlinked bonds is if you have liabilities depending on the in ation. Furthermore, these bonds have nice properties regarding diversi cation: Low correlation with other assets and a low volatility. These properties will be analysed in the last part of the thesis. With the basic knowledge of the in ation and the market, the di erent types of in ationlinked securities will be introduced. There are several di erent types of in ationlinked bonds each having di erent cash ow pro les, which all will be analysed. Along with the market for in ationlinked bonds, a market for in ation derivatives has been growing providing a great deal of exibility, especially for hedging purposes. This thesis focuses on two types of in ation swaps: ZeroCoupon Swap and Yearon Year Swap. There is still a lack of pricing models designed to pricing in ationlinked securities, but one approach that is used is the HeathJarrowMorton framework. The two main conclusions of this approach are shown. These include a fast evaluation of the model and few parameters to estimate, which is due to the fact that the dynamics are described as shifts away from the forward curve. Having a pricing model de ned the focus will be changed to the main purpose: In ationlinked securities in risk and asset management. Normally, duration is a wellknown risk measure for bonds, but you need to decompose the duration into both real rate duration and in ation duration when using this measure for indexedlinked bonds. When hedging your liabilities with in ationlinked securities, you need to distinguish between bonds ans swaps, with in ation swaps giving you a much better match of your liabilities, which is the last focus in this thesis along with a short discussion of tactical asset allocation including these securities. URI: http://hdl.handle.net/10417/1651 Files in this item: 1
mads_berendt_soendergaard.pdf (928.7Kb) 
Khan, Sonia (Frederiksberg, 2010)[More information][Less information]
Abstract: The aim of this thesis is to price and construct the optimal debt contract between two parties, the owner of the company and its creditor. Within the subject of credit risk this is a topic that has gotten a lot attention since its importance is high for almost every firm, and especially for financial firms as their whole being relies upon giving loans to other companies. By introducing the traditional model of Merton the aspect of pricing is touched upon. But since Merton’s model has its limitations, for instance its assumption about a perfect loan market, models by Leland are introduced to move slightly closer to the reality of how firms operate. Leland’s models introduce default costs, tax benefits and a capital structure, which can change after its initial settlement. Here debt is priced after the optimal capital structure has been settled, and endogenous default boundaries are defined. Last an optimal risk strategy is defined, as it is shown that the timing of the strategy affects the payoffs both parties receive. To move yet another step closer to reality, renegotiation is introduced. The element of renegotiation is important, since it is rarely seen that the creditor lets the firm default without having tried to give it a helping hand thru for instance letting it pay no or smaller monthly installments for a periods of time. The creditor will only agree to renegotiate, if his gains by accepting the deal covers the losses he will have for a period of time. Control allocation is then implemented in the initial debt contract, as it can be thought that in future the owner and the creditor will have their disagreements about what decisions to make regarding the firms ongoing. The aim is to construct an (pareto) optimal contract which protects both parties from the other party’s opportunistic behavior. The aim of the paper is not to define a single optimal contract, as the definition of optimal highly relies on what parameters and circumstances are current. However optimal debt contracts have been deducted by defining certain circumstances, and if a single parameter is changed so will the optimal debt contract. URI: http://hdl.handle.net/10417/1729 Files in this item: 1
sonia_khan.pdf (1.100Mb) 
Schipper Jespersen, Nicolai (Frederiksberg, 2010)[More information][Less information]
Abstract: In the rst part of the paper we present some classical actuarial models (the collec tive and individual risk model) and the probability theory behind. A discussion of pros and cons of each approach leads to an alternative approach where the losses on each policy is modelled by an individual compound Poisson process. We estimate this model using generalized linear models (GLM). In the second part we introduce a framework for incorporating empirical claim severity in ation in the severity models. This gives a method for automatic update of the insurance tari . The framework is a generalization a commonly used method of discounting, modelling and in ating (which we denote the DMI framework). A possible modi cation to the DMI framework is proposed, which makes it applicable to frequency models too. We suggest some methods to compare risk models, especially with respect to their performance over time, are suggested. Finally the methods are applied on a real life motor insurance dataset and we nd that the models under the DMI framework are superior to traditional models without in ation adjustments. The reader is expected to have a background in probability theory and have experience with GLM modelling. URI: http://hdl.handle.net/10417/1727 Files in this item: 1
nicolai_schipper_jespersen.pdf (3.292Mb) 
Ingerslev, Theis (Frederiksberg, 2010)[More information][Less information]
Abstract: The nancial crisis that arose in 2007 clearly emphasized the importance of the money markets and how essential they are for the rest of the nancial markets to operate orderly. The price of obtaining liquidity in the unsecured money markets developed explosively compared to the price of liquidity in the secured markets. So far analyses have been limited to plotting price spreads and qualitative attempts to measure banks' risk aversion. This thesis aims to take the analysis an academic step further. This study presents a liquidity model which will enable banks to lend unsecured liquidity to one another. The model depends on conditions regarding consumers' utility, stochastics in banks' individual liquidity need, distribution of asset returns and banks' gain on their customers. The model is not directly related to previous studies but it bene ts greatly from the results found in earlier articles. The liquidity model creates a market for unsecured liquidity lending without asymmetric information and nancial contagion. A pattern of liquidity shock corresponding to those used in most recent studies is incorporated. The pattern combines both an aggregated and an individual (idiosyncratic) shock. It is examined and quanti ed how banks trade liquidity in the model. It is found that their trade can be limited by lack of surplus liquidity or by lack of pro t on trading liquidity. The market is under full competition. This determines the price of unsecured liquidity and drives the expected pro t on the most riskprone liquidity lender to zero. It is shown how the banks' allocation of liquidity should be regulated in the liquidity model to achieve the highest average utility for the consumers. Overall it succeeds to establish mathematical expressions specifying the price of liquidity given the model's dependencies. The European money markets are examined. The framework is described: From obtaining liquidity in the European Central Bank to liquidity trading and transformation in the secondary market between private banks. The main money market indices are presented. Further more the assumptions of the liquidity model are related to the real European money market. The assumptions are found to be quite realistic but the exact precision of the regulation is questioned. The liquidity model is applied to the European money market and it is estimated how it re ects upon the recent developments within the market. The return of the asset seems to be of great importance. The rst rise in the price level of the unsecured liquidity in 2007 can be re ected by falling average return of the assets, though failing to justify the most drastic price rises. The most elevated price levels are reproduced in the liquidity model by a slip in the volatility of the asset returns or the consumers' discount of future consumption. If the explosive price development from 2008 is anchored to consumers' discount it clearly requires poor regulation of the banks' liquidity allocation. The rest of the dependencies in the model are found too weak to fully explain the development experienced within the money markets. URI: http://hdl.handle.net/10417/1726 Files in this item: 1
theis_ingerslev.pdf (1.803Mb) 
Helverskov, Mille Lykke; Lund, Thomas (Frederiksberg, 2010)[More information][Less information]
Abstract: In this master thesis a pricing model for Danish fixed rate callable mortgage bonds is developed. The Danish market for mortgage loans is one of the largest and most liquid in the world. High transparency and legal restrictions are the main factors that secure the investor. Danish mortgage bonds are described as safe investments. A callable bond has an embedded option which gives the borrower the right to repay his loan at par, at specified dates prior to maturity of the loan. The borrowers do not always act entirely rational and the gain by exercising the option is not the only factor that decides if a borrower exercises at a given time. This complicates the pricing procedure, and therefore a prepayment model which takes other factors into account has to be developed. First a term structure model is estimated with market data as input. Both the current zero coupon rates and implied volatility of swaptions is matched. This model assumes to describe the possible future development of the short rate. The term structure is used in the pricing model for discounting cash flows and calculation of refinancing incentives. Then a prepayment model is estimated on the basis of historical prepayments and borrower compositions. This model describes the conditional prepayment rate as a function of the gain by exercising, the relative time to maturity and the poolfactor. The pricing model is developed on the basis of the term structure model and the prepayment model. It is assumed that the value of a callable bond can be divided in two: The value if every borrower prepays their loans and the value if none of the borrowers prepay their loans. Then the expected future cash flows can be found in every state and discounted back to the value date. Generally the pricing model does a good job and the model prices relatively close to the market price. URI: http://hdl.handle.net/10417/1722 Files in this item: 1

Clement Andersen, Casper; Musajev, Michael Murtaz (Frederiksberg, 2010)[More information][Less information]
Abstract: The main objective of this thesis is to assess the performance of modern portfolio choice models in comparison with 1/N on the Danish Stock market in the period from 1980 to 2010. We search for a strategy with the ability to consistently outperform the benchmark. By applying FamaFrench theory on a large sample of Danish equities, we form 6 Size and BookToMarket portfolios, as well as 6 Size and Momentum portfolios. These datasets are analyzed separately and jointly where we find strong evidence in support of the samplebased asset allocation models. This is explained by the modest N and the fact that the constructed FamaFrench portfolios, by nature, result in both excellent as well as devastating evolvements. We develop the means to assess optimal asset allocation with time dependency in N, and address extreme cases of +100 shares. In general large values of N imply a welldiversified 1/N strategy, as our results also emphasize, i.e. none of the samplebased models presented even a single incident of outperforming the benchmark in any of the two datasets considered with a substantially large N. By simulating various datasets, calibrated to reflect both stocks as well as portfolios, performance is evaluated for extreme values of M and T. We find that the samplebased strategies have a better chance of outperforming 1/N on stocks rather than portfolios. Inspired by Brandt, SantaClara and Valkanov (2007), we suggest a fairly simple approach to allocate wealth relying on the firm characteristic PriceEarnings ratio, and assess whether such a model brings anything new into the field of asset allocation. In alignment with DeMiguel et al. 2007, our study suggests a tendency of the crosssectional firm characteristic model to enhance performance, although it is not consistently statistically superior, than the samplebased asset allocation models. Putting things into perspective we compare our results on the Danish Stock market with a similar study performed on the U.S. market (DeMiguel et al. 2007). Overall our findings on the Danish market are in alignment with the U.S. results, in the sense that none of the considered strategies were proven to be consistently superior to 1/N. Despite our findings, we present empirical evidence of equity funds that have, at least to some extent, implemented similar strategies to the ones examined in this thesis. The Swiss asset management company Unigestion follows a longonly MinimumVariance strategy, which presents a convincing performance compared to the DJ Europe Stoxx 600, i.e. considerably lower yearly volatility as well as higher returns. As for asset allocation with respect to PriceEarnings ratios, we clarify the performance of the India Tata Equity P/E fund obtained since it was founded, and compare it to the BSE Sensex 30 index. Evolvements in normalized total return indexes as well as yearly volatilities indicate that the P/E fund outperforms the Indian market. URI: http://hdl.handle.net/10417/1731 Files in this item: 1

Knudsen, Kenneth; Tahir, Mamona (Frederiksberg, 2011)[More information][Less information]
Abstract: The Vehicle Routing Problem (VRP) is one of the most important and challenging problems in the field of Operations Research. It was first presented by Dantzig and Ramser in 1959, as a problem of designing a leastcost set of routes for vehicles in such a way, that all customers are visited once, and vehicle capacities are adhered to. The importance of VRP is due to the fact that it yields significant economic benefits, which is why researchers have given more and more attention to the various extensions of the VRP that arise from real life applications. Typically such extensions consider additional and more complicated constraints, such as time windows. In this thesis, we have chosen to focus on approximate solution techniques, which not only provide high quality solutions, but within an acceptable computational time. Such methods are especially necessary in large VRP instances with hundreds or even thousands of both customers and constraints. Classical heuristics is a category, consisting of methods, which relatively quickly converges to a solution. It suffers, however, from the fact that the search often results in local minima, which is why the field of metaheuristics currently dominates the heuristics arena, since it considers inferior solutions as access points to more promising regions of the solution space. At the time of this writing, the proposed metaheuristics for most of the VRP extensions is rather limited in the literature. However, we have nevertheless presented some of the currently best metaheuristics for several of such extensions. At the end of this paper, we have applied metaheuristics for an atypical real life instance of vehicle routing, which is based on the district heating network in Copenhagen. A solution is presented, and even though several of the special characteristics of the problem have been simplified and reduced, there is a large similarity between this solution and the existing network. URI: http://hdl.handle.net/10417/2668 Files in this item: 1
kenneth_knudsen_og_mamona_tahir.pdf (2.185Mb)
Now showing items 120 of 88
Next Page