Digital Opportunities for Traditional Lenders through PSD2 Access to Bank Account Data

Master Thesis "Executive MBA", Cologne-Rotterdam (2018)


The financial industry has been one of the early adopters of computer technology, but traditional players are currently struggling with the challenges of digitalizing not only their processes but also their business models. The gap is currently filled by an increasing number of financial technology (FinTech) startups pushing into all business areas of banks with innovative customers services. In addition to FinTechs, also tech giants such as Facebook and Amazon recently started offering financial services. As a consequence, the financial industry is currently facing a phase of intense competition. In the European Union, competition is even intensified by two important regulatory changes becoming effective in 2018, the General Data Protection Regulation (GDPR) and the revised Payment Service Directive (PSD2). The GDPR gives consumers control over their personal data which also includes the right to share their customer data stored by corporations with other parties. In the financial industry, the PSD2 even requires banks to implement application programming interfaces (API) for non-discriminative third party access to bank accounts. Third party providers can now build their innovative services on bank account data and the infrastructure provided by banks. From a bank perspective the new regulation significantly increases the risk of customer disintermediation and commoditization. While the new regulation increases competition in the financial industry, it also creates a wide range of business opportunities for banks, especially in lending, which is traditionally data driven. This work demonstrates that banks could implement a fully automated lending process based on PSD2 access to accounts that could significantly improve the customer experience and increase operational efficiency. With regard to the implementation of the new process, it is shown that banks and FinTech competitors have developed complementary resources and both could benefit from partnerships. The analysis also shows, that banks could further exploit the new digital capabilities in lending in a new business model as data identity providers generating new revenue streams in a growth market. Here, banks could develop a competitive advantage based on strong consumer trust and the high level of assurance of bank’s data processing capabilities which is required by their regulated position. Finally, this work shows that banks need to adapt their communication strategies for engaging with customers in these new delivery and business models. Research shows that customers are willing to share their data for more purposes than companies believe, but are expecting transparency in return. This point is emphasized by the importance of customer consent under the new regulation. However, currently many banks approach data protection from a compliance perspective relying on information pull-strategies. Here, banks could potentially even increase their trust position by switching to a push-strategy where customers are actively informed about the use of personal data. 


Record statistics of fractal stochastic processes and intraday stock prices

Master Thesis in "Mathematical Finance", Oxford (2018)


Extreme events and records in particular play an important role in finance as well as generally in the study of stochastic processes. Extreme value statistics is the study of the occurrence and statistical properties of rare, extreme events, in record statistics one is interested in events that exceed (or deceed) all previous events in a given series of events. In recent years, there has been a lot of research on the statistics of record-breaking events in various areas of life and science and, particularly, also in finance.

Here, we study record events in two types of fractal stochastic processes and compare our findings to financial data. First, the record statistics of fractional Brownian motion is analyzed and discussed in detail. Then records in (rough) fractal stochastic volatility models are studied. In both models, the occurrence and number of record events of the processes and their increments as well as, importantly, record-record correlations are considered. The theoretical and numerical results developed in this context are then applied and compared to the record statistics of intraday stock data of numerous stocks listed at the NASDAQ stock exchange with a time-resolution of one minute. In previous studies, only daily stock data was considered. The record statistics of intraday data exhibits interesting deviations from the behavior found within the daily data. In contrast to the daily data, when effects emerging from the discreteness of the stock prices are taken into account, the occurrence of records in the intraday data is consistent with an unbiased symmetric random walk.

We will furthermore argue that it is worthwhile to study record-record correlations since they help to understand the properties of the stochastic process describing the intraday stock prices. The record-record correlations in intraday stock returns are, up to a certain degree, consistent with rough fractal stochastic volatility models with a small Hurst exponent H between 0:1 and 0:3. In this context, our analysis con rms the relevance and applicability of these models in finance.


Time Series Analysis and Supervised Learning Classification of Pneumonitis Datasets

Master Thesis in "Mathematical Finance", Oxford (2017)


The goal of this thesis is to apply time series analysis and supervised learning classifiers to pneumonitis datasets. Pneumonitis is a general term that refers to inflammation of lung tissue. In the first part of the thesis, we use a seasonal ARIMA model, a Dynamic Harmonic Regression model using Fourier Terms with ARIMA errors and an artificial neural network on a dataset of three years of daily discharges in a hospital association, and conduct forecasts up to 14 days. We evaluate the in-sample and out-of sample performance using a rolling forecast origin (cross-validation). We compare this to the performance of simple benchmarks, the seasonal naive benchmark and simple exponential smoothing. The results show that the artificial neural network outperforms all other models. In the second part, we present applications of the Logistic Regression classifier, the k-Nearest Neighbours (k-NN) classifier, the Random Forest classifier and the Gradient Boosting Machine (GBM) on a medical dataset, for the classification of pneumonitis survival. The GBM classifier achieves the best area under the ROC curve among all developed models.


Temperature-based weather derivatives: models, pricing and hedging

Master Thesis in "Mathematical Finance", Oxford (2017)


Weather derivatives have been one of the hot topics in Mathematical Finance since the inception of a structured market in the late 90s. While a multitude of approaches for the valuation of these exotic derivatives have been proposed since, the challenge of market incompleteness due to non-tradable underlying assets has not been overcome yet. There is no market-wide generally accepted formula, such as the one for no-arbitrage prices of stock options in the Black-Scholes setting, available, which creates barriers for larger market entrance. Nevertheless, the dissertation at hand presents recent modelling and valuation concepts of the most prominent type of underlying, namely temperature, with focus on modelling the daily average temperature by mean-reverting Ornstein-Uhlenbeck processes. Moreover, in the course of this work we provide a market update and point out the most common issues to be addressed when modelling the temperature for application in the weather derivatives market.



Analysis of volatility clustering in selected empirical data and in a simple limit order book model

Master Thesis in "Mathematical Finance", Oxford (2017)


Comparison between Risk Weighted Assets under Basel III and Financial assets measurements under IFRS

Master Thesis in "Risk Management and Regulation", Frankfurt School of Finance & Management (2016)


Strategies to Manage the FX Exposure of Commodity Transactions

Master Thesis in "Executive MBA", University of Durham/EBS (2016)

This thesis investigates the advantages of an integrated risk management for commodity price risks and exchange rate risk. It is discussed if commodity price risks and exchange rate risks should be managed separately or if they should be controlled simultaneously. Different strategies to manage exchange rate risks stemming from commodity transactions are presented and compared. The focus lies on the currency exchange rate risk of a company that has the obligation to buy a commodity traded in a foreign currency at a future date. The company does not want to be exposed to a currency exchange rate risk and wants to be exposed only to a commodity price risks. The uncertainty of the forecasts of the future commodity price induces that the FX hedging strategy is not obvious. The required future amount of the foreign currency is unknown due to the unknown future commodity price and cannot be hedge accurately. It is discussed if the simultaneous management of the commodity exposure and the exchange rate exposure is advantageous. The investigation starts with an analysis of relevant historic market. In the following, different hedging strategies for the management of FX exposure stemming from commodity transactions are presented where the future commodity price is unknown. These strategies range from a fully hedged FX position where the FX exposure is continuously monitored and closed to a strategy without any FX hedge transactions. Correlations between the commodity prices and the currency exchange rates are included in the calculations and their influence on the total risk position is discussed. A sensitivity analysis is performed and the impact of changing market conditions on the riskiness of the different FX hedging strategies is presented.


Performance Measurement and Management of Interest Rate Risk incorporating Accounting Effects

Master Thesis in "Executive MBA", University of Durham/EBS (2016)

Schierenbeck invented a method which is known in the literature as the current interest rate method (Marktzinsmethode). This method had been widely used as a performance measure in the financial industry. In the context of HGB (German Commercial Code) this method offers meaningful results, but this is not the case in the presence of IFRS, since IFRS specific P&L effects, such as agios, disagios, fees, hedge and basis adjustments, are not incorporated in the current method. Within this dissertation the attempt will be made, to generalize the current method in a way that IFRS specific effects are considered. The generalization will use synthetic derivatives in accordance with the concept of asset swap pricing to determine the so-called generalized margin. This rather theoretical concept will deliver an idea of the perfect hedging instrument as a side-product. The concept will offer a generalized definition of the net interest margin (conditions margin) and the profit/loss from the mismatch spread (structural margin). In addition the thesis will offer a framework for the management of interest rate risk. The scope will be limited to financial investments with fixed rate, floating rate or a zero coupon structure. At a later stage the concept might be generalized to derivatives as well. The data used will be taken from common financial market data providers, such as Reuters.
The thesis starts with a literature review of existing concepts and methodologies, continued by the definition of the generalized margin and a framework for the performance measurement incorporating IFRS. These concepts will be then applied in a simulation study, which serves as a proof of concept.

Sponsoring Big Sports. Is It Worth It? A Stock Price Perspective

Master Thesis in "Executive MBA", University of Durham/EBS (2015)

Big sporting events like the Olympic Games or the FIFA Football World Cups attract a huge global audience. It is, therefore, not surprising that companies want to utilise these events to increase their brands’ value. Event sponsorship is a popular, yet expensive, means to achieve this.
This paper provides an analysis of whether such a sponsoring endeavour is beneficial for the sponsor. The outcome is measured by the sponsor’s stock performance during the event, by means of an event study. A literature review on the event study methodology and on the research surrounding sport sponsorship is provided.
The analysis focusses on Adidas as a major sponsor for big sporting events. Also Adidas’ main competitors Puma and Nike are included in the research. As those two companies do not act as event sponsors, this offers a useful comparison.
The study uses an approach for a single firm / single event study proposed in the literature. The approach is extended with an original bootstrapping method that allows the calculation of abnormal return over the period of the studied event.
The results show on the one hand that companies cannot rely on achieving significant positive abnormal returns from event sponsorship activities. On the other hand, significant and systematic negative abnormal returns could not be observed either. In addition, the results indicate that sponsoring individuals – as opposed to events – may yield significant positive effects.
In summary – from Adidas’ perspective – sponsoring the events can be considered a reasonable endeavour. The lack of significant stock price changes shows that the net value of the sponsorship action is zero, which means that the investors regard the costs and the benefits of Adidas’ activity as balanced. However, as costs are constantly increasing, event sponsorship may become less attractive in the future.

The Challenge of Collateral Management - ISDA-CSA, CCPs and new Margining Rules for OTC-Derivatives

Master Thesis in "Risk Management and Regulation", Frankfurt School of Finance & Management (2015)

In this thesis we discuss different legal and operational challenges with respect to collateral management. We discuss the role of collateral as a risk mitigant, legal challenges and standardised legal contracts such as the ISDA Credit Support Annex or the “Besicherungsanhang zum DRV”. Moreover, we look at the most recent regulatory development in elaborate detail. We focus on the margin requirements under EMIR for non-centrally cleared transaction and the clearing obligation. This includes frontloading, initial and variation margin including minimum standards for models set by the regulator, account setup and segregation models, and requirements on eligible collateral.

Using Semantic Web Technologies in Financial Institutions

Master Thesis in "Risk Management and Regulation", Frankfurt School of Finance & Management (2015)

This thesis is structured as follows. We will begin with an introduction to semantic technologies and the Web Ontology Language (OWL). We will also discuss the di fferent language profi les and their implications for the practical implementation. Next, we will have a look at related technologies established in the banking sector. Here, XBRL plays a prominent role due to its wide and accepted use in regulatory reporting. However, we will also be able to see how XBRL relates to OWL and what their main di fferences are. In the next chapter, we will discuss FIBO, an emerging ontology standard for financial institutions. All of these chapters include a section called "Use Case", which discusses some applications of the currently discussed topics. Having reviewed the technical issues, we will turn our attention to the regulatory aspects of semantic technologies in the form of BCBS 239. Finally, the concluding chapter will review whether semantic technologies in general and FIBO in particular are a viable option for a bank's risk IT.


Relationship between IFRS and bank regulation -
A critical appraisal of the prudent valuation approach 

MBA Thesis in "General Management", Leipzig Graduate School of Management (2015)

This master thesis presents the accounting rules for financial instruments according to IFRS 9 and the fair value measurement as set out in IFRS 13. Also, the prudent valuation requirements as demanded by the CRR are explained. Special emphasis is devoted to the ensuing additional valuation adjustments specified by the EBA in terms of the final exposure draft of their RTS on prudent valuation. The relationship between financial reporting standards and valuation approaches pursuant to prudent valuation is discussed. This discussion is based on the IFRS conceptual framework.


Default contagion in interbank networks: a simulation based analysis 

Master Thesis in "Executive MBA", University of Durham/EBS (2015)

The interest of researchers, policy makers and central banks on the interbank behaviour has grown since Lehman Brothers, one of the largest investment banks, collapsed in 2008. The default of Lehman has become a symbol of the financial crisis and additional defaults of banks that have been considered as "too-big-to-fail" could only be prevented with interventions by European and US governments.
In this thesis, a simulation-based framework to analyse systemic risk in the interbank network is presented. Systemic risk in the financial network is studied as the risk of cascading defaults caused by the default of a single bank or a group of banks. The simulation based approach is used to identify systemically important banks based on real data. Here, a bank is considered to be systemically important if its default would cause a domino effect of bank defaults.
The presented framework is based on the credit relationships between connected financial institutions and on a model of the financial robustness of these institutions. Two different models have been developed. The first one is a static model that analysis the default cascade at a given point in time and uses a sequential algorithm to perform simulations. The second model is a dynamic model that simulates the financial robustness over time and includes stochastic external market effects in the simulation.
The two modelling approach have been applied to real-life data that describes the global interbank market. So far, default cascade algorithms have only been applied to data of domestic interbank networks. In this thesis, global data of the interbank network published by the Bank of International Settlement (BIS) are used. Although these data are available aggregated on country level, a sampling technique is developed to derive individual credit relationships between approx. 100 banks. A comparison with data from end of 2009 and end of 2013 demonstrates that the global interbank market is more stable today than it was five years ago. Based on the simulation results, a group of 28 financial institutions is considered as systemically relevant in 2013, whereas 57 banks belong to that group in 2009.
The results of this paper may be of interest to central banks and policy makers. The identified systemically relevant banks might be charge with higher core capital requirements to avoid further governmental interventions.

Cross-Correlation of FX Rates from a Microstructure Perspective

Master Thesis in "Mathematical Finance", Oxford (2014)

TIn this master thesis we analyze lead/lag effects between FX rates in market ticktime. We show that the cross-correlations between the currency pairs EUR/USD, EUR/JPY and USD/JPY are statistically significant and changing throughout the day as trading activity moves between different trading zones. These, so-called triangular pairs, are predominately traded on EBS and we use tick by tick order book data by EBS. [...]


Influence of the long-range order flow correlations on the limit order book dynamics

Master Thesis in "Mathematical Finance", Oxford (2015)

Technological progress of recent decades had significant impact on how
stock market trading is organized. Floor trading has to a large extent lost its
significance and has been replaced by modern electronic markets, which use
limit order books (LOB) as their price formation mechanism. Understanding
LOB dynamics is not only an interesting academic topic, but also has many
practical applications: e. g. developing optimal order execution algorithms,
minimizing market impact, designing electronic trading algorithms, etc.
Although many different models were developed to describe the LOB
properties and dynamics, none of them is considered to be “the standard
model”. In economics literature theories were proposed, which try to explain
the complex LOB dynamics by the interaction of fully rational agents. In
physics literature on the other hand, an opposite approach was taken: order
flows are modelled as random processes, and the rationality of the agents
is completely ignored. These so-called “zero-intelligence” models inspired
by statistical physics are quite successful in describing many empirically
observed properties of the real order books (e. g. concave market impact
In this thesis one of the most popular “zero-intelligence” LOB model of
Smith et al is studied using numerical simulations. Since original results for
this model were mostly obtained for an unrealistic case of the infinitesimal
price tick dp = 0, one of the goals of this thesis is to assess how the model
properties are affected if the realistic finite tick size dp > 0 is used. Our
study confirms most of the conclusions from the original paper and points
out the small corrections introduced by using dp > 0.
One of the major drawbacks of the Smith et al model is that it does not
take into account the empirical fact, that the time series of the market order
signs exhibits long-range correlations. Another drawback is its unrealistic
subdiffusive price dynamics, which contradicts market efficiency. Following
Toth et al, long-range correlations of the order flow are explicitly included
as an extension to the basic Smith model and their influence on the model
properties is studied in detail.


Hedge Accounting According to IFRS: Valuation of Interest Rate Risk or: Do Hedge Accounting Projects deliver?

Master Thesis in "Executive MBA", University of Durham/EBS (2015)

The collapse of the Lehman Brothers and the financial crisis during 2007/2008 caused an elevated perceived default-risk for banks. As a consequence, a basis-spread between the EURIBOR/LIBOR-Indices and the corresponding Overnight-Rates became observable in the interbank-markets.

While the basis-spread was mainly absent in the pre-crisis world, there was no difference between the discounting- and the forward-yield curves used to determine the present value of interbank-derivatives. The appearance of the basis-spread introduced new multi-curve approaches which have been established in the after-crisis world. In these multi-curve approaches, the Overnight-Indices are used for determining the discount-factors for the valuation of collateralized derivatives, while the bootstrapping-procedures for the forward-curves have been extended such that they rely on both the Overnight-Index and the Swap-Rates used in the pre-crisis world.

This approach involving more curves than in the pre-crisis world has induced a new risk-factor. While in the pre-crisis world there was one interest-rate risk, this new risk has to be divided in an interest-rate risk (basing e.g. on Overnight-Indices) and a spread-risk, basing on the yield-difference between the Overnight-Rate and the corresponding EURIBOR/LIBOR-Tenor.

The appearance of this new risk-factor has induced the “reinvention” of several areas in bank-controlling, especially market-risk and Hedge Accounting. In this dissertation the focus is set on the latter one. The questions arising due to the new circumstances are: Is the old approach still effective?If not, is it worth for a bank to start a project in order to implement a multi-curve setup in an IFRS-conform way? Is the fair value option an alternative to Hedge Accounting?

In this thesis we focus on interest-rate risks in the EUR-zone. Therefore, we construct a valuation-programme simulating hedge-relationships basing on real market data. Within this framework, we test different hedge fair value approaches and compare the results to the alternative of a fair value option. Furthermore, we include the project-costs needed for the implementation of a new Hedge Accounting method in order to give an indication whether such projects are worth being conducted or not.


Fair Value Measurement of Loans

Master Thesis in "Risk Management and Regulation", Frankfurt School of Finance & Management (2013)

This thesis provides an insight into the current accounting treatment of loans and the need of the fair value measurement. In addition, based on banks notes to the financial statement and personal exchange, current fair value approaches applied by banks are briefly discussed. In the last section the papers objective and restricitons are summarized.   


Stakeholder management in turnaround situations - with the focus on German Landesbanks

Master Thesis in "Executive MBA", University of Durham/EBS (2013)

In this dissertation the key success factors for stakeholder management in a turnaround situation with the focus on German Landesbanks are analyzed. The special case that the state is one of the stakeholders has not been considered so far, though this case became a hot topic during the financial crisis and in the aftermath of this crisis. German Landesbanks provide a good opportunity to investigate the key success factors for stakeholder management in the turnaround situation, since in these cases the state was owner of the Landesbanks also during the financial crisis, thus key success factors can be derived by analyzing the actions of the CEOs and their management teams, the effect their actions had on the relevant stakeholders and the impact the relevant stakeholders in turn had on the turnaround result.   


The optimal Market Share of a Tourism Company

Master Thesis in "Executive MBA", University of Durham/EBS (2013)

The present work investigates how the market share of a tourism company might be optimized. It is an essential part of the work to derive a model for the profit of the company that has a similar structure like a portfolio of financial assets. This allows drawing on results of portfolio theory in order to optimize the market share. The tourism demand is an essential factor of the company and will be modeled by an autoregressive process by taking irregularities into account. Autoregressive processes and irregularities have already been used in the literature on forecasting tourism demand. Within the framework of a sample calculation it will be outlined how the theoretic results could be used for planning and decision making.

Credit Exposure Netting with Central Clearing Counterparties 

Master Thesis in "Mathematical Finance", Oxford (2011)

Derivative contracts like CDS typically are collateralized. Offsetting positions are netted so only the remaining exposure results in collateral costs. Clearing is done both bilaterally or via a central clearing counterparty. The manner in which netting in a market with many participants is organized influences the total amount of collateral demand. Here, the total collateral demand of a bilateral netting market vs. a combination of bilateral and centralized clearing is analyzed. This is the situation of part of a market moving from OTC to central clearing (e.g. consider the current intention of centralizing CDS clearing). The model predicts that a reduction of total collateral is only achievable if the market share moved to centralized clearing has a minimum size and number of participants. Refining the model, it is found that the exact efficiency threshold strongly depends on the actual exposure distributions and their correlations, indicating that from a purely netting efficiency-driven point of view, the decision of switching from bilateral to centralized netting demands careful further study. 

Network models for financial systems

Master Thesis in "Quantitative Finance", Frankfurt School of Finance & Management (2011)

Währungsrisiken in Investitionsvehikeln für Mikrokredite

Master Thesis in "Quantitative Finance", Frankfurt School of Finance & Management (2010)