BA Working Paper Series
- Competing for contracts with buyer uncertainty Choosing price and quality variables ( 343.3 KB)
Edward Anderson Cheng Qian
Abstract: We model a situation in which a single firm evaluates competing suppliers and
selects just one. Suppliers submit bids involving both price and quality variables. The
buyer makes a choice which from the supplier's perspective appears to contain a
stochastic element - for example the buyer may have information, which is not
shared with the suppliers, and that gives one supplier an advantage in the final
choice. We use a discrete choice model of buyer choice (e.g. multinomial logit). Our
main result is that the supplier's choice of the quality variables is not affected by the
competitive environment. Thus the suppliers compete only on price. We compare this
with a second model in which the buyer's weighting on different quality variables is
uncertain at the time bids are made.
Keywords: Supplier choice, Quality variables, Nash equilibrium, Types of uncertainty
- Practical use of sensitivity in econometrics with an illustration to forecast combinationsPractical use of sensitivity in econometrics with an illustration to forecast combinations ( 244.2 KB)
Jan R. Magnus Andrey L. Vasnev
Abstract: Sensitivity analysis is important for its own sake and also in combination with
diagnostic testing. We consider the question how to use sensitivity statistics in
practice, in particular how to judge whether sensitivity is large or small. For this
purpose we distinguish between absolute and relative sensitivity and highlight the
context-dependent nature of any sensitivity analysis. Relative sensitivity is then
applied in the context of forecast combination and sensitivity-based weights are
introduced. All concepts are illustrated through the European yield curve. In this
context it is natural to look at sensitivity to autocorrelation and normality assumptions.
Different forecasting models are combined with equal, fit-based and sensitivity-based
weights, and compared with the multivariate and random walk benchmarks. We show
that the fit-based weights and the sensitivity-based weights are complementary. For
long-term maturities the sensitivity-based weights perform better than other weights.
Keywords: Sensitivity analysis, Forecast combination, Yield curve prediction
- Forecast combination for U.S. recessions with real-time data ( 244.6 KB)
Laurent L. Pauwels Andrey Vasnev
Abstract: This paper proposes the use of forecast combination to improve predictive accuracy
in forecasting the U.S. business cycle index, as published by the Business Cycle
Dating Committee of the NBER. It focuses on one-step ahead out-of-sample monthly
forecast utilising the well-established coincident indicators and yield curve models,
allowing for dynamics and real-time data revisions. Forecast combinations use logscore
and quadratic-score based weights, which change over time. This paper finds
that forecast accuracy improves when combining the probability forecasts of both the
coincident indicators model and the yield curve model, compared to each model's
own forecasting performance.
Keywords: U.S. business cycle, Forecast combination, Density forecast, Probit models, Yield curve, Coincident indicators.
- Practical considerations for optimal weights in density forecast combination ( 288.9 KB)
Andrey L. Vasnev Laurent L. Pauwels
Abstract: The problem of finding appropriate weights to combine several density forecasts
is an important issue currently debated in the forecast combination literature.
Recently, a paper by Hall and Mitchell (IJF, 2007) proposes to combine density
forecasts with optimal weights obtained from solving an optimization problem.
This paper studies the properties of this optimization problem when the number
of forecasting periods is relatively small and finds that it often produces corner
solutions by allocating all the weight to one density forecast only. This paper's
practical recommendation is to have an additional training sample period for the
optimal weights. While reserving a portion of the data for parameter estimation
and making pseudo-out-of-sample forecasts are common practices in the empirical
literature, employing a separate training sample for the optimal weights is novel,
and it is suggested because it decreases the chances of corner solutions. Alternative
log-score or quadratic-score weighting schemes do not have this training sample
Keywords: Forecast combination; Density forecast; Optimization; Optimal weight; Discrete choice models
- Multiple Event Incidence and Duration Analysis for Credit Data Incorporating Non-Stochastic Loan Maturity ( 476.0 KB)
John G. T. Watkins Andrey L. Vasnev Richard Gerlach
Abstract: Applications of duration analysis in Economics and Finance exclusively employ
methods for events of stochastic duration. In application to credit data, previous
research incorrectly treats the time to pre-determined maturity events as censored
stochastic event times. The medical literature has binary parametric 'cure rate'
models that deal with populations that never experienced the modelled event. We
propose and develop a Multinomial parametric incidence and duration model,
incorporating such populations. In the class of cure rate models, this is the first fully
parametric multinomial model and is the first framework to accommodate an event
with pre-determined duration. The methodology is applied to unsecured personal
loan credit data provided by one of Australia's largest financial services
organizations. This framework is shown to be more flexible and predictive through a
simulation and empirical study that reveals: simulation results of estimated
parameters with a large reduction in bias; superior forecasting of duration;
explanatory variables can act in different directions upon incidence and duration;
and, variables exist that are statistically significant in explaining only incidence or
Keywords: Loan default, prepayment, maturity, dependent competing risks, duration analysis
- Bayesian Semi-parametric Expected Shortfall Forecasting in Financial Markets ( 344.3 KB)
Richard H. Gerlach Cathy W.S. Chen Liou-Yan Lin
Abstract: Bayesian semi-parametric estimation has proven effective for quantile estimation in general and specifically in financial Value at Risk forecasting. Expected short-fall is a competing tail risk measure, involving a conditional expectation beyond a quantile, that has recently been semi-parametrically estimated via asymmetric least squares and so-called expectiles. An asymmetric Gaussian density is proposed allowing a likelihood to be developed that leads to Bayesian semi-parametric estimation and forecasts of expectiles and expected shortfall. Further, the conditional autoregressive expectile class of model is generalised to two fully nonlinear families. Adaptive Markov chain Monte Carlo sampling schemes are employed for estimation in these families. The proposed models are clearly favoured in an empirical study forecasting eleven financial return series: clear evidence of more accurate expected shortfall forecasting, compared to a range of competing methods is found. Further, the most favoured models are those estimated by Bayesian methods.
Keywords: CARE model; Nonlinear; Asymmetric Gaussian distribution; Expected shortfall; semi-parametric.
- Does the Box-Cox transformation help in forecasting macroeconomic time series? ( 219.2 KB)
Tommaso Proietti Helmut Lütkepohl
Abstract: The paper investigates whether transforming a time series leads to an improvement in forecasting accuracy. The class of transformations that is considered is the Box-Cox power transformation, which applies to series measured on a ratio scale. We propose a nonparametric approach for estimating the optimal transformation parameter based on the frequency domain estimation of the prediction error variance, and also conduct an extensive recursive forecast experiment on a large set of seasonal monthly macroeconomic time series related to industrial production and retail turnover. In about one fifth of the series considered the Box-Cox transformation produces forecasts significantly better than the untransformed data at one-step-ahead horizon; in most of the cases the logarithmic transformation is the relevant one. As the forecast horizon increases, the evidence in favour of a transformation becomes less strong. Typically, the naïve predictor that just reverses the transformation leads to a lower mean square error than the optimal predictor at short forecast leads. We also discuss whether the preliminary in-sample frequency domain assessment conducted provides a reliable guidance which series should be transformed for improving significantly the predictive performance.
Keywords: Forecasts comparisons. Multi-step forecasting. Rolling forecasts. Nonparametric estimation of prediction error variance.
- The Multistep Beveridge-Nelson Decomposition ( 188.3 KB)
Abstract: The Beveridge-Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The paper introduces the multistep Beveridge-Nelson decomposition, which arises when the forecast function is obtained by the direct autoregressive approach, which optimizes the predictive ability of the AR model at forecast horizons greater than one. We compare our proposal with the standard Beveridge-Nelson decomposition, for which the forecast function is obtained by iterating the one-stepahead predictions via the chain rule. We illustrate that the multistep Beveridge-Nelson trend is more efficient than the standard one in the presence of model misspecification and we subsequently assess the predictive validity of the extracted transitory component with respect to future growth.
Keywords: Trend and Cycle. Forecasting. Filtering. Misspecification
- Do External Political Pressures Affect the Renminbi Exchange Rate? ( 219.2 KB)
Laurent Pauwels Li-Gang Liu
Abstract: This paper investigates whether external political pressure for faster renminbi (RMB)
appreciation affect both the daily returns and the conditional volatility of the RMB
central parity rate. We construct several political pressure indicators pertaining to the
RMB exchange rate, with a special emphasis on the US pressure, to test the
hypothesis. After controlling for Chinese macroeconomic surprise news, we find that
US and non-US political pressure does not have a significant influence on RMB's
daily returns. However, evidence suggests that political pressures, and especially
those from the US, have statistically significant impacts on the conditional volatility of
the RMB. Furthermore, we conduct the same exercise on the 12-month RMB nondeliverable
forward rate (NDF). We find that the NDF market is highly responsive to
macroeconomic surprise news and there is some evidence that Sino-US bilateral
meetings affect the conditional volatility of the RMB NDF.
Keywords: Renminbi exchange rate, Event studies, Political pressures, Non-deliverable forward, Macroeconomic news
- Stochastic trends and seasonality in economic time series: new evidence from Bayesian stochastic model specification search ( 216.7 KB)
Tommaso Proietti Stefano Grassi
Abstract: An important issue in modelling economic time series is whether key unobserved components representing trends, seasonality and calendar components, are deterministic or evolutive. We address it by applying a recently proposed Bayesian variable selection methodology to an encompassing linear mixed model that features, along with deterministic effects, additional random explanatory variables that account for the evolution of the underlying level, slope, seasonality and trading days. Variable selection is performed by estimating the posterior model probabilities using a suitable Gibbs sampling scheme. The paper conducts an extensive empirical application on a large and representative set of monthly time series concerning industrial production and retail turnover. We find strong support for the presence of stochastic trends in the series, either in the form of a time-varying level, or, less frequently, of a stochastic slope, or both. Seasonality is a more stable component: only in 70% of the cases we were able to select at least one stochastic trigonometric cycle out of the six possible cycles. Most frequently the time variation is found in correspondence with the fundamental and the first harmonic cycles. An interesting and intuitively plausible finding is that the probability of estimating time-varying components increases with the sample size available. However, even for very large sample sizes we were unable to find stochastically varying calendar effects.
Keywords: Nonstationarity. Variable selection. Linear Mixed Models.
- Ranking games and gambling: When to quit when you're ahead ( 294.9 KB)
Abstract: It is common for rewards to be given on the basis of a rank ordering, so that relative
performance amongst a cohort is the criterion. In this paper we formulate an
equilibrium model in which an agent makes successive decisions on whether or not
to gamble and is rewarded on the basis of a rank ordering of final wealth. This is a
model of the behaviour of mutual fund managers who are paid depending on funds
under management which in turn are largely determined by annual or quarterly rank
orderings. In this model fund managers can elect either to pick stocks or to use a
market tracking strategy. In equilibrium the final distribution of rewards will have a
negative skew. We explore how this distribution depends on the number of players,
the probability of success when gambling, the structure of the rewards, and on
information regarding the other player?s performance.
- Convergent learning algorithms for potential games with unknown noisy rewards ( 471.6 KB)
Archie C. Chapman David S. Leslie Alex Rogers Nicholas R. Jennings
Abstract: In this paper, we address the problem of convergence to Nash equilibria in games with rewards that are initially unknown and which must be estimated over time from noisy observations. These games arise in many real-world applications, whenever rewards for actions cannot be prespecified and must be learned on-line. Standard results in game theory, however, do not consider such settings. Specifically, using results from stochastic approximation and differential inclusions, we prove the convergence of variants of fictitious play and adaptive play to Nash equilibria in potential games and weakly acyclic games, respectively. These variants all use a multi-agent version of Q-learning to estimate the reward functions and a novel form of the e-greedy decision rule to select an action. Furthermore, we derive e-greedy decision rules that exploit the sparse interaction structure encoded in two compact graphical representations of games, known as graphical and hypergraphical normal form, to improve the convergence rate of the learning algorithms. The structure captured in these representations naturally occurs in many distributed optimisation and control applications. Finally, we demonstrate the efficacy of the algorithms in a simulated ad hoc wireless sensor network management problem.
Keywords: Game theory, distributed optimisation, learning in games.
- Forecast combination for discrete choice models: predicting FOMC monetary policy decisions ( 243.0 KB)
Laurent Pauwels Andrey Vasnev
Abstract: This paper provides a methodology for combining forecasts based on several
discrete choice models. This is achieved primarily by combining one-step-ahead
probability forecast associated with each model. The paper applies well-established
scoring rules for qualitative response models in the context of forecast combination.
Log-scores and quadratic-scores are both used to evaluate the forecasting accuracy
of each model and to combine the probability forecasts. In addition to producing point
forecasts, the effect of sampling variation is also assessed. This methodology is
applied to forecast the US Federal Open Market Committee (FOMC) decisions in
changing the federal funds target rate. Several of the economic fundamentals
influencing the FOMC decisions are nonstationary over time and are modelled in a
similar fashion to Hu and Phillips (2004a, JoE). The empirical results show that
combining forecasted probabilities using scores mostly outperforms both equal
weight combination and forecasts based on multivariate models.
Keywords: Forecast combination, Probability forecast, Discrete choice models, Monetary policy decisions
- Supply Function Equilibria Always Exist ( 238.4 KB)
Abstract: Supply function equilibria are used in the analysis of divisible good auctions with a large number of identical objects to be sold or bought. An important example occurs in wholesale electricity markets. Despite the substantial literature on supply function equilibria the existence of a pure strategy Nash equilibria for a uniform price auction in asymmetric cases has not been established in a general setting. In this paper we prove the existence of a supply function equilibrium for a duopoly with asymmetric firms having convex costs, with decreasing concave demand subject to an additive demand shock, provided the second derivative of the demand function is small enough. The proof is constructive and also gives insight into the structure of the equilibrium solutions.
Keywords: Wholesale electricity markets; divisible good auctions; supply functions; existence of equilibria.
- Bayesian Forecasting for Financial Risk Management, Pre and Post the Global Financial Crisis ( 867.0 KB)
Richard Gerlach Cathy WS Chen Edward MH Lin Wcw Lee
Abstract: Value-at-Risk (VaR) forecasting via a computational Bayesian framework is
considered. A range of parametric models are compared, including standard,
threshold nonlinear and Markov switching GARCH specifications, plus standard and
nonlinear stochastic volatility models, most considering four error probability
distributions: Gaussian, Student-t, skewed-t and generalized error distribution.
Adaptive Markov chain Monte Carlo methods are employed in estimation and
forecasting. A portfolio of four Asia-Pacific stock markets is considered. Two
forecasting periods are evaluated in light of the recent global financial crisis. Results
reveal that: (i) GARCH models out-performed stochastic volatility models in almost all
cases; (ii) asymmetric volatility models were clearly favoured pre-crisis; while at the
1% level during and post-crisis, for a 1 day horizon, models with skewed-t errors
ranked best, while IGARCH models were favoured at the 5% level; (iii) all models
forecasted VaR less accurately and anti-conservatively post-crisis.
Keywords: EGARCH model; generalized error distribution; Markov chainMonte Carlo method; Value-at-Risk; Skewed Student-t; market risk charge; global nancial crisis.
- The Two-sided Weibull Distribution and Forecasting Financial Tail Risk ( 489.7 KB)
Richard Gerlach Qian Chen
Abstract: A two-sided Weibull is developed to model the conditional financial return distribution, for the purpose of forecasting Value at Risk (VaR) and conditional VaR. A range of conditional return distributions are combined with four volatility specifications to forecast tail risk in four international markets, two exchange rates and one individual asset series, over a four year forecast period that includes the recent global financial crisis. The two-sided Weibull performs at least as well as other distributions for VaR forecasting, but performs most favourably for conditional Value at Risk forecasting, prior to as well as during and after the recent crisis.
Keywords: Two-sidedWeibull, Value-at-Risk, Expected shortfall, Back-testing, global financial crisis, volatility.
- Margining Option Portfolios by Network Flows ( 396.4 KB)
D. Matsypura V.G. Timkovsky
Abstract: As shown in [Rudd and Schroeder, 1982], the problem of margining option portfolios where option spreads with two legs are used for offsetting can be solved in polynomial time by network flow algorithms. However, spreads with only two legs do not provide sufficient accuracy in measuring risk. Therefore, margining practice also employs spreads with three and four legs. A polynomial time solution to the extension of the problem where option spreads with three and four legs are also used for offsetting is not known. In this paper we propose a heuristic network flow algorithm for this extension and present a computational study that proves high efficiency of this algorithm in margining practice.
- Combinatorics of Option Spreads: The Margining Aspect ( 408.5 KB)
D. Matsypura V.G. Timkovsky
Abstract: In December 2005, the U.S. Securities and Exchange Commission approved margin
rules for complex option spreads with 5, 6, 7, 8, 9, 10 and 12 legs. Only option
spreads with 2, 3 or 4 legs were recognized before. Taking advantage of option
spreads with a large number of legs substantially reduces margin requirements and,
at the same time, adequately estimates risk for margin accounts with positions in
options. In this paper we present combinatorial models for known and newly
discovered option spreads with up to 134 legs. We propose their full characterization
in terms of matchings, alternating cycles and chains in graphs with bicolored edges.
We show that the combinatorial analysis of option spreads reveals powerful hedging
mechanisms in the structure of margin accounts, and that the problem of minimizing
the margin requirement for a portfolio of option spreads can be solved in polynomial
time using network flow algorithms. We also give recommendations on how to create
more efficient margin rules for options.
- Portfolio Margining: Strategy vs Risk ( 449.2 KB)
E.G. Coffman, JR D. Matsypura V.G. Timkovsky
Abstract: This paper presents the results of a novel mathematical and experimental analysis of two approaches to margining customer accounts, strategy-based and risk-based. Building combinatorial models of hedging mechanisms of these approaches, we show that the strategy-based approach is, at this point, the most appropriate one for margining security portfolios in customer margin accounts, while the risk-based approach can work efficiently for margining only index portfolios in customer mar-gin accounts and inventory portfolios of brokers. We also show that the application of the risk-based approach to security portfolios in customer margin accounts is very risky and can result in the pyramid of debt in the bullish market and the pyramid of loss in the bearish market. The results of this paper support the thesis that the use of the risk-based approach to margining customer accounts with positions in stocks and stock options since April 2007 influenced and triggered the U.S. stock market crash in October 2008. We also provide recommendations on ways to set appropriate margin requirements to help avoid such failures in the future.
- Estimating Value At Risk ( 343.9 KB)
Zudi Lu Hai Huang Richard Gerlach
Abstract: Significantly driven by JP Morgan's RiskMetrics system with EWMA (exponentially weighted moving average) forecasting technique, value-at-risk (VaR) has turned to be a popular measure of the degree of various risks in financial risk management. In this paper we propose a new approach termed skewed-EWMA to forecast the changing volatility and formulate an adaptively efficient procedure to estimate the VaR. Differently from the JP Morgan's standard-EWMA, which is derived from a Gaussian distribution, and the Guermat and Harris (2001)'s robust-EWMA, from a Laplace distribution, we motivate and derive our skewed-EWMA procedure from an asymmetric Laplace distribution, where both skewness and heavy tails in return distribution and the time-varying nature of them in practice are taken into account. An EWMA-based procedure that adaptively adjusts the shape parameter controlling the skewness and kurtosis in the distribution is suggested. Backtesting results show that our proposed skewed-EWMA method offers a viable improvement in forecasting VaR.
Keywords: Asymmetric Laplace distribution, Exponentially weighted moving average (EWMA), forecasting, Skewed EWMA, Skewness and heavy tails, Time-varying shape parameter, Value-at-risk (VaR).
- Mixed strategies in discriminatory divisible-good auctions (UPDATED) ( 578.9 KB)
E.J. Anderson P.Holmberg A.B. Philpott
Abstract: Using the concept of market-distribution functions, we derive general optimality conditions for discriminatory divisible-good auctions, which are also applicable to Bertrand games and non-linear pricing. We introduce the concept of offer distribution function to analyze randomized offer curves, and characterize mixed-strategy Nash equilibria for pay-as-bid auctions where demand is uncertain and costs are common knowledge; a setting for which pure-strategy supply function equilibria typically do not exist. We generalize previous results on mixtures over horizontal offers as in Bertrand-Edgeworth games, but more importantly we characterize novel mixtures over partly increasing supply functions.
Keywords: Pay-as-bid auction, divisible good auction, mixed strategy equilibria, wholesale electricity markets
- Survival Analysis for Credit Scoring: Incidence and Latency ( 950.9 KB)
John Watkins Andrey Vasnev Richard Gerlach
Abstract: Duration analysis is an analytical tool for time-to-event data that has been borrowed from medicine and engineering to be applied by econometricians to investigate typical economic and finance problems. In applications to credit data, time to the pre-determined maturity events have been treated as censored observations for the events with stochastic latency. A methodology, motivated by the cure rate model framework, is developed in this paper to appropriately analyse a set of mutually exclusive terminal events where at least one event may have a predetermined latency. The methodology is applied to a set of personal loan data provided by one of Australia?s largest financial services institutions. This is the first framework to simultaneously model prepayment, write off and maturity events for loans. Furthermore, in the class of cure rate models it is the first fully parametric multinomial model and the first to accommodate for an event with pre-determined latency. The simulation study found this model performed better than the two most common applications of survival analysis to credit data. In addition, the result of the application to personal loans data reveals particular explanatory variables can act in different directions upon incidence and latency of an event and variables exist that may be statistically significant in explaining only incidence or latency.
- Bayesian time-varying quantile forecasting for Value-at-Risk in financial markets ( 322.9 KB)
Richard Gerlach; Cathy W.S. Chen; Nancy Y. C. Chan
Abstract: Recently, Bayesian solutions to the quantile regression problem, via the likeli-hood of a Skewed-Laplace distribution, have been proposed. These approaches are extended and applied to a family of dynamic conditional autoregressive quantile models. Popular Value at Risk models, used for risk management in finance, are extended to this fully nonlinear family. An adaptive Markov chain Monte Carlo sampling scheme is adapted for estimation and inference. Simulation studies illustrate favourable performance, compared to the standard numerical optimization of the usual nonparametric quantile criterion function, in finite samples. An empirical study generating Value at Risk forecasts for ten major financial stock indices finds significant nonlinearity in dynamic quantiles and evidence favoring the proposed model family, for lower level quantiles, compared to a range of standard parametric volatility models, a semi-parametric smoothly mixing regression and some nonparametric risk measures, in the literature.
Keywords: CAViaR model; Asymmetric; Skew-Laplace distribution; Value-at-Risk; GARCH; Regression quantile.