Find us on Facebook Find us on LinkedIn Follow us on Twitter Subscribe to our YouTube channel

2013 Seminars

8th Feb 2013 - 11:00 am

Speaker:

Associate Professor Richard Gerlach,

Affiliation:

Discipline of Business Analytics, University of Sydney

Venue:

Room 498, Merewether Building (H04)

Title:

Bayesian Semi-parametric Expected Shortfall Forecasting in Financial Markets.

Abstract:

Bayesian methods have proven effective for quantile estimation, including for financial Value at Risk forecasting. Expected shortfall is a competing tail risk measure, now favoured by the Basel Committee, involving a conditional expectation, that has recently been semi-parametrically estimated via asymmetric least squares. An asymmetric Gaussian density is proposed, allowing a likelihood to be developed that leads to Bayesian semi-parametric estimation and forecasts of expectiles and expected shortfall. Further, the conditional autoregressive expectile (CARE) class of model is generalised to two fully nonlinear families. Adaptive Markov chain Monte Carlo sampling schemes are employed for estimation. The proposed models are favoured in an empirical study forecasting eight financial return series: evidence of more accurate expected shortfall forecasting, compared to a range of competing methods, is found, while Bayesian estimated models tend to be more accurate. However, during the recent financial crisis period, most models perform badly, while two existing methods perform best.

27th Feb 2013 - 11:00 am

Speaker:

Professor Helmut Lutkepohl,

Affiliation:

Department of Economics, Freie University Berlin and DIW Berlin

Venue:

Room 498, Merewether Building (H04)

Title:

Identifying Structural Vector Autoregressions via Changes in Volatility.

Abstract:

Identification of shocks of interest is a central problem in structural vector autoregressive (SVAR) modelling. Identification is often achieved by imposing restrictions on the impact or long-run effects of shocks or by considering sign restrictions for the impulse responses. In a number of articles changes in the volatility of the shocks have also been used for identification. The present study focusses on the latter device. Some possible setups for identification via heteroskedasticity are reviewed and their potential and limitations are discussed. Two detailed examples are considered to illustrate the approach.

1st Mar 2013 - 11:00 am

Speaker:

Professor Masayuki Hirukawa,

Affiliation:

Faculty of Economics; Setsunan University

Venue:

Room 498

Title:

Family of Generalized Gamma Kernels: A Unified Approach to the Asymptotics on Asymmetric Kernels

Abstract:

Unlike symmetric kernels, exploiting the asymptotics on asymmetric kernels has relied on kernel-specific arguments. Toward a unified approach to their asymptotics, this paper proposes a generic form of asymmetric kernels that consists of a set of common conditions. The generic kernel, called a family of Generalized Gamma kernels, is built on the Generalized Gamma density function, and incorporates the Modified Gamma kernel as a special case. As other special cases, two new kernels, namely, the Weibull and Nakagami-m kernels, are also proposed. The density estimator using Generalized Gamma kernels is shown to preserve the appealing properties that the Gamma and Modified Gamma kernels possess. Furthermore, this paper investigates three extensions of the density estimation including multiplicative bias correction.

4th Mar 2013 - 10:30 am

Speaker:

Professor Vinod Singhal,

Affiliation:

Brady Family Professor of Operations Management; Georgia Institute of Technology; Scheller College of Business

Venue:

Room 498 Merewether Building H04

Title:

Supply Chain Risks and Financial performance: Evidence from Demand-Supply Mismatches

Abstract:

 

This talk will present empirical evidence on the effect of supply chain risks on financial performance. Financial performance is measured using measures related to shareholder value, share price volatility, and profitability.  It will compare and contrast the corporate performance effects of three different types of supply chain risks; supply chain disruptions, product introduction delays, and excess inventory.  The implications of these results on making the business case for supply chain initiatives and justifying investments in technologies and solutions that mitigate supply chain risks will be discussed.

8th Mar 2013 - 11:00 am

Speaker:

Assistant Professor Saraswata Chaudhuri; Department of Economics,

Affiliation:

Authors: Saraswata Chaudhuri and David Guilkey; University of North Carolina at Chapel Hill

Venue:

Room 498

Title:

GMM with Multiple Missing Variables

Abstract:

Abstract:  We consider estimation of a finite dimensional unknown parameter value defined by a set of overidentifying unconditional moment restrictions. Random variables forming the different elements of the moment vector can be missing at random -- jointly or individually -- for some sample units, thus rendering the corresponding elements of the moment vector infeasible.  We obtain the semiparametric efficiency bound under this setup. We recommend semiparametric estimators that utilize the available information optimally and hence have asymptotic variances equal to the efficiency bound. A small scale Monte-Carlo experiment provides evidence that these semiparametric estimators perform better than the existing estimators even in relatively small samples. An empirical example studying the relationship between a child's years of schooling and number of siblings based on data from Indonesia is provided for the purpose of illustration.

15th Mar 2013 - 11:00 am

Speaker:

Associate Professor William McCausland,

Affiliation:

Universite de Montreal

Venue:

Room 498 Merewether Building H04

Title:

PRIOR DISTRIBUTIONS FOR RANDOM CHOICE STRUCTURES - WILLIAM J. MCCAUSLAND AND A. A. J. MARLEY.

Abstract:

We study various axioms of discrete probabilistic choice,measuring how restrictive they are, both alone and in the presence of other axioms. We do this by formulating a class of prior distributions over the set of random choice structures and using Monte Carlo simulation to compute, for a range of prior distributions, probabilities that various simple and compound axioms hold. For example, the probability of the triangle inequality is usually many orders of magnitude higher than the probability of random utility. For most pairs of axioms we study, the probability that both hold is consistently greater than the product of their marginal probabilities. When one of the two axioms is Sattath and Tversky's (1976) multiplicative inequality and the other is the triangle inequality or one of the variants of stochastic transitivity, the joint probability is consistently less than the product of the marginals. In this sense, the multiplicative inequality is complementary to these other axioms. The reciprocal of the prior probability that an axiom holds is an upper bound on the Bayes factor in favour of a restricted model, in which the axiom holds, against an unrestricted model. The high prior probability of the triangle inequality limits the degree of support data from a single decision maker can provide in its favour.
Key words: Bayesian inference, choice axioms, discrete choice, probabilistic choice, Luce's challenge, random utility.

12th Apr 2013 - 11:00 am

Speaker:

Professor George Steiner,

Affiliation:

McMaster University

Venue:

Room 498 Merewether Building H04

Title:

Revised delivery-time quotation in scheduling with outsourcing and tardiness penalties

Abstract:

There are many situations in supply chain scheduling when the supplier finds it impossible to meet the promised due dates for some orders. We present a model for the rescheduling of orders with simultaneous assignment of attainable revised due dates to minimize due date escalation and tardiness penalties for the supplier. The model can also be used to determine which orders should be outsourced if this option is available. We show that the problem is equivalent to minimizing the total tardiness with rejection with respect to the original due dates. We prove that the problem is NP-hard and present a pseudopolynomial algorithm for it. We also present a fully polynomial time approximation scheme for the problem.

19th Apr 2013 - 11:00 am

Speaker:

Prof. John Geweke,

Affiliation:

UTS Business School

Venue:

Room 498 Merewether Building H04

Title:

Adaptive Sequential Posterior Simulators for Massively Parallel Computing Environments

Abstract:

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways.  But to fully exploit these benefits algorithms that conform to parallel computing environments are needed.  Sequential Monte Carlo comes very close to this ideal whereas other approaches like Markov chain Monte Carlo do not. This paper presents a sequential posterior simulator well suited to this computing environment. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm is robust to pathological posterior distributions, generates accurate marginal likelihood approximations, and provides estimates of numerical standard error and relative numerical efficiency intrinsically.  The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

3rd May 2013 - 03:00 pm

Speaker:

Professor Rodney Strachan,

Affiliation:

Australian National University

Venue:

Room 498 Merewether Building H04

Title:

Invariant Inference and Efficient Computation in the Static Factor Model

Abstract:

Factor models are used in a wide range of areas. Two issues with Bayesian versions of these models are a lack of invariance to ordering of the variables and computational inefficiency. This paper develops invariant and efficient Bayesian methods for estimating static factor models.  This approach leads to inference on the number of factors that does not depend upon the ordering of the variables, and we provide arguments to explain this invariance. Beginning from identified parameters in which no ordering is imposed, we use parameter expansions to obtain a specification with standard conditional posteriors. Identifying restrictions that are commonly employed result in interpretable factors or loadings and, using our approach, these can be imposed ex-post. This allows us to investigate several alternative identifying schemes without the need to respecify and resample the model. We show significant gains in computational efficiency. We apply our methods to a simple example using a macroeconomic
dataset.

17th May 2013 - 11:00 am

Speaker:

Dr Laurent Pauwels,

Affiliation:

Discipline of Business Analytics, University of Sydney Business School

Venue:

Room 498 Merewether Building H04

Title:

Stock market margin requirements and the 2008 financial crisis

Abstract:

This paper presents a case study on how a substantial decrease of margin requirements for equities, equity options, unlisted derivatives and NBI futures made it easier for investors to borrow and take a leveraged position in equities. From 2005 until 2007, the Securities and Exchange Commission had embarked on a policy change, which may have triggered a speculative bubble in the process. This paper hypothesises that the change in margin requirements led to the formation of a speculative bubble in margin debt across U.S. stock market in the advent of the 2008 financial crisis. In order to investigate such a bubble, this paper employs recursive test procedures for testing explosive behaviour. The recursive methodology relies on modifications of unit-root and structural change tests commonly used in the empirical economics and finance literature. Furthermore, the tests can also identify the origination and collapse dates of the bubble. There is evidence of a bubble in margin debt forming in the second half of 2006 and collapsing after September 2008 with the global financial melt-down. Our findings imply that the new margin requirements led to the formation of a speculative bubble and will continue doing so unless adjusted appropriately.

24th May 2013 - 11:00 am

Speaker:

Associate Professor Oleg Prokopyev,

Affiliation:

University of Pittsburgh

Venue:

Room 498 Merewether Building H04

Title:

Optimal Implantable Cardioverter Defibrillator (ICD) Generator Replacement

Abstract:

Implantable Cardioverter Defibrillators (ICDs) include small, battery powered generators, the longevity of which depends on a patient's rate of consumption.  Generator replacement, however, involves risks.  Hence, a trade-off exists between prematurely exposing the patient to these risks and allowing for the possibility that the device is unable to deliver therapy when needed. Currently, replacements are performed using a one-size-fits-all approach.  Here, we develop a Markov decision process model to determine patient-specific optimal replacement policies as a function of patient age and the remaining battery capacity.  We analytically establish that the optimal policy is of threshold-type in the remaining capacity, but not necessarily in patient age.  We conduct a large computational study that suggests that under the optimal policy, patients see a considerable decrease in the total expected number of replacements, while achieving the same or greater expected lifetime.

7th Jun 2013 - 03:30 pm

Speaker:

Professor Eric Renault,

Affiliation:

Brown University

Venue:

Room 498 Merewether Building H04

Title:

Affine Option Pricing Model in Discrete Time

Abstract:

We propose an extension with leverage effect of the discrete time stochastic volatility model of Darolles et al. (2006). This extension is shown to be the natural discrete time analog of the Heston (1993) option pricing model. It shares with Heston (1993) the advantage of structure preserving change of measure: with an exponentially affine stochastic discount factor, the historical and the risk neutral models belong to the same family of joint probability distributions for return and volatility processes. This allows computing option prices in semi-closed form through Fourier transform. The discrete time approach has several advantages. First, it allows relaxing the constraints on higher order moments implied by the specification of a diffusion process. Second, it makes more transparent the role of various parameters: leverage versus volatility feedback effect, connection with daily realized volatility measure on high-frequency intraday returns, closed-form formulas for affine dynamics of the first two moments of return and volatility that are robust to temporal aggregation, impact of leverage on the volatility smile, etc. This sheds some new light on the identification issue of the various risk premium parameters. An empirical illustration is provided.

Keywords:  stochastic volatility; leverage; option pricing; equity risk premium; volatility risk premium

Download Paper

2nd Aug 2013 - 11:00 am

Speaker:

Professor Jiti Gao,

Affiliation:

Department of Econometrics and Business Statistics; Monash University

Venue:

Room 498 Merewether Bldg H04

Title:

Estimation and Specification in Nonstationary Time Series with Endogeneity

Abstract:

This presentation gives a survey of some recent developments on estimation and model specification for nonlinear and nonstationary time series.  This talk focuses on the discussion of using non-parametric and semi-parametric models to deal with a class of time series models that allow for nonlinearity, nonstationarity and endogeneity. Applications in economics and finance are discussed.

 

9th Aug 2013 - 11:00 am

Speaker:

Dr. Quan Gan,

Affiliation:

Discipline of Finance; The University of Sydney

Venue:

Room 498, Merewether Bldg H04

Title:

Portfolio Selection with Skew Normal Asset Returns

Abstract:

This paper examines the portfolio selection problem with skew normal asset returns.  By exploring an alternative parameterization of Azzalini & Dalla Valle (1996)'s multivariate skew normal distribution I show that the multivariate skew normal distribution is a special case of Simaan (1993)'s three-parameter model.  All Simaan (1993)'s results are applicable to the skew normal asset returns. The three-parameter efficient frontier is spanned by three funds which include two funds from the mean-variance portfolio selection.  Combining the skew normal asset returns with the CARA utility, I obtain the closed-form certainty equivalent and skewness premium.  I show that the skewness premium is positive (negative) when asset returns have negative (positive) skewness. The magnitude of the skewness premium is increasing in market risk aversion.  I use the skew normal certainty equivalent to evaluate the economic value of incorporating higher moments in portfolio selection. I find that when investors face broad investment opportunities, the economic value of considering higher moments is negligible under realistic margin requirements.


Keywords: Portfolio selection, Skew normal, Certainty equivalent, Skewness premium

16th Aug 2013 - 11:00 am

Speaker:

Dr Vasilis Sarafidis,

Affiliation:

Department of Econometrics; Monash University

Venue:

Room 498 Merewether Building H04

Title:

Estimation of Correlated Random Coefficient Models for Short Panels with a Multi-Factor Structure

Abstract:

In this paper we develop a methodology that provides a consistent estimator of the average effect in a correlated random coefficient panel data model with cross-sectional dependence when the time dimension is fixed. The problem of identification and estimation is studied without imposing the restriction that T is larger than the number of regressors.  We put forward a pooled GMM estimation approach, which allows certain forms of weak exogeneity or endogeneity.  Finite sample evidence shows that the estimator performs well.

23rd Aug 2013 - 11:00 am

Speaker:

Professor Dvir Shabtay,

Affiliation:

Ben Gurion University of Negev, Israel

Venue:

Room 498 Merewether Building H04

Title:

The Resource Dependent Assignment Problem and Its Applications in Scheduling

Abstract:

We extend the classical linear assignment problem to the case where the cost of assigning agent j to task i is a multiplication of task i's cost parameter by a cost function of agent j. The cost function of agent j is a either a linear or a convex function of the amount of resource allocated to the agent. A solution for our assignment problem is defined by the assignment of agents to tasks and by a resource allocation to each agent. The quality of a solution is measured by two criteria. The first criterion is the total assignment cost and the second is the total weighted resource consumption. We address these criteria via four different problem variations. For both assignment cost function (linear and convex) we obtained similar results (also the analysis is completely different). For both functions, we prove that (i) our assignment problem is NP-hard for three of the four variations even if all the resource consumption weights are equal and that (ii) the fourth variation is solvable in polynomial time. For the linear assignment cost function, we also provide a pseudo polynomial time algorithm to solve the NP-hard variations. In addition, we find that our assignment problem is equivalent to a large set of important scheduling problems whose complexity has heretofore been an open question for three of the four variations.

30th Aug 2013 - 11:00 am

Speaker:

Dr Yong Song,

Affiliation:

Business School; University of Technology Sydney

Venue:

Room 498 Merewether Building H04

Title:

Infinite Hidden Markov Models with Application to Speculative Bubble Detection

Abstract:

This paper proposes an infinite hidden Markov model (iHMM) to detect, date stamp, and estimate speculative bubbles. Three features make this new approach attractive to practitioners. First, the iHMM is capable of capturing the nonlinear dynamics of heterogeneous bubble behaviors as it allows an infinite number of regimes. Second, the implementation of this procedure is straightforward as the detection, dating, and estimation of bubbles are done simultaneously in a coherent Bayesian framework. Third, the iHMM, by assuming hierarchical structures, is parsimonious and superior in out-of-sample forecast. This model and its extensions are applied to the price-dividend ratio of NASDAQ Composite Index from 1973M02 to 2013M01. The in-sample posterior analysis and out-of-sample prediction find evidence of explosive dynamics during the dot-com bubble period. Model comparison shows that the iHMM is strongly supported by the data against the finite hidden Markov model.

11th Sep 2013 - 11:30 am

Speaker:

Professor Robert Kohn,

Affiliation:

Australian School of Business; The University of New South Wales

Venue:

Room 498 Merewether Bldg H04

Title:

Bayesian Inference Using an Unbiased Estimate of the Likelihood

Abstract:

We consider Bayesian inference by importance sampling or by Markov chain Monte Carlo when the likelihood is analytically intractable but can be unbiasedly estimated. When the inference is by importance sampling we refer to this procedure importance sampling squared (IS-squared), as we can often estimate the likelihood itself by importance sampling. We provide a formal justification for such inference  when working with an estimate of the likelihood and study its convergence properties. We analyse the effect of estimating the likelihood on the resulting inference and provide guidelines on how to set up the precision of the likelihood estimate in order to obtain an optimal tradeoff between the computational cost of estimating the likelihood and accuracy for posterior inference on the model parameters. We illustrate the methodology  in empirical applications to stochastic volatility models, nonlinear DSGE models and multinomial panel data models.

20th Sep 2013 - 11:00 am

Speaker:

Associate Professor Nektarios Aslanidis,

Affiliation:

Departament d' Economia; Universitat Rovira i Virgili, CREIP ; Spain

Venue:

Room 498 Merewether Bldg

Title:

Quantiles of the Realized Stock-Bond Correlation and Links to the Macroeconomy

Abstract:

This paper adopts quantile regressions to scrutinize the realized stock-bond correlation based upon high frequency returns. The paper provides in-sample and out-of-sample analysis and considers a large number of macro-finance predictors well-know from the return predictability literature. Strong in-sample predictability is obtained from quantile models with factor-augmented predictors, particularly at the lower to median quantiles. Out-of-sample the quantile factor model works best at the median to upper quantiles. Investor sentiment generally does not significantly affect the quantiles of the realized stock bond corrrelation. 

Keywords: Realized stock-bond correlation; Quantile regressions; Macro-finance variables; Factor analysis; Investor sentiment.

JEL Classifications: C22; G11; G12

10th Oct 2013 - 11:30 am

Speaker:

Professor Rob Hyndman,

Affiliation:

Department of Econometrics and Business Statistics; Monash University

Venue:

Room 498 Merewether Bldg H04

Title:

Forecasting Hierarchical Time Series

Abstract:

Hierarchical time series occur when there are multiple time series that are hierarchically organized and can be aggregated at several different levels in groups based on dimensions such as product, geography, or some other features. A common application occurs in manufacturing where forecasts of sales need to be made for a range of different products in different locations. The forecasts need to add up appropriately across the levels of the hierarchy.

Historically, forecasting of hierarchical time series has been done using either the "bottom-up" method, various "top-down" methods, or some combination of the two known as "middle-out" approaches.

I will describe a framework for studying such methods which leads naturally to an optimal combination approach based on a large ill-conditioned regression model.

While the model leads to optimal forecasts, the ill-conditioning and size of the model make computation difficult or impossible. I will describe a solution to this problem that make the forecasts fast to compute even for problems involving hundreds of thousands of time series.

 

18th Oct 2013 - 11:00 am

Speaker:

Dr Nuttanan Wichitaksorn,

Affiliation:

Department of Mathematics and Statistics; University of Canterbury, NZ

Venue:

Room 498 Merewether Bldg H04

Title:

The Bayesian Parallel Computation for Intractable Likelihood

Abstract:

Parallel computation is a fast growing computing environment in many areas including computational Bayesian statistics.  However, most of the Bayesian parallel computing have been implemented through the sequential Monte Carlo method where model parameters are updated sequentially and it is suitable for some large-scale problems.  This talk is the first to revive the use of adaptive griddy Gibbs (AGG) algorithm under the Markov chain Monte Carlo framework and show how to implement the AGG using the parallel computation.  The parallel AGG is suitable for (i) small to medium-scale problems where the dimension of model parameter space is not very high, (ii) some or all model parameters are defined on a specific interval, and (iii) model likelihood is intractable. In addition, the parallel AGG is relatively easy to implement and code.  Since the AGG is a Gibbs algorithm where each of model parameters is directly drawn from the conditional posterior density, the model marginal likelihood can be conveniently computed and immediately provided after the end of posterior simulation.  Three examples including a linear regression model with Student-t error, a nonlinear regression model, and a financial time series model (GARCH), will be illustrated to show the applicability of the AGG to the parallel computing environment.

 

1st Nov 2013 - 11:00 am

Speaker:

Dr Vitali Alexeev,

Affiliation:

University of Tasmania

Venue:

Room 498 Merewether Building H04

Title:

Equity portfolio diversification with high frequency data

Abstract:

Investors wishing to achieve a particular level of diversification may be misled on how many
stocks to hold in a portfolio by assessing the portfolio risk at different data frequencies. High
frequency intradaily data provide better estimates of volatility, which translate to more
accurate assessment of portfolio risk. Using 5-minute, daily and weekly data on S&P500
constituents for the period from 2003 to 2011 we find that for an average investor wishing to
diversify away 85% (90%) of the risk, equally weighted portfolios of 7 (10) stocks will suffice,
irrespective of the data frequency used or the time period considered. However, to assure
investors of a desired level of diversification 90% of the time, instead of on average, using low
frequency data results in an exaggerated number of stocks in a portfolio when compared with
the recommendation based on 5-minute data. This difference is magnified during periods
when financial markets are in distress, as much as doubling during the 2007-2009 financial
crisis.
Keywords: Portfolio diversification, high frequency, realized variance, realized correlation.

5th Dec 2013 - 11:00 am

Speaker:

Dr Georgios Tsiotas,

Affiliation:

University of Crete; Greece

Venue:

Rm 498 Merewether Bldg H04

Title:

Loss Functions in Value-at-Risk Estimation

Abstract:

The Value at Risk (VaR) is a risk measure that is widely used by financial institutions to allocate risk. VaR forecast estimation involves the evaluation of conditional quantiles based on the currently available information. Recent advances in VaR evaluation incorporate conditional variance into the quantile estimation, which yields the Conditional Autoregressive VaR (CAViaR) models.

Optimal VaR estimates are typically generated using the so-called ``check'' loss function.
However, issues like VaR's bias estimation and asymmetric financial decision making, based on the sign of the forecast error, can give grounds for the use of asymmetric loss functions in forecasting VaR. In this study, we introduce a combination of loss functions and we investigate its effect on forecasting conditional VaR. We illustrate this method using simulated and daily financial return series.