Find us on Facebook Find us on LinkedIn Follow us on Twitter Subscribe to our YouTube channel

2010 Seminars

26th Feb 2010 - 11:00 am

Speaker:

Dr James Symons,

Affiliation:

University College London

Venue:

Room 498 Merewether Building

Title:

The Housing Market and the U.S. Business Cycle

Abstract:

We develop a forecasting model of U.S. GDP growth, 1956-2009. Perhaps counter-intuitive to some, we conclude that changes in the real value of tangible assets, comprised mostly of real estate, do not have an important role in forecasting future growth, once one controls for the conventional macroeconomic variables.

19th Mar 2010 - 11:00 am

Speaker:

Professor Richard Paap,

Affiliation:

Econometric Institute, Erasmus University Rotterdam, The Netherlands

Venue:

Room 498 Merewether Building

Title:

Real-Time Inflation Forecasting in a Changing World

Abstract:

This paper revisits inflation forecasting using reduced form Phillips curve forecasts, i.e., inflation forecasts using activity and expectations variables. We propose a Phillips curve-type model that results from averaging across different regression specifications selected from a set of potential predictors. The set of predictors includes lagged values of inflation, a host of real activity data, term structure data, nominal data and surveys. In each of the individual specifications we allow for stochastic breaks in regression parameters, where the breaks are described as occasional shocks of random magnitude. As such, our framework simultaneously addresses structural change and model certainty that unavoidably affects Phillips curve forecasts. We use this framework to describe PCE deflator and GDP deflator inflation rates for the United States across the post-WWII period. Over the full 1960-2008 sample the framework indicates several structural breaks across different combinations of activity measures. These breaks often coincide with, amongst others, policy regime changes and oil price shocks. In contrast to many previous studies, we find less evidence for autonomous variance breaks and inflation gap persistence. Through a real-time out-of-sample forecasting exercise we show that our model specification generally provides superior one-quarter and one-year ahead forecasts for quarterly inflation relative to a whole range of forecasting models that are typically used in the literature.

This is a joint work with Jan J.J. Groen and Francesco Ravazzolo

26th Mar 2010 - 11:00 am

Speaker:

Professor William McCausland,

Affiliation:

Economics Department, The University of Montreal

Venue:

Room 498 Merewether Building

Title:

The HESSIAN Method: Highly Efficient Simulation Smoothing, In A Nutshell

Abstract:

State space models, which govern the interaction of observed data and unobserved states, are very useful in capturing dynamic relationships, especially where there are changing, but latent, economic conditions: the states may be state variables in macroeconomic models, volatility in asset markets or time-varying model parameters. In this paper, I describe the HESSIAN method for non-linear non-Gaussian state space models. It involves an approximation of the conditional density of states given data that can be evaluated and simulated exactly. The approximation can be used as a proposal density, for Bayesian inference using Markov chain Monte Carlo (MCMC) methods; or as an importance density, useful for approximating the likelihood function through simulations. Because the approximation is so close, fast MCMC and importance sampling are feasible and highly numerically efficient for problems where alternatives are inefficient or intractable. I compare the performance of the HESSIAN method with methods currently used in the literature.

9th Apr 2010 - 11:00 am

Speaker:

Professor Dirk Kroese,

Affiliation:

The University of Queensland

Venue:

Room 498 Merewether Building

Title:

Efficient Estimation via Generalised Splitting

Abstract:

Adaptive importance sampling techniques such as the cross-entropy method have proved to be very useful in rare-event probability estimation. However, for high-dimensional problems the importance sampling estimator can become unreliable, due to the degeneration of the likelihood ratio.  We propose a new rare-event estimation method based on the classical splitting idea of Kahn and Harris. The new method does not use importance sampling, is non-parametric, and remains stable in higher dimensions. We demonstrate its effectiveness by estimating the number of solutions in the satisfiability (SAT) problem.

12th Apr 2010 - 11:00 am

Speaker:

Professor Jan Magnus,

Affiliation:

Tilburg University, The Netherlands

Venue:

Room 498 Merewether Building

Title:

WALS model averaging

Abstract:

Parameter estimation under model uncertainty is a difficult and fundamental issue in econometrics. One way to approach this problem --- and hence to avoid the harmful pretest effect --- is via model averaging. In model averaging all potential models play a role, but in varying degree of importance. We shall introduce a new method, calles weighted average least squares (WALS), and compare the performance with the dominant method (Bayesian model averaging - BMA). Our proposed method has two major advantages over BMA: its computational burden is trivial and it is based on a transparent definition of prior ignorance.

30th Apr 2010 - 11:00 am

Speaker:

Professor Boaz Golany,

Affiliation:

Technion in Haifa, Israel

Venue:

Room 498 Merewether Building

Title:

Resource Allocation in a Tactical Arms Race with Temporary Advantages

Abstract:

We consider an arms race between two opponents (e.g., government forces vs. insurgents) where each advantage that is achieved by one of the opponents is limited in time and expires when the other opponent develops a new weapon or counter-measure (in contrast with the "winner-takes-all" situation that characterizes much of the literature on investments in competitive business environments). We first consider a variety of models that apply to a one-sided situation, where the defender has to determine how much to invest in developing counter-measures to a weapon employed by the attacker. The decision problems are expressed as (convex) nonlinear optimization problems. We present an example that provides some operational insights regarding optimal resource allocation. We also consider a two-sided situation and develop a Nash equilibrium solution that sets investment values so that both parties have no incentive to change.

21st May 2010 - 11:00 am

Speaker:

Associate Professor Shelton Peiris,

Affiliation:

School of Mathematics and Statistics, The University of Sydney

Venue:

Room 498 Merewether Building

Title:

Estimation of Autoregressive Conditional Duration (ACD) Models using Estimating Functions and QMLE Methods

Abstract:

We consider the class of Autoregressive Conditional Duration (ACD) models to analyze the time gaps (durations) between consecutive transactions. Since the popular maximum likelihood (MLE) procedure is diffcult to implement in practice, we use an alternative estimation method based on estimating functions (EF). Using a large scale simulation study, we show that this EF method is easier to apply than the MLE in practice and produces similar results. We also discuss the class of log ACD models and use QMLE methods for parameter estimation following Allen et al (2008). If time permits, we consider some examples from real data sets to illustrate these estimation methods.

18th Jun 2010 - 11:00 am

Speaker:

Professor Paul Fischer,

Affiliation:

Smeal College of Business Administration, The Pennsylvania State University

Venue:

Room 498 Merewether Building

Title:

When Does Soft Talk Matter?: Evidence from Officer Quotations in Earnings Press Releases

Abstract:

We employ a sample of quotations from earnings press releases to ascertain factors influencing the information content of qualitative disclosures by management. We find that, after controlling for quantitative aspects of the earnings release disclosure, the degree of optimism conveys significant incremental information content when quantitative earnings guidance is not provided in the release. The information content of the degree of optimism is reduced when quantitative guidance is provided, which is consistent with quantitative guidance crowding out information transmission via qualitative disclosure. In addition, we provide some evidence that the information content of qualitative disclosure is enhanced if management perceives greater litigation risk and is reduced when investors have less favourable perceptions of management.

22nd Jun 2010 - 11:00 am

Speaker:

Associate Professor H. K. Tony Ng,

Affiliation:

Dept. of Statistical Science, Southern Methodist University, USA

Venue:

Room 498 Merewether Building

Title:

Optimal Sample Size Allocation for Multi-Stress Tests using Extreme Value Regression

Abstract:

In this talk, I will discuss the optimal sample size allocation in a multi-group life-testing experiment for complete sample and Type-II censored sample. The extreme value regression model is commonly used for statistical analysis of data arising from such a multi-stress experiment, for example, the books by Nelson (1982) and Meeker and Escobar (1998).  By considering this situation, we will derive the maximum likelihood estimators (MLEs), expected Fisher information and the asymptotic variance-covariance matrix of the MLEs. Three optimality criteria will be introduced and the optimal allocation of units for two- and k-stress level situations will then be determined. Then I will demonstrate the efficiency of this optimal allocation rule by using the real experimental situation considered earlier by Nelson and Meeker (1978). Finally, I will present some Monte Carlo simulations to show that the optimality results hold for small sample sizes as well. 

30th Jul 2010 - 11:00 am

Speaker:

Dr Laurent Pauwels,

Affiliation:

The Discipline of Operations Management and Econometrics, The University of Sydney

Venue:

Room 498, Merewether Building

Title:

Forecast combination in discrete choice models: predicting FOMC monetary policy decisions

Abstract:

This paper extends the discrete choice model by Hu and Phillips (2004) that allows for nonstationary dependent and explanatory variables. It provides a new methodology to combine in- and out-of-sample forecasts based on a mixture of discrete models. This is achieved primarily by combining probabilities associated with each model. The methodology is not limited to point forecast and can be used to predict the whole density of the multiple choice model. Scoring functions, such as log-score and quadratic score, are used to evaluate the forecasting performance of the diverse models. We apply this methodology to forecast the outcomes of Federal Reserve board meetings decisions in changing the federal funds target rate. The original and extended Hu and Phillips (2004) data set and model are employed as a starting point to conduct the empirical studies. This paper also investigate the utilisation of real-time data, which contains the actual information available at the time when Federal Reserve board makes decisions rather than revised and updated data series. Furthermore, models are constructed with a mixture of data frequencies.

 

This paper is work-in-progress and only preliminary results and preliminary ideas will be presented. This is a joint work of Laurent Pauwels and Andrey Vasnev.

6th Aug 2010 - 11:00 am

Speaker:

Dr Takashi Yamagata,

Affiliation:

University of York

Venue:

Room 498, Merewether Building

Title:

Instrumental Variable Estimation of Dynamic Linear Panel Data Models with Defactored Regressors under Cross-sectional Dependence

Abstract:

This paper develops an instrumental variable (IV) estimator for consistent estimation of dynamic panel data models with error cross-sectional dependence when both N and T, the cross-section and time series dimensions respectively, are large. Our approach asymptotically projects out the common factors from regressors using principal components analysis and then uses the defactored regressors as instruments to estimate the model in a standard way. Therefore, the proposed estimator is computationally very attractive. Furthermore, our procedure requires estimating solely the common factors included in the regressors, leaving those that in.uence only the dependent variable into the errors. Hence aside from computational simplicity the resulting approach allows parsimonious estimation of the model. The
.nite-sample performance of the IV estimator and the associated t-test is investigated using simulated data. The results show that the bias of the estimator is very small and the size of the t-test is correct even when (T;N) is as small as (10; 50). The performance of an overidentifying restrictions test is also explored.

 

This is a joint work with Dr Vasilis Sarafidis.

13th Aug 2010 - 11:00 am

Speaker:

Dr Vasilis Sarafidis,

Affiliation:

The Discipline of Operations Management and Econometrics, The University of Sydney

Venue:

Room 498, Merewether Building

Title:

IV Estimation of Factor Residuals

Abstract:

This paper considers panel data regression models with residuals generated by a multi-factor error structure and regressors that are not necessarily strongly exogenous. In such cases, the standard dynamic panel estimators fail to provide consistent estimates of the parameters. We propose a new estimation procedure, based on instrumental variables, which retains the traditional attractive features of method of moments estimators. The novelty of our approach is that we introduce new parameters to represent the unobserved covariances between the instruments and the factor component of the residual; these parameters are typically estimable when N is large. Some important estimation and identification issues are studied in detail. Our estimator permits unit roots and is robust to cases where the variance of the factor loadings is large. In the fixed-effects case, we show that modified versions of our estimator are asymptotically equivalent to the popular Arellano-Bond, Ahn-Schmidt and system GMM estimators. Therefore, our approach provides a unifying treatment of existing panel estimators. The finite-sample performance of our estimator is investigated using simulated data. The results show that proposed method produces reliable estimates of the parameters over various parametrizations.

27th Aug 2010 - 11:00 am

Speaker:

Professor Zinoviy Landsman,

Affiliation:

University of Haifa

Venue:

Room 498, Merewether Building

Title:

Translation-invariant and positive homogeneous risk measures and optimal portfolio management

Abstract:

The problem of risk portfolio optimization with translation-invariant and
positive-homogeneous risk measures, important representatives of which are value-at-risk (VaR) and tail conditional expectation (TCE), leads for the case of elliptical multivariate underlying distributions to the problem of minimizing a combination of a linear functional and the square root of a quadratic functional. In this paper we provide a simple and feasible condition under which the optimal solution exists, and the explicit closed- form solution of this minimization problem is provided prior to this condition. The results are illustrated using data of 10 stocks from NASDAQ/Computers. The closeness between the VaR and TCE optimal portfolios is investigated.

3rd Sep 2010 - 11:00 am

Speaker:

Professor Joachim Inkmann,

Affiliation:

The University of Melbourne

Venue:

Room 498, Merewether Building

Title:

Can the Life Insurance Market Provide Evidence for a Bequest Motive?

Abstract:

Using U.K. microeconomic data, we analyze the empirical determinants of participation in the life insurance market. We find that term insurance demand is positively correlated with measures of bequest motives like being married, having children and/or subjective measures of strong bequest motives. We then show that a life-cycle model of life insurance demand, saving and portfolio choice can rationalize quantitatively the data in the presence of a bequest motive. These findings provide evidence supporting the presence of a bequest motive.

24th Sep 2010 - 11:00 am

Speaker:

Professor Udi Makov,

Affiliation:

University of Haifa, Israel

Venue:

Room 498, Merewether Building (H04)

Title:

Extensions of the Lee-Carter model for mortality projections

Abstract:

The literature on mortality projections was dominated in the 1990's by the Lee-Carter model which assumes that the central death rate for a specific age follows a log-bilinear form, allowing for variations in the level of mortality over time. This model, with its inherent homoscedastic structure, was later extended by a Poisson model governed by a similar log-bilinear force of mortality. The paper will discuss potential extensions to the Lee-Carter model along the following lines:

  • Presentation of the L-C model as a state-space model.
  • Bayesian Implementation of the L-C model with a broad family of imbedded ARIMA models.
  • Bayesian model choice considerations.
  • Adaptation of the L-C model for simultaneous projection of several populations.

21st Oct 2010 - 11:00 am

Speaker:

Professor Susan Xu,

Affiliation:

Penn State University, Smeal College of Business

Venue:

Room 498, Merewether Building

Title:

Joint Dynamic Pricing of Multiple Perishable Products Under Consumer Choice

Abstract:

In response to competitive pressures, firms are increasingly adopting revenue management opportunities afforded by advances in information and communication technologies. Motivated by applications in industry, we consider a dynamic pricing problem facing a firm that sells given initial inventories of multiple substitutable and perishable products over a finite selling horizon. In these applications, since individual product demands are linked through consumer choices, the seller must formulate a joint dynamic pricing strategy while explicitly incorporating consumer behaviour. For a general model of consumer choice, we model this multi-product dynamic pricing problem as a stochastic dynamic program and characterize its optimal prices. In addition, since consumer behaviour depends on the nature of product differentiation, we specialise the general choice model to capture vertical and horizontal differentiation. When products are vertically differentiated, our results show monotonic properties of the optimal prices and reveal that the optimal prices can be determined by considering aggregate inventories of products rather than their individual inventory levels. Accordingly, we develop a polynomial-time exact algorithm for determining the optimal prices. When products are horizontally differentiated, we find that analogous structural properties do not hold and the behaviour of optimal prices is substantially different. To solve this problem, we develop a variant of the backward induction algorithm that uses cubic spline interpolation to construct the value functions at each stage. We demonstrate that this interpolation-based algorithm has low memory requirements and is very effective in generating near-optimal solutions that result in an average error of less than 0.15%.

22nd Oct 2010 - 11:00 am

Speaker:

Professor Gael Martin,

Affiliation:

Department of Econometrics and Business Statistics, Monash University

Venue:

Room 498, Merewether Building

Title:

Probabilistic Forecasts of Volatility and its Risk Premia.

Abstract:

The object of this paper is to produce distributional forecasts of physical volatility and its associated risk premia using a non-Gaussian state space approach. Option and spot market information on the unobserved variance process is captured via non-parametric, `model-free' measures of option-implied and spot price-based variance, with the two measures used to define a bivariate observation equation in the state space model. The premium for diffusive variance risk is linear in the latent variance (in the usual fashion) whilst the premium for jump variance risk is specified as a conditionally deterministic dynamic process, driven by a past function of the measurements. Linking the risk premia to the risk aversion parameter in a particular form of representative agent model, we also produce probabilistic forecasts of the relative risk aversion of a representative investor. The inferential approach adopted is Bayesian, implemented via a Markov chain Monte Carlo (MCMC) algorithm that caters for the non-linearities in the model and for the multi-move sampling of the latent variances. The simulation output is used to estimate predictive distributions for all latent and observed variables of interest. The method is applied to empirical spot and option price data for the S&P500 index over the 2000 to 2007 period, with conclusions drawn about investors' required compensation for variance risk during the recent financial turmoil. Bayesian methods are used to demonstrate the accuracy of the probabilistic forecasts of the observable variance measures, compared with those yielded by more standard time series models.

29th Oct 2010 - 11:00 am

Speaker:

Dr Fernando Jose Garrigos Simon,

Affiliation:

Technical University of Valencia, Spain

Venue:

Room 498, Merewether Building

Title:

Seasonality, quality and short strategies of prices. The case of the Alicante-London Market

Abstract:

This paper focuses on airline prices in the Alicante-London market. It introduces a model of prices and analyses price evolution over short periods to observe the incidence of seasonality, the types of firms involved, timetabling, types of airport, competitiveness, and variables such as the price of jet fuel and the rate of exchange used by airlines to establish prices.

The paper analyses these variables in three seasons to compare the strategies of the companies. The results show the relative incidence of all variables analysed, and stress the relevance of seasonality and competitiveness in the price strategies followed by the different types of company

1st Nov 2010 -

Speaker:

Associate Professor Gary Tian,

Affiliation:

University of Wollongong

Venue:

Room 498, Merewether Building

Title:

Disproportional ownership structure and pay-performance relationship: evidence from China?s listed firms

Abstract:

This paper examines the impact that ownership structure has on the pay-performance relationship in China’s listed firms. We find that the cash flow rights of the ultimate controlling shareholder have a positive effect on this relationship while a divergence between the control rights and cash flow rights has a significantly negative effect.  By dividing our sample into state owned enterprises (SOE), state assets management bureaus (SAMB), and privately controlled firms, we find that cash flow rights in SOE controlled firms have a significant impact on accounting based pay performance and cash flow rights in privately controlled firms also affect the market performance based relationship, however, CEO pay in SAMB controlled firms bear no relationship with either accounting or market based performance. We therefore argue that CEO pay is inefficient in firms where the state is the controlling shareholder because it is insensitive to market based performance but consistent with the efforts of controlling shareholders to maximize their profits. 

1st Nov 2010 - 10:00 am

Speaker:

Dr Tommaso Proietti,

Affiliation:

University of Tor Vegata, Italy

Venue:

Room 498, Merewether Building

Title:

Hyper-spherical and Elliptical Stochastic Cycles

Abstract:

A univariate first order stochastic cycle can be represented as an element of a bivariate first order vector autoregressive process, or VAR(1), where the transition matrix is associated with a Givens rotation. From the geometrical viewpoint, the kernel of the cyclical dynamics is described by a clockwise rotation along a circle in the plane. The reduced form of the cycle is either ARMA(2,1), with complex roots, or AR(1), when the rotation angle equals 2kΠ or (2k + 1) Π, k = 0, 1, ...
This paper generalizes this representation in two directions. According to the first, the cyclical dynamics originate from the motion of a point along an ellipse. The reduced form is also ARMA(2,1), but the model can account for certain types of asymmetries. The second deals with the multivariate case: the cyclical dynamics result from the projection along one of the coordinate axis of a point moving in Rn along an hyper-sphere. This is described by a VAR(1) process whose transition matrix is obtained by a sequence of n-dimensional Givens rotations. The reduced form of an element of the system is shown to be ARMA(n, n - 1). The properties of the resulting models are analyzed in the frequency domain, and we show that this generalization can account for a multimodal spectral density.
The illustrations show that the proposed generalizations can be fitted successfully to some well-known case studies of the econometric and time series literature. For instance, the elliptical model provides a parsimonious but effective representation of the mink-muskrat interaction. The hyper-spherical model provides an interesting re-interpretation of the cycle in US Gross Domestic Product quarterly growth and the cycle in the Fortaleza rainfall series.
 

1st Nov 2010 - 12:00 pm

Speaker:

Dr Thomas A. Weber,

Affiliation:

Stanford University

Venue:

Room 498, Merewether Building

Title:

MODELS AND DECISIONS: A ROBUST APPROACH TO FINDING BOTH

Abstract:

Separating the identification problem from the problem of finding solutions to a decision problem described by an uncertain model has the generic drawback that the error norm used for fitting the model is not related to the expected loss from ex-post model mismatch. Furthermore, sample data from the model can be often complemented by insights about admissible model behavior. In this talk, I will present a general approach that merges the identification and robust optimization problems, subject to structural constraints. Approximation errors and optimal decisions are determined simultaneously, together with an ex-ante distribution of the corresponding payoffs. I will also discuss the related problem of data acquisition and provide perspectives on how the approach can be generalized to games. The robust approximation method is illustrated for the problem of optimal debt settlement in the credit-card industry.

Joint work with Naveed Chehrazi.

12th Nov 2010 - 11:00 am

Speaker:

Professor Sophia P. Dimelis,

Affiliation:

Athens University of Economics and Business

Venue:

Room 498, Merewether Building

Title:

A Stochastic Production Frontier Model

Abstract:

In this paper we explore the idea that the Information and Communications Technologies (ICT) may have a contribution in reducing productive inefficiencies.  ICT is treated as a special type of technology and knowledge capital, the impact of which on production should be evaluated through the channel of technical efficiency. To implement this, we adopt the stochastic production frontier methodology for panel data. We follow the one-step procedure, as suggested in the recent literature, in which the technology parameters are estimated simultaneously with the parameters of the inefficiency equations, avoiding serious econometric problems involved with the two-step procedure initially employed (Schmidt and Sickles, 1984; Battese and Coelli, 1995; Coelli et al., 1999; Wang and Schmidt, 2002).
The model is implemented using a panel of 42 developed and developing countries in the period 1993-2001. The analysis is also performed focusing only on the OECD countries for which a longer data set was available (1990-2005). Strong evidence is provided for a significant impact of ICT in reducing country inefficiencies. Further evidence indicates a significantly positive ICT impact on labor productivity, while it seems that a substitute relationship between ICT and non ICT capital exists.
Based on the model’s estimates, the most efficient countries in the OECD group are the USA, Belgium and the Netherlands, while India and Argentina achieved the highest efficiency levels among the developing countries. Overall, developed countries operate closer into the world frontier. Several south European countries are less efficient and have not yet converged to the efficiency levels of the most developed OECD countries.

19th Nov 2010 - 11:00 am

Speaker:

Dr Mohamad Khaled,

Affiliation:

Discipline of Operations Management and Econometrics, The University of Sydney

Venue:

Room 498, Merewether Building

Title:

Estimation of copula models with discrete margins

Abstract:

Estimation of copula models with discrete margins is known to be difficult beyond the bivariate case. We show how this can be achieved by augmenting the likelihood with uniform latent variables, and computing inference using the resulting augmented posterior. To evaluate this we propose two efficient Markov chain Monte Carlo sampling schemes. One generates the latent variables as a block using a Metropolis-Hasting step with a proposal that is close to its target distribution. Our method applies to all parametric copulas where the conditional copula functions can be evaluated, not just elliptical copulas as in previous Bayesian work. Moreover, the copula parameters can be estimated joint with any marginal parameters. We establish the effectiveness of the estimation method by modeling consumer behavior in online retail using Archimedean and Gaussian copulas and by estimating 16 dimensional D-vine copulas for a longitudinal model of usage of a bicycle path in the city of Melbourne, Australia. Finally, we extend our results and method to the case where some margins are discrete and others continuous.

The paper is a joint work with Professor Michael S. Smith.