Find us on Facebook Find us on LinkedIn Follow us on Twitter Subscribe to our YouTube channel Instagram

2016 Seminars

26th Feb 2016 - 11:00 am

Venue: Room 5040 Business School Abercrombie Bldg H70

Speaker: Professor Eddie Anderson, Discipline of Business Analytics; Business School; The University of Sydney

Title: Releasing Buyer Information in Procurement Auctions with Price and Quality Variables

We model a situation in which a single firm evaluates competing suppliers and selects just one. Suppliers submit bids involving both price and quality variables. The buyer makes a choice determined by a weighted scoring system, but the suppliers do not have full information about this - for example the supplier's bid may be scored on “reliability” but the supplier is unable in advance to be sure of the exact score that will be awarded. We consider a model in which there is supplier uncertainty on the scores that will be given, but the weights that the buyer assigns to different variables are known. We investigate the amount of information that the buyer should release prior to the auction, and demonstrate that it is beneficial not to reveal all information, so that the suppliers continue to be uncertain about their final scores.

11th Mar 2016 - 11:00 am

Venue: Room 2150 Business School Abercrombie Bldg H70

Speaker: Dr Erick Li, Discipline of Business Analytics; Business School; The University of Sydney

Title: Robust Retail Supply Chain Management

Brightstar is the largest distributor of telecommunication devices in Australia. Like many other firms, Brightstar faces some challenges in allocating inventory to each store in its supply chain and designing compensation contracts for each store. The obstacles include stock-out substitution and incomplete information regarding the demand distributions.


In the first part of the presentation, I propose a multi-product inventory model with a max-min objective and stock-out substitution. There are many practical obstacles that limit the distributor’s ability to exactly characterize the demand distributions. With limited information regarding the demand distributions, the distributor may apply the max-min decision rule that maximises the worst-case expected profit. A max-min decision rule is also appealing when firms start to embrace an economy that is slowing down. On the other hand, stock-out substitution occurs when a customer finds his/her first choice runs out of stock and seeks a second choice as substitute. I formulate a two-stage optimisation model to identify the optimal max-min inventory levels. In addition, I also propose a heuristic and investigate its effectiveness.


In the second part of the presentation, I investigate how to design the sales force compensation contract and determine the inventory level in a single-product model. The distributor does not possess complete information regarding the promotion methods that the store can undertake to influence the demand but wishes to maximize the worst-case expected profit. I demonstrate that a linear compensation contract is optimal in this circumstance. After that, I describe how to jointly optimise the linear commission rate and inventory level. 

18th Mar 2016 - 11:00 am

Venue: Room2150, Abercrombie Business School Bldg H70

Speaker: Prof Anton Ovchinnikov, Queens School of Business; Queens University; Ontario, Canada

Title: Benavioral Anomalies in Consumer Wait-or-Buy Decisions and Their Implications for Markdown Management

A decision to buy an item at a regular price or wait for a possible markdown involves a multi-dimensional trade-off between the value of the item, the delay in getting it, the likelihood of getting it and the magnitude of the price discount. Such trade-offs are prone to behavioral anomalies by which human decision makers deviate from the discounted expected utility model. We build an axiomatic preference model that accounts for three well-known anomalies, and produces a parsimonious generalization of discounted expected utility. We then plug this behavioral model into a Stackelberg-Nash game between a firm that decides the price discount and a continuum of consumers who decide to wait or buy, anticipating other consumers' decisions and the resultant likelihood of product availability. We solve the markdown management problem and contrast the results of our model with those under discounted expected utility. We analytically show that accounting for the behavioral anomalies can result in larger markdowns and higher revenues. Finally, we calibrate our model via a laboratory experiment, and validate its predictions out-of-sample.


*joint paper with Manel Baucells University of Virginia, USA), Nikolay Osadchiy (Queens University, Canada)

1st Apr 2016 - 11:00 am

Venue: Room 5040, Abercrombie Business School Bldg H70

Speaker: Dr Richard Philip, Discipline of Finance; The University of Sydney

Title: Implementing the Bivariate Cumulant-Generating Function to Estimate Linear Models with Errors in Variables with Applications to Measuring Beta

We propose a bivariate cumulant generating function (CGF) method to obtain consistent estimates for linear models with error in the variables.  By assuming that the explanatory variable follows a flexible

double gamma distribution, we obtain closed-form solutions for the analytical characteristic function of the data. Via simulation, we show that our estimators outperform ordinary least squares (OLS) estimators

and existing characteristic function techniques in terms of mean square error. Lastly, we provide guidance on implementing CGF methods and apply the technique to estimate Betas based on the Capital Asset Pricing Model (CAPM).


*joint work with H. Malloch and S. Satchell


29th Apr 2016 - 11:00 am

Venue: Room 2150, Abercrombie Business School Bldg H70

Speaker: Prof Nicholas Hall, Fisher College of Business; Ohio State University

Title: Project Management: A Research Agenda and Maximizing Risk-Adjusted Net Present Value

Project management has recently experienced remarkable growth in business interest, as shown by a 1000% increase in membership in the Project Management Institute since 1996. Today, one-fifth of the world’s economic activity, with an annual value of $12T, is organized as a project. This growth is largely attributable to new applications, for example IT, R&D, new product and service development and corporate change management, with different characteristics and requiring new methodologies. Yet projects still routinely fail to deliver on time, on cost and within scope. This talk outlines an agenda for project management research, including a detailed discussion of open issues that are both practically important and theoretically interesting. Within this agenda, one important topic is calculation of the performance measure net present value (NPV), which is used for project evaluation and selection. It is widely recognized that, during project execution, the level of risk diminishes. In principle, this should be represented in the discount rate that is used in the calculation of NPV. This paper represents what is apparently the first attempt to accomplish this. We model decreasing project risk by removing from the discount rate the risk factor associated with each task when it is completed. We formulate and solve the resulting nonlinear, nonconcave problem of maximizing the risk-adjusted NPV of a project. Our work enables better scheduling of projects, which increases their NPV. Also, our estimation of maximum risk-adjusted NPV enables a company to make better informed project selection decisions. We validate the contribution of our work using two computational studies: the first demonstrates a significant increase in maximum project NPV from improved project scheduling; the second demonstrates a significant increase in total project portfolio value from improved project selection. Our work also motivates companies to develop more precise information about their project risks.


6th May 2016 - 11:00 am

Venue: Room 2150, Abercrombie Business School H70

Speaker: Dr Vitali Alexeev, School of Business and Economics; University of Tasmania

Title: Assymetric Jump Beta Estimation with Implications for Portfolio Risk Management

In this paper, we contribute to the literature on portfolio diversification by evaluating the impact of extreme market shifts on equity portfolios. An important feature explored in our study is the asymmetry in portfolios’ behaviour during extreme negative market downturns versus extreme positive uprises. In studying jump dependence of two processes, we use high frequency observations focusing on segments of data on the fringes of return distributions.  Thus, we only consider a few outlying observations that, at the time, are informative for the jump inference. In particular, we study the relationship between jumps of a process for a portfolio of assets and an aggregate market factor, and we analyse the co-movement of the jumps in these two processes. Given the predominance of factor models in asset pricing applications, we focus on a linear relationship between the jumps and portfolio of assets and we assess its sensitivity to jumps in the market. We find that ignoring the asymmetry in sensitivities to negative versus positive market jumps results in under-diversification of portfolios and increases exposure to extreme negative market shifts. We show that investors care differently about extreme downside losses as opposed to extreme upside gains demanding additional compensation for holding stocks with high sensitivities to these movements.


*joint work with Wenying Yao, University of Tasmania; Giovanni Urga, University of Bergamo (Italy)


20th May 2016 - 11:00 am

Venue: Room 2150 Abercrombie Bldg H70

Speaker: A/Prof Artem Prokhorov, Discipline of Business Analytics; University of Sydney

Title: Recent Results on Using Copulas in Finance and in Productivity Analysis

In this talk I will review a few of my recent contributions to the copula-based analysis of issues of interest to finance and business.  The talk will be based on two (maybe three) papers, one of particular interest to finance, one of interest to productivity and efficiency analysis.  The finance paper is about situations when the conventional benefits of diversification are reversed; the productivity paper is about estimating firms efficiency when inputs and environmental variables are endogenous. In both papers, copulas arise naturally but lead to, perhaps, surprising results. One such result is that for a fairly large class of dependence structures, diversification increases riskiness of portfolios, where components have extremely fat tailed distributions. Another is that accounting for dependence between firm's efficiency and unobserved and observed factors influencing firm’s production permits a much more precise estimation of efficiency scores. I will talk about theory and provide some empirical examples. 


Links to the two papers are




27th May 2016 - 11:00 am

Venue: Room 2150 Abercrombie Bldg H70

Speaker: Dr Chris Oates, ACEMS Research Fellow; University of Technology Sydney

Title: Stein Operators on Hilbert Spaces

In 1972, Charles Stein published a central limit theorem for correlated variables. The mathematical approach used in the proof has since become known as Stein's Method. This talk provides an introduction to Stein's Method and describes a formal generalisation, based on Stein Operators. A characterisation of the action of Stein Operators on Hilbert spaces offers considerable potential for applications in kernel methods. One such application is presented, in the context of numerical integration for Bayesian posterior computation.



Oates, Girolami, Chopin (2017) Control Functionals for Monte Carlo Integration. J. R. Statist. Soc. B., 79(3), 1-24.

Oates, Cockayne, Briol, Girolami (2016) Convergence Rates for a Class of Estimators Based on Stein's Identity. arxiv:1603.03220.


10th Jun 2016 - 11:00 am

Venue: Room 2150 Abercrombie Bldg H70

Speaker: A/Prof Tomohiro Ando, Melbourne Business School; University of Melbourne

Title: Clustering Huge Number of Financial Time Series: A Panel Data Approach with High-Dimensional Predictors and Factor Structures

This paper introduces a new procedure for clustering a large number of financial time series based on high-dimensional panel data with grouped factor structures. The proposed method attempts to capture the level of similarity of each of the time series based on sensitivity to observable factors as well as to the unobservable factor structure. The proposed method allows for correlations between observable and unobservable factors and also allows for cross-sectional and serial dependence and heteroskedasticities in the error structure, which are common in financial markets. In addition, theoretical properties are established for the procedure.


We apply the method to analyse the returns for over 6,000 international stocks from over 100 financial markets. The empirical analysis quantifies the extent to which the U.S subprime crisis spilled over to the global financial markets. Furthermore, we find that nominal classifications based on either listed market, industry, country or region are insufficient to characterize the heterogeneity of the global financial markets.


17th Jun 2016 -

Venue: Room2150, Abercrombie Business School Building, H70

Speaker: Dr Jasper Veldman, Faculty of Economics and Business, University of Groningen

Title: Competition and cooperation in supplier development

We know very little about the interaction between horizontal buyer competition and cooperation when it comes to intervening in the supply market. Adding this “coopetition” perspective is important as even the most vigorous rivals engage in cooperation. Specifically, the notion that buyers can cooperatively invest in the supplier to gain advantages in terms of wholesale pricing could be further advanced in the literature, especially since cooperation between competitors to cut cost or to develop new technologies is an increasingly observed practice in industries such as the automobile and aerospace industry.
This paper analyses the interplay of competition and cooperation in a supply chain with two buyers and a shared supplier. By making either cooperative or competitive cost-reducing R&D investments in the supplier, the buyers may influence their rivalry in the product market. This is because of the effect of supplier investments on the wholesale prices charged by the supplier. We endogenize the buyer’s decision to let his investment spill over to the rival, by making an explicit distinction between relation-specific investments, and investments that directly benefit a buyer’s rival as well. 
We analyse the mechanisms of a phenomenon frequently observed in practice: buyer interventions in supply markets. Specifically we address the importance of a new phenomenon of buyers cooperating to improve supplier performance, and the benefits that may be the result of that. Our results are in contrast of recently published studies.

* Joint work with Niels Pulles (University of Twente), Gerard Gaalman (University of Groningen), Eddie Anderson (University of Sydney)

1st Jul 2016 - 11:00 am

Venue: Room 5050 H70

Speaker: A/Prof Leon Zolotoy, Melbourne Business School; University of Melbourne

Title: Managerial Mood and Corporate Earnings Management Practices

Prior research shows that weather-induced mood impacts individuals’ judgments and decisions. Using a localized sunshine-based indicator of mood, we examine whether mood influences the corporate earnings management practices of U.S. firms. We find that firms engage in more aggressive earnings management in years with relatively more sunny weather around their headquarters. We also find that the documented effect of local sunshine on earnings management is mitigated in firms with stronger corporate monitoring and firms where executives have longer tenure. Our findings are robust and economically meaningful. Collectively, our findings are consistent with sunshine inducing a positive mood in executives, which in turn influences their subjective assessments of the benefits and risks associated with earnings management.

12th Jul 2016 - 11:00 am

Venue: Room 5050 H70

Speaker: Dr Wendun Wang, Department of Econometrics; Erasmus University; Rotterdam NL

Title: To Pool or Not to Pool: What is a Good Strategy?

This paper considers estimating the slope parameters in heterogeneous panel data regressions. We propose a novel optimal pooling averaging estimator that makes an explicit trade-off between efficiency gains from pooling and bias due to heterogeneity. By theoretically and numerically comparing various estimators, we find that a uniformly best estimator does not exist and that our new estimator is superior in non-extreme cases. To decide which estimator to use, practical guidance is provided depending on features of data and models. As illustration we study the cross-country sovereign credit risk and shed new light on the determinants of sovereign credit default swap spreads. 


5th Aug 2016 - 11:00 am

Venue: Room 2150 H70

Speaker: Dr Christopher Gibbs, School of Economics; University of New South Wales

Title: Overcoming the Forecast Combination Puzzle: Lessons from the Time-Varying Efficiency of Phillips Curve Forecasts of U.S. Inflation

This paper proposes a new dynamic forecast combination strategy for forecasting inflation. The procedure draws on explanations of why the forecast combination puzzle exists and the stylized fact that Phillips curve forecasts of inflation exhibit significant time-variation in forecast accuracy. The forecast combination puzzle is the empirical observation that a simple average of point forecasts is often the best forecasting strategy. The forecast combination puzzle exists because many dynamic weighting strategies tend to shift weights toward Phillips curve forecasts after they exhibit a significant period of relative forecast improvement, which is often when their forecast accuracy begins to deteriorate. The proposed strategy in this paper weights forecasts according to their expected performance rather than their past performance to anticipate these changes in forecast accuracy. The forward-looking approach is shown to robustly beat equal weights combined and benchmark uni-variate forecasts of inflation in real-time out-of-sample exercises on U.S. and New Zealand inflation data.


12th Aug 2016 - 11:00 am

Venue: Room 2150 H70

Speaker: A/ Prof Ivan Medovikov, Department of Economics; Brock University; ON, Canada

Title: Can Analysts Predict Volatility?

We assess investment value of sell-side analyst recommendations from the standpoint of portfolio risk. We match I/B/E/S consensus recommendations issued for U.S.-listed equities during January 2015 with realized volatility of security returns up to one year from recommendation issue, and using a flexible semiparametric copula model find recommendation levels to be associated with subsequent changes in return volatility, suggesting that recommendations can help manage portfolio risk. Further, we find this relationship to be asymmetric and most pronounced among best-rated securities. Specifically, best-rated stocks appear to experience largest volatility declines after recommendation issue. These effects are conditional on recommendation changes.

31st Aug 2016 - 11:00 am

Venue: Room 5050 Abercrombie Business School Bldg H70

Speaker: Professor Bala Rajaratnam, Discipline of Business Analytics; University of Sydney

Title: MCMC-Based Inference in the Era of Big Data: A Fundamental Analysis of the Convergence Complexity of High-Dimensional Chains

Markov chain Monte Carlo (MCMC) lies at the core of modern Bayesian methodology, much of which would be impossible without it. Thus, the convergence properties of Markov chains relevant to MCMC have received significant attention, and in particular, proving (geometric) ergodicity is of critical interest. Nevertheless, current methods do not yield convergence rates sharp enough to permit a meaningful analysis in terms of the dimension of the parameter p and sample size n.  Thus, a clear theoretical characterization of the behavior of modern Markov chains in high dimensions is not available.

In this paper, we first demonstrate that contemporary methods for establishing Markov chain convergence behavior have serious limitations when the dimension grows, such as in the so-called "Big Data" setting. We then employ novel theoretical approaches to rigorously establish the convergence behavior of Markov chains typical of high-dimensional MCMC. Unlike many comparable results in the literature, we obtain exact convergence rates in total variation distance by establishing upper and lower bounds that share the same rate constant in n and p. We are thus able to overcome some of the stated challenges in contemporary research on convergence of MCMC. We also show a universality result for the convergence rate across an entire spectrum of models. We then demonstrate the precise nature and severity of convergence problems that can occur in some important models when implemented in high dimensions, including phase transitions in the convergence rates in various n and p regimes. These convergence problems effectively eliminate the apparent safeguard of geometric ergodicity. We then demonstrate theoretical principles by which Markov chains can be analyzed to yield bounded geometric convergence rates (essentially recovering geometric ergodicity) even as the dimension p grows without bound. Additionally, we propose a diagnostic tool for establishing convergence (or the lack thereof) in high-dimensional MCMC

2nd Sep 2016 - 11:00 am

Venue: Room 2150 H70

Speaker: A/Prof Felix Chan, School of Economics and Finance; Curtin University

Title: The Conditional Duration Models

This seminar presents an overview on modelling duration between changes in asset price using ultra-high frequency data. It first focus on the conventional approach by proposing a Generalized Logarithmic Autoregressive Conditional Duration (GLACD) model to examine the interaction between duration and variation in asset prices. This provides a convenient framework to test statistically the existence of such relationship. The model is flexible and contains various well known models as special cases, including, the Exponential Generalised Autoregressive Heteroskedasticity (EGARCH) model of Nelson (1991) and the Logarithmic Conditional Duration (Log-ACD) model of Bauwens and Giot (2000). The paper also obtains theoretical results for the Quasi-Maximum Likelihood Estimator (QMLE) for the proposed model. Specifically, sufficient conditions for consistency and asymptotic normality are derived under mild assumptions. Monte Carlo experiments also pro- vide further support of the theoretical results and demonstrate that the QMLE has reasonably good finite sample performance.


The paper then applies the model to nine different assets from three different asset classes, namely two exchange rate, two commodities and five stocks. The two currencies are Australia/US and British Pound/US exchange rates; the two commodities are Gold and Silver and the five stocks are BHP, Rio Tinto, CBA, ANZ and Apple. The sample spans from 1 March 2010 to 31 May 2010 with the number of observations ranges from 44178 to 1109897. The results show that there are strong relationship between duration and variation in price changes. The forecast performance of GLACD is also compared with the Log-ACD model and the results show that the proposed model performed better than the Log-ACD model. The attempt to establish the multivariate extension of the proposed model reveals its limitation. The paper will also propose an alternate approach to model multivariate conditional duration by extending the Hawke's process.


9th Sep 2016 - 11:00 am

Venue: Room 2150 Abercrombie Bldg H70

Speaker: A/Prof Joshua Chan, Research School of Economics; Australian National University; ACT

Title: Large Bayesian VARs: A Flexible Kronecker Error Covariance Structure

We introduce a class of large Bayesian vector autoregressions (BVARs) that allows for non-Gaussian, heteroscedastic and serially dependent innovations. To make estimation computationally tractable, we exploit a certain Kronecker structure of the likelihood implied by this class of models. We propose a unified approach for estimating these models using Markov chain Monte Carlo (MCMC) methods. In an application that involves 20 macroeconomic variables, we find that these BVARs with more flexible covariance structures outperform the standard variant with independent, homoscedastic Gaussian innovations in both in-sample model-fit and out-of-sample forecast performance.

14th Oct 2016 - 11:00 am

Venue: Rm 2150 Abercrombie Bldg H70

Speaker: Prof Richard Gerlach, Discipline of Business Analytics; University of Sydney

Title: Bayesian Semi-parametric Realized-CARE Models for Tail Risk Forecasting Incorporating Range and Realized Measures

A new framework called Realized Conditional Autoregressive Expectile (Realized-CARE) is proposed, through incorporating a measurement equation into the conventional CARE model, in a manner analogous to the Realized-GARCH model. Realized measures (e.g. Realized Variance and Realized Range) are employed as the dependent variable in the measurement equation and to drive expectile dynamics. The measurement equation here models the contemporaneous dependence between the realized measure and the latent conditional expectile. For this model, the usual search procedure and optimisation to estimate the expectile level proves challenging re convergence. We incorporate a fast MCMC approach, combined with a targeted search based on a quadratic approximation, to allow reasonably fast and improved accuracy in estimation of this level parameter. The usual Asymmetric Least Squares method also proves inaccurate and difficult to make converge, thus Bayesian adaptive Markov Chain Monte Carlo methods are proposed for estimation. Furthermore, the methods of sub-sampling and scaling are applied to the Realized Measures, to help deal with the inherent micro-structure noise of the realized volatility measures. In a real forecasting study applied to 7 market indices and 2 individual assets, one-day-ahead Value-at-Risk and Expected Shortfall forecasting results favor the proposed Realized-CARE model, especially that incorporating the Realized Range and the sub-sampled Realized Range, over a range of competing tail risk models and methods.

28th Oct 2016 - 11:00 am

Venue: Rm 2150 Abercrombie Bldg H70

Speaker: Dr Laurent Pauwels, Discipline of Business Analytics; The University of Sydney

Title: Fundamental Moments

Global trade can give rise to global hubs, which find themselves at the center of the global economy. A hub is a center of activity -geographic or sectoral- whose influence on the global economy is large enough that local disturbances have observable consequences in the aggregate. This paper investigates the existence, nature, and rise of such hubs through a model of worldwide input-output linkages. The model is fitted to the World Input-Output Database (WIOD) to evaluate the importance of vertical trade in creating global hubs with significant effects on the volatility of countries' cycles, and their international correlation. Our results suggest that worldwide granularity has increased sizeably since 1995, with significant consequences on GDP volatility and co-movements, especially in developed countries. The increase in granularity, in turn, is well explained by international trade.


*joint work with Jean Imbs , Paris School of Economics (CNRS)


4th Nov 2016 - 11:00 am

Venue: Rm 2150 Abercrombie Bldg H70

Speaker: Prof Gael Martin, Department of Econometrics; Monash University

Title: Asymptotic Properties of Approximate Bayesian Computation

Approximate Bayesian computation (ABC) is becoming an accepted tool for statistical analysis in models with intractable likelihoods. With the initial focus being primarily on the practical import of ABC, exploration of its formal statistical properties has begun to attract more attention. In this paper we consider the asymptotic behaviour of the posterior obtained from ABC and the ensuing posterior mean. We give general results on: (i) the rate of concentration of the ABC posterior on sets containing the true parameter (vector); (ii) the limiting shape of the posterior; and (iii) the asymptotic distribution of the ABC posterior mean. These results hold under given rates for the tolerance used within ABC, mild regularity conditions on the summary statistics, and a condition linked to identification of the true parameters. Important implications of the theoretical results for practitioners of ABC are highlighted.

*joint work with David Frazier, Monash University; Gael Martin; Monash University; Judith Rousseau, CREST, University of Paris, Dauphine; Christian Robert, CREST, University of Paris, Dauphine, University of Warwick.

11th Nov 2016 - 11:00 am

Venue: Room 4202 Abercrombie Bldg H70

Speaker: Prof Michael Smith, Melbourne Business School; University of Melbourne

Title: From Amazon to Apple: Modelling Online Retail Sales, Purchase Incidence and Visit Behavior

In this study, we propose a multivariate stochastic model for Web site visit duration, page views, purchase incidence, and the sale amount for online retailers. The model is constructed by composition from carefully selected distributions and involves copula components. It allows for the strong nonlinear relationships between the sales and visit variables to be explored in detail, and can be used to construct sales predictions.

The model is readily estimated using maximum likelihood, making it an attractive choice in practice given the large sample sizes that are commonplace in online retail studies. We examine a number of top-ranked U.S. online retailers, and find that the visit duration and the number of pages viewed are both related to sales, but in very different ways for different products. Using Bayesian methodology, we show how the model can be extended to a finite mixture model to account for consumer heterogeneity via latent household segmentation. The model can also be adjusted to accommodate a more accurate analysis of online retailers like that sell products at a very limited number of price points. In a validation study across a range of different Web sites, we find that the purchase incidence and sales amount are both forecast more accurately using our model, when compared to regression, probit regression, a popular data-mining method, and a survival model employed previously in an online retail study.


2nd Dec 2016 - 11:00 am

Venue: Rm 5070 Abercrombie Bldg H70

Speaker: Prof Chialin Chang, Professor of Economics; National Chung Hsing University; Taiwan

Title: International Technology Diffusion of Joint and Cross-Border Patents

With the advent of globalization, economic and financial interactions among countries have become widespread. Given technological advancements, the factors of production can no longer be considered to be just labor and capital. In the pursuit of economic growth, every country has sensibly invested in international cooperation, learning, innovation, technology diffusion and knowledge. In this paper, we use a panel data set of 40 countries from 1981 to 2008 and a negative binomial model, using a novel set of cross-border patents and joint patents as proxy variables for technology diffusion, in order to investigate such diffusion. The empirical results suggest that, if it is desired to shift from foreign to domestic technology, it is necessary to increase expenditure on R&D for business enterprises and higher education, exports and technology. If the focus is on increasing bilateral technology diffusion, it is necessary to increase expenditure on R&D for higher education and technology.