Find us on Facebook Find us on LinkedIn Follow us on Twitter Subscribe to our YouTube channel Instagram

Peter Radchenko

Peter Radchenko

PhD Yale
Associate Professor

Rm 4158
H70 - Abercrombie Building
The University of Sydney
NSW 2006 Australia

Telephone +61 2 9351 3835
peter.radchenko@sydney.edu.au

Bio

Peter Radchenko is an Associate Professor of Business Analytics at the University of Sydney Business School.Prior to joining the University of Sydney in 2017, he held academic positions at the University of Chicago and in the Marshall School of Business at the University of Southern California.Peter has a PhD in Statistics from Yale University, and an undergraduate degree in Mathematics and Applied Mathematics, from the Lomonosov Moscow State University.

Peter Radchenko's primary research focus is on developing new methodology for dealing with massive and complex modern data.Such large scale problems fall under the general framework of high-dimensional statistics and statistical machine learning, which are the main areas of Peter’s research.In particular, Peter has done extensive work in the area of high-dimensional regression, where the number of predictors is large relative to the number of observations. He has also worked on the problems of large-scale cluster analysis, including estimating the number of clusters and feature screening.  Another area of Peter’s research is functional data analysis, in which the measurements of a function or curve are treated as a single observation of the function as a whole. Peter's research papers have been published in the Journal of the Royal Statistical Society, the Annals of Statistics, the Journal of the American Statistical Association, Biometrika, and the Annals of Applied Statistics.

Research Interests

Peter Radchenko’s research focusses on developing and analysing novel methodology for dealing with massive and complex modern data. Fields ranging from finance, marketing and economics to image analysis, signal processing, data compression and computational biology nowadays share the common feature of trying to extract information from vast noisy data sets. The age of Big Data has created an abundance of interesting problems, posing new challenges, not present in conventional data analysis. Such large scale problems fall under the general framework of High Dimensional Statistics and Statistical Machine Learning, which are the primary areas of Peter Radchenko’s research. His main focus has been on the problems in high-dimensional regression, convex clustering and functional data analysis.

Peter Radchenko’s work on high dimensional regression problems involves fitting models and performing variable selection in settings where the number of predictors is large relative to the number of observations. His corresponding methodological work covers a wide range of topics, including linear and nonlinear additive models, nonlinear interaction models, generalized linear models, and single index models.

One serious limitation of the traditional clustering methods, such as k-means, is the non-convexity of the corresponding optimization problems. Peter Radchenko has worked on developing and analysing highly scalable convex clustering approaches that can handle massive amounts of data. His recent papers focus on estimating the number of clusters and on feature screening in large scale cluster analysis.

The key principle of the Functional Data Analysis field is to treat the measurements of a function or curve not as multiple data points, but as a single observation of the function as a whole. This approach allows one to more fully exploit the structure of the data. The infinite dimensional nature of functional data makes it critical to reduce the dimension of the predictor data before fitting a regression model. Most existing methods utilize an unsupervised approach, such as functional principal component analysis. The novel methodology developed by Peter Radchenko and his collaborators performs the dimension reduction in a supervised fashion, taking the response information into account.

A recent new direction of Peter Radchenko’s research takes advantage of the impressive advances in mixed integer optimization and modern optimisation techniques to solve certain classes of discrete problems arising in statistics. Together with his collaborators, he has developed novel mixed integer optimization based approaches for fitting sparse high-dimensional linear and nonlinear additive models. He has also worked on the problem of best subset selection in low-signal high-dimensional regimes.  His areas of interest for current and future research include high-dimensional nonlinear regression models with shape constraints and sparse generalized

2017

4
Journal Article/s

Radchenko P and Mukherjee G 2017 'Convex Clustering via L-1 Fusion Penalization', Journal of the Royal Statistical Society, Series B (Statistical Methodology), vol.79:5, pp. 1527-46 [Link]

Mazumder R and Radchenko P 2017 'The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization', IEEE Transactions on Information Theory, vol.63:5, pp. 3053-75 [Link]

Banerjee T, Mukherjee G and Radchenko P 2017 'Feature screening in large scale cluster analysis', Journal of Multivariate Analysis, vol.161, pp. 191-212 [Link]

2015

4
Journal Article/s

Fan Y, James G and Radchenko P 2015 'Functional Additive Regression', Annals of Statistics, vol.43:5, pp. 2296-325 [Link]

Radchenko P 2015 'High Dimensional Single Index Models', Journal of Multivariate Analysis, vol.139, pp. 266-82 [Link]

Radchenko P, Qiao X and James G 2015 'Index Models for Sparsely Sampled Functional Data', Journal of the American Statistical Association, vol.110:510, pp. 824-36 [Link]

2011

4
Journal Article/s

Radchenko P and James G 2011 'Improved Variable Selection with Forward-Lasso Adaptive Shrinkage', Annals of Applied Statistics, vol.5:1, pp. 427-48 [Link]

2010

4
Journal Article/s

Radchenko P and James G 2010 'Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions', Journal of the American Statistical Association, vol.105:492, pp. 1541-53 [Link]

2009

4
Journal Article/s

James G and Radchenko P 2009 'A Generalized Dantzig Selector with Shrinkage Tuning', Biometrika, vol.96:2, pp. 323-37 [Link]

James G, Radchenko P and Lv J 2009 'DASSO: Connections between the Dantzig Selector and Lasso', Journal of the Royal Statistical Society, Series B (Statistical Methodology), vol.71:1, pp. 127-42 [Link]

2008

4
Journal Article/s

James G and Radchenko P 2008 'Invited discussion of “Sure Independence Screening for Ultrahigh Dimensional Feature Space.”', Journal of the Royal Statistical Society, Series B (Statistical Methodology), vol.70:5, pp. 895-6 [Link]

Radchenko P 2008 'Mixed-rates Asymptotics', Annals of Statistics, vol.36:1, pp. 287-309 [Link]

Radchenko P and James G 2008 'Variable Inclusion and Shrinkage Algorithms', Journal of the American Statistical Association, vol.103:483, pp. 1304-315 [Link]

2006

4
Journal Article/s

Pollard D and Radchenko P 2006 'Nonlinear Least-Squares Estimation', Journal of Multivariate Analysis, vol.97:2, pp. 548-62 [Link]

2005

6
Conference Proceeding/s

Radchenko P 2005 'Reweighting the Lasso', Proceedings of the American Statistical Association, 1st January 2005-31st December 2005

1999

2
Book Section/s

Afanasyeva L and Radchenko P 1999 'On Homogeneity of Two Semi-Markov Samples' in Semi-Markov Models and Applications, ed. J Janssen & N Limnios, Kluwer, Dordrecht, Netherlands, pp. 187-99 [Link]