My research in Statistics revolve around different topics in Dimension Reduction (the order of topics is random – and references can be found in my Publications list) :

  1. SVM-based Sufficient Dimension Reduction: There are a number of topics that I am working in this area going back to my PhD years. In my Ph.D. years, with my supervisor (Prof Bing Li) and Dr. Lexin Li we have introduced what I now call SVM-based SDR methodology where introduced (SVM stands for Support Vector Machines). We introduced a unified framework for linear and nonlinear feature extraction (see Li, Artemiou and Li (2011) for all the details). This work was extended in papers like Artemiou and Dong (2016) and Shin and Artemiou (2017). Nowadays, I am working on a number on other projects:
    • Robustifying SVM-based SDR algorithms. There are many ideas that can be used to robustify SVM-based SDR methodology; a lot of these ideas are coming from the classification framework where this methodology was introduced. I have one paper accepted (Artemiou 2019 – Statistics). One paper on this topic is currently submitted and another two are in progress. This is also the topic we will be focusing with Prof. Alexandros Karagrigoriou on supervising Mr. Kimon Ntotsis Ph.D. dissertation.
    • Introducing real time SVM-based SDR algorithms. This is a topic which we have a couple of papers in progress with my collaborators Dr. Y. Dong (Temple University) and Dr. SJ Shin (Korea University). The basic idea is to find a way to introduce a real-time updates to our estimate of the reduced subspace. This is essential as computer power nowadays allows for the continuous collection of data. The first paper on this was accepted in November 2020 by Pattern Recognition – see Artemiou, Dong and Shin (2022+). We Dr Shin we have submitted another paper. And I aldo have an MPhil student, Matthew Hoare, working on this topic.
    • Using SVM-based SDR methodology in very high-dimensional problems. Using observations from the problematic behaviour of SVM in very high dimensions we develop algorithms which can help us improve the performance of SVM-based SDR in very high-dimensions. We have a paper accepted with my PhD student Hayley Randall (see Randall, Artemiou and Qiao (2021)) and we have another work in progress.
    • Post dimension reduction inference: With my collaborator, Prof Gerda Claeskens (KU Leuven) we are looking to address the problem of post-dimension reduction inference.
  2. Inverse-moment-based Sufficient Dimension Reduction: Inverse-moment-based methodology for Sufficient Dimension Reduction (SDR) is the oldest class of algorithms introduced in the SDR methodology and they are still considered the first go-to methods when developing new methodology mainly due to their theoretical and computational simplicity. I am working on a number of topics in this framework, which can be grouped in the following methods:
    • Robustifying inverse-moment-based SDR algorithms. We are working with different ideas in robustifying inverse-moment-based SDR algorithms. We currently have a paper with my former CUROP student Stephen Babos (Babos and Artemiou – 2020) and a paper in progress.
    • Sufficient dimension reduction at the presence of categorical predictors. There has been a number of methods developed in the literature on Sufficient Dimension Reduction at the presence of categorical predictors. We are looking on expanding these ideas on a number of different directions with my Ph.D. student Ben Jones.
    • Sufficient dimension reduction for time series. We are trying to extend recent work on applying dimension reduction techniques to time series data. We have a paper in progress with a former BSc student Mr Hector Haffenden.
  3. Methods that apply to both large inverse-moment and SVM-based SDR algorithms.
    • Flexible Sufficient Dimension Reduction: The flexible sufficient dimension reduction idea is mostly an algorithm which let you choose the best framework to choose a dimension reduction algorithm from. This is currently a work in progress
    • HDLSS problems: The large p small n problems pose a great challenge in SDR algorithms. With my collaborator Eugen Pircalabelu (UC Louvain) we are currently working on a couple of ideas to address the issue in different SDR algorithms. We have submitted two papers on this topic. One is already accepted (see Pircalabelu and Artemiou(2022+))
  4. Dimension Reduction for Exponential distribution Principal Component Analysis (PCA). With partial funding from the University Health Board my PhD student Luke Smallman found out that there is a limited number of methods to perform dimension reduction on text data. Text data can’t be assumed Gaussian and therefore they are more appropriately being modelled using Poisson distribution. They are also very high-dimensional (if we assume that each word is a different variable) and therefore there is a need for sparse dimension reduction methods. We have extended previous literature using appropriate penalties to sparsify them. One method was published in Pattern Recognition (Smallman, Artemiou, Morgan – 2018) and one is in Computational Statistics (Smallman, Underwood, Artemiou – 2019). We also have a literature review accepted (Smallman and Artemiou – 2022). Finally we are looking into creating a unified framework for the exponential family PCA algorithms using different ideas.
  5. Dimension Reduction for Tensors. With my collaborator, Dr. Joni Virta (Turku University) we are working on a number of different ideas for tensors both on unsupervised and supervised ideas. We have submitted a paper on PCA for discrete count data.
  6. Envelope models. With my Ph.D. student Alya Alzahrani we are thinking on a number of topics we can work on in this topic. We have managed to create some envelope based SVM paper and we have a couple of papers in progress.
  7. Philosophical research on PCA. Following three papers with my PhD supervisor Prof. Li (Artemiou and Li 2009- 2013) and my current Ph.D. student Ben Jones (Jones and Artemiou – 2018+) on the appropriateness of PCA as a dimension reduction tool in linear, nonlinear regression settings and functional predictor settings we have recently further explored the appropriateness of kernel PCA as a nonlinear feature extraction tool in a regression setting (see Jones, Artemiou and Li (2020) and Jones and Artemiou (2021)) . We have another paper in progress with Ben. Moreover, we are trying to see if there is a different measure of association (other than correlation) to be used in this framework (see Artemiou (2021)).
  8. Classification algorithms. With a number of students (mainly MSc and BSc project students) I am trying to propose a number of new classification algorithms which improves on the existing literature of classification algorithms. Currently we have a work in progress with my former BSc student Michalis Panayides.
  9. Multicollinearity in regression. With my collaborator Alex Karagrigoriou and the Ph.D. student we are co-supervising, Kimon Ntotsis, we have published a paper proposing a criterion to identify multicollinearity between predictors in regression (see Ntotsis, Karagrigoriou and Artemiou (2021))