My research in Statistics revolve around different topics in Dimension Reduction (the order of topics is random – and references can be found in my Publications list) :
- SVM-based Sufficient Dimension Reduction: There are a number of topics that I am working in this area going back to my PhD years. In my Ph.D. years, with my supervisor (Prof Bing Li) and Dr. Lexin Li we have introduced what I now call SVM-based SDR methodology where introduced (SVM stands for Support Vector Machines). We introduced a unified framework for linear and nonlinear feature extraction (see Li, Artemiou and Li (2011) for all the details). This work was extended in papers like Artemiou and Dong (2016) and Shin and Artemiou (2017). Nowadays, I am working on a number on other projects:
- Robustifying SVM-based SDR algorithms. There are many ideas that can be used to robustify SVM-based SDR methodology; a lot of these ideas are coming from the classification framework where this methodology was introduced. I have one paper accepted (Artemiou 2019 – Statistics).
- Introducing real time SVM-based SDR algorithms. This is a topic which we have a couple of papers in progress with my collaborators Dr. Y. Dong (Temple University) and Dr. SJ Shin (Korea University). The basic idea is to find a way to introduce a real-time updates to our estimate of the reduced subspace. This is essential as computer power nowadays allows for the continuous collection of data. The first paper on this was published (see Artemiou, Dong and Shin (2022) – Pattern Recognition) and a followup paper was published in 2023 (see Jang, Shin and Artemiou (2023) – Computational Statistics and Data Analysis). I currently have one Ph.D. student Aeron Hoare working on this topic.
- Using SVM-based SDR methodology in very high-dimensional problems. Using observations from the problematic behaviour of SVM in very high dimensions we develop algorithms which can help us improve the performance of SVM-based SDR in very high-dimensions. We have a paper accepted with my former MPhil student Hayley Randall (see Randall, Artemiou and Qiao (2021 – Scandinavian Journal of Statistics)) and we have another work in progress.
- Post dimension reduction inference: With my collaborator, Prof Gerda Claeskens (KU Leuven) we are looking to address the problem of post-dimension reduction inference.
- Inverse-moment-based Sufficient Dimension Reduction: Inverse-moment-based methodology for Sufficient Dimension Reduction (SDR) is the oldest class of algorithms introduced in the SDR methodology and they are still considered the first go-to methods when developing new methodology mainly due to their theoretical and computational simplicity. I am working on a number of topics in this framework, which can be grouped in the following methods:
- Robustifying inverse-moment-based SDR algorithms. We are working with different ideas in robustifying inverse-moment-based SDR algorithms. We currently have two papers with my former CUROP student Stephen Babos (Babos and Artemiou – (2020 – Statistical Methods and Applications) – and (2022 – Stats))
- Sufficient dimension reduction at the presence of categorical predictors. There has been a number of methods developed in the literature on Sufficient Dimension Reduction at the presence of categorical predictors. We have one paper submitted with my former Ph.D. student Ben Jones.
- Sufficient dimension reduction for time series. We are trying to extend recent work on applying dimension reduction techniques to time series data. We have a paper published with a former BSc student Mr Hector Haffenden (Stat (2024)) and currently my Ph.D. student Amal Alqarni is working on extending these results
- Sufficient dimension reduction for text data. After working extensively on PCA methods for text data we also worked on SDR methods for text data. One method appeared in Smallman’s thesis and the other is with my student Amarjit Gaba with whom we have a paper published (Statistics and Its Interface (2025))
- Methods that apply to both large inverse-moment and SVM-based SDR algorithms.
- Flexible Sufficient Dimension Reduction: The flexible sufficient dimension reduction idea is mostly an algorithm which let you choose the best framework to choose a dimension reduction algorithm from. This is currently a work in progress.
- HDLSS problems: The large p small n problems pose a great challenge in SDR algorithms. With my collaborator Eugen Pircalabelu (UC Louvain) we are currently working on a couple of ideas to address the issue in different SDR algorithms. We have two papers on this topic. (see Pircalabelu and Artemiou(2021 – Computational Statistics and Data Analysis- and 2022 – Electronic Journal of Statistics)). We also work on this with my Collaborator Dr. Eftychia Solea.
- Dimension Reduction for Exponential distribution Principal Component Analysis (PCA). With partial funding from the University Health Board my PhD student Luke Smallman found out that there is a limited number of methods to perform dimension reduction on text data. Text data can’t be assumed Gaussian and therefore they are more appropriately being modeled using Poisson distribution. They are also very high-dimensional (if we assume that each word is a different variable) and therefore there is a need for sparse dimension reduction methods. We have extended previous literature using appropriate penalties to sparsify them. One method was published in Pattern Recognition (Smallman, Artemiou, Morgan – 2018) and one is in Computational Statistics (Smallman, Underwood, Artemiou – 2019). We also have a literature review accepted (Smallman and Artemiou – 2022 – Journal of Statistical Theory and Practice). Finally we are looking into creating a unified framework for the exponential family PCA algorithms using different ideas.
- Dimension Reduction for Tensors. With my collaborator, Dr. Joni Virta (Turku University) we are working on a number of different ideas for tensors both on unsupervised and supervised ideas. We have two papers published Virta and Artemiou (2023 – Pattern Recognition and 2024- IEEE Transactions on Signal Processing) and we are also working on other ideas.
- Envelope models. With my former Ph.D. student Alya Alzahrani we are thinking on a number of topics we can work on in this topic. We have managed to create some envelope based SVM paper and we have a couple of papers in progress.
- Philosophical research on PCA. Following three papers with my PhD supervisor Prof. Li (Artemiou and Li 2009- 2013) and my current Ph.D. student Ben Jones (Jones and Artemiou – 2018+) on the appropriateness of PCA as a dimension reduction tool in linear, nonlinear regression settings and functional predictor settings we have recently further explored the appropriateness of kernel PCA as a nonlinear feature extraction tool in a regression setting (see Jones, Artemiou and Li (2020) and Jones and Artemiou (2021)) . We have another paper in progress with Ben. Moreover, we are trying to see if there is a different measure of association (other than correlation) to be used in this framework (see Artemiou (2021)).
- Classification algorithms. With a number of students (mainly MSc and BSc project students) I am trying to propose a number of new classification algorithms which improves on the existing literature of classification algorithms. Currently we have a paper published with my former BSc student Michalis Panayides (Computers (2024)) and a number of other papers in progress with other students.
- Multicollinearity in regression. With my collaborator Alex Karagrigoriou and former Ph.D. student, Kimon Ntotsis, we have published a paper proposing a criterion to identify multicollinearity between predictors in regression (see Ntotsis, Karagrigoriou and Artemiou (2021))
- Nonlinear association coefficient. With my collaborator Alex Karagrigoriou andformer Ph.D. student Kimon Ntotsis, we have published two papers submitted on Kernel Association Coefficient, a coefficient which can be used to measure nonlinear relationships between predictors and response variables in regression settings.