Former B.Sc. students

Here is a small description of the project that each one of those students did while working under my supervision:

  1. James Howson (BSc project) 2020-2021: James worked on a project to combine machine learning algorithms to create a new algorithm.
  2. Nikolaos Chatzis (BSc project) 2020-2021: Nikolaos is comparing SVM-based classification algorithms with Fisher’s LDA . His work can be published if polished.
  3. Kirill Shvidler (BSc project) 2020-2021: Kirill worked on a project where he tried to combine sufficient dimension reduction in support vector regression.
  4. Stephen Babos (MMORS project) 2019-2020: Stephen continued his work from his CUROP project on robust methods for sufficient dimension reduction. His work was diverse and can be actually published in different publications. So far we have published some of the findings in:
    • Babos and Artemiou (2021) Stats
  5. Charles Worsford (B.Sc. project) 2019-2020: Charles worked on developing a new SVM algorithm which can be shown to be equivalent to Fisher’s LDA.
  6. Hector Haffenden (B.Sc. project) 2018-2019: Hector took my MA2501 Programming and Statistics module and he really enjoyed it. As a result he asked me to do a project around financial data. Therefore I asked him to develop a new algorithm for sufficient dimension reduction for time series. We are working now to publish his work.
  7. Stephen Babos (CUROP) Summer 2018: Steve, worked over the summer of 2018 on a CUROP project to propose a new dimension reduction method robust to outliers. This is an extension of the work in Artemiou and Tian (2015) and we have a paper in Statistical Methods and Applications (Jan 2020 – online)
  8. Michael Panayides (B.Sc. Project) 2017-2018: Michael, worked for his final year project with me and he proposed a new computationally fast method to perform SVM-type classification. We are currently trying to finalize the details (run some nonlinear experiments) to submit a paper.
  9. Harry Chant (B.Sc. Project) 2017-2018: Harry worked on clustering/classification algorithms for a financial datasets using a dataset we found in UC Irvine’s Machine Learning repository.
  10. William Underwood (visiting from Oxford) Summer 2017: Will contacted the school before the Summer of 2017 expressing his interest for a summer project and somehow this requested ended on my desk (thanks to Prof. Paul Harper in a sense). Will worked on one project under joint supervision between myself and my Ph.D. student Luke Smallman on a sparse Poisson PCA algorithm. A paper which contains his work over the summer is published in Computational Statistics.
    • Smallman L., Underwood W. and Artemiou A. (2020) – Computational Statistics
  11. Sophie Shapcott (CUROP) Summer 2017: Sophie worked at the end of her first year in a CUROP project that I co-supervised with Dr. Dimitris Potoglou in Geography and Planning.
  12. Michael Clayton-Rose (B.Sc. Project) 2016-2017: Michael did his final year project on creating a package in R on multivariate hypothesis testing. He did a lot of work and programmed a lot of different tests that at the point were merged in different packages. We did not publish the package yet.
  13. Sarah Medland (B.Sc. Project) 2016-2017: Sarah worked on a bat dataset that was given for a data competition by Veronica Zamora-Gutierez in the British Classification Society in November 2013. Sarah applied some interesting algorithms and reach very high classification accuracy of the different type of species.
  14. Ben Byrne (B.Sc. Project) 2015-2016: Ben worked with me on the analysis of an Eye movement dataset which was given as a competition by the RSS before the 2015 annual meeting in Exeter. During his Ph.D. he did some basic analysis using classic statistical methodology, like t-tests, ANOVA, nonparametric tests etc. He later continued this during his MSc to find some classification techniques which could separate the eye movement based on the different conditions.
  15. Stefan Andjelkovic (MMATH Project) 2015-2016: Stefan worked with me on a theoretical topic of Principal Component Analysis. We did a major review of the literature on the theory behind Principal Component Analysis and he tried to expand some results in a different framework.
  16. Rishan Shah (MURBS) Summer 2015: Rishan was the last student that worked with me under a MURBS funding (the school of Mathematics undergraduate research bursary scheme). Rishan worked on dimension reduction algorithms when the response is missing at random.
  17. Holly Tibble (B.Sc. Project) 2014-2015: Holly was the first final year project I supervised. Her project was based on a discussion we had with Bing (my Ph.D. supervisor) about Jared Diamond’s book “Guns, Germs and Steel”. Among other things the author suggests that moving in an East-West direction was easier for early humans due to the same climate one would have encounter (the major differences are when moving on the North – South direction). We tried to to look whether countries that are more elongated on the East-West direction have higher similarity indexes (like the Gini Index). We had some interesting results, but nothing too exciting to publish, although I always feel we did not exhaustively study all our options.
  18. Alex Carney (MURBS) Summer 2014 – Summer 2015: Alex was funded through MURBS (the school of Mathematics undergraduate research bursary scheme). His project was about text mining software and he was primarily supervised by Dr. Jennifer Morgan (at that point appointed at the University Health Board and the School of Mathematics). Part of his findings over the two summers were published in
    • Morgan J et al (2019) Journal of the OR Society
  19. Laura Dimond (MURBS – Summer 2014 and
    B.Sc. Project 2015-2016)
    : Laura worked with me twice. First she was funded in the Summer of 2014 through MURBS (the school of Mathematics undergraduate research bursary scheme). Her project was for a robust algorithm for SVM-based sufficient dimension reduction. Then she worked with me on her final year project during the academic year 2015-2016 for a different SVM-based robust sufficient dimension reduction algorithm. Both of her works are publishable quality if I ever find the time to finalize some theoretical details.
  20. Luke Smallman (LMS URB) Summer 2014: With Luke we applied for the LMS URB funding in my first year at Cardiff University. I knew him from my MA2002 Matrix Algebra module. We were successful and he studied a number of reweighted methods to handle imbalance of SVM-based dimension reduction methods. After an impressive 8-weeks, he decided to stay in Cardiff as a Ph.D. student under my supervision. His LMS work was published in:
    • Smallman and Artemiou (2017) Communications in Statistics, Theory and Methods