Paul-Olivier Dehaye: From Cambridge Analytica to the future of online services: a personal journey (School Seminar)

Event details

  • When: 19th March 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Abstract:

2018 was a crazy year for privacy. The General Data Protection Regulation came into force in May, and new revelations on the personal data ecosystem were making headlines on a weekly basis. I will give the behind the scenes for a lot of these events, question why they didn’t happen earlier, and offer some thoughts on the necessary future of online services. This will include a brief discussion of topics such as semantic alignment, interpretable machine learning, or new privacy-preserving data processing techniques.

Speaker Bio:

Paul-Olivier Dehaye is a mathematician by training. Affiliated to the University of Zurich as a SNSF Assistant Professor until 2016, his career then took a turn towards data protection activism and social entrepreneurship. He was the researcher on several news articles who have reached millions of readers (including many with Carole Cadwalladr), and testified in front of the UK and EU Parliaments on multiple occasions. He is on the board of MyData Global, has founded the NGO PersonalData.IO, and the project MyData Geneva.

Lauren Roberts & Peter Michalák (Newcastle): Automating the Placement of Time Series Models for IoT Healthcare Applications (School Seminar)

Event details

  • When: 26th February 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Abstract:

There has been a dramatic growth in the number and range of Internet of Things (IoT) sensors that generate healthcare data. These sensors stream high-dimensional time series data that must be analysed in order to provide the insights into medical conditions that can improve patient healthcare. This raises both statistical and computational challenges, including where to deploy the streaming data analytics, given that a typical healthcare IoT system will combine a highly diverse set of components with very varied computational characteristics, e.g. sensors, mobile phones and clouds. Different partitionings of the analytics across these components can dramatically affect key factors such as the battery life of the sensors, and the overall performance. In this work we describe a method for automatically partitioning stream processing across a set of components in order to optimise for a range of factors including sensor battery life and communications bandwidth. We illustrate this using our implementation of a statistical model predicting the glucose levels of type II diabetes patients in order to reduce the risk of hyperglycaemia.

Speaker Bios:

Lauren and Peter are final year PhD students at the CDT in Cloud Computing for Big Data at Newcastle University. Peter has a background in Computer Engineering from University of Žilina, Slovakia and a double-degree in Computer Software Engineering from JAMK University of Applied Sciences, Jyväskylä, Finland. His research interests are within distributed event processing, edge computing and Internet of Things with a special focus on energy and bandwidth constrains. Lauren has an MMath degree from Newcastle University and her research interests lie in statistical modelling of time series data.

Quintin Cutts (Glasgow): Re-imagining software engineering education through the apprenticeship lens (School Seminar)

Event details

  • When: 19th February 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Abstract:

Apprenticeship degrees have sprung up so fast that there has been little time for us all to reflect on how this apparently new form of education, to universities at least, could significantly affect our educational offerings. The University of Glasgow has been undertaking some preparatory work for Skills Development Scotland prior to running its apprenticeship degree in software engineering, and this has afforded us some time to see what others nationally and internationally have been doing, and to consider relevant aspects of the literature, as well as consult with industry. One view that we are developing of these degrees is as a true evolution of typical, largely campus-based, software engineering degrees, towards a full-blown professional degree such as in medicine, where university and hospitals are in real partnership over the training of doctors. In this talk, I will outline our thinking and raise a number of issues for discussion. In suggesting a closer relationship with industry in a talk in St Andrews, I do not of course miss the irony that industry accreditation was never (I believe) something that St Andrews was particularly bothered about – thinking that my BSc (Hons) 1988 is not accredited!

Lewis McMillan (St Andrews): Parallel Computer Simulations of Light-Tissue Interactions for Applications in Medicine, Cosmetics Industry and Biophotonics Research (School Seminar)

Event details

  • When: 5th March 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Abstract:

Tissue ablation is a widely used treatment in both the cosmetic and medical sectors, for treating various diseases or to improve cosmetic outlooks. We present our tissue ablation model which can predict the depth of ablation, and the surrounding thermal damage caused by the laser during ablation.

“Non-diffracting” beams have a multitude of uses in physics, from optical manipulation to improved microscopy light sources. For the first time we show that these beams can be modelled using Monte Carlo radiation transport method. Allowing better insight into how these beams propagate in a turbid medium.

Both of these projects use the Monte Carlo radiation transport method (MCRT) to simulate light transport. The MCRT method is a powerful numerical method that can solve light transport though heavily scattering and absorbing mediums, such as biological tissues. The method is extremely flexible and can model arbitrary geometries and light sources. MCRT can also model the various micro-physics of the simulated medium, such as polarisation, fluorescence, and Raman scattering. This talk will give an overview of our group’s work, with particular focus on simulating tissue ablation, and modelling “non-diffracting” beams.

Speaker Bio:

Lewis McMillan is a final year physics PhD student at St Andrews University. His research interests are in using Monte Carlo radiation transport method for various applications within medicine and biophotonics.

Ian Gent (St Andrews): The Winnability of Klondike and Many Other Single-Player Card Games (School Seminar)

Event details

  • When: 5th February 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

This is joint work with Charlie Blake.

Abstract:

The most famous single-player card game is ‘Klondike’, but our ignorance of its winnability percentage has been called “one of the embarrassments of applied mathematics”. Klondike is just one of many single-player card games, generically called ‘solitaire’ or ‘patience’ games, for which players have long wanted to know how likely a particular game is to be winnable for a random deal. A number of different games have been studied empirically in the academic literature and by non-academic enthusiasts.

Here we show that a single general purpose Artificial Intelligence program, called “Solvitaire”, can be used to determine the winnability percentage of approximately 30 different single-player card games with a 95% confidence interval of ± 0.1% or better. For example, we report the winnability of Klondike to within 0.10% (in the ‘thoughtful’ variant where the player knows the location of all cards). This is a 30-fold reduction in confidence interval, and almost all our results are either entirely new or represent significant improvements on previous knowledge.

Speaker Bio:

Ian Gent is professor of Computer Science at the University of St Andrews. His mother taught him to play patience and herself showed endless patience when he “helped” her by taking complete control of the game. A program to play a patience game was one of the programs he wrote on his 1982 Sinclair Spectrum now on the wall outside his office.

Emanuele Trucco (Dundee): Retinal image analysis and beyond in Scotland: the VAMPIRE project (School Seminar)

Event details

  • When: 29th January 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Abstract:

This talk is an overview of the VAMPIRE (Vessel Assessment and Measurement Platform for Images of the REtina) project, an international and interdisciplinary research initiative created and led by the Universities of Dundee and Edinburgh in Scotland, UK, since the early 2000s. VAMPIRE research focuses on the eye as a source of biomarkers for systemic diseases (e.g. cardiovascular, diabetes, dementia) and cognitive decline, as well as on eye-specific diseases. VAMPIRE is highly interdisciplinary, bringing together medical image analysis, machine learning and data analysis, medical research, and data governance and management at scale. The talk introduces concisely the aims, structure and current results of VAMPIRE, the current vision for effective translation to society, and the several non-technical factors complementing technical research needed to achieve effective translation.

Speaker Bio:

Emanuele (Manuel) Trucco, MSc, PhD, FRSA, FIAPR, is the NRP Chair of Computational Vision in Computing, School of Science and Engineering, at the University of Dundee, and an Honorary Clinical Researcher of NHS Tayside. He has been active since 1984 in computer vision, and since 2002 in medical image analysis, publishing more than 270 refereed papers and 2 textbooks, and serving on the organizing or program committee of major international and UK conferences. Manuel is co-director of VAMPIRE (Vessel Assessment and Measurement Platform for Images of the Retina), an international research initiative led by the Universities of Dundee and Edinburgh (co-director Dr Tom MacGillivray), and part of the UK Biobank Eye andVision Consortium. VAMPIRE develops software tools for efficient data and image analysis with a focus on multi-modal retinal images. VAMPIRE has been used in UK and international biomarker studies on cardiovascular risk, stroke, dementia, diabetes and complications, cognitive performance, neurodegenerative diseases, and genetics.

SRG Seminar: “Large-Scale Hierarchical k-means for Heterogeneous Many-Core Supercomputers” by Teng Yu

Event details

  • When: 1st November 2018 13:00 - 14:00
  • Where: Cole 1.33b
  • Series: Systems Seminars Series
  • Format: Seminar, Talk
We present a novel design and implementation of k-means clustering algorithm targeting supercomputers with heterogeneous many-core processors. This work introduces a multi-level parallel partition approach that not only partitions by dataflow and centroid, but also by dimension. Our multi-level ($nkd$) approach unlocks the potential of the hierarchical parallelism in the SW26010 heterogeneous many-core processor and the system architecture of the supercomputer.
Our design is able to process large-scale clustering problems with up to 196,608 dimensions and over 160,000 targeting centroids, while maintaining high performance and high scalability, significantly improving the capability of k-means over previous approaches. The evaluation shows our implementation achieves performance of less than 18 seconds per iteration for a large-scale clustering case with 196,608 data dimensions and 2,000 centroids by applying 4,096 nodes (1,064,496 cores) in parallel, making k-means a more feasible solution for complex scenarios.
This work is to be presented in the International Conference for High Performance Computing, Networking, Storage, and Analysis (SC18).

SRG Seminar: “Using Metric Space Indexing for Complete and Efficient Record Linkage” by Özgür Akgün

Event details

  • When: 18th October 2018 13:00 - 14:00
  • Where: Cole 1.33b
  • Series: Systems Seminars Series
  • Format: Seminar
Abstract
Record linkage is the process of identifying records that refer to the same real-world entities, in situations where entity identifiers are unavailable. Records are linked on the basis of similarity between common attributes, with every pair being classified as a link or non-link depending on their degree of similarity. Record linkage is usually performed in a three-step process: first groups of similar candidate records are identified using indexing, pairs within the same group are then compared in more detail, and finally classified. Even state-of-the-art indexing techniques, such as Locality Sensitive Hashing, have potential drawbacks. They may fail to group together some true matching records with high similarity. Conversely, they may group records with low similarity, leading to high computational overhead. We propose using metric space indexing to perform complete record linkage, which results in a parameter-free record linkage process combining indexing, comparison and classification into a single step delivering complete and efficient record linkage. Our experimental evaluation on real-world datasets from several domains shows that linkage using metric space indexing can yield better quality than current indexing techniques, with similar execution cost, without the need for domain knowledge or trial and error to configure the process.