PGR Seminar with Carla Davesa Sureda

The next PGR seminar is taking place this Friday 22nd November at 2PM in JC 1.33a

Below is a Title and Abstract for Carla’s talk – Please do come along if you are able.

Title:

Towards High-Level Modelling in Automated Planning

Abstract:

Planning is a fundamental activity, arising frequently in many contexts, from daily tasks to industrial processes. The planning task consists of selecting a sequence of actions to achieve a specified goal from specified initial conditions. The Planning Domain Definition Language (PDDL) is the leading language used in the field of automated planning to model planning problems. Previous work has highlighted the limitations of PDDL, particularly in terms of its expressivity. Our interest lies in facilitating the handling of complex problems and enhancing the overall capability of automated planning systems. Unified-Planning is a Python library offering high-level API to specify planning problems and to invoke automated planners. In this paper, we present an extension of the UP library aimed at enhancing its expressivity for high-level problem modelling. In particular, we have added an array type, an expression to count booleans, and the allowance for integer parameters in actions. We show how these facilities enable natural high-level models of three classical planning problems.

Doughnuts will be available! 🍩

AI Seminar Tuesday 19th November – Francesco Leofante

The School is hosting an AI seminar on Tuesday 19th November at 11am in JCB1.33A/B

Our speaker is Francesco Leofante from Imperial College London.

Title:

Robustness issues in algorithmic recourse.

Abstract:

Counterfactual explanations (CEs) are advocated as being ideally suited to providing algorithmic recourse for subjects affected by the predictions of machine learning models. While CEs can be beneficial to affected individuals, recent work has exposed severe issues related to the robustness of state-of-the-art methods for obtaining CEs. Since a lack of robustness may compromise the validity of CEs, techniques to mitigate this risk are in order. In this talk we will begin by introducing the problem of (lack of) robustness, discuss its implications and present some recent solutions we developed to compute CEs with robustness guarantees.

Bio:

Francesco is an Imperial College Research Fellow affiliated with the Centre for Explainable Artificial Intelligence at Imperial College London. His research focuses on safe and explainable AI, with special emphasis on counterfactual explanations and their robustness. Since 2022, he leads the project “ConTrust: Robust Contrastive Explanations for Deep Neural Networks”, a four-year effort devoted to the formal study of robustness issues arising in XAI. More details about Francesco and his research can be found at https://fraleo.github.io/.

PGR Seminar with Daniel Wyeth and Ferdia McKeogh

The next PGR seminar is taking place this Friday 15th November at 2PM in JC 1.33a

Below is a Title and Abstract for Daniel’s and Ferdia’s talks – Please do come along if you are able.

Daniel:

Deep Priors: Integrating Domain Knowledge into Deep Neural Networks

Deep neural networks represent the state of the art for learning complex functions purely from data.  There are however problems, such as medical imaging, where data is limited, and effective training of such networks is difficult.  Moreover, this requirement for large datasets represents a deficiency compared to human learning, which is able harness prior understanding to acquire new concepts with very few examples.  My work looks at methods for integrating domain knowledge into deep neural networks to guide training so that fewer examples are required.  In particular I explore probabilistic atlases and probabilistic graphical models as representations for this prior information, architectures which enable networks to use these, and the application of these to problems in medical image understanding.

Ferdia:

“Lessons Learned From Emulating Architectures”

Automatically generating fast emulators from formal architecture specifications avoids the error-prone and time-consuming effort of manually implementing an emulator. The key challenge is achieving high performance from correctness-focused specifications; extracting relevant functional semantics and performing aggressive optimisations. In this talk I will present my work thus far, and reflect on some of the unsuccessful paths of research.

Doughnuts will be available! 🍩

PGR Seminar with Ariane Hine

The PGR seminars for this academic year are beginning this Friday 8th November at 2PM in JC 1.33A/B

Below is a title and Abstract for Ariane’s talk – Please do come along if you are able.

Title: Enhancing and Personalising Endometriosis Care with Causal Machine Learning

Abstract: Endometriosis poses significant challenges in diagnosis and management due to the wide range of varied symptoms and systemic implications. Integrating machine learning into healthcare screening processes can significantly enhance and optimise resource allocation and diagnostic efficiency, and facilitate more tailored and personalised treatment plans. This talk will discuss the potential of leveraging patient-reported symptom data through causal machine learning to advance endometriosis care and reduce the lengthy diagnostic delays associated with this condition.

The goal is to propose a novel personalised non-invasive diagnostic approach that understands the underlying causes of patient symptoms and combines health records and other factors to enhance prediction accuracy, providing an approach that can be utilised globally.

Fudge donuts will be available! 🍩

AI Seminar Friday 18th October – Leonardo Bezerra

The School is hosting an AI seminar on Friday 18th October at 11.30am in JCB1.33A!

Our speaker is Leonardo Bezerra from the University of Stirling.

FAIRTECH by design: assessing and addressing the social impacts of artificial intelligence systems

In a decade, social media and big data have transformed society and enabled groundbreaking artificial intelligence (AI) technologies like deep learning and generative AI. Applications like ChatGPT have impacted the world and outpaced regulatory agencies, who were rushed from a data-centred to an AI-centred concern. Recent developments from both the United Kingdom (UK) and the United States (US) originated in the executive branch, and the most advanced Western binding legislation is the European Union (EU) AI Act, expected to be implemented over the next three years. In the meantime, the United Nations (UN) have proposed an AI advisory body similar to the International Panel on Climate Change (IPCC), and countries from the Global South like Brazil are following Western proposals. In turn, AI companies have been proactive in the regulation debate, aiming at a scenario of improved accountability and reduced liability. In this talk, we will briefly overview efforts and challenges regarding AI regulation and how major AI players are addressing it. The goal of the talk is to stir future project collaborations from a multidisciplinary perspective, to promote a culture where the development and adoption of AI systems is fair, accountable, inclusive, responsible, transparent, ethical, carbon-efficient, and human-centred (FAIRTECH) by design.

Speaker bio: Leonardo Bezerra joined the University of Stirling as a Lecturer in Artificial Intelligence (AI)/Data Science in 2023, after having been a Lecturer in Brazil for the past 7 years. He received his Ph.D. degree from Université Libre de Bruxelles (Belgium) in 2016, having defended a thesis on the automated design of multi-objective evolutionary algorithms. His research experience spans from applied data science projects with public and private institutions to supervising theses on automated and deep machine learning. Recently, his research has concentrated on the social impact of AI applications, integrating the Participatory Harm Auditing Workbenches and Methodologies project funded by Responsible AI UK.

2024-2025 CS-EDI Poster Competition

The school EDI committee would like to run an EDI-themed poster competition during the academic year 2024-2025. The goal is to enhance EDI awareness and foster conversations around EDI topics. Posters aim to voice what really matters to you, and a means to celebrate the value of our diverse background and experiences. The selected posters will be displayed on the school walls and screens.

We welcome entries from all students in Computer Science. Participants can be teams or individuals. Prizes will be awarded to the winning teams. If you want to register interest to the competition or you have any questions, please contact edi-cs@st-andrews.ac.uk. More details can be found https://tinyurl.com/yx63a6sa. The deadline for expressing interest is 11 Oct 2024. We are looking forward to hearing from you.

Week 1 Social Events

The following social events are being held in the School this week in the Jack Cole coffee area:-

  • Monday 16 September – 17:00 – 18:00 – MSc and MSci welcome reception (TODAY)
  • Wednesday 18 September – 16:00 – 17:00 – Honours (Junior & Senior) welcome reception
  • Friday 20 September – 17:00 – 18:00 – Sub-honours (students on first and second years of CS programmes) social event

Drinks 🍷 and snacks 🍰 available at all events, so please come along and join us!

STACS Welcome BBQ 🍔

If you are a new Undergraduate or Postgraduate Taught student to the School of Computer Science, you are invited to the STACS Welcome BBQ outside the Jack Cole Coffee Area on Friday 13th September 5.30pm-7.30pm.

The usual BBQ favourites will be available from the grill and refreshments will be provided. We look forward to seeing you there!