Max L. Wilson (University of Nottingham): Brain-based HCI – What could brain data can tell us HCI

Please note non-standard date and time for this talk

Abstract:

This talk will describe a range of our projects, utilising functional Near Infrared Spectroscopy (fNIRS) in HCI. As a portable alternative that’s more tolerate of motion artefacts than EEG, fNIRS measures the amount of oxygen in the brain, as e.g. mental workload creates demand. As opposed to BCI (trying to control systems with our brain), we focus on brain-based HCI, asking what brain data can tell us about our software, our work, our habits, and ourselves. In particular, we are driven by the idea that brain data can become personal data in the future.

Speaker Bio:

Dr Max L. Wilson is an Associate Professor in the Mixed Reality Lab in Computer Science at the University of Nottingham.  His research focus is on evaluating Mental Workload in HCI contexts – as real-world as possible – primarily using functional Near Infrared Spectroscopy (fNIRS).  As a highly tolerant form of brain sensor, fNIRS is suitable for use in HCI research into user interface design, work tasks, and everyday experiences.  This work emerged from his prior research into the design and evaluation of complex user interfaces for information interfaces. Across these two research areas, Max has over 120 publications, including a Honourable Mention CHI2019 paper on a Brain-Controlled Movie – The MOMENT.

Event details

  • When: 25th October 2019 14:00 - 15:00
  • Where: Cole 1.33b
  • Series: School Seminar Series
  • Format: Seminar

DLS: Multimodal human-computer interaction: past, present and future

Speaker: Stephen Brewster (University of Glasgow)
Venue: The Byre Theatre

Timetable:

9:30: Lecture 1: The past: what is multimodal interaction?
10:30 Coffee break
11:15 Lecture 2: The present: does it work in practice?
12:15 Lunch (not provided)
14:15 The future: Where next for multimodal interaction?

Speaker Bio:

Professor Brewster is a Professor of Human-Computer Interaction in the Department of Computing Science at the University of Glasgow, UK. His main research interest is in Multimodal Human-Computer Interaction, sound and haptics and gestures. He has done a lot of research into Earcons, a particular form of non-speech sounds.

He did his degree in Computer Science at the University of Herfordshire in the UK. After a period in industry he did his PhD in the Human-Computer Interaction Group at the University of York in the UK with Dr Alistair Edwards. The title of his thesis was “Providing a structured method for integrating non-speech audio into human-computer interfaces”. That is where he developed my interests in Earcons and non-speech sound.

After finishing his PhD he worked as a research fellow for the European Union as part of the European Research Consortium for Informatics and Mathematics (ERCIM). From September, 1994 – March, 1995 he worked at VTT Information Technology in Helsinki, Finland. He then worked at SINTEF DELAB in Trondheim, Norway.

Event details

  • When: 8th October 2019 09:30 - 15:15
  • Where: Byre Theatre
  • Series: Distinguished Lectures Series
  • Format: Distinguished lecture

Daniel S. Katz (University of Illinois): Parsl: Pervasive Parallel Programming in Python

Please note non-standard date and time for this talk

Abstract: High-level programming languages such as Python are increasingly used to provide intuitive interfaces to libraries written in lower-level languages and for assembling applications from various components. This migration towards orchestration rather than implementation, coupled with the growing need for parallel computing (e.g., due to big data and the end of Moore’s law), necessitates rethinking how parallelism is expressed in programs.

Here, we present Parsl, a parallel scripting library that augments Python with simple, scalable, and flexible constructs for encoding parallelism. These constructs allow Parsl to construct a dynamic dependency graph of components from a Python program enhanced with a small number of decorators that define the components to be executed asynchronously and in parallel, and then execute it efficiently on one or many processors. Parsl is designed for scalability, with an extensible set of executors tailored to different use cases, such as low-latency, high-throughput, or extreme-scale execution. We show, via experiments on the Blue Waters supercomputer, that Parsl executors can allow Python scripts to execute components with as little as 5 ms of overhead, scale to more than 250000 workers across more than 8000 nodes, and process upward of 1200 tasks per second.

Other Parsl features simplify the construction and execution of composite programs by supporting elastic provisioning and scaling of infrastructure, fault-tolerant execution, and integrated wide-area data management. We show that these capabilities satisfy the needs of many-task, interactive, online, and machine learning applications in fields such as biology, cosmology, and materials science.

Slides: see here.

Speaker Bio: Daniel S. Katz is Assistant Director for Scientific Software and Applications at the National Center for Supercomputing Applications (NCSA), and Research Associate Professor in Computer Science; Electrical & Computer Engineering; and the School of Information Sciences at the University of Illinois Urbana-Champaign. For further details, please see his website here.

Event details

  • When: 18th October 2019 13:00 - 14:00
  • Where: Cole 1.33b
  • Series: School Seminar Series
  • Format: Seminar

Ankush Jhalani (Bloomberg): Building Near Real-Time News Search

Abstract:

This talk provides an insight into the challenges involved in providing near real-time news search to Bloomberg customers. It starts with a picture of what’s involved in building such a backend, then delves into what makes up a search engine. Finally we discuss the challenges of scaling up for low-latency and high-load, and how we tackle them.

Speaker Bio:

Ankush leads the News Search infrastructure team at the Bloomberg Engineering office in London. After completing his Masters in Computer Science, he joined Bloomberg at their New York office in 2009. Later working from Washington DC, he led a team to build a web application leveraging Lucene/Elasticsearch for businesses to discover government contracting opportunities. In London, his team focuses on search infrastructure and services allowing clients to search news events from all over the globe with near real-time access and sub-second latencies.

 

Event details

  • When: 15th October 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

MIP Modelling Made Manageable

Can a user write a good MIP model without understanding linearization? Modelling languages such as AMPL and AIMMS are being extended to support more features, with the goal of making MIP modelling easier. A big step is the incorporation of predicates, such a “cycle” which encapsulate MIP sub-models. This talk explores the impact of such predicates in the MiniZinc modelling language when it is used as a MIP front-end. It reports on the performance of the resulting models, and the features of MiniZinc that make this possible.

Professor Mark Wallace is Professor of Data Science & AI at Monash University, Australia. We gratefully acknowledge support from a SICSA Distinguished Visiting Fellowship which helped finance his visit.

Professor Wallace graduated from Oxford University in Mathematics and Philosophy. He worked for the UK computer company ICL for 21 years while completing a Masters degree in Artificial Intelligence at the University of London and a PhD sponsored by ICL at Southampton University. For his PhD, Professor Wallace designed a natural language processing system which ICL turned into a product. He moved to Imperial College in 2002, taking a Chair at Monash University in 2004.

His research interests span different techniques and algorithms for optimisation and their integration and application to solving complex resource planning and scheduling problems. He was a co-founder of the hybrid algorithms research area and is a leader in the research areas of Constraint Programming (CP) and hybrid techniques (CPAIOR). The outcomes of his research in these areas include practical applications in transport optimisation.

He is passionate about modelling and optimisation and the benefits they bring.  His focus both in industry and University has been on application-driven research and development, where industry funding is essential both to ensure research impact and to support sufficient research effort to build software systems that are robust enough for application developers to use.

He led the team that developed the ECLiPSe constraint programming platform, which was bought by Cisco Systems in 2004. Moving to Australia, he worked on a novel hybrid optimisation software platform called G12, and founded the company Opturion to commercialise it.  He also established the Monash-CTI Centre for optimisation in travel, transport and logistics.   He has developed solutions for major companies such as BA, RAC, CFA, and Qantas.  He is currently involved in the Alertness CRC, plant design for Woodside planning, optimisation for Melbourne Water, and work allocation for the Alfred hospital.

Event details

  • When: 19th June 2019 11:00 - 12:00
  • Where: Cole 1.33a
  • Series: AI Seminar Series
  • Format: Lecture, Seminar

Distinguished Lecture Series: Formal Approaches to Quantitative Evaluation

Biography:
Jane Hillston was appointed Professor of Quantitative Modelling in the School of Informatics at the University of Edinburgh in 2006, having joined the University as a Lecturer in Computer Science in 1995. She is currently Head of the School of Informatics. She is a Fellow of the Royal Society of Edinburgh and Member of Academia Europaea. She currently chairs the Executive Committee of the UK Computing Research Committee.
Jane Hillston’s research is concerned with formal approaches to modelling dynamic behaviour, particularly the use of stochastic process algebras for performance modelling and stochastic verification. The application of her modelling techniques have ranged from computer systems, to biological processes and transport systems. Her PhD dissertation was awarded the BCS/CPHC Distinguished Dissertation award in 1995 and she was the first recipient of the Roger Needham Award in 2005. She has published over 100 journal and conference papers and held several Research Council and European Commission grants.
She has a strong interest in promoting equality and diversity within Computer Science; she is a member of the Women’s Committee of the BCS Computing Academy and chaired the Women in Informatics Research and Education working group of Informatics Europe 2016—2018, and during that time instigated the Minerva Informatics Equality Award.

Formal Approaches to Quantitative Evaluation
Qualitative evaluation of computer systems seeks to ensure that the system does not exhibit bad behaviour and is in some sense “correct”. Whilst this is important it is also often useful to be able to reason not just about what will happen in the system, but also the dynamics of that behaviour: how long it will take, what are the probabilities of alternative outcomes, how much resource is used….? Such questions can be answered by quantitative analysis when information about timing and probability are incorporated into models of system behaviour.

In this short series of lectures I will talk about how we can extend formal methods to support quantitative evaluation as well as qualitative evaluation of systems. The first lecture will focus on computer systems and a basic approach based on the stochastic process algebra PEPA. In the second lecture I will introduce the language CARMA which is designed to support the analysis of collective adaptive systems, in which the structure of the system may change over time. In the third lecture I will consider systems where the exact details of behaviour may not be known and present the process algebra ProPPA which combines aspect of machine learning and inference with formal quantitative models.

Timetable:
Lecture 1: 9:30 – 10:30 – Performance Evaluation Process Algebra (PEPA)

Coffee break at 10:30 – 11:15
Lecture 2: 11:15 – 12:15 – Collective Adaptive Resource-sharing Markovian Agents (CARMA)

Lecture 3: 14:15 – 15:15 – Probabilistic Programming for Stochastic Dynamical Systems (ProPPA)


Venue: Upper and Lower College Halls

Event details

  • When: 8th April 2019 09:30 - 15:30
  • Where: Lower College Hall
  • Series: Distinguished Lectures Series
  • Format: Distinguished lecture

Hugh Leather (Edinburgh): Deep Learning for Compilers (School Seminar)

Abstract:

Writing optimising compilers is difficult. The range of programs that may be presented to the compiler is huge and the system on which they run are complex, heterogeneous, non-deterministic, and constantly changing. Machine learning has been shown to make writing compiler heuristics easier, but many issues remain.

In this talk I will discuss recent advances in using deep learning to solve compiler issues: learning heuristics and testing compiler correctness.

Speaker Bio:

Hugh is a reader (associate professor) at the University of Edinburgh. His research involves all elements of compilers and operating systems, usually targeting performance and energy optimisation, often with a focus on using machine learning for those tasks. After his PhD, also at Edinburgh, he was a Fellow of the Royal Society of Engineering. Before returning to academia, he was an engineer at Microsoft and architect and team leader at Trilogy, delivering multi-million dollar projects to Fortune 500 companies.

Event details

  • When: 9th April 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Paul-Olivier Dehaye: From Cambridge Analytica to the future of online services: a personal journey (School Seminar)

Abstract:

2018 was a crazy year for privacy. The General Data Protection Regulation came into force in May, and new revelations on the personal data ecosystem were making headlines on a weekly basis. I will give the behind the scenes for a lot of these events, question why they didn’t happen earlier, and offer some thoughts on the necessary future of online services. This will include a brief discussion of topics such as semantic alignment, interpretable machine learning, or new privacy-preserving data processing techniques.

Speaker Bio:

Paul-Olivier Dehaye is a mathematician by training. Affiliated to the University of Zurich as a SNSF Assistant Professor until 2016, his career then took a turn towards data protection activism and social entrepreneurship. He was the researcher on several news articles who have reached millions of readers (including many with Carole Cadwalladr), and testified in front of the UK and EU Parliaments on multiple occasions. He is on the board of MyData Global, has founded the NGO PersonalData.IO, and the project MyData Geneva.

Event details

  • When: 19th March 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Rachel Menzies (Dundee): Unlocking Accessible Escape Rooms: Is Technology the Key? (School Seminar)

Abstract:

Escape rooms are popular recreational activities whereby players are locked in a room and must solve a series of puzzles in order to ‘escape’. Recent years have seen a large expansion technology being used in these rooms in order to provide ever changing and increasingly immersive experiences. This technology could be used to minimise accessibility issues for users, e.g. with hearing or visual impairments, so that they can engage in the same way as their peers without disabilities. Escape room designers and players completed an online questionnaire exploring the use of technology and the accessibility of escape rooms. Results show that accessibility remains a key challenge in the design and implementation of escape rooms, despite the inclusion of technology that could be used to improve the experience of users with disabilities. This presentation will explore the lack of accessibility within Escape Rooms and the potential for technology to bridge this gap.

Speaker Bio:

Dr Rachel Menzies is the Head of Undergraduate Studies for Computing at the University of Dundee and is the current SICSA Director of Education (https://www.sicsa.ac.uk/education/). She co-directs the UX’d research group (https://www.ux-d.co.uk/) and her research interests include user centred design with marginalised user groups, such as users with disabilities, as well as exploring novel interfaces, data visualisation and CS education. Her most recent work focusses on accessibility is in escape rooms, in particular how users with varied disabilities can access and enjoy the experience alongside typical users.

Event details

  • When: 2nd April 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Marina Romanchikova (NPL): How good are our data? Measuring the data quality at National Physical Laboratory (School Seminar)

Abstract:

From mapping the spread of disease to monitoring climate change, data holds the key to solving some of the world’s biggest challenges. Dependable decisions rely on understanding the provenance and reliability of data. Historically, only a small fraction of the generated data was shared and re-used, while the majority of data were used once and then erased or archived. At NPL Data Science we are defining best practice in measurement data reuse and traceability by developing metadata standards and data storage structures to locate and interpret datasets and make them available for sharing, publication and data mining.

The talk will shed light on the most burning issues in the scientific data management, and illustrate it with examples from industrial and academic practices. It will present several NPL Data Science projects that focus on delivering confidence in data obtained from life science imaging, medicine, geosciences and fundamental physics.

Speaker Bio:

Dr Marina Romanchikova joined the NPL Data Science team in 2017 to work on data quality and metadata standards. She obtained an MSc in Medical Informatics at University of Heidelberg, Germany, where she specialised in medical image processing and in management of hospital information systems. In 2010 she received a PhD on Monte Carlo dosimetry for targeted radionuclide therapy at the Institute of Cancer Research in Sutton, UK. Marina worked six years as a radiotherapy research physicist at Cambridge University Hospitals where she developed methods for curation and analysis of medical images.

Current interests

– Quantitative quality assessment of medical images and medical image segmentation
– Harmonisation of medical and healthcare data from heterogeneous sources
– Applications of machine learning in healthcare
– Automated data quality assurance

Event details

  • When: 12th March 2019 14:00 - 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar