UCAS applicants are welcomed to the School of Computer Science for an Undergraduate visiting day.
Event details
- When: 10th March 2012 11:00 - 21:00
- Format: Visiting Day
UCAS applicants are welcomed to the School of Computer Science for an Undergraduate visiting day.
UCAS applicants are welcomed to the School of Computer Science for an Undergraduate visiting day.
The next SCONE meeting will be held in the School of Computer Science
in St Andrews on Friday 24 February. We will start with lunch at 1200,
and the main event from 1300-1700 before adjourning to a pub.
To register, please e-mail Tristan Henderson so that we can organise numbers for
catering.
The format for this meeting will be a very small number of talks, and
a PhD poster session. If you are a PhD student, then please consider
bringing a poster to advertise your work and elicit feedback. If you
are a PhD supervisor, then please encourage your students to present a
poster.
If you are not a PhD student and would like to give a talk, then
please also get in touch.
The programme will eventually become available here.
A talk on “Proactive contextual information retrieval” by Samuel Kaski of Aalto University and University of Helsinki, Finland.
Abstract:
In proactive information retrieval the ultimate goal is to seamlessly access relevant multimodal information in a context-sensitive way. Usually explicit queries are not available or are insufficient, and the alternative is to try to infer users’ interests from implicit feedback signals, such as clickstreams or eye tracking. We have studied how to infer relevance of texts and images to the user from the eye movement patterns. The interests, formulated as an implicit query, can then be used in further searches. I will discuss our new machine learning-based results in this field, including data glasses-based augmented reality interface to contextual information, and timeline browsers for life logs.
An overview of ALife in general, some of the research–including neuroscience, genetic algorithms, information theory, and animal cognition–leading to my incremental, evolved approach to AI, and the work I (and others) have done in this area.
Slides:
Venue: UCH (Upper College Hall)
The school is hosting a St Andrews Day Graduation reception on Wednesday 30th November from 14:00
Professor Simon Dobson, School of Computer Science, will deliver his Inaugural Lecture “The computer is the new microscope” in the Lecture Theatre, Medical and Biological Sciences Building, on Wednesday 7 December 2011 at 5.15 p.m. PLEASE NOTE CHANGE OF VENUE.
The Princpal will take the Chair and the Dean of Science will give the vote of thanks.
The School will host a reception in the coffee area (near the foyer) of the Jack Cole Building.
Abstract: Modern biological research hinges on technologies that are able to generate very large and complex datasets. For example, recent advances in DNA sequencing technologies have led to global collections in the multi-petabyte range that are doubling every five months. These data require organising in a form that allows interpretation by a very large and diverse user community that are interested in everything from human health and disease, through crop and animal breeding to the understanding of ecosystems. In this talk I will first give an overview of core molecular biology concepts and some of the different types of data that are currently collected, I will then focus on work from my group in visualisation and analysis of sequence alignment data before turning to examples of prediction of properties and features from biological data.
Title: Multimodal mobile interaction – making the most of our users’ capabilities
Mobile user interfaces are commonly based on techniques developed for desktop computers in the 1970s, often including buttons, sliders, windows and progress bars. These can be hard to use on the move which then limits the way we use our devices and the applications on them. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a ‘head up’ way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things
I will discuss some of the work we are doing on input using gestures done with fingers, wrist and head, along with work on output using non-speech audio, 3D sound and tactile displays in applications such as for mobile devices such as text entry, camera phone user interfaces and navigation. I will also discuss some of the issues of social acceptability of these new interfaces; we have to be careful that the new ways we want people to use devices are socially appropriate and don’t make us feel embarrassed or awkward
Biography: Stephen is a Professor of Human-Computer Interaction in the Department of Computing Science at the University of Glasgow, UK. His main research interest is in Multimodal Human-Computer Interaction, sound and haptics and gestures. He has done a lot of research into Earcons, a particular form of non-speech sounds. He completed his degree in Computer Science at the University of Herfordshire in the UK. After a period in industry he did his PhD in the Human-Computer Interaction Group at the University of York in the UK with Dr Alistair Edwards. The title of his thesis is “Providing a structured method for integrating non-speech audio into human-computer interfaces”. That is where he developed his interests in earcons and non-speech sound. After finishing his PhD he worked as a research fellow for the European Union as part of the European Research Consortium for Informatics and Mathematics (ERCIM). From September, 1994 – March, 1995 he worked at VTT Information Technology in Helsinki, Finland. He then worked at SINTEF DELAB in Trondheim, Norway.
Systems software, such as an operating system or a network stack, underlies everything we do on a computer, whether that computer is a desktop machine, a server, a mobile phone, or any embedded device. It is therefore vital that such software operates correctly in all situations. In recent years, dependent types have emerged as a promising approach to ensuring program correctness using languages and verification tools such as Agda and Coq. However, these tools operate at a high level of abstraction and so it can be difficult to map these verified programs to efficient low level code, working with bit-level operations and interacting directly with system services.
In this talk I will describe Idris, a dependently typed programming language implemented with systems programming in mind. I will show how it may be used to implement programs which interact safely with the operating system, in particular how to give precise APIs for verifiable systems programming with external C libraries.
Bio: Edwin Brady is a SICSA Advanced Research Fellow at the University of St Andrews
(http://www.cs.st-andrews.ac.uk/~eb)