Sinhalese Text Entry Research by Shyam Reyal, University of St. Andrews

More info

The Sinhalese language (which falls into the family of Indo-Aryan languages) is spoken, read and written by over 22 million users worldwide (and by almost all the citizens of Sri Lanka). The language itself is very rich and complex – with over 60 base characters + 13 vowel variations for each, and also in terms of contextual phrases and idioms, which are much more diverse than Western languages. Nevertheless, very little work has been done in terms of creating efficient, user friendly text entry mechanisms for Sinhalese, in both computers and mobile devices. As present, despite attempts to standardize input methods, no such single main-stream popular method of text entry has surfaced. Continue reading

Event details

  • When: 5th February 2013 13:00 - 14:00
  • Where: Cole 1.33a
  • Format: Seminar

School Seminar by Eoin Woods

The Role of the Software Architect in Industry

Eoin Woods is a professional software architect and amateur software architecture researcher, having spent over 20 years in software engineering practice and contributed a number of papers and a co-authored book to the research literature on software architecture. In this talk, he will discuss how the two worlds relate to each other, the context for software architecture provided by enterprise software development and what software architects actually spend their days doing. The aim of the talk is to provide an honest insight into the day-to-day work of an industrial software architect, while still inspiring people to become one!

Event details

  • When: 8th May 2012 15:00 - 16:00
  • Where: Phys Theatre C
  • Series: CS Colloquia Series
  • Format: Colloquium

Seminar, An Overview of the AspeKT Project – Turning Academic Excellence into Gold by Colin Adams

An Overview of the AspeKT Project – Turning Academic Excellence into Gold


Abstract


The talk will give an overview of the major elements of the AspeKT project a 3 year program funded by Scottish Enterprise and the Scottish Funding Council dedicated to improving the flow of ideas between the research excellence and talent pool produced by SICSA, and local industry. It will go through the major elements of the program designed to stimulate industrial innovation and a great flow of start-ups from that research base.


Bio


Dr Colin Adams is the Director of Commercialisation at the School of Informatics, University of Edinburgh and the Director of the AspekT program – the knowledge transfer program for the SICSA research pool. Colin started as an academic in the 1970’s before moving to Digital Equipment Corporation where he managed the development of VAX/VMS operating system before running the office automation business and the All-In-1 product line. He then moved into Electronic Design Automation and silicon, founding European Silicon Structures , US Silicon Structures and EuCAD. He sold EuCAD to Cadence Design Systems and managed various Cadence businesses and finally running the TALITY Management Buy Out. After a brief attempt at retiring he returned to the School of Informatics at University of Edinburgh to run the ProspeKT program focusing on generating start-ups out of the talent pool there.
He also chairs 2 local start-ups: ATEEDA and Coriolis Media and is a non Exec Director for ScotlandIS. HE has a BSc in Computer Science and Mathematics and a PhD in Computer Science, both from the University of Edinburgh

Event details

  • When: 23rd April 2012 14:00 - 15:00
  • Where: Phys Theatre C
  • Series: CS Colloquia Series
  • Format: Colloquium

Autonomy handover and rich interaction on mobile devices by Simon Rodgers

Abstract: In this talk I will present some of the work being done in the new Inference, Dynamics, and Interaction group, at the University of Glasgow. In particular, we are interested in using probabilistic inference to improve interaction technology on handheld devices (particularly with touch screens).

I will show how we are using sequential Monte-Carlo techniques to infer distributions over user inputs which can be (1) augmented with applications to provide a smooth handover of control between the human and device and (2) used to extract additional information regarding touch interactions and subsequently improve touch accuracy.

There is a short bio on my webpage:
http://www.dcs.gla.ac.uk/~srogers

Event details

  • When: 19th March 2012 14:00 - 15:00
  • Where: Phys Theatre C
  • Series: CS Colloquia Series
  • Format: Colloquium, Seminar

A large-scale study of information needs by Karen Church

In recent years, mobile phones have evolved from simple communication devices to sophisticated personal computers enabling anytime, anywhereaccess to a wealth of information. Understanding the types of information needs that occur while mobile and how these needs are addressed is crucial in order to design and develop novel services that are tailored to mobile users.

To date, studies exploring information needs, in particular mobile needs, have been relatively small in terms of scope, scale and duration. The goal of this work is to investigate information needs on a much larger-scale and to explore, through quantitative analysis, how those needs are addressed.To this end, we conducted one of the most comprehensive studies of information needs to date, spanning a 3-month period and involving over 100 users. The study employed an intelligent experience sampling algorithm, an online diary and SMS technology to gather insights into the types of needs that occur from day to day.

Our results not only complement earlier studies but also shed new light on the differences between mobile and non-mobile information needs as well as the impact of demographics like gender have on the types of needs that arise and on the means chosen to satisfy those needs. Finally, we point to a number of design implications for enriching the future experiences of mobile users based on our findings..

Continue reading

Event details

  • When: 5th March 2012 14:00 - 15:00
  • Where: Phys Theatre C
  • Series: CS Colloquia Series
  • Format: Colloquium, Seminar

Distinguished Lecture Series:Artificial Life as an approach to Artificial Intelligence, by Professor Larry Yaeger

Programme dls_sem2 12 Yaeger

An overview of ALife in general, some of the research–including neuroscience, genetic algorithms, information theory, and animal cognition–leading to my incremental, evolved approach to AI, and the work I (and others) have done in this area.

Slides:

Venue: UCH (Upper College Hall)

Event details

  • When: 12th March 2012
  • Series: Distinguished Lectures Series
  • Format: Seminar

Inaugural Lecture: The computer is the new microscope by Professor Simon Dobson

Professor Simon Dobson, School of Computer Science, will deliver his Inaugural Lecture “The computer is the new microscope” in the Lecture Theatre, Medical and Biological Sciences Building, on Wednesday 7 December 2011 at 5.15 p.m.  PLEASE NOTE CHANGE OF VENUE.

The Princpal will take the Chair and the Dean of Science will give the vote of thanks.

The School will host a reception in the coffee area (near the foyer) of the Jack Cole Building.

Event details

  • When: 7th December 2011 17:15 - 18:15
  • Format: Lecture

Multimodal mobile interaction – making the most of our users’ capabilities by Stephen Brewster, University of Glasgow

Title: Multimodal mobile interaction – making the most of our users’ capabilities


Mobile user interfaces are commonly based on techniques developed for desktop computers in the 1970s, often including buttons, sliders, windows and progress bars. These can be hard to use on the move which then limits the way we use our devices and the applications on them. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a ‘head up’ way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things


I will discuss some of the work we are doing on input using gestures done with fingers, wrist and head, along with work on output using non-speech audio, 3D sound and tactile displays in applications such as for mobile devices such as text entry, camera phone user interfaces and navigation. I will also discuss some of the issues of social acceptability of these new interfaces; we have to be careful that the new ways we want people to use devices are socially appropriate and don’t make us feel embarrassed or awkward


Biography: Stephen is a Professor of Human-Computer Interaction in the Department of Computing Science at the University of Glasgow, UK. His main research interest is in Multimodal Human-Computer Interaction, sound and haptics and gestures. He has done a lot of research into Earcons, a particular form of non-speech sounds. He completed his degree in Computer Science at the University of Herfordshire in the UK. After a period in industry he did his PhD in the Human-Computer Interaction Group at the University of York in the UK with Dr Alistair Edwards. The title of his thesis is “Providing a structured method for integrating non-speech audio into human-computer interfaces”. That is where he developed his interests in earcons and non-speech sound. After finishing his PhD he worked as a research fellow for the European Union as part of the European Research Consortium for Informatics and Mathematics (ERCIM). From September, 1994 – March, 1995 he worked at VTT Information Technology in Helsinki, Finland. He then worked at SINTEF DELAB in Trondheim, Norway.

Event details

  • When: 20th February 2012 14:00 - 15:00
  • Where: Phys Theatre C
  • Series: CS Colloquia Series
  • Format: Colloquium