Dr Adam Barker Awarded Royal Society Fellowship

Dr Adam Barker has been awarded a prestigious Royal Society Industry Fellowship. The scheme aims to enhance knowledge transfer in science and technology in the UK, and provides an outstanding opportunity to demonstrate the benefits of how industry and academia can work effectively together to drive innovation.

Adam will be spending 50% of his time for two years on a collaborative project at Cloudsoft in TechCube, a world-class startup space in Edinburgh. Adam will be working on multi-cloud application management with Dr Alex Heneveld, Co-Founder and Chief Technology Officer (CTO) and his team. He will be contributing towards Brooklyn – an open-source, policy-driven control plane for distributed applications, and the OASIS Cloud Application Management for Platforms (CAMP) standard.

MSc Poster Demo Session 2013

After a summer of hard work, our MSc students submitted their final dissertations last week. Earlier today they had an opportunity to present their posters and demonstrate their project artefacts.

MSc students complete with poster.

MSc students complete with poster.

With prizes for the top 3 posters and cakes for all, the session was very busy and provided the perfect occasion to reflect upon their own dissertation journey and appreciate the projects completed by their peers. Congratulations to Ilya Lvov, Oleg Iliev and Olalekan Baruwa who received the coveted amazon vouchers for best poster.

Main image from left to right: Ilya Lvov, Oleg Iliev and Olalekan Baruwa complete with Amazon Vouchers

Main image from left to right: Ilya Lvov, Oleg Iliev and Olalekan Baruwa complete with Amazon Vouchers


Poster Titles

  • Ilya Lvov, Data Journalism: Tools and Practices
  • Oleg Iliev, Exploration of QoS and QoE using radar charts
  • Olalekan Baruwa, IVF-predict and MyOvaries: An exploration, implementation and deployment of Bio-Medical Mobile Software Applications.

We wish them every success as they approach graduation and look forward to seeing them again in November!

Dr Per Ola Kristensson tipped to change the world

Dr Per Ola Kristensson is one of 35 top young innovators named today by the prestigious MIT Technology Review.

For over a decade, the global media company has recognised a list of exceptionally talented technologists whose work has great potential to “transform the world.”

Dr Kristensson (34) joins a stellar list of technological talent. Previous winners include Larry Page and Sergey Brin, the cofounders of Google; Mark Zuckerberg, the cofounder of Facebook; Jonathan Ive, the chief designer of Apple; and David Karp, the creator of Tumblr.

The award recognises Per Ola’s  work at the intersection of artificial intelligence and human-computer interaction. He builds intelligent interactive systems that enable people to be more creative, expressive and satisfied in their daily lives. focusingon text entry interfaces and other interaction techniques.

One example  is the gesture keyboard, which  enables users to quickly and accurately write text on mobile devices by sliding a  finger across  a touchscreen keyboard.  To write “the” the user touches the T key, slides to the H key, then the E key, and then lifts the finger. The result is a shorthand gesture for the word “the” which can be identified as a user’s intended word using a recognition algorithm. Today, gesture keyboards are found in products such as ShapeWriter, Swype and T9 Trace, and pre-installed on Android phones. Per Ola’s own ShapeWriter, Inc. iPhone app, ranked the 8th best app by Time Magazine in 2008, had a million downloads in the first few months.

Two factors explain the success of the gesture keyboard: speed, and ease of adoption. Gesture keyboards are faster than regular touchscreen keyboards because expert users can quickly gesture  a word by direct recall from motor memory. The gesture keyboard is easy to adopt because it enables users to smoothly and unconsciously transition from slow visual tracing to this fast recall directly from motor memory. Novice users spell out words by sliding their finger  from letter to the letter using visually guided movements. With repetition, the gesture gradually builds up in the user’s motor memory until it can be quickly recalled.

A gesture keyboard works by matching the gesture made on the keyboard to a set of possible words, and then decides which word is intended by looking at both the gesture and the contents of the sentence being entered. Doing this can require checking as many as 60000 possible words: doing this quickly on a mobile phone required developing new techniques for searching, indexing, and caching.

An example of a gesture recognition algorithm is available here as an interactive Java demo: http://pokristensson.com/increc.html

There are many ways to improve gesture keyboard technology. One way to improve recognition accuracy is to use more sophisticated gesture recognition algorithms to compute the likelihood that a user’s gesture matches the shape of a word. Many researchers work on this problem. Another way  is to use better language models. These models can be dramatically improved by identifying large bodies of  text  similar to what users want to write. This is often achieved by mining the web. Another way to improve language models is to use better estimation algorithms. For example, smoothing is the process of assigning some of the probability mass of the language model to word sequences the language model estimation algorithm has not seen. Smoothing tends to improve the language model’s ability to accurately predict words.

An interesting point about gesture keyboards  is how they may disrupt other areas of computer input. Recently we have developed a system that enables a user to enter text via speech recognition, a gesture keyboard, or a combination of both. Users can fix speech recognition errors by simply gesturing the intended word. The system will automatically realize there is a speech recognition error, locate it, and replace the erroneous word with the result provided by the gesture keyboard. This is possible by fusing the probabilistic information provided by the speech and the keyboard.

Per Ola also works in the areas of multi-display systems, eye-tracking systems, and crowdsourcing and human computation. He takes on undergraduate and postgraduate project students and PhD students. If you are interested in working with him, you are encouraged to read http://pokristensson.com/phdposition.html

References:

Kristensson, P.O. and Zhai, S. 2004. SHARK2: a large vocabulary shorthand writing system for pen-based computers. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology (UIST 2004). ACM Press: 43-52.

(http://dx.doi.org/10.1145/1029632.1029640)

Kristensson, P.O. and Vertanen, K. 2011. Asynchronous multimodal text entry using speech and gesture keyboards. In Proceedings of the 12th Annual Conference of the International Speech Communication Association (Interspeech 2011). ISCA: 581-584.

(http://www.isca-speech.org/archive/interspeech_2011/i11_0581.html)

Full Press Release

Ambitious, entrepreneurial, innovative, employable and highflying…

Words we use to describe our alumni, who work in New York, Switzerland, London and Edinburgh amongst other places.

Whether working for established companies such as Adobe and Google or in their own business start-ups such as AetherWorks LLC. and PlanForCloud (formerly ShopForCloud) our graduates continue to flourish. And rumour has it more of our talented CS graduates will be joining some of them shortly. The suspense! They are all exemplars of why St Andrews is the only Scottish university to feature in the 2013 High Fliers, a report about the graduate market in 2013.

CollageImagepngg

Clockwise from top left:

Rob, Angus and Greg from AetherWorks LLC., who took time out to capture a photo of themselves outside their offices in New York.

Ali at graduation, sporting a colour-co-ordinated Google Glass (who knew!). Listen to Ali discuss his career at the SICSA PhD conference careers panel.

We caught up with Adam, Andrew and James earlier this year when they represented Google at the Tech Talk by Google engineers.

Neil (complete with sunglasses) visited the school last week, on an unusually sunny day, with colleagues from Adobe.

Thanks to:
AetherWorks LLC.: Robert MacInnis, Angus MacDonald, Allan Boyd and Greg Bigwood.
PlanForCloud: Ali Khajeh-Hosseini and Alistair Scott.
Adobe: Neil Moore.
Google: James Smith, Adam Copp and Andrew McCarthy.
Editorial Support: Anne Campbell

The 11th International Conference on Finite-State Methods and Natural Language Processing (FSMNLP 2013)

The 11th International Conference on Finite-State Methods and Natural Language Processing (FSMNLP 2013) was held in the Gateway in St Andrews on July 15-17,2013. Presented were 17 peer-reviewed papers on natural language processing applications, language resources, and theoretical and implementational issues with relevance to finite-state methods. In addition, there were two keynote lectures, by Alexander Clark (King’s College London) and Bill Byrne (University of Cambridge), and three tutorials, by Ruth Hoffmann (University of St Andrews), Bevan Keeley Jones (University of Edinburgh) and Kousha Etessami (University of Edinburgh).

The conference was attended by 34 researchers and students from three continents. It also hosted a business meeting of SIGFSM (ACL Special Interest Group on Finite-State Methods). The social programme included a reception on July 14th, and a guided walk, a conference dinner in Lower College Hall and a concert in St Salvator’s Chapel on July 16th.

Accommodation in Agnes Blackadder Hall was arranged for non-local delegates, and lunches were served in the Gateway. Coffee breaks could be used for informal demos in the smaller seminar rooms of the Gateway.

Sponsored student places were available thanks to support from SICSA. Further support was received from VisitScotland and the University of St Andrews.

The full programme, with links to the proceedings, can be found from the website: http://fsmnlp2013.cs.st-andrews.ac.uk/

Images and text courtesy of Mark-Jan Nederhof (conference chair), Anssi Yli-Jyrä and Shyam Reyal.

Summer Days in Computer Science

Students and staff took advantage of the Scottish weather on Friday and held a BBQ to mark the anniversary of these events.

  • The Great Fire of Rome
  • Birth of Computer Scientist Mark Crispin
  • The Opening of the StACS Garden

CollageImage3

CollageImage4

Organised by Jan de Muijnck-Hughes and David Letham. Cooked by Jan de Muijnck-Hughes and Masih Hajiarabderkani. Salad ingredients from the StACS Garden. Enjoyed by all (including Pippa the dog).

Services to the Cloud

On June 27th Gordon Baxter and Derek Wang gave a presentation about their work on the SFC funded project “Creating High Value Cloud Services” at the Edinburgh Chamber of Commerce’s Business Growth Club.

Gordon talked about the lessons that have been learned so far from working closely with several Scottish SMEs who are adopting the cloud. Derek then gave a short demonstration of the web-based toolkit he has developed to analyse the potential costs and revenues associated with delivering a product or service through the cloud.

Find out more about the project on Services to the Cloud and The Cloudscape blog

MSc in Human Computer Interaction starting in September 2013

We have added more details on our new MSc in Human Computer Interaction which is starting in September 2013. This is an intensive one-year programme designed to provide a solid theoretical and practical foundation in HCI. It is designed to enable students from a variety of backgrounds to become HCI practitioners, in roles including UX designer, visual analysts, interaction designers and interaction architects. This MSc will also help prepare you for a PhD programme in HCI. In semester 1 students take Human Computer Interaction Principles and Human Computer Interaction Practice, followed by User-Centred Interaction Design and Evaluation Methods in Human Computer Interaction in semester 2. Other modules can be selected from the general MSc portfolio.

You can find more details here on the MSc in Human Computer Interaction.

MSc-Program1  typist   blockFontExample_cut