School Seminar – Barry Brown

Mobility in vivo

Barry Brown, Co-director Mobile Life, University of Stockholm

barbro.tumblr.com
The Mobile VINN Excellence Centre

Abstract
Despite the widespread use of mobile devices, details of mobile technology use ‘in the wild’ have proven difficult to collect. For this study we uses video data to gain new insight into the use of mobile computing devices. Screen-captures of smartphone use, combined with video recordings from wearable cameras, allow for the analysis of the detail of device use in a variety of activity and settings. We use this data to describe how mobile device use is threaded into other co-present activities, focusing on the use of maps and internet searches to support users on a day-trip. Close analysis of the video data reveals novel aspects of how gestures are used on touch screens, in that they form a resource for the ongoing coordination of joint action. We go on to describe how the local environment and information in the environment are combined to guide and support action. In conclusion, we argue for the mobility of mobile devices being as much about this interweaving of activity and device use, as it is about physical portability.

Barry Brown

Event details

  • When: 1st October 2012 15:00 - 16:00
  • Where: Phys Theatre C
  • Format: Seminar

Postgraduate Computer Science BBQ

After a busy week of welcome talks and induction, orientation week drew to a close with the postgraduate BBQ.

MSc and PhD students had an opportunity to meet each other, discuss their diverse backgrounds, previous studies, eat burgers, twiglets and consume the local delicacy Irn Bru.

Images Courtesy of Anne Campbell

Orientation Week BBQ

It was great to see so many undergraduate computer science students at the Orientation Barbecue yesterday. New and returning students had the opportunity to discuss the merits of studying computer science, eat burgers and consume the traditional Irn Bru in a friendly setting.

The Gaming/Programming Competition winners also received their prize in the form of Amazon vouchers. Congratulations to Maclej, Simon and Daniel.

PhD Reading Party 2012

The PhD Reading Party was held at the Burn House, just outside Edzell in the
North East of Scotland.

It was an opportunity for the research students to
give a talk in a relaxed atmosphere, about their research interests.
It also allowed for some socialising while wandering through the nearby
woods and rivers.

Each student gave a 20 minute talk including time for questions and discussions.

In the free time some went off to explore the nearby forest and salmon
rich river while others decided to take a trip to Montrose and test the
North Sea.

Text and Images Courtesy of Ruth Hoffmann


St Andrews Algorithmic Programming Competition

When: Wednesday 12th of September 9:30am – 5pm (with a 1 hour break for lunch)
Where: Sub-honours lab in Jack Cole building (0.35)

As part of this competition, you may be offered an opportunity to participate in a Human-Computer Interaction study on subtle interaction. Participation in this study is completely voluntary.

There will be two competitive categories:
HCI study participants:
1st prize: 7” Samsung Galaxy Tab 2
2nd prize: £50 Amazon voucher
3rd prize: £20 Amazon voucher
Everyone:
1st prize: £50 Amazon voucher
2nd prize: £20 Amazon voucher
3rd prize: £10 Amazon voucher

We will try to include as many programming languages as is reasonable, so if you have any special requests, let us know.
If you have one, bring a laptop in case we run out of lab computers!
If you have any questions, please email Jakub on jd67@st-andrews.ac.uk

Event details

  • When: 12th September 2012 09:30 - 17:00
  • Where: Cole 0.35 - Subhons Lab

Facing Healthcare’s Future: Designing Facial Expressivity for Robotic Patient Mannequins

Speaker: Laurel Riek, University of Notre Dame
Title: Facing Healthcare’s Future: Designing Facial Expressivity for Robotic Patient Mannequins

Abstract:

In the United States, there are an estimated 98,000 people per year killed and $17.1 billion dollars lost due to medical errors. One way to prevent these errors is to have clinical students engage in simulation-based medical education, to help move the learning curve away from the patient. This training often takes place on human-sized android robots, called high-fidelity patient simulators (HFPS), which are capable of conveying human-like physiological cues (e.g., respiration, heart rate). Training with them can include anything from diagnostic skills (e.g., recognizing sepsis, a failure that recently killed 12-year-old Rory Staunton) to procedural skills (e.g., IV insertion) to communication skills (e.g., breaking bad news). HFPS systems allow students a chance to safely make mistakes within a simulation context without harming real patients, with the goal that these skills will ultimately be transferable to real patients.

While simulator use is a step in the right direction toward safer healthcare, one major challenge and critical technology gap is that none of the commercially available HFPS systems exhibit facial expressions, gaze, or realistic mouth movements, despite the vital importance of these cues in helping providers assess and treat patients. This is a critical omission, because almost all areas of health care involve face-to-face interaction, and there is overwhelming evidence that providers who are skilled at decoding communication cues are better healthcare providers – they have improved outcomes, higher compliance, greater safety, higher satisfaction, and they experience fewer malpractice lawsuits. In fact, communication errors are the leading cause of avoidable patient harm in the US: they are the root cause of 70% of sentinel events, 75% of which lead to a patient dying.

In the Robotics, Health, and Communication (RHC) Lab at the University of Notre Dame, we are addressing this problem by leveraging our expertise in android robotics and social signal processing to design and build a new, facially expressive, interactive HFPS system. In this talk, I will discuss our efforts to date, including: in situ observational studies exploring how individuals, teams, and operators interact with existing HFPS technology; design-focused interviews with simulation center directors and educators which future HFPS systems are envisioned; and initial software prototyping efforts incorporating novel facial expression synthesis techniques.

Biography:

Dr. Laurel Riek is the Clare Boothe Luce Assistant Professor of Computer Science and Engineering at the University of Notre Dame. She directs the RHC Lab, and leads research on human-robot interaction, social signal processing, facial expression synthesis, and clinical communication. She received her PhD at the University of Cambridge Computer Laboratory, and prior to that worked for eight years as a Senior Artificial Intelligence Engineer and Roboticist at MITRE.

Event details

  • When: 4th September 2012 13:00 - 14:00
  • Where: Cole 1.33a
  • Format: Seminar

Forthcoming talk by SICSA Distinguished Visitor

Room 1.33a at 2:00 pm on Friday 7th September 2012

  • Introduction to Grammatical Formalisms for Natural Language Parsing
  • Giorgio Satta, Department of Information Engineering, University of Padua, Italy

Abstract:
In the field of natural language parsing, the syntax of natural languages is

modeled by means of formal grammars and automata. Sometimes these formalisms

are borrowed from the field of formal language theory and are adapted to the
task at hand, as in the case of context-free grammars and their lexicalized
versions, where each individual rule is specialized for one or more lexical
items. Sometimes these formalisms are newly developed, as in the case of
dependency grammars and tree adjoining grammars. In this talk, I will
briefly overview several of these models, discussing their mathematical
properties and their use in parsing of natural language.

Event details

  • When: 7th September 2012 14:00 - 15:00
  • Where: Cole 1.33a
  • Format: Seminar, Talk

MSc Poster Demo Session 2012

After a summer of hard work the MSc student poster presentations and project demos took place earlier today. Dissertations were submitted on Monday. We wish them every success as they approach graduation and look forward to seeing them again in November!



Soundcomber: A Stealthy and Context-Aware Sound Trojan for Smartphones

Seminar by Dr Apu Kapadia, Indiana University

We introduce Soundcomber, a “sensory malware” for smartphones that
uses the microphone to steal private information from phone
conversations. Soundcomber is lightweight and stealthy. It uses
targeted profiles to locally analyze portions of speech likely to
contain information such as credit card numbers. It evades known
defenses by transferring small amounts of private data to the malware
server utilizing smartphone-specific covert channels. Additionally, we
present a general defensive architecture that prevents such sensory
malware attacks.

Event details

  • When: 9th August 2012 14:00 - 15:00
  • Where: Cole 1.33
  • Format: Seminar

Mobile user study masterclass

SICSA Masterclass
Mobile user studies and experience sampling

Dr Apu Kapadia is a Distinguished SICSA Visitor in August 2012. As
part of his visit we are organising a pair of masterclasses in running
mobile user studies. These masterclasses are open to all SICSA PhD
students. Students will be need to be available to attend both
masterclasses:
– Thursday 2 August, University of Glasgow
– Thursday 9 August, University of St Andrews

The classes will cover how to design and run a mobile user study using
smartphones, and in particularly cover the use of the experience
sampling method (ESM), a currently popular methodology for collecting
rich data from real-world participants. In the first class, attendees
will learn about the methodology and be given a smartphone. Attendees
will then carry the smartphone and participate in a small study, and
we will cover data analysis in the second class in St Andrews. The
organisers have experience in running ESM studies which have looked at
mobility, social networking, security and privacy, but the methodology
should be of interest to PhD students in both the NGI and MMI themes.

Biography of Dr Apu Kapadia:

Apu Kapadia is an Assistant Professor of Computer Science and
Informatics at the School of Informatics and Computing, Indiana
University. He received his Ph.D. in Computer Science from the
University of Illinois at Urbana-Champaign in October 2005.

Dr Kapadia has published over thirty peer-reviewed conference papers
and journal articles focused on privacy, with several of these at
top-tier venues such as ACM TISSEC, IEEE TDSC, PMC, CCS, NDSS,
Pervasive, and SOUPS. For his work on accountable anonymity, two of
his papers were named as “Runners-up for PET Award 2009: Outstanding
Research in Privacy Enhancing Technologies”, a prestigious award in
the privacy community. His work on usable metaphors for controlling
privacy was given the “Honorable Mention Award (Runner-up for Best
Paper)” at Pervasive. Dr Kapadia’s recent work on smartphone
“sensory” malware that make use of onboard sensors was published at
NDSS and received widespread media coverage. His work on analyzing
privacy leaks on Twitter also received media attention naming his work
as one of the “7 Must-Read Twitter Studies from 2011”, and one of “The
10 Most Interesting Social Media Studies of 2011”.

Dr Kapadia is interested in topics related to systems’ security and
privacy. He is particularly interested in security and privacy issues
related to mobile sensing, privacy-enhancing technologies to
facilitate anonymous access to services with some degree of
accountability, usable mechanisms to improve security and privacy, and
security in decentralized and mobile environments.

Event details

  • When: 9th August 2012 11:00 - 17:00
  • Where: Cole 1.33