Workshop on Considering Technology through a Philosophical Lens

Event details

  • When: 18th May 2017 10:00 - 13:00
  • Where: Cole 1.33a
  • Format: Workshop

Technology fundamentally shapes our communication, relationships, and access to information. It also evolves through our interaction with it. Dialoguing across disciplines can facilitate an understanding of these complex and reciprocal relationships and fuel reflection and innovation.

This hands-on, participant-driven and experimental workshop will start a discussion of what can come from considering technology through a philosophical lens. Through discussions and hands-on design activities, it will provide an introduction to and reflection on questions at the intersection of computer science and philosophy, such as:

  • How have philosophy and technology shaped each other in the past?
  • How can philosophical ideas and methods guide research in Computer Science?
  • How can thinking through technology help Humanities researchers discover relevance and articulate impact in their research?

Engaging these questions can provide participants an entry-point into exploring these themes in the context of their own research.

This workshop is aimed at researchers from computer science who are curious about philosophy and how to leverage it to inform technically oriented research questions and designing for innovation. It is also aimed at researchers in the arts & humanities, social sciences, and philosophy who are curious about current research questions and approaches in computer science and how questions of technology can stimulate philosophical thought and research.

Attending the workshop is free but please register by emailing Nick Daly: nd40[at]st-andrews.ac.uk

Organisers: Nick Daly (School of Modern Languages) and Uta Hinrichs (School of Computer Science)

 

ACM SIGCHI: Communication Ambassador & Turing Award Celebration News

Congratulations to Hui-Shyong Yeo, who has been selected as both an ACM SIGCHI communication ambassador and to represent SIGCHI at the ACM 50 Years of the A.M. Turing Award Celebration.

Yeo is a 2nd year PhD student and is particularly interested in exploring and developing novel interaction techniques. Since joining us in SACHI, he has had work accepted at ACM CHI 2016 and CHI 2017, ACM MobileHCI 2016 and 2017 and ACM UIST 2016. His work has featured at Google I/O 2016, locally on STV news and he gave a talk at Google UK in 2016 about his research. His work has also featured in the media including in Gizmodo, TheVerge, Engadget and TechCrunch., see his personal website for more details. Continue reading

Wrist Worn Haptic Feedback Device

One of our PhD students Esma Mansouri Benssassi and her supervisor Dr Erica Ye defined a requirement for a wrist worn device to group a number of Haptic feedback elements for an experiment they wished to carry out. The on-board Haptic elements are two eccentric rotating mass micro motors and an linear resonant actuator. Initial circuit schematics and printed circuit board designs were created in an Open Source Electronics Design Automation Suite KiCAD EDA. The resulting printed circuit board (PCB) design was made on the CS CNC Router , this produces the PCB by engraving the copper clad fibreglass-epoxy board with a Vee cutter.

PCBBare Circular Engraved PCB

The case for the PCB was created in Autodesk Inventor and was 3D printed using the CS Makerbot 2X 3D printer.

Blank PCB and 3D Printed Case

Haptic Wristband and Haptic Transducers

The wrist worn Haptic feedback device will be connected via an umbilical cable to the main control Feather M0 embedded ARM and Haptic Driver breadboard. This is an ARM microcontroller and wifi module which can be programmed using the Arduino IDE. Code for the ARM processor will enable stored and custom waveforms to be played on the haptic devices on the wrist.

Haptic Feedback Breadboard Assembly

Success in the Laidlaw Undergraduate Internship Programme in Research and Leadership

Congratulations to Patrick Schrempf and Billy Brown who have been successful in their applications for a Laidlaw Undergraduate Internship in Research and Leadership for 2017. You can read further details about Billy and Patrick below.

Billy Brown:

I’m a fourth year Computer Science student from Belgium with too much interest for the subject. I play and referee korfball for the university, and I am fascinated by Old English and Norse history and mythology. I plan on using the Laidlaw Internship programme to get into the field of Computer Science research.

Project summary:

The Essence Domain Inference project aims to improve automated decision making by optimising the understanding of the statements used to define a problem specification. As part of the compilation of the high level Essence specification language, this project would tighten the domains to which a specified problem applies, with a domain inference algorithm.

The work is very much in the context of the recently-announced EPSRC grant working on automated constraint modelling in an attempt to advance the state of the art in solving complex combinatorial search problems. The modelling pipeline is akin to a compiler in that we refine a specification in the Essence language Billy mentions down to a number of powerful solving formalisms. The work Billy plan is to improve the refinement process and therefore the performance of the solvers, leading to higher quality solutions more quickly.

Patrick Schrempf:
I am currently a third year Computer Science student from Vienna. After enjoying doing research with the St Andrews Computer Human Interaction (SACHI) group last year, I am looking forward to the Laidlaw Internship Programme. Apart from research and studying, I enjoy training and competing with the Triathlon Club and the Pool Society.
Continue reading

RadarCat presented at UIST2016

SACHI research project RadarCat (Radar Categorization for Input & Interaction), highlighted earlier this year in the University news, the Courier and Gizmodo and in a Google I/O ATAP 2016 session, will be presented at UIST2016 this week.

RadarCat is a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. SACHI’s contribution to Project Soli featured in a previous blog post SACHI contribute to Google’s Project Soli, in May. Read more about RadarCat for object recognition on the SACHI blog.

Google's Project Soli workshop in March 2016

Google’s Project Soli workshop in March 2016

Aaron Quigley appointed as ACM SIGCHI Vice President for Conferences

Congratulations to Professor Aaron Quigley who has been appointed to the ACM SIGCHI Executive Committee, to serve as the Vice President for Conferences. The ACM Special Interest Group on Human Computer Interaction (SIGCHI) is the premier international society for professionals, academics and students who are interested in human-technology & human-computer interaction. SIGCHI sponsors or co-sponsors 24 conferences in addition to providing in-cooperation support for over 40 other conferences. This family of HCI conferences are held across the year and around the world.

As Vice-President for conferences, Aaron will be responsible for strategic planning for SIGCHI-sponsored conferences, overseeing all aspects of SIGCHI-sponsored conferences, chairing various boards and committees and working with other SIGCHI vice-presidents and the SIGCHI executive committee on policies affecting SIGCHI sponsored, co-sponsored, and in-cooperation conferences.

img

PhD Viva Success: Michael Mauderer

Belated congratulations to Michael Mauderer, who successfully defended his thesis earlier this month. Micheal’s thesis, augmenting visual perception with gaze-contigent displays, was supervised by Dr Miguel Nacenta. Professor Aaron Quigley acted as internal examiner and Professor Hans Gellersen, from Lancaster University acted as external examiner.

mm

Deepview Project: Innovative GAZER Software

Congratulations to Dr Miguel Nacenta and Michael Mauderer on the success of Deepview and its subsequent application Gazer, an open source tool that provides functionality for showcasing light field images using gaze-contingent focus. The software, developed by SACHI, works in conjunction with eye tracking devices to allow photographers using light field cameras to discern images by automatically concentrating on objects using just their eyes.

Gazer-Logo-full-768x202

Results from the project have been widely disseminated in the media and featured on a BBC click episode (20.56)

For more information and downloads visit the Gazer project section or github repository.

The project was funded through the European Union’s Marie Curie Program (CIG – 303780).