Masterclass with Elizabeth Churchill, Mohamed Bin Zayed University of Artificial Intelligence

We have the privilege of hosting a session with Professor Elizabeth Churchill, one of the top scientists in the world in human-centred digital technology, with experience from some of the largest companies in the world. She is currently Department Chair of Human Computer Interaction at MBZUAI, the Mohamed Bin Zayed University of Artificial Intelligence, in Abu Dhabi, UAE. 

Time: Wednesday July 23, 14.00-16.00

Place: Waverley 129 (City Campus) – please ask at reception for access

Details:


Dr Elizabeth F. Churchill is Professor and founding Department Chair of Human Computer Interaction at MBZUAI (the Mohamed Bin Zayed University of Artificial Intelligence) in Abu Dhabi, UAE. She was formerly a Senior Director of User Experience at Google. She is an Association of Computing Machinery (ACM) Fellow, a member of the ACM’s CHI Academy, and an ACM Distinguished Speaker. She has built research teams at Google, eBay, Yahoo, PARC and FujiXerox.

Professor Churchill is an applied social scientist, interactive technology designer and social communications researcher. She has a background in psychology (neuro, experimental, cognitive and social), AI, and cognitive science. For the past 20+ years, she has drawn on social, computer, engineering and data sciences to create innovative end-user applications and services. For the past few years, she has been most active in the areas of ubiquitous and mobile computing, social media, computer mediated communication, locative media and internet/web sciences. During this time, Professor Churchill designed and evaluated enterprise and consumer-facing information/communication applications and services for desktop, mobile, tablet and large screen devices.

https://elizabethchurchill.com

https://mbzuai.ac.ae/study/faculty/elizabeth-churchill/

https://www.linkedin.com/in/elizabethfchurchill/

At this Masterclass, we will have a discussion, led by Professor Lars Erik Holmquist, about Dr. Churchill’s experience as a researcher crossing industry and academia, and working at the frontline of many of the world’s top tech companies. The session will be very interactive, and we invite everyone, in particular PhD students and early career researchers, to take part!

Hope to see you there!

Design Studios and Cosy Games at ARPPID 2025

CXL have two presentations at ARPPID 2025, a new Working Conference on Academic Research and Professional Practice in Interaction Design taking place in London on July10-11.

Lars Erik Holmquist and Sam Nemeth will present the Case Study The Design Studio of the Future: Insights from A Practical Experiment in Remote Collaboration. In this project, which was partly carried out during the Covid pandemic, we aimed to recreate the positive features of a design studio in an educational and workplace setting using distance-spanning digital technology. Inspired by Conjoint Control, we constructed realistic solutions using off-the-shelf hardware and software that afforded novel and efficient ways of blending co-located and distance collaboration. From this we learned that although much of the required hardware already exists, it is not generally being used to its full potential, and that the standard meeting software is often not fit for this kind of creative collaboration settings.

Also at ARPPID, Anh Pham and Lars Erik Holmquist will present a poster titled From Cosy Games to Metaverse: Deriving Positive Interaction Qualities through Netnography. To gain design guidelines for a more welcoming Metaverse experience, we examined cosy games, which offer a non-violent and relaxing experience. Through netnography and thematic analysis we identified two main aspects of the positive user experience: positive emotional experiences and anticipation experiences. The findings will serve as the foundation for exploring the future potential and create design frameworks for creating more inclusive Metaverse applications.

Conjoint Control paper at HCI International 2025

Sam Nemeth and and Lars Erik Holmquist will present the paper Conjoint Control: A Practical Approach to Implementing Physical Interfaces in Real-World Settings at HCI International 2025 in Gothenburg, Sweden (open access version here). 

The study concerns how to implement physical interfaces, often called tangible user interfaces (TUI), in real-world settings. Such interfaces are in many ways more difficult to implement and maintain than purely software-based graphical interfaces, and this is one of the reasons that they are almost exclusively present in lab settings, despite having documented advantages compared to screen-based interfaces. To support the development of physical interfaces we created a set of guidelines we call Conjoint Control. The approach stresses the use of communities and off-the-shelf hardware and software, as well as an interplay between the designer and the intended users.

The guidelines were applied in a real-world use case where we created and deployed a physical interface in a real-world office setting. We were given the task by the management of a building to design a system to manage a reception desk from a remote location – a “telepresence receptionist”. By following the process of Conjoint Control, we were able to design, implement and deploy a system that was used by the staff for over a year. 

The pictures above show the components of the system from the visitor and remote receptionist side, and the video below shows it in action at the PROTO Emerging Technologies Centre in Gateshead.



Conversational Prototyping video

In this video we explain the project Conversational Prototyping: Leveraging generative AI to support iterative device production and testing, which is funded by the pro² network+, a UKRI (United Kingdom Research and Innovation) funded network that aims to democratise digital device production.

The visuals are from our amazing collaboration partner Electric Circus in Amsterdam!



Presentations on AI and XR at CHI 2025 in Yokohama

Lars Erik Holmquist and Sam Nemeth will be presenting the Late Breaking Work “Don’t believe anything I tell you, it’s all lies!”: A Synthetic Ethnography on Untruth in Large Language Models at CHI 2025,  the ACM SIGCHI conference on Human Factors in Computing Systems. CHI is the premier international conference of Human-Computer Interaction.

The short paper (open access version) applies the new method synthetic ethnography to present three case studies of how Large Language Models (LLMs) present untruth. We observed the behaviour of different LLMs talking about verifiable facts, and found that when they reached the edge of their knowledge base, they just went on talking (what Hicks et al. called “bullshitting”). We believe that although it is problematic, this capacity to “fill in” is also part of what makes LLMs and other forms of generative AI so useful. However, currently the interfaces fail to give the user an understanding of the model’s limitations, and therefore we argue that LLMs should be designed so that they can “lie gracefully” – continuing to do what they do best, while making it possible for the user to work with them to avoid dangerous untruths.

Also at CHI, Lars Erik Holmquist will present the paper Liberated Pixels and Virtual Production: A Framework and a Test Case for XR Beyond Glasses at the workshop Beyond Glasses: Future Directions for XR Interactions within the Physical World. The paper argues that virtual production (such as the VP studio at NTU's new DaDA building) presents a perfect environment for developing novel extended reality (XR) applications, and proposes a new framework for XR beyond glasses, called Liberated Pixels. 

CHI 2025 takes place in Yokohama, Japan, at the PACIFICO Yokohama from 26 April to 1 May 2025, while also supporting remote attendance. Hope to see you there!

Seminar with Ollie Hanton: 3D Printing Interactive Devices

Time: April 10, 14.00-16.00
Place: Waverley 129


What shape is a screen and why do flat, rectangular interfaces limit the way we interact with digital information? Free-form interactive devices such as custom-fitted wearables, bespoke controllers and integrated interactive surfaces hold the potential to revolutionise environments including education, vehicular travel, workplaces and homes. However, manufacturing is limited to high quantity runs of identical devices and component-based fabrication is limited in form and quality. By harnessing 1) state-of-the-art active materials that can enable input and output and 2) additive manufacturing methods to enable automated deposition, the next step in manifesting interfaces is through decentralised production where makers can design and produce physical devices on-demand. 

Ollie Hanton is a lecturer (assistant professor) at the University of Bath. Before joining the University of Bath, he attained his Ph.D. from the University of Bristol, supervised by Professor Anne Roudaut and Professor Mike Faser. There he focused his research on the Personal Fabrication of Interactive Devices through spraying and 3D printing. Ollie publishes work at top HCI conferences (CHI, TEI) where he has received multiple awards for his research.

Seminar with Max Wilson: Brain Data as Cognitive Personal Informatics

Time: March 27, 14.00-16.00
Place: Waverley 129


Classifying the cognitive states of people, using physiological data is basically a machine learning problem now, whether its brain data, heart/breathing data, or off-body camera data. So what is the HCI question? For us, the question is how will this become mixed up in wearable tech and personal informatics. Especially since you can already buy consumer home neurotech for £200-1000 (!), but also because our watches and apps like Welltory are similarly trying to help manage our cognitive effort. Further, because we still don’t really know what goals people should have for e.g. their mental workload levels throughout the day, nor what is healthy. In this talk I will describe several of our research projects that are building our understanding of a Future Living with Consumer Neurotechnology.

Max L. Wilson is an Associate Professor of Human-Computer Interaction, in the Mixed Reality Lab, and Director of Student Experience in the School of Computer Science. His EPSRC, European, and Google funded research is focused on the use of fNIRS brain data, about mental workload and other cognitive activity, as a form of personal data, that can be used to evaluate technology and work tasks. This work has emerged from his earlier research on the evaluation of user interfaces for interacting with information. Max is on the steering committees of both ACM CHI and ACM CHIIR conferences, as well as a member of the SIGCHI Conferences Working Group, and a Deputy Editor at the International Journal of Human-Computer Studies.