Past Projects

 

CSPE (Centre for the Study of Perceptual Experience)

CSPE (Centre for the Study of Perceptual Experience)

The Centre for the Study of Perceptual Experience (CSPE) facilitates analytical philosophical and empirical research into the nature of perceptual experience drawing on philosophy, psychology, neuroscience, psychiatry, human–computer interaction, and artistic practice.

We consist of a core group of faculty, postdocs, and graduate students at the University of Glasgow, together with a wider group of researchers from around the world. 

Find out below about our research, personnel, impact, events, opportunities for visiting and studying at the Centre, as well as current and previous research projects and collaborations.

Find out more: CSPE

Edify

Learn without limits

Edify logoThe Edify teaching platform is an output from the Innovate-UK funded Project Mobius.

Developed with and for academics, it allows ordinary users, with no 3D expertise, to teach using the powers of VR.

Those with access to sufficient VR headsets can use the software at scale, but those without can use a single headset to broadcast the VR experience across video-conferencing tools such as Zoom or Teams.

The University of Glasgow has partnered with edify to validate the platform, and act as an exemplar of at-scale VR deployment in Higher Education.

Find out more: Edify

Exploring motion sickness mitigations for mixed reality (Collaborative Project)

Glasgow-Sydney Partnership Collaboration Award: Exploring Motion Sickness Mitigations for Mixed Reality 

(McGill, Pollick from University of Glasgow, Verstraten from University of Sydney)

Working in conjunction with the University of Sydney to explore novel motion sickness mitigations, exploiting visual cues that can discreetly convey vehicle motion in XR.

Glasgow Life Sciences

Building 3D models, animations and VR games using data derived from laser scanning microscopes

Dr Craig Daly - School of Life Sciences

Confocal Laser Scanning Microscopes produce 3D scans of thick biological samples. 

Craig Daly has been collecting this type of data for almost 30 years and has built up a large archive of digital 3D models of the vascular wall. More recently he has gathered together a collection of data from around the MVLS College and mounted the best of these data sets on a Sketchfab page. 

The collection displays 43 multichannel 3D models from 13 different research groups and comprises virus particles, cells, tumours, fly guts and more. Sketchfab models can all be viewed in VR from a suitable web browser.

The best of these 3D models were then selected for inclusion in a Virtual Gallery which was used in Level 3 physiology teaching. A video of the whole process from microscope to VR teaching can be viewed on Craig’s Youtube channel.

More recently, Craig has been working with Multimedia designer Angela Douglass on a VR Cell Physiology game.  It is due out in summer 2021.

Increasing Digital Engagement and Access to The Hunterian’s collections

Funding from Museums Galleries Scotland. PI: Prof. Maria Economou

Start: 01/10/2021

End Date 28/02/2023

 

The project builds on our experience designing digital interpretation for the Antonine Wall collection and takes this learning forward to develop capacity across Hunterian staff. This will enhance our skills in sustainably developing and embedding digital resources in future practice.

Our popular Antonine Wall display showcases the largest collection from this UNESCO World Heritage site [see_supporting_doc_1]. The EU-funded, award-winning EMOTIVE project (2016–2019) created digital storytelling prototypes for this display to encourage emotional engagement with heritage. Plans to develop these prototypes into resources for diverse audiences were completely stalled by the pandemic.

This grant will help:

a) enlist external developers to convert these prototypes and adapt our plans to meet post-Covid demands for remote and safe onsite engagement, which reinforced the importance of EMOTIVE’s emphasis on building empathy and human connection.

b) employ a Digital Engagement Officer to promote this work across Hunterian teams to encourage future projects and embed ways to diversify our interpretation appealing to varied learning styles and sensory experiences. The Antonine Wall digital interpretation case study will thus be used to build our staff’s digital literacy leading to long-term impact. This will help meet our digital engagement strategic objectives and share our experience across the sector.

 

Novel Interactions for mixed reality (School of Computer Science)

EPSRC IAA:  Novel Interactions for Mixed Reality 

(McGill and Freeman from University of Glasgow, Dr Aiden Kehoe from Logitech)

This project is part of Glasgow's collaboration with Logitech on the future of XR-enabled productivity. 

This will look at novel interaction designs for augmented peripherals (peripherals whose position can be tracked in real-time, enabling augmentation of their inputs and outputs using an XR headset) such as the Logitech MR keyboard and Logitech Ink.

This is a follow up to another IAA we had funded: 6dofpen for 2D and 3D drawing in collaboration with Logitech.

Our heritage, our stories

Lorna Hughes MAE’s £3.6M UKRI Towards a National Collection award (https://www.ukri.org/blog/five-projects-join-the-towards-a-national-collection-community/) on ‘Our heritage, our stories’, which will build automated tools to assess community generated digital content (CGDC) in the National Archives. This is the only non-London led project of the five funded, but it involves partners from across the UK under University of Glasgow consortium leadership. Please remember to send details of other successful recent grant awards to Sarune for the website. 

PriXR

University of Glasgow researchers, Dr Mark McGill (Computing Science) and Dr Mohamed Khamis (Computing Science) have collaborated to co-produce the ARC XR project, 'PriXR: Protecting Extended Reality (XR) User and Bystander Privacy by Supporting Legibility of XR Sensing and Processing'. Funding (£80,000) from the National Research Centre on Privacy, Harm Reduction and Adversarial Influence Online (REPHRAIN) will support the project from July 2021 until August 2022.

Objectives of the project

State-of-the-art Extended Reality (XR) headsets now incorporate wide angle depth/LiDAR-type sensing, enabling the sensing of our environment, bodies and actions, and the presence and actions of others. XR technology has the capacity revolutionize personal computing, heralding new capabilities in augmented intelligence and perception (AIP), telepresence, productivity, accessibility and entertainment – capable of fundamentally altering, augmenting, or supplanting reality in the process.  

Every major technology innovator is actively developing their own XR headsets and platforms, vying for control of this future. However, mass ubiquitous adoption of XR headsets will introduce significant privacy and security risks. Perhaps the foremost risk will be to the anonymity and privacy of both users and bystanders.

XR sensing will bestow malicious actors with super-sensory capabilities, harming the security and privacy of bystanders. Platforms, companies and governments will have access to unprecedented capabilities for distributed real-time surveillance, enabling the `worldscraping' of behaviours, environments and actions. 

The impact

PriXR intends to harden XR technology against violations of privacy and anonymity, crucially exploring XR not in terms of its benefits to society, but in how society can safely unlock those benefits through: 

  • Supporting resistance against surveillance and misuse – We will explore novel XR sensing API architectures that facilitate both enhanced data access protections, and increased user awareness regarding how, when, and to what purpose personal sensing is being used – bringing transparency and accountability to the use of XR sensing, and making it harder for applications to unknowingly abuse XR’s capacity for surveillance. 
  • Facilitating bystander awareness and consent – Where XR sensing is used with the user’s knowledge and permission to surveil, capture, or augment reality, we will also examine how we can facilitate bystander awareness and comprehension of this activity, and actively include mechanisms for bystanders to grant or deny consent to said activity.  

Project Mobius (Centre for the Study of Perceptual Experience)

Innovate UK Funded – (Macpherson, McDonnell)

Project Mobius logoProject Mobius is a collaboration between Sublime Digital and the University of Glasgow costing £977,000 across three years, 2018 – 2021.

The aims of the project were to:

  • develop specialist VR apps for teaching in Higher Education
  • develop an analytics framework to gain insight into VR learning
  • establish the physical infrastructure required for VR teaching at scale within Higher Education.

Find out more: Project Mobius and CSPE (Centre for the Study of Perceptual Experience)

Scottish Heritage Partnership: Immersive Experiences

Lead Research Organisation: College of Arts & Humanities, University of Glasgow

Core research question

How successful have approaches to immersive technologies at major heritage sites in Scotland been currently, both in terms of outcomes against business plan expectations and in terms of visitor response, and what kinds of future development are supported by the evidence?

Research methods

The proposed Research Methods in this initial pilot phase will lay the groundwork for the exploration of the effectiveness and potential of the core Immersive Technology Research Question.

Under the guidance of the PI and research team, the pilot project RA will set up a questionnaire to test visitor response to the immersive dimensions of the Culloden, Robert Burns Museum and Bannockburn sites, as well as at the Riverside Museum in Glasgow (which has secured one of the highest - if not the highest-non-traditional museum audience in the UK) and the National Library of Scotland at Kelvin Hall.

In parallel, they will set up observations and a focus group round the proposed collections and policy developments at Newhailes by the National Trust for Scotland.

These approaches will follow the methodology used by the PI's CDA to evaluate audience response among the 60,000 visitors to the 'When Glasgow Flourished' exhibition in 2014 and by the PI's Beyond Text RA to evaluate responses to the material Burns January exhibition in the Mitchell Library, Glasgow in 2010 and 2011; and by the CI Economou's RA in three different immersive exhibitions in Rome, Athens, and Ename as part of the Marie Curie CHIRON project between 2005-2008 (Economou & Pujol Tost 2011).

The project team will identify audience focus groups from the existing visitor and client contact base of the partner organizations, and will explore their visitor experience while also exposing them to new developments in Immersive Experience technology. Consideration will be given to the development of future 'Smart' response evaluations, such as Fitbit and smartphone visitor response monitoring.

Immersive experiences are means of 'composing' memory (that is, creating the conditions in which the memories which are publicly expressed are those which are formulated within a range of socially acceptable contexts. In the motorized era, trails have fulfilled the same function of embedding preferred memory narratives, while immersive experiences, delivered in part or whole through the medium of technology, strive to present a fusion of memory, place and performance to create a close and lasting relationship of visitor memory to the experience purchased by the visit.

Immersive technologies have (although research on this is not yet developed and its development is a key component of the proposed partnership) arguably similar effects to electronic mass media in the composure of memory, but effects which are possibly delivered in stronger and more lasting terms.

We will also work with Soluis as our digital partner, to create a decision-making model for policy and audience development.

Research context

The research context is that of both the recent rapid growth of the heritage sector, and within that the centrality of cutting edge immersive experiences for tourism, the heritage industry and audience development.

The development of immersive experiences at 'fantasy' venues such as the London, York, Blackpool and Edinburgh 'Dungeons' from Merlin Entertainments is a connected activity. Some of these visitor experiences are relatively recent, and audience feedback is at an early stage: however, there is some evidence that fully or predominantly CGI immersive experiences such as Bannockburn are less appealing and effective to a comprehensive audience demographic than they are to particular groups.

Research outputs

Website, a policy paper, a risk assessment, a visualization decision making tool and presentations at the AHRC Showcase, and connected events, e.g. presentations at DH conferences and a media/social media strategy.

Find out more

Simmons Lab (School of Psychology)

Simmons Lab logoSimmons Lab is run by Dr David Simmons and his ESRC-funded PhD students.

Every year we have postgraduate masters and undergraduate project students contributing to ongoing projects as well.

The main focus of our research is Autism and the perceptual differences associated with it. We are also working on adapting new Virtual Reality technology for our experiments.

Our Virtual Reality work is carried out in a lab in the School of Humanities (Philosophy), which is managed by Dr Neil McDonnell. We work closely with Immersive Experiences LabGlasgow Autism Research GroupGlasgow School of Art and our industrial partners Edify.

We also help with Project Mobius and the user testing for Edify.

Find out more: Simmons Lab

Current projects

+++

Using Virtual Reality to Understand Inner Perceptual World of Autism Researcher: Sarune Savickaite (PhD candidate)

Autism, a common neuro-developmental condition, affects at least 1% of the UK population. Autism is partly characterized by sensory difficulties, such as over- or under-responsiveness to certain types of lighting and everyday noises, and an almost obsessive desire for particular types of sensory stimulation, known as “sensory seeking” behaviour. To date, most research on sensory aspects of autism has used parent/caregiver-reports, combined with a smaller amount of self-report data from those able to speak for themselves and further data from lab-based experiments. So far, however, despite these data providing us with some fascinating insights, we have yet to fully appreciate precisely what is going on in the “inner perceptual world” of autism, although it is clear that it is qualitatively different from what typical individuals experience.In this project we propose the use of Virtual Reality (VR) technology to explore this inner perceptual world. VR technology has become much less bulky and much more affordable in recent years, and the availability of software has burgeoned. In our experiments we aim to explore perceptual worlds by asking people to illustrate their experiences using the powerful and compelling creative tools now available for use in VR environments, such as Tiltbrush. We will use a combination of quantitative analysis of participants’ responses to questionnaires with qualitative analysis of both their verbal descriptions (if available) and of their audio-visual creations to further understand the nature of their inner perceptual worlds. Furthermore, we will use our experience in objective behavioural experimentation to embed game-like tasks into the created environments to explore our participants’ perceptual limits more objectively. This collaborative project will further our understanding of the inner perceptual world of autism and result in the development of a suite of versatile VR software tools together with new techniques of creative expression for those with communication difficulties.

---

+++

Using Virtual Reality Technology to Examine the Relationship between Sensory Sensitivities and Anxiety Researcher: Elliot Millington (PhD candidate)

Anxiety is one of the most common psychiatric conditions in the western world, with lifetime incidence estimates of up to 25%. Anxiety acts as a barrier to educational, vocational and societal engagement and is estimated to cost the US economy $38 Billion annually. One of the most consistent causal factors of anxiety are sensory sensitivities, usually over- and/or under-responsivity to everyday sensory stimulation such as noises, lights and smells. Research into sensory sensitivities and anxiety originated in the autism literature, but evidence, partly from our laboratory, has emerged that sensory sensitivities strongly impact typically developing individuals as well. Whilst there is a robust empirical and theoretical literature looking at the interactions between autism, sensory sensitivities and anxiety, unfortunately it is largely questionnaire-based and/or correlational. The aim of this project is to use the methodological advantages of Virtual Reality (VR) technologies to experimentally test the key relationships which form the basis of these theoretical frameworks and then apply these findings in the form of environmental accommodations. In previous research, immersive environments have been used both to treat and induce anxiety. It is clear that VR can therefore be used to manipulate the anxiety levels of participants in a safe and controlled manner. The first step in this project will be to construct a virtual environment in which the participant’s anxiety level can be closely controlled, whilst also developing competencies using equipment, such as wearable technology, to objectively measure real-time anxiety. The second stage will be to expand the methodology to measure real-time sensory sensitivity. Finally, this environment will be used to empirically probe the parameter space of sensory sensitivity and anxiety whilst accounting for individual differences in susceptibility. The industrial partner specializes in content creation for virtual environments and is specifically targeting workplace training and healthcare applications: perfectly suited to this project.

---

Mitigating Virtual Reality Sickness During Immersive Cognitive Therapies

University of Glasgow researcher, Dr Gang Li (Psychology) partnered with Dr Theodore Zanto (Director of Neuroscience Division) from the Neuroscape Center, Department of Neurology, the University of California, San Francisco (UCSF) to co-produce the project, ‘Mitigating Virtual Reality Sickness Using Neurostimulation in Users During Immersive Cognitive Therapies’. The UofG-UCSF partnership collaboration was awarded £6,000 by the Royal Society of Edinburgh international joint project to support the project from January 2022 to November 2022.

Objective of the project

The rise of consumer-friendly virtual reality (VR) systems have led to research bodies, notably the University of California San Francisco (UCSF) Neuroscape Center, to develop and deliver gamified cognitive and rehabilitation therapies in VR. However, some participants withdraw from the therapy due to VR-induced motion-sickness, which results in a general or nauseous discomfort. Moreover, approximately 60% of the general population experience some level of VR-sickness, which may limit therapeutic efficacy even in those who do not drop out. Therefore, a collaboration between UCSF and University of Glasgow researchers proposed to explore a new method to alleviate VR-sickness and to assess the effects of VR sickness on cognitive control. The success of this project will build a foundation for the development of “side-effect-free” VR therapies.

The impact

Benefits to Scotland

“The use of virtual reality technology to help people with neurological conditions is a great example of innovation in health and social care,” agreed former Public Health and Wellbeing Minister Joe FitzPatrick who just tried out a VR therapy in the Scottish Parliament in 2018.

By providing a “side-effect-free” VR experience, we have the potential to improve at the quality of life of tens of thousands of Scottish who would benefit from this technology. Among these populations are those suffering from mental diseases characterized by cognitive decline, such as Alzheimer's disease. As such, this research will directly support Scotland’s National Dementia Strategy of care and support and Neurological conditions national action plan 2019-2024.

Benefits to overseas countries

The Neuroscape Center at UCSF will immediately benefit from this research through the technique of VR sickness mitigation. Neuroscape is a translational research with a focus on using modern technology such as VR to develop novel therapeutics for clinical populations.

Unfortunately, the use of VR as a therapeutic is currently limited to those who do not experience sickness. Therefore, this research will enable access to a broader patient population who may benefit from VR therapeutics. As many of the VR interventions may be applied remotely (via telemedicine), these benefits extend to patients  worldwide.

Find out more