For the past year we have had the pleasure of working with our visiting scholar Dr. Xunbing Shen. Dr. Shen has returned to his position at Jiangxi University of Traditional Chinese Medicine in Nanchang, China. During his time in our lab, Dr. Shen designed a number of experiments aimed at looking at the relation between microexpressions and movement parameters, which are still ongoing.
With the acquisition of the HTC Vive virtual reality product, the SPOCC lab has begun to plan new ways in which human behavior can be studied. Building research paradigms that involve VIrtual Reality requires learning skills that are not often learned by psychologists. Computer programming and trouble shooting are becoming the most relevant tasks to complete in order to prepare our research projects. The current learning-goals for students who wish to do research using VR in the SPOCC lab are:
To program simple stimuli to be presented through the VR Head-Mounted-Display.
To sync presented stimuli with data recorded and calculated from a user’s own motion.
As the research questions become more complex here in the SPOCC lab, the skill sets of students who do that research become more multi-faceted. We are truly training resourceful researchers.
Anthony Drew, M.S.
One of the great things about studying perception and action is that it allows you to engage in interesting research and address issues from many different fields.
Recently we were lucky enough to work on a collaborative project with Dr. Harvery Thurmer (Dept. of Music), and Dr. Bill Berg (Kinesiology and Health) along with former graduate students Rachelle Wolfe, M.M. and Henry Cook, Ph.D.
The project involved a very timely question – how can we reduce repetitive strain injuries in violinists and violists (and really everyone)?
The study was unique in that it used assessment techniques from all three disciplines: performance training and evaluation using the Alexander technique (body mindfulness) muscle activity evaluation using Electromyography (EMG) and kinematic (movement) evaluation using motion capture (magnetic motion tracking)
It also gave our students a chance to learn how collaboration can work and learn how different types of researchers approach questions,
The results of the initial study was published in June in Medical Problems of Performing Artists
Here is a Miami newsletter article about the project.
The best part of this project however, was being able to both learn and teach with colleagues and students!
This year had a small but dedicated group of students in our lab course (PSY375).
The students this year were interested in discovering how different types of constraints influence performance on a task. For some in the group this was the first crack at the research process!!
To figure out how to do this we again did some brain storming, and organizing ideas…
and designed a study to examine this question using a perceptual adaptation procedure that included using motion capture and video games…
Finally presenting our results at this years Undergraduate research forum!
As always proud of the work that our students do!!
An upcoming project will examine the viability of blood pressure cuffs as an experimental tool for altering sensory information. If found to be effective this will allow us to experimentally induce symptoms of different disorders (such as Anorexia Nervosa) on healthy participants which will help us to better understand the mechanisms by which different disorders work.
What is the best way to approach a complex task?
How might this approach be modified based on changing constraints?
These are questions that Alex Feltz B.A., Sarah Laane B.A., & Valencia Brown, B.A. sought to answer using a virtual task last semester.
They noticed that employees are often asked to follow a certain set of work instructions (procedures) that specify how they are to conduct their job. In many cases they are also given time constraints to ensure that the necessary amount of a given product is produced. Working under these constraints can be stressful, particularly when the constraints are greater than what can be done safely and accurately. Their research was conducted to determine how time constraints could affect the performance of complex tasks.
They tested this by looking at how participants performed in a virtual building task when given a specific set of instructions and varying time constraints.
They found that placing unrealistic time constraints on the task resulted in participants deviating from the instructions by either skipping (combining) or omitting (forgetting) steps. However, performance did improve over trials and analysis of their physical movement suggests that participants were getting better even if they couldn’t complete the task in the manner specified.
So time constraints can alter the way in which a task is completed, but may not always be detrimental.
-AF, SL, VB
Dr. Xunbing Shen will be working with us this year. He is currently affiliated with Jiangxi University of Traditional Chinese Medicine in Nanchang, China. His recent research focuses on the recognition of facial micro-expressions of emotions. Dr. Shen mentions that in his master’s thesis he explored the effects of perception conflict on pointing and grasping, and that he is currently doing research on patients with traumatic brain injury, including rehabilitation of their gait. This overlaps with our research interests in perception and action, and we look forward to a mutually beneficial collaboration.
New: Thanks to your support of our Hawksnest fundraiser and the Department of Psychology, we were able to purchase an HTC Vive system and upgrade our data collection machine!
Virtual Reality technology is becoming a large market force. Facebook recently purchased the Oculus rift VR project for 2 billion dollars. The HTC Vive sold 100,00 units in the first three months following its release. With the reemergence of VR technologies has also come an awareness of certain compatibility issues that arise between the tech and the humans who use it. A substantial number of VR users may experience head ache, nausea or a host of other symptoms which have been collectively dubbed simulator sickness or visually induced motion sickness (VIMS). Different techniques for combating the effect of VIMS have been explored in the form of restricting visual information, or electrophysiological interventions to simulate vestibular motion. These techniques have drawbacks that may reduce the quality of the VR experience for users. The SPoCC lab has explored this issue by attempting to predict the onset of sickness symptoms using data collected in real time from a human user.
In the past semester we in the SPoCC lab conducted an experiment to test an algorithm designed to predict motion sickness incidence while using a virtual reality headset. We brought participants into the Visualization Lab located in Armstrong Interactive Media Studies, and asked them to view stimuli using an Oculus Rift VR headset. During the experiment we measured participants postural motion using the position tracking capabilities on-board the Oculus Development Kit 2. By measuring this movement data in real time we hoped to be able to predict whether a participant was going to become motion sick before the participant themselves reported symptoms.
The experiment used a simulated star field as a stimulus. Participants completed one 10-minute baseline trial during which the star field remained stationary. After the baseline, participants were exposed to one of two conditions. In the first condition the star field swayed at a constant rate of 0.2Hz, and was not in any way coupled to the participant’s own motion. In the second condition the star field swayed at a rate that was directly matched to the participant’s own motion, but was reversed in direction. If a participant swayed forward, the star field swayed backward thereby reversing the natural coupling between visual stimuli and postural sway.
Our prediction algorithm was able to correctly predict sickness in 90% of the original participants (whose data the algorithm was based on) and 57% of new participants. Predicting sickness in individuals at a rate that is above chance is a major step into developing a dynamic intervention for users of VR tech. We in the lab seek to improve this accuracy rate by weighting outcome measures and norming parameters to larger samples.
-Anthony Drew, M.S.
This semester we “rebooted” a laboratory class in Perception, Action, and Cognition (PSY 375). It is meant to serve as a way for students to get “experiential learning” in our major without having to join an actual lab. The course will rotate among faculty so students can learn a number of techniques and methodologies.
This semester since I facilitated the course – we focused on behavioral techniques (motion capture) that we use in our lab, and since many in the class were new to the research process – how to generate ideas… we started by just jotting down ideas on giant “post-it” sheet on the wall, each class adding a little more detail, a little more structure, and watched as two very solid research questions emerged (and very different questions from what I imagined – bridging perception/action and social perception research).
Once we had questions, we began to work on methodology and procedure – still using the “post-it” method – our wall was now becoming external memory – helping us remember not only where we were but where we started.
We then started working on creating our materials and piloting, followed by a bit of data collection (not as much as we would like but that is also part of the research process)…
And finally presenting our initial results…
While there were challenges – this was a great experience for all and to me cemented the idea the active collaboration and design can yield novel and exciting ideas and knowledge! Looking forward to doing this again next spring!