PsyIntEC

MasterS project ideas

In the list below you find a number of suggestions for master thesis topics.

NOTE: this page is updated regularly, so please check back later if you are still looking for a project.

Some projects do not appear when viewed using Internet Explorer!

Are you curious about what former master students have written their theses about?
Then follow the link to: Completed Masters Projects



Sara Erik

Cognitive and Neural Engineering research group

Psychophysiological data generation and analysis of game/simulation interaction


Supervisor: Dr. Charlotte Sennersten

Humans have five senses which are hearing, seeing, smell, touch, and taste. In game/simulation environments mostly hearing, seeing and touch (via devices) are used and designed for. Games are usually intended for entertainment purposes of different kinds and so called serious games are often used in training settings where real physical tasks can be trained in safe, controlled and cost effective environments.

While we are interacting with a 3D game we are interacting in taskgenerated design where the challenges make us not just react physically (keystrokes or movements) but also physiologically (eye, pulse, sweating, brainactivation). Physiological reactions/responses reflect psychologigal processes, which already in the design becomes a tool in how to design.

The logging and steering in a game/simulation engine generate lots of data that can be analysed and show statistical significance when several players or users are logged. To have statistical significance into your thesis you need at least 20 participants to play/interact while being logged. The psychological part comes either via questionnaires or interviews or both.

The scientific framework for this project is cognitive and neuroscience, psychology and game design theories.

Important to know is that the development time has to be very limited to be able to fulfill studies like this within the time available for a thesis project. 

3D Scene Modelling and Object-of-Gaze Resolution in Human-Robot Interaction


Supervisor: Prof. Craig Lindley

An eye gaze tracking system detects the direction in which a user is looking. Gaze tracking is used for many purposes, including studying visual attention (e.g. on web pages and printed material) and providing hands-free control to allow those who have limited use of their hands to use word processors, email and the web.

This project involves developing a system for reconstruction of 3D scene models from actual physical objects, based upon the use of environmental structure and augmented reality tagging. Eye gaze tracking data will then be processed to determine what objects within a scene a human is looking at. Required hardware and software systems will be provided.

Core skills developed in the project: understanding of eye gaze tracking, augmented reality, human-robot interaction. Prior experience with image processing and/or machine vision is strongly recommended.

Control Algorithms for a 6-Legged Walking Robot


Supervisor: Prof. Craig Lindley

CHR-3 hexapodCHR-3 hexapod

The CH3-R is a 6-legged walking robot, with motion provided by 3 servo motors in each leg. Control electronics include an ABB-01 mini atom bot board, BA-02 BASIC atom 28, SSC-32 servo controller. A legged robot has strong capacity for navigating over rough ground.

This project involves developing control algorithms for the robot for a variety of competencies, such as moving to a required target point, while avoiding obstacles.

Core skills developed in the project: understanding of real-time embedded systems, ability to develop robot control algorithms.

Eye Gaze and/or Brain Controlled Human-Robot Interaction


Supervisor: Prof. Craig Lindley

An electroencephalograph (EEG) measures real-time electromagnetic radiation generated by brain activity and emitted through the cranium. Interpretation of EEG data provides information about mental states, states of consciousness, attention, etc.. EEG data can be used directly to control software systems. Users of an EEG system can also learn to generate control actions for operating software and also for controlling hardware systems, such as mobile and articulated robots (e.g. for wheelchair control, exploration and object manipulation) for those with physical disabilities.

An eye gaze tracking system detects the direction in which a user is looking. Gaze tracking is used for many purposes, including studying visual attention (e.g. on web pages and printed material) and providing hands-free control to allow those who have limited use of their hands to use word processors, email and the web. In previous work we have developing a control system for the Spinosaurus mobile robot based upon eye tracking.

This project involves using EEG and/oreyetracking as interaction technology to develop a control system for :

- a wheeled mobile robot

- an articulated robot arm

- a 6-legged walking robot

- mobile robots with an articulated arm and grasper attached to the robot

User interface and application code will also need to be developed or modified from the current system. Required robots and eyetracking systems will be provided.

Core skills developed in the project: understanding of eyetracking and/or EEG, data acqusition, data reduction, robot control and user interface development, software integration.

References:

Andreassi J. L. Psychophysiology: Human Behavior and Physiological Response, Lawrence Erlbaum Associates Inc (2006).

Metrics for Human-Robot Interaction


Supervisor: Prof. Craig Lindley

This project involves modelling human-robot collaboration and developing an associated metrics system for measuring and evaluating collaborative action. Modes of collaboration will be defined for specific application and human-robot interaction scenarios. Metrics will be task-oriented (e.g. subtask timing and accuracy, repeatability), will also include additional human factors (including physiological and psychological factors), and be extended to include newly defined factors relating uniquely to joint human-robot (inter-)action.

Application scenarios include i) a mobile ground robot (moving platform with real time video camera and manipulator, or ii) a work cell for the assembly of prototype devices, including two articulated robot arms. The metrics framework should be demonstrated using one of these application scenarios. Other forms of robots could also be considered, such as robots that fly, move underwater, or have a humanoid or biomimetic form; however, in these cases demonstration must be based upon using robots of type i) or ii) above.

Core skills developed in the project: understanding of human-robot interaction, task analysis, metrics development and system performance evaluation.

 

COMPUTER GRAPHICS research group

Visualisation of the Polhem Dry Dock


Supervisor: Dr. Veronica Sundstedt

The aim of this project is to use digital 3D technology to bring to life and make available a high-tech industrial project from the 1700s on the naval dockyard in Karlskrona. The work will focus on highlighting how the significant technical problems during construction of the Polhem dry dock were solved. Special focus will be on the technology to keep the dock dry as well as the systems used to empty water from the dock. A 3D model of the dry dock will be created using Autodesk Maya. Experiments will be run to evaluate certain aspects of the visualisation.

Visualisation and Evaluation of the BTH Campus Building


Supervisor: Dr. Veronica Sundstedt

Parts of Blekinge Institute of Technology (BTH) were recently moved to new premises in Karlskrona. In this process new buildings were merged with older parts of the campus. As a result of this change in navigation finding a specific room in the current buildings is sometimes difficult. This project will investigate various visualisation techniques for helping a person finding their way throughout the campus. Experiments will be designed that explore which visualisation technique is most efficient. The proposed solution should also be compared with the existing system for navigation at BTH (maps, signs etc.). Possible new implementation techniques could explore 2D/3D visualisations.

EYELEARN Gaze Controlled Digital Storytelling


Supervisor: Dr. Veronica Sundstedt

Eye tracking is a process that allows us to determine where an observer is focusing at a given time. The gaze direction indicates where people focus their attention. Recent innovations in the video game industry include alternative input modalities for games to provide an enhanced, more immersive user experience. Gaze control has recently been explored as an input modality in games. In previous work we have developed games which can be controlled by gaze and voice recognition alone. Alternative means of interaction in games are especially important for disabled users for whom traditional techniques, such as mouse and keyboard, may not be feasible. This project involves creating a novel interactive entertainment application in the form of a digital storybook. The storybook will be enhanced by the use of animation, sound, and visual feedback to produce an enjoyable experience for children. A user evaluation could also be part of the project. The storybook needs to be integrated with an eye tracking system. An eye tracking system and the SDK to obtain information about eye movements will be provided.

Core skills developed in the project: understanding of eye tracking and eye movements, entertainment technology development, software integration.

References:

Isokoski P., Joos M., Spakov O., Martin B., Gaze Controlled Games, Universal Access in the Information Society, Volume 8, Number 4, 323-337 (2009).

Sundstedt V. Gazing at Games: Using Eye Tracking to Control Virtual Characters, ACM SIGGRAPH Course (2010).

 

Design and Experience research group

Edit
Share Share