Detjon hold a B.Sc. in Electronics engineering and a M.Sc. degree in Mechatronics engineering from the Polytechnic University of Turin. Then, he worked as an assistant professor in his university. After his M.Sc., he participated in an Erasmus program in Portugal. He worked on wearable devices for medical diagnosis in the Institute of System and Robotics.
Multimodal Haptic Surfaces
Detjon's project represents a small part of multiTOUCH and it is related with WP2.
The double aim of his project is:
- to provide hardware and software to the partners of the project in order to perform psychophysical experiments by a synchronization of three feedback modalities: Tactile, Visual and Auditory
- to introduce relevant sensors and provide a wide range of haptic sensations through new devices or through the optimization of already existing device.
Giulia obtained a First Class BSc in Biological Sciences (Genetics and Molecular Biology) from Kingston University London in 2018. She then completed a MSc in Neuroscience from the University of Sussex (Brighton, UK), achieving a Distinction. After her MSc, she worked as a full-time research assistant at the Lab of Action and Body, Royal Holloway University of London.
Psychophysical and electrophysiological (EEG) study of the cortical processes underlying tactile-auditory-visual integration in conditions of active touch
The aim of Giulia's project will be to investigate how stimuli from different sensory modalities (vision, audition) are integrated with tactile feedback under conditions of active, dynamic touch. In order to investigate the neural underpinnings of multisensory integration during haptic exploration, she will use a combination of psychophysical and electroencephalography (EEG) experiments. These will include oddball paradigms, and techniques such as mismatch negativity and EEG frequency tagging.
ESR 2: Milad Jamalzadeh
He is working on project Multimodal Haptic framework in virtual reality in Lille, in the group of Prof. Laurent Grisoni
Milad studied master of biomedical sciences and engineering at Koc University, Istanbul. He studied human perception of tactile stimuli created by electrovibration. He also simulated some biomedical problems like a soft finger rehabilitation robot using fiber reinforced actuators and simulation of aortic aneurysm.
Haptic Multimodal interaction in Virtual Reality
Human senses are a key element for people to perceive information from the surrounding world. we are used, in our everyday lives, to perceive a real world artefact using not one but several sensory inputs; such multisensoriality allows our brain to consolidate perception, through correlation, or enrichment for example, between sensory canals. However, such multisensorial perception human scheme is still poorly exploited by technology for providing a better access to information. Milad and his team propose through this PhD to explore in which measure haptic technologies can be combined with visual and sound feedback technologies for virtual reality applications.
ESR 5 : María Casado Palacios
She is working on project Cross-Sensory perception and deprivation in Genova, in the group of Dr. Monica Gori
Graduated in Psychology at the University of Seville, with a master degree on Neuroscience, by the University Complutense of Madrid. Since 2018 until 2020 she has been participating in the project “The social origin of human language: the gaze and the pupil size of the speaker as modulators of the semantic and syntactic understanding of language” supervised by Dr. Manuel Martin-Loeches at the UCM-ISCIII Center for Human Evolution and Behavior. Specifically, her research was based on the modulating role of these variables in the syntactic understanding of language, using the N400 and P600 components of EEG as the main measurement tool, as well as Eye Tracking.
Cross-Sensory perception and deprivation
Her research project will focus on the experiments aiming at producing new models for sensory substitution and reinforcement in individuals with sensory disabilities (hearing- and vision-impaired individuals) and also in individuals with brain damage, using EEG, psychophysics and Bayesian probabilistic models.
Born and raised in New Delhi, Iqra enrolled in the BS-MS dual degree program as an INSPIRE Fellow at IISER Kolkata. Then she continued her training with internships in molecular biophysics in the University of Delhi and Indian Institute of Science Bangalore, Electrophysiology in Forschungzentrum CAESAR and Cognitive Science in Yale University. Ultimately, she graduated with a dual degree of Bachelor and Master in Biological Sciences.
Psychophysical and neuroimaging study of visual-tactile motion integration
The broad objective of Iqra's PhD project is to study how the brain integrates the visual and tactile motion input to bring out the coherent multisensory experience while touching a tactile display. The aim is to find out if the directional motion on the tactile display is coded by the somatotopic coordinates or an external (visual) frame of reference.
ESR 6 : Mihail Terenti
He is working on Designing and engineering multimodal feedback to augment the user experience of touch input in Suceava in the group of Prof. Radu-Daniel Vatavu
He has a Master's degree from 2016 in Information Technology in Education and a Bachelor's Degree in Physics & Computer Science from 2014 from the Tiraspol State University, Chisinau, Republic of Moldova. In the past few years since graduation, he has worked as a web developer in order to extend his technical knowledge in application development and software technology, especially for the web environment.
Designing and Engineering Multimodal Feedback to Augment the User Experience of Touch Input
Mihail's main goal in this project, is to get involved in technical and scientific activities regarding the design, engineering, and evaluation of multimodal feedback for touch user interfaces (UIs), with a focus on haptic feedback and understanding the user experience (UX). As part of this position, and of the corresponding Ph.D. studies planned for this project, He will develop application prototypes to understand user performance with multimodal UIs. Examples of applications include assistive technology for users with visual impairments, eyes-free UIs, and interaction in Virtual Reality.