ORGANISATION/COMPANYUniversité de Rouen Normandie
RESEARCH FIELDComputer science › Systems design
RESEARCHER PROFILEFirst Stage Researcher (R1)
APPLICATION DEADLINE04/04/2018 21:00 - Europe/Athens
LOCATIONFrance › Saint Etienne du Rouvray
TYPE OF CONTRACTTemporary
HOURS PER WEEKup tp 35h/week
OFFER STARTING DATE09/04/2018
IS THE JOB RELATED TO STAFF POSITION WITHIN A RESEARCH INFRASTRUCTURE?Yes
TITLE : ACCESSPACE: design and evaluation of an indoors and outdoors tactile “GPS” system for the visually impaired.
Keywords: Computer Vision, Sensor Fusion, SLAM, Semantic segmentation, Android, Augmented topographic maps, Sensory Substitution
Human mobility is a complex and research topic still relatively little investigated. Acquired at birth through unknown and unconscious cognitive mechanisms, mobility is a fundamental issue for Visually Impaired People (VIP), but also for people interacting with perceptually complex or fuzzy situations (such as firefighters or surgeons), or for the development of autonomous mobile intelligent systems (such as humanoid robots, or autonomous vehicles).
Our work at LITIS (affiliated to Rouen-Normandy University and CNRS FR 36438), led through both international (e.g. ACCESSPACE) and national (e.g. TETMOST – CNRS “AUTON” challenge) transdisciplinary research projects, aims to understand and model the underlying cognitive mechanisms of mobility, and ultimately use those models to design more effective mobility assistive devices and algorithms. Those electronic devices, thanks to state of the art ICT (Computer Vision, Signal Processing, Computer Science, …) and Cognitive Neuroscience research, aim to assist the mobility (and thus autonomy) of VIP, and improve their overall quality of life.
This research internship’s aim is to develop a wearable assistive device for VIP, based on sensory substitution and spatial cognition principles. This device will consist of a “specialized GPS” using various sensors (visual, GPS and IMU) to provide an augmented map to the user through haptic feedback, thanks to a vibrating belt controlled through Arduino and Raspberry Pi.
This device will allow VIPs to perceive their environment by recoding their surroundings into vibration patterns, allowing them to walk towards the set destination autonomously, avoiding obstacles along the way, while maintaining their orientation in space. The device relevance in guiding the visually impaired will be first tested in a simulated environment, and later through real-world experimentations thanks to a “perception-mobility” platform.
Your goal during this internship (4 to 6 months) will be to implement and test the computer vision part of the device: visual odometry (thanks to multiple sensors’ data fusion), obstacle detection and tracking, POI localisation and matching, SLAM in unknown environment, etc.
- Keen taste for research, for fundamental works, and openness to other research fields;
- good communication skills
Master 2 degree may be achieved during 2018 (June)
Please send your résumé, letter of motivation (topic: Stage M2 ACCESSPACE), current and previous’ year grades and the contact of your supervisor(s) to:
- (Prof.) Edwige PISSALOUX : email@example.com
- (Post-doc) Simon GAY : firstname.lastname@example.org
(PhD) Marc-Aurèle RIVIERE : email@example.com
application selection by an international commitee
Skype interview for short listed candidates
- Remuneration in accordance to French standard internship’s pay: 3.75€ / h à 577.5€ / month.
- Research team and internship location : STI, from LITIS lab (http://litislab.eu/), affiliated with Rouen-Normandy University (Rouen, Saint-Etienne-du-Rouvray) and the CNRS FR 3638 ;
- Internship’s start date is to be set with the supervisors (as early as 2nd April)
- Velazquez, R., Fontaine, E., & Pissaloux, E. (2006). Coding the Environment in Tactile Maps for Real-Time Guidance of the Visually Impaired, IEEE Int. Symp. on Micromechatronics and Human Science, Nov. 6 - 8, 2006 Nagoya, Japan , 1-6
- Lenay, C., Gapenne, O., Hanneton, S., Marque, C., & Genouëlle, C. (2003). Sensory substitution: Limits and perspectives. Touching for knowing, 275–292.
- Pissaloux, E.E., Velazquez, R.R., Maingreaud, F., A New Framework for Cognitive Mobility of Visually Impaired Users and Associated Tactile Device, IEEE Trans. Human-Machine Systems, vol. 47 Issue 6, pp.1-12, 2017
- Pissaloux, E., Velazquez, R., (éd.), Mobility in Visually Impaired People – Fundamentals and ICT Assistive Technology, Springer, 2017
- Li, B., Munoz, J. P., Rong, X., Xiao, J., Tian, Y., & Arditi, A. (2016). ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In European Conference on Computer Vision (pp. 448–462). Springer.
REQUIRED EDUCATION LEVELEngineering: Master Degree or equivalent
REQUIRED LANGUAGESENGLISH: GoodFRENCH: Basic
- computer vision (OpenCV, SLAM, sensor fusion) and OO Programming (Java /
EURAXESS offer ID: 285234