European Research Council

Post-doctoral research position in computational and cognitive neuroscience

    Université de Paris
    Recognised Researcher (R2)
    30/09/2021 23:00 - Europe/Athens
    France › Paris
    H2020 / ERC


In the context of an ERC Starting Grant, the Dugué Lab (www.duguelab.com) is looking for a post-doctoral researcher with expertise in Cognitive and Computational Neuroscience. Using neuroimaging data from iEEG/MEG/EEG and fMRI, the successful candidate will study oscillatory Travelling Waves and their impact on perception and attention. He/She will work under the supervision of Dr. Laura Dugué. Candidates with experience in iEEG and/or MEG and/or EEG and/or computational modeling are preferred. Previous experience with fMRI will be appreciated.

The postdoctoral candidate will be part of a growing research group focusing on the neural mechanisms underlying attentional rhythms using a multimodal approach including behavioral measures, neuroimaging (EEG, MEG, TMS, fMRI) and computational modelling. As part of the Vision group of the Integrative Neuroscience and Cognition Center (INCC) of Université de Paris and CNRS, they will have access to numerous training opportunities.

This is a two-year posting based on a one-year, renewable contract.

The Integrative Neuroscience and Cognition Center

The Dugué Lab is housed in the Vision Group of the Integrative Neuroscience and Cognition Center (INCC) (www.incc-paris.fr). Located in the center of Paris at the Centre Universitaire des Saints Pères in Saint-Germain des Près, the INCC is a research department affiliated with the CNRS and Université de Paris. The INCC breaks barriers between disciplines and addresses the complex functional and neurophysiological aspects of behavior and brain functions, using multimodal approaches including behavior, neuroimaging and computational modelling. INCC researchers come from disciplines as diverse as cognitive sciences, computational neuroscience, movement science, medical science, engineering, physics, neurophysiology and biology.

Research conducted in the Vision Group in particular is aimed at better understanding the mechanisms underlying perception, attention, consciousness and the links between perception and action. Our interests include the properties of visual attention and of spatial maps, visual perception during and across eye and head movements and visual motion perception. We also perform research on hearing and touch, especially their interactions with vision. We deploy multiple techniques including behavioral methods such as psychophysics and eye tracking, computational modelling and brain imaging techniques such as fMRI, EEG, MEG and TMS. We use decoding methods to better understand brain processes and for designing the online control of robotic upper limbs.

Université de Paris

With its exact and experimental sciences, broad and well-established human and social sciences and a strong tradition of work at the interface of disciplines, Université de Paris (www.u-paris.fr) is fully multidisciplinary, in terms of both training and research. It is a unique university community based on strong values: freedom of thought in study, teaching and research; the service of society and the general interest; openness to the world, the city and the immediate environment; respect and promotion of the well-being of everyone; scientific integrity. The excellence of its 142 laboratories, associated with French research organizations, makes it a major player in international research. Université de Paris has more than 30 international labs in the fields of health, science and technology, humanities and social sciences, humanities and languages.

In March 2018, an international jury appointed by the French government awarded Université de Paris the "Initiative d'excellence" label. This label aims to create 5 to 10 world-class universities in France hence allowing Université de Paris to be a world class research university and one of the leading actors in the evolving landscape of higher education and research in France. Université de Paris will publish no less than one in ten articles in France. 5% of PhD students in France will come from Université de Paris.

More Information


The gross monthly salary is typically between 2235€ and 2766€ depending on research experience. After standard deductions, the corresponding net salary is between 1800€ and 2257€. The benefit package includes unemployment insurance, work-site insurance, health insurance, a pension plan, and 50 vacation days per year (in addition to official holidays).

Selection process

Applications should be sent to Dr. Dugué directly (laura.dugue@u-paris.fr) and must include a cover letter, CV, and two letters of recommendation. We will begin reviewing applications at the end of the application period and will continue until the position is filled. Candidates will be short listed based on CV and reference letters. Interviews will be conducted via videoconference.

Offer Requirements

    ENGLISH: Excellent


The candidate should be highly motivated, have excellent communication and organizational skills, and be able to work independently and as part of a team.

Candidate must have previous experience in the field of Computational and/or Cognitive Neuroscience. A strong preference is given to candidates with demonstrated expertise in the utilization of computational modeling and/or neuroimaging methods including iEEG and/or MEG and/or EEG. Previous experience with fMRI will be appreciated.

Candidate must be fluent in programing with Matlab and/or Python.

Map Information

Job Work Location Personal Assistance locations
Work location(s)
1 position(s) available at
Université de Paris
rue des Saints Pères

EURAXESS offer ID: 670918


The responsibility for the jobs published on this website, including the job description, lies entirely with the publishing institutions. The application is handled uniquely by the employer, who is also fully responsible for the recruitment and selection processes.


Please contact support@euraxess.org if you wish to download all jobs in XML.