PhD position on adapting language models to new languages and domains
This PhD will be developed within the framework of the DeepR3 project, whose aim is to investigate new methods to pre-train and finetune language models under constrained resources, therefore reducing the overall carbon footprint needed to train them, and be more environmental friendly. In particular, the PhD will research in ways to adapt large pre-trained language models to new languages and domains, using parameter efficient techniques that reduce the carbon footprint needed to build these models.
The candidate should preferably have a BSc degree in computer science, telecommunications engineering, mathematics or physics, and a MSc in language technologies and/or machine learning. We are looking for individuals who are passionate about natural language processing and have a strong background in computer science and related fields. Our ideal candidate has experience in machine learning, deep learning, and statistical analysis, as well as a strong proficiency in programming languages such as Python. We welcome applicants from all backgrounds and are committed to creating an inclusive and supportive workplace.
The position is fully funded. The student will have all tuition fees covered and receive a gross salary of 17,221€ (1st year), 17,823€ (2nd year) and 19,765€ (3rd year), which is sufficient to cover life expenses in the area (including housing in a shared apartment).
The advisors will be Aitor Soroa and German Rigau. If you have any question, please do not hesitate to contact at this address: firstname.lastname@example.org. Please include the job ID when contacting us.
To submit your application please follow this link.