ikastaro eta mintegiak

 

2026 edition

Deep Learning for NLP (code: DL4NLP)
Instructor: Eneko Agirre
This course introduces in detail the machinery that makes Deep Learning work for NLP, including the latest transformers and large language models like GPT, BERT and T5. Attendants will be able to understand, modify and apply current and future Deep Learning models. They will learn the inner workings of the models and implement them in Keras.
Student profile: professionals, researchers and students with basic programming and Python experience. Basic math skills (algebra or pre-calculus) are also needed. Although not strictly necessary, we recommend subscribing to Collab Pro for more out of GPUs.
Large Language Models (code: LLM)
Instructor: Oier Lopez de Lacalle
The course will introduce large language models, with special emphasis on adaptation techniques (e.g. in-context learning, few-shot, instruction learning) and ways to align with human preferences. In addition, advanced training techniques such as parallelism, selective architectures and scaling laws are presented.
Participants, in addition to understanding the fundamentals of LLMs and learning advanced training techniques, will gain hands-on experience in applying and working with these models, while addressing biases and ethical concerns.
Student profile: professionals, researchers and students with basic programming and Python experience. Basic math skills (algebra or pre-calculus) are also needed. Although not strictly necessary, we recommend subscribing to Collab Pro for more out of GPUs.
the insurance for one of the courses)
Generative Playground: LLMs made easy (code: GPLLMME)
Instructor: Ander Barrena
The aim of this course is to understand and deploy large language models (LLMs) from a practical perspective, enabling students to gain hands-on experience with these models without coding, with particular emphasis on ethical considerations, including addressing bias in language, responsibly handling sensitive information, and evaluating the deployed models.
Participants will learn how to use proprietary models like GPT-4o and open-source models like LLaMa3 for prompt engineering, creating agents, chatbots, Retrieval Augmented Generation (RAG) models, and other NLP applications.
Student profile: graduate students and professionals from various disciplines (linguistics, journalism, computer science, sociology, etc.) who need to understand and deploy LLMs easily. No coding skills are necessary for the practical content. Although not strictly necessary, the OpenAI ChatGPT Plus subscription plan is advisable to complete some of the labs.
Deep Learning for Speech Processing (code: DL4SP)
Instructor:
This course introduces the main Deep Learning techniques used in state-of-the-art Speech Processing. Participants will learn the fundamental approaches behind key tasks such as automatic speech recognition, speaker recognition, language identification and speaker diarization.
The course will present the main neural network architectures used for speech, including convolutional, recurrent and transformer-based models, as well as common speech representations and training strategies. Through practical examples, attendees will learn how current systems are built and how to apply existing models and toolkits to real-world speech processing tasks.

Student profile: professionals, researchers and students with programming and Python experience. Math and signal processing knowledge (at the level of a BSc in Sciences or Engineering) is also recommended. Although not strictly necessary, we recommend subscribing to Collab Pro for more GPU availability.

More info here

Past courses

 

2025

Deep Learning for NLP (code: DL4NLP)
September 15th to 19th, 20 hours, 5 afternoons. 14th edition.

Large Language Models (code: LLM)
September 29th to October 03th, 20 hours, 5 afternoons. 2nd edition.

Introduction to LT Applications (code: ILTAPP)
October 13th to 17th, 20 hours, 5 afternoons. 8th edition.

Generative Playground: LLMs made easy (code: GPLLMME)
October 27th to 31th, 20 hours, 5 afternoons. 2nd edition.
 

2025 More info here.

2024

Generative AI, Deep Learning and Language Technology

Specialization courses by HiTZ Chair of Artificial Intelligence and Language Technology
Next edition: September-October, 2024, fully online

This series of specialization courses offers a complete immersion in the fields of deep learning, large language models (LLM) and their impactful applications. These courses cover a spectrum ranging from fundamental principles to the most advanced methodologies. We offer you a comprehensive learning pathway to gain practical experience as each course includes practical exercises and real-life case studies. Aimed at professionals, researchers and students who wish to understand and apply the latest techniques in Artificial Intelligence.

+ More info

2023

  • 4Gune (2023/02/16)

The aim of the course is to provide an overview of Artificial Intelligence (AI) and Big Data at the present time, with special attention to methods, areas and applications. Participants should acquire a global vision of the application areas of Artificial Intelligence, especially in their areas of interest. Identify the types of problems that Artificial Intelligence and Big Data address, and learn about some generic methods to solve them. After taking this course, attendees are expected to be able to assess the possibility of applications in their business environment.

  • Short programme AI (2023/03/21)

This short talk is an introduction to AI and NLP. We will briefly describe the history of AI and NLP, including the last and incredible advances that deep learning have brought to the field. In the talk we will briefly describe neural language models, discussing how they have revolutionized various NLP tasks such as machine translation, text generation or question answering. Additionally, we will touch upon new learning paradigms such as zero-shot and in-context-learning, which require very few training examples, if any, to adapt language models to new tasks.

  • Deep Learning for Natural Language Processing (DL4NLP)

January 2023, UPV/EHU San Sebastian. Winter version, 3 weeks, 35 hours, labs in Tensorflow.
  • Introduction to Language Technology Applications

February 2023, UPV/EHU Donostia-San Sebastian. Winter version, 5 weeks, 22.5 hours, labs with Flair, Spacy and Transformers

2022

  • Eustat (2022/11/24-25)

More info here [es]

  • Deep Learning for Natural Language Processing (DL4NLP)

July 2022, UPV/EHU San Sebastian. Summer version, 5 days, 20 hours, labs in Keras.

January 2022, UPV/EHU San Sebastian. Winter version, 3 weeks, 35 hours, labs in Tensorflow.

  • Introduction to Language Technology Applications

July 2022, UPV/EHU Donostia-San Sebastian. Summer version, 5 days, 20 hours, labs with Scikit-learn, Flair, Spacy and Transformers

Winter 2022, UPV/EHU Donostia-San Sebastian. Winter version, 9 days, 22.5 hours, labs with Scikit-learn, Flair and Spacy

2021

  • Deep Learning for Natural Language Processing (DL4NLP)

July 2021, UPV/EHU San Sebastian. Summer version, 5 days, 20 hours, labs in Keras.

January 2021, UPV/EHU San Sebastian. Winter version, 3 weeks, 35 hours, labs in Tensorflow.

  • Introduction to Language Technology Applications

July 2021, UPV/EHU Donostia-San Sebastian. Summer version, 5 days, 20 hours, labs with Scikit-learn, Flair and Spacy

2020

  • Deep Learning for Natural Language Processing (DL4NLP)

July 2020, UPV/EHU San Sebastian. Summer version, 3 days, 20 hours, labs in Keras.

January 2020, UPV/EHU San Sebastian. Winter version, 3 weeks, 35 hours, labs in Tensorflow.

2019

  • Deep Learning for Natural Language Processing (DL4NLP)

July 2019, UPV/EHU San Sebastian. Summer version, 3 days, 20 hours, labs in Keras.

January 2019, UPV/EHU San Sebastian. Winter version, 3 weeks, 30 hours, labs in Tensorflow.

2018

  • Deep Learning for Natural Language Processing (DL4NLP)

July 2018, UC3M Madrid. Summer version, 3 days, 20 hours, labs in Keras.

January 2018, UPV/EHU San Sebastian. Winter version, 3 weeks, 30 hours, labs in Tensorflow.

 

.