Softwares
- A toolkit for measuring context-mixing in text and speech Transformers [code]
- A disentanglement framework to seprate pre-trained speech representaions into textual and task-relevant acoustic features [code]
Datasets
- A French homophone dataset for context-aware ASR evaluation [data]
Tutorials
- “Interpretability Techniques for Speech Models” at Interspeech 2025 conference, in Rotterdam. [materials]
- Hands-on tutorial on “NLP & LLMs” for Dutch school students participating in the Olympiad in Artificial Intelligence (IOAI 2025).
- A series of short educational videos on how to open the Black Box of Large Language Models:
- Why it is crucial to track how Transformers mix information? [YouTube]
- How to best measure context-mixing in Transformers? [YouTube]
- “Transformer-specific Interpretability”, at EACL 2024 conference, in Malta. [materials]