Modular Deep Learning An overview of modular deep learning across four dimensions (computation function, routing function, aggregation function, and training setting).
cross-lingual The State of Multilingual AI This post takes a closer look at the state of multilingual AI. How multilingual are current models in NLP, computer vision, and speech? What are the main recent contributions in this area? What challenges remain and how we can we address them?
events ACL 2022 Highlights This post discusses my highlights of ACL 2022, including language diversity and multimodality, prompting, the next big ideas and keynotes, my favorite papers, and the hybrid conference experience.
ML and NLP Research Highlights of 2021 This post summarizes progress across multiple impactful areas in ML and NLP in 2021.
Multi-domain Multilingual Question Answering This post expands on the EMNLP 2021 tutorial on Multi-domain Multilingual Question Answering and highlights key insights and takeaways.
natural language processing Challenges and Opportunities in NLP Benchmarking Over the last years, models in NLP have become much more powerful, driven by advances in transfer learning. A consequence of this drastic increase in performance is that existing benchmarks have been left behind. Recent models "have outpaced the benchmarks to test for them" (AI Index Report 2021)
ACL 2021 Highlights This post discusses my highlights of ACL 2021, including challenges in benchmarking, machine translation, model understanding, and multilingual NLP.
language models Recent Advances in Language Model Fine-tuning This article provides an overview of recent methods to fine-tune large pre-trained language models.
transfer learning ML and NLP Research Highlights of 2020 This post summarizes progress in 10 exciting and impactful directions in ML and NLP in 2020.
cross-lingual Why You Should Do NLP Beyond English 7000+ languages are spoken around the world but NLP research has mostly focused on English. This post outlines why you should work on languages other than English.
advice 10 Tips for Research and a PhD This post outlines 10 things that I did during my PhD and found particularly helpful in the long run.
natural language processing 10 ML & NLP Research Highlights of 2019 This post gathers ten ML and NLP research directions that I found exciting and impactful in 2019.
cross-lingual Unsupervised Cross-lingual Representation Learning This post expands on the ACL 2019 tutorial on Unsupervised Cross-lingual Representation Learning. It highlights key insights and takeaways and provides updates based on recent work, particularly unsupervised deep multilingual models.
transfer learning The State of Transfer Learning in NLP This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. It highlights key insights and takeaways and provides updates based on recent work.
events EurNLP The first European NLP Summit (EurNLP) will take place in London on October 11, 2019. It is an opportunity to foster discussion and collaboration between researchers in and around Europe.
events NAACL 2019 Highlights This post discusses highlights of NAACL 2019. It covers transfer learning, common sense reasoning, natural language generation, bias, non-English languages, and diversity and inclusion.
transfer learning Neural Transfer Learning for Natural Language Processing (PhD thesis) This post discusses my PhD thesis Neural Transfer Learning for Natural Language Processing and some new material presented in it.
events AAAI 2019 Highlights: Dialogue, reproducibility, and more This post discusses highlights of AAAI 2019. It covers dialogue, reproducibility, question answering, the Oxford style debate, invited talks, and a diverse set of research papers.
natural language processing The 4 Biggest Open Problems in NLP This is the second post based on the Frontiers of NLP session at the Deep Learning Indaba 2018. It discusses 4 major open problems in NLP.
transfer learning 10 Exciting Ideas of 2018 in NLP This post gathers 10 ideas that I found exciting and impactful this year—and that we'll likely see more of in the future. For each idea, it highlights 1-2 papers that execute them well.
events EMNLP 2018 Highlights: Inductive bias, cross-lingual learning, and more This post discusses highlights of EMNLP 2018. It focuses on talks and papers dealing with inductive bias, cross-lingual learning, word embeddings, latent variable models, language models, and datasets.
natural language processing HackerNoon Interview This post is an interview by fast.ai fellow Sanyam Bhutani with me. It covers my background, advice on getting started with NLP, writing technical articles, and more.
language models A Review of the Neural History of Natural Language Processing This post expands on the Frontiers of Natural Language Processing session organized at the Deep Learning Indaba 2018. It discusses major recent advances in NLP focusing on neural network-based methods.
natural language processing ACL 2018 Highlights: Understanding Representations and Evaluation in More Challenging Settings This post discusses highlights of the 56th Annual Meeting of the Association for Computational Linguistics (ACL 2018). It focuses on understanding representations and evaluating in more challenging scenarios.