natural language processing Challenges and Opportunities in NLP Benchmarking Over the last years, models in NLP have become much more powerful, driven by advances in transfer learning. A consequence of this drastic increase in performance is that existing benchmarks have been left behind. Recent models "have outpaced the benchmarks to test for them" (AI Index Report 2021), quickly reaching
language models Recent Advances in Language Model Fine-tuning This article provides an overview of recent methods to fine-tune large pre-trained language models.
transfer learning ML and NLP Research Highlights of 2020 This post summarizes progress in 10 exciting and impactful directions in ML and NLP in 2020.
natural language processing 10 ML & NLP Research Highlights of 2019 This post gathers ten ML and NLP research directions that I found exciting and impactful in 2019.
cross-lingual Unsupervised Cross-lingual Representation Learning This post expands on the ACL 2019 tutorial on Unsupervised Cross-lingual Representation Learning. It highlights key insights and takeaways and provides updates based on recent work, particularly unsupervised deep multilingual models.
transfer learning The State of Transfer Learning in NLP This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. It highlights key insights and takeaways and provides updates based on recent work.
events NAACL 2019 Highlights This post discusses highlights of NAACL 2019. It covers transfer learning, common sense reasoning, natural language generation, bias, non-English languages, and diversity and inclusion.
transfer learning Neural Transfer Learning for Natural Language Processing (PhD thesis) This post discusses my PhD thesis Neural Transfer Learning for Natural Language Processing and some new material presented in it.
events AAAI 2019 Highlights: Dialogue, reproducibility, and more This post discusses highlights of AAAI 2019. It covers dialogue, reproducibility, question answering, the Oxford style debate, invited talks, and a diverse set of research papers.
transfer learning 10 Exciting Ideas of 2018 in NLP This post gathers 10 ideas that I found exciting and impactful this year—and that we'll likely see more of in the future. For each idea, it highlights 1-2 papers that execute them well.
events EMNLP 2018 Highlights: Inductive bias, cross-lingual learning, and more This post discusses highlights of EMNLP 2018. It focuses on talks and papers dealing with inductive bias, cross-lingual learning, word embeddings, latent variable models, language models, and datasets.
language models A Review of the Neural History of Natural Language Processing This post expands on the Frontiers of Natural Language Processing session organized at the Deep Learning Indaba 2018. It discusses major recent advances in NLP focusing on neural network-based methods.
natural language processing ACL 2018 Highlights: Understanding Representations and Evaluation in More Challenging Settings This post discusses highlights of the 56th Annual Meeting of the Association for Computational Linguistics (ACL 2018). It focuses on understanding representations and evaluating in more challenging scenarios.
natural language processing NLP's ImageNet moment has arrived Big changes are underway in the world of NLP. The long reign of word vectors as NLP's core representation technique has seen an exciting new line of challengers emerge. These approaches demonstrated that pretrained language models can achieve state-of-the-art results and herald a watershed moment.
natural language processing Highlights of NAACL-HLT 2018: Generalization, Test-of-time, and Dialogue Systems This post discusses highlights of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT 2018). It focuses on Generalization, the Test-of-Time awards, and Dialogue Systems.
semi-supervised learning An overview of proxy-label approaches for semi-supervised learning While unsupervised learning is still elusive, researchers have made a lot of progress in semi-supervised learning. This post focuses on a particular promising category of semi-supervised learning methods that assign proxy labels to unlabelled data, which are used as targets for learning.
transfer learning Requests for Research It can be hard to find compelling topics to work on and know what questions to ask when you are just starting as a researcher. This post aims to provide inspiration and ideas for research directions to junior researchers and those trying to get into research.
domain adaptation Learning to select data for transfer learning Domain adaptation methods typically seek to identify features that are shared between the domains or learn representations that are general enough to be useful for both domains. This post discusses a complementary approach to domain adaptation that selects data that is useful for training the model.
multi-task learning An Overview of Multi-Task Learning in Deep Neural Networks Multi-task learning is becoming more and more popular. This post gives a general overview of the current state of multi-task learning. In particular, it provides context for current neural network-based methods by discussing the extensive multi-task learning literature.
transfer learning Transfer Learning - Machine Learning's Next Frontier Deep learning models excel at learning from a large number of labeled examples, but typically do not generalize to conditions not seen during training. This post gives an overview of transfer learning, motivates why it warrants our application, and discusses practical applications and methods.