XLNet

XLNet: Generalized Autoregressive Pretraining for Language Understanding by Yang et al. was published in June 2019. The article claims that it overcomes shortcomings of BERT and achieves SOTA results in many NLP tasks.

In this article I explain XLNet and show the code of a binary classification example on the IMDB dataset. I compare the two model as I did the same classification with BERT (see here). For the complete code, see my github (here).

Continue reading “XLNet”

Let’s have a committed relationship … with git

Git is the most widely used version control system in the world. I have to use it during my work, and as it is not difficult to use (if you don’t have conflicts), I didn’t think much about it until recently. However, once I started to read more about it, I quickly realised its elegant way of dealing with versions and integrity control worth more than an article! We have probably all taken this amazing tool granted time to time, underestimating all the trouble it saves us from. So let’s realise the power and elegance of git, and let’s finally get into a committed relationship with it!

Continue reading “Let’s have a committed relationship … with git”

BERT: Bidirectional Transformers for Language Understanding

One of the major advances in deep learning in 2018 has been the development of effective NLP transfer learning methods, such as ULMFiT, ELMo and BERT. The Transformer Bidirectional Encoder Representations aka BERT has shown strong empirical performance therefore BERT will certainly continue to be a core method in NLP for years to come.

Continue reading “BERT: Bidirectional Transformers for Language Understanding”

Create a website or blog at WordPress.com

Up ↑