We'll provide details on research articles in this blog. This is the first blog written on that basis.
Chinese researchers are very interested in conducting detailed surveys, listing important works and key breakthrough ideas under the Natural Language process of a specific subdivision of machine learning.
Accordingly, a recently published review article has reviewed different types of transformers with a focus on Natural language Processes (NLP). Anyone interested in getting into the NLP world and into Transformers should read this. This article discusses the basic principles of self-focus and details of modern variants of transformers such as architectural changes, pre-training, and various applications.
Research Paper Link : https://arxiv.org/abs/2106.04554

