Maybe BERT is all you need ?

Paul Peyramaure

feb. 04 2021

Zoom

BERT, which stands for Bidirectional Encoder Representations from Transformer, has been published by a Google AI team in 2018. It has been presented as a new cutting-edge model for Natural Language Processing (NLP). Based on Transformer achitecture, it is design to learn bidirectional representations by considering both the left and right contexts in all its layers. While being initially introduced for NLP tasks, it has recently been used to model other tasks such as action recognition.

illustration

Jacob Devlin and Ming-Wei Chang and Kenton Lee and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.

M. Esat Kalfaoglu and Sinan Kalkan and A. Aydin Alatan. 2020. Late Temporal Modeling in 3D CNN Architectures with BERT for Action Recognition.