Logo

BERT

Core Technologies
Letter: B

Bidirectional Encoder Representations from Transformers - a pre-trained language model.

Detailed Definition

BERT (Bidirectional Encoder Representations from Transformers) is a revolutionary language model developed by Google that fundamentally changed natural language processing. Unlike previous models that processed text sequentially, BERT reads entire sequences of words simultaneously, understanding context from both directions. This bidirectional approach enables BERT to better understand the meaning of words based on their full context. BERT is pre-trained on vast amounts of text using techniques like masked language modeling, where random words are hidden and the model learns to predict them. This pre-training creates rich representations that can be fine-tuned for specific tasks like sentiment analysis, question answering, and text classification. BERT's success demonstrated the power of transfer learning in NLP and paved the way for many subsequent transformer-based models.