To fully understand how BERT works,

Accurate, factual information from observations
Post Reply
Md5656se
Posts: 13
Joined: Sun Dec 22, 2024 3:44 am

To fully understand how BERT works,

Post by Md5656se »

it is important to think about its acronym.

And BERT is the acronym for Bidirectional Encoder Representations philippine area code Transformer, but we will pay special attention to Bidirectional and Transformer:

BERT is purely bidirectional, meaning it analyzes text both right-to-left and left-to-right. This allows it to provide a deeper sense of context than single-directional language models.

Let's take the following sentence as an example: “I'd rather go to a cool place. I don't like high temperatures.”

In the first sentence, “cool place” can refer to both a cold place and a trendy place.

The second sentence, “I don't like high temperatures” provides the necessary context for the algorithm to easily know that the sentence is talking about temperature and not trends or fashion.

BERT's bidirectional nature allows you to read the sentence in both directions and relate the entire content, that is, obtain the context necessary to understand its meaning.


Image



Transformers: This is a neural network that was initially used to improve machine translations.

In the case of BERT, Transformers focus their attention on some words that make it difficult to understand the context, such as links or pronouns.

In this way, for example, they can understand references to pronouns, direct and even indirect objects.

For example, “My sister's car is broken. “I should pick her up”

In this case, BERT understands that “her” refers to “My sister” because it focuses on the pronoun and looks for the relationship between both sentences.

These two elements together, bidirectionality and Transformers, give BERT the ability to very accurately understand the context of a sentence, understand longer sentences, and even relate two sentences to each other.

What does this change in the algorithm mean?
This update to Google's algorithm allows search results to be much more precise, especially for longer-tail keywords.

In addition, the impact it has on the precision of search intent, on understanding ambiguous searches and on voice search is notable.

Greater accuracy in results: BERT better understands the user's search intent
A clear example of this type of improvement in search intent is the one detailed by Pandu Nayak in the article: "Understanding searches better than ever" .

Google BERT - Example

In the case of the search “2019 brazil traveler to usa need a visa” we see a significant change before and after BERT.

Before BERT, Google displayed a result from the Washington Post aimed at American travelers looking to travel to Brazil.

The algorithm completely ignored the “to” connector ,
Post Reply