Google announced on Friday (25) a change in its search engine algorithm. From now on, he will better understand the context of the terms inserted in the searches to present more relevant results.
For this, Google adopted BERT (Representations of Bidirectional Encoder for Transformers, in free translation), a neural network technique that analyzes what comes before and after keywords and seeks to better understand the purpose of the research.
It should be useful in multi-word searches that approach questions. Google says that searches can rely on a more natural language, as the algorithm will be able to understand the context of the words.
In its demonstration, the company used as an example a search that asks if Brazilians need a visa when traveling to the United States. Until then, the algorithm would have given little importance to the preposition “para” and would indicate results on trips from the USA to Brazil.
Another example indicates a survey of those who have doubts about the possibility of removing medicines sold under medical prescription to someone. Without BERT, the algorithm would not focus on “someone” and would show general results on medical prescriptions.
In announcing the news, Google considered it “the biggest leap in the last five years” and one of the biggest leaps in the searcher’s history. The company said that the change in the way of interpreting the surveys will represent an advance for the billions of searches done daily.
BERT started to be used for searches in English, but will be taken to other languages in the future. According to Google, in Portuguese searches, it is already being used to improve snippets, tables with relevant information that appear in some searches.