New Ideas in Marketing
Essential news for marketers, summarised by YouGov

The model is being used on 1 in 10 searches and will enable users to enter queries more naturally.

Google is rolling out a new technology called BERT (bidirectional encoder representations from transformers), to comprehend search queries in a better manner. BERT models can interpret the appropriate meaning of a word within the context of the words that precede and follow it.

Given the complexity of BERT, Google will limit its use to 1 in 10 searches in the US in English, for now. BERT helps the search engine giant’s algorithms understand the nuances of queries and the connection between words which it previously couldn’t.

This change is expected to not only improve the interpretation of search queries but also show more accurate results. The tech will allow users to enter queries in a “more natural way”.

Read the original article 

[4 minute read] 

Related Content