The new large-scale sparse model aims to make search more accurate.
Combined with the existing Transformers models, the “Make Every Feature Binary” (MEB) model will make search results on Bing are more relevant. Currently, MEB is running in production for all of Bing searches across all languages and regions.
The MEB model underwent training with more than 500 billion query and document pairs, fed from Bing searches spanning three years. With an input space of more than 200 billion binary features, MEB aims to surpass semantic relationships to learn hidden intents between query and document.
For instance, MEB identified hidden intent between “Hotmail” and “Microsoft Outlook”. Similarly, the model learned that “baseball” and “hockey” are negatively related. Microsoft says that MEB has reduced the clickthrough rate on the top search results by almost 2%. Manual query reformulation and clicks by pagination have been reduced by more than 1% and 1.5%, respectively.
[2 minute read]