Karlston Posted October 25, 2019 Share Posted October 25, 2019 Google is improving 10 percent of searches by understanding language context Say hello to BERT Google is currently rolling out a change to its core search algorithm that it says could change the rankings of results for as many as one in ten queries. It’s based on cutting-edge natural language processing (NLP) techniques developed by Google researchers and applied to its search product over the course of the past 10 months. In essence, Google is claiming that it is improving results by having a better understanding of how words relate to each other in a sentence. In one example Google discussed at a briefing with journalists yesterday, its search algorithm was able to parse the meaning of the following phrase: “Can you get medicine for someone pharmacy?” The old Google search algorithm treated that sentence as a “bag of words,” according to Pandu Nayak, Google fellow and VP of search. So it looked at the important words, medicine and pharmacy, and simply returned local results. The new algorithm was able to understand the context of the words “for someone” to realize it was a question about whether you could pick up somebody else’s prescription — and it returned the right results. The tweaked algorithm is based on BERT, which stands for “Bidirectional Encoder Representations from Transformers.” Every word of that acronym is a term of art in NLP, but the gist is that instead of treating a sentence like a bag of words, BERT looks at all the words in the sentence as a whole. Doing so allows it to realize that the words “for someone” shouldn’t be thrown away, but rather are essential to the meaning of the sentence. The way BERT recognizes that it should pay attention to those words is basically by self-learning on a titanic game of Mad Libs. Google takes a corpus of English sentences and randomly removes 15 percent of the words, then BERT is set to the task of figuring out what those words ought to be. Over time, that kind of training turns out to be remarkably effective at making a NLP model “understand” context, according to Jeff Dean, Google senior fellow & SVP of research. Another example Google cited was “parking on a hill with no curb.” The word “no” is essential to this query, and prior to implementing BERT in search Google’s algorithms missed that. Source: Google is improving 10 percent of searches by understanding language context (The Verge) Link to comment Share on other sites More sharing options...
zigzag Posted October 25, 2019 Share Posted October 25, 2019 Search engines shouldn't police the web. They filter so much content. In the end google become useless. I changed search engine a long time ago. Link to comment Share on other sites More sharing options...
dfortunsan Posted October 25, 2019 Share Posted October 25, 2019 Hmmm, that's is strange and interesting, because search engines really can't police web. Link to comment Share on other sites More sharing options...
vitorio Posted October 25, 2019 Share Posted October 25, 2019 This has special meaning when the person who wrote the search is not writing it as a native speaker but as a second language. The word order changed and so the search. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.