fbpx

Ever had to search for something on Google, but you’re not exactly sure what it is, so you just use some language that vaguely implies it? Google’s about to make that a whole lot easier.

Google announced today it’s rolling out a new machine learning-based language understanding technique called Bidirectional Encoder Representations from Transformers, or BERT. BERT helps decipher your search queries based on the context of the language used, rather than individual words. According to Google, “when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English.”

Most of us know that Google usually responds to words, rather than to phrases — and Google’s aware of it, too. In the announcement, Pandu Nayak, Google’s VP of search, called this kind of searching “keyword-ese,” or “typing strings of words that they think we’ll understand, but aren’t actually how they’d naturally ask a question.” It’s amusing to see these kinds of searches — heck, Wired has made a whole cottage industry out of celebrities reacting to these keyword-ese queries in their “Autocomplete” video series” — but Nayak’s correct that this is not how most of us would naturally ask a question.

As you might expect, this subtle change might make some pretty big waves for potential searchers. Nayak said this “[represents] the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.” Google offered several examples of this in action, such as “Do estheticians stand a lot at work,” which apparently returned far more accurate search results.

I’m not sure if this is something most of us will notice — heck, I probably wouldn’t have noticed if I hadn’t read Google’s announcement, but it’ll sure make our lives a bit easier. The only reason I can see it not having a huge impact at first is that we’re now so used to keyword-ese, which is in some cases more economical to type. For example, I can search “What movie did William Powell and Jean Harlow star in together?” and get the correct result (Libeled Lady; not sure if that’s BERT’s doing or not), but I can also search “William Powell Jean Harlow movie” and get the exact same result.

BERT will only be applied to English-based searches in the US, but Google is apparently hoping to roll this out to more countries soon.

[Source: This article was published in thenextweb.com By RACHEL KASER - Uploaded by the Association Member: Dorothy Allen]

Categorized in Search Engine

The new language model can think in both directions, fingers crossed

Google has updated its search algorithms to tap into an AI language model that is better at understanding netizens' queries than previous systems.

Pandu Nayak, a Google fellow and vice president of search, announced this month that the Chocolate Factory has rolled out BERT, short for Bidirectional Encoder Representations from Transformers, for its most fundamental product: Google Search.

To pull all of this off, researchers at Google AI built a neural network known as a transformer. The architecture is suited to deal with sequences in data, making them ideal for dealing with language. To understand a sentence, you must look at all the words in it in a specific order. Unlike previous transformer models that only consider words in one direction – left to right – BERT is able to look back to consider the overall context of a sentence.

“BERT models can, therefore, consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries,” Nayak said.

For example, below's what the previous Google Search and new BERT-powered search looks like when you query: “2019 brazil traveler to usa need a visa.”

2019 brazil

Left: The result returned for the old Google Search that incorrectly understands the query as a US traveler heading to Brazil. Right: The result returned for the new Google Search using BERT, which correctly identifies the search is for a Brazilian traveler going to the US. Image credit: Google.

BERT has a better grasp of the significance behind the word "to" in the new search. The old model returns results that show information for US citizens traveling to Brazil, instead of the other way around. It looks like BERT is a bit patchy, however, as a Google Search today still appears to give results as if it's American travelers looking to go to Brazil:

current google search

Current search result for the query: 2019 brazil traveler to USA need a visa. It still thinks the sentence means a US traveler going to Brazil

The Register asked Google about this, and a spokesperson told us... the screenshots were just a demo. Your mileage may vary.

"In terms of not seeing those exact examples, the side-by-sides we showed were from our evaluation process, and might not 100 percent mirror what you see live in Search," the PR team told us. "These were side-by-side examples from our evaluation process where we identified particular types of language understanding challenges where BERT was able to figure out the query better - they were largely illustrative.

"Search is dynamic, content on the web changes. So it's not necessarily going to have a predictable set of results for any query at any point in time. The web is constantly changing and we make a lot of updates to our algorithms throughout the year as well."

Nayak claimed BERT would improve 10 percent of all its searches. The biggest changes will be for longer queries, apparently, where sentences are peppered with prepositions like “for” or “to.”

“BERT will help Search better understand one in 10 searches in the US in English, and we’ll bring this to more languages and locales over time,” he said.

Google will run BERT on its custom Cloud TPU chips; it declined to disclose how many would be needed to power the model. The most powerful Cloud TPU option currently is the Cloud TPU v3 Pods, which contain 64 ASICs, each carrying performance of 420 teraflops and 128GB of high-bandwidth memory.

At the moment, BERT will work best for queries made in English. Google said it also works in two dozen countries for other languages, too, such as Korean, Hindi, and Portuguese for “featured snippets” of text. ®

[Source: This article was published in theregister.co.uk By Katyanna Quach - Uploaded by the Association Member: Anthony Frank]

Categorized in Search Engine

Google confirmed an update affecting local search results has now fully rolled out, a process that began in early November.

Screenshot 1

In what’s been called the November 2019 Local Search Update, Google is now applying neural matching to local search results. To explain neural matching, Google points to a tweet published earlier this year that describes it as a super-synonym system.

That means neural matching allows Google to better understand the meaning behind queries and match them to the most relevant local businesses – even if the keywords in the query are not specifically included in the business name and description.

“The use of neural matching means that Google can do a better job going beyond the exact words in business name or description to understand conceptually how it might be related to the words searchers use and their intents.”

In other words, some business listings might now be surfaced for queries they wouldn’t have shown up for prior to this update. Hopefully, that proves to be a good thing.

Google notes that, although the update has finished rolling out, local search results as they are displayed now are not set in stone by any means. Like regular web searches, results can change over time.

Google has not stated to what extent local search results will be impacted by this update, though it was confirmed this is a global launch across all countries and languages.

[Source: This article was published in searchenginejournal.com By Matt Southern - Uploaded by the Association Member: Jasper Solander]

Categorized in Search Engine

John Mueller from Google gave one of the clearest and easiest to understand explanations on how Google uses machine learning in web search. He basically said Google uses it for "specific problems" where automation and machine learning can help improve the outcome. The example he gave was with canonicalization and the example clears things up.

This is from the Google webmaster hangout starting at 37:47 mark. The example is this "So, for example, we use machine learning for canonicalization. So what that kind of means is we have all of those factors that we talked about before. And we give them individual weights. That's kind of the traditional way to do it. And we say well rel canonical has this much weight and redirect has this much weight and internal linking has this much weight. And the traditional approach would be to say well we will just make up those weights, at those numbers and see if it works out. And if we see that things don't work out we will tweak those numbers a little bit. And with machine learning what we can essentially do is say well this is the outcome that we want to have achieved and machine learning algorithms should figure out these weights on their own."

This was the first part of the answer around how Google debugs its search algorithm.

Here is the full transcript of this part.

The question:

Machine learning has been a part of Google search algorithm and I can imagine it's getting smarter every day. Do you as an employee with access to the secret files know the exact reason why pages rank better than others or is the algorithm now making decisions and evolving in a way that makes it impossible for humans to understand?

John's full answer:

We get this question every now and then and we're not allowed to could provide an answer because the machines are telling us not to talk about this topic. So it's I really can't answer. No just kidding.

It's something where we use machine learning in lots of ways to help us understand things better. But machine learning isn't just this one black box that does everything for you. Like you feed the internet in on one side the other side comes out search results. It's a tool for us. It's essentially a way of testing things out a lot faster and trying things out figuring out what the right solution there is.

So, for example, we use machine learning for canonicalization. So what that kind of means is we have all of those factors that we talked about before. And we give them individual weights. That's kind of the traditional way to do it. And we say well rel canonical has this much weight and redirect has this much weight and internal linking has this much weight. And the traditional approach would be to say well we will just make up those weights, at those numbers and see if it works out. And if we see that things don't work out we will tweak those numbers a little bit. And with machine learning what we can essentially do is say well this is the outcome that we want to have achieved and machine learning algorithms should figure out these weights on their own.

So it's not so much that machine learning does everything with canonicalization on its own but rather it has this well-defined problem. It's working out like what are these numbers that we should have there as weights and kind of repeatedly trying to relearn that system and understanding like on the web this is how people do it and this is where things go wrong and that's why we should choose these numbers.

So when it comes to debugging that. We still have those numbers, we still have those weights there. It's just that they're determined by machine learning algorithms. And if we see that things go wrong then we need to find a way like how could we tell the machine learning algorithm actually in this case we should have taken into account, I don't know phone numbers on a page more rather than just the pure content, to kind of separate like local versions for example. And that's something that we can do when we kind of train these algorithms.

So with all of this machine learning things, it's not that there's one black box and it just does everything and nobody knows why it does things. But rather we try to apply it to specific problems where it makes sense to automate things a little bit in a way that saves us time and that helps to pull out patterns that maybe we wouldn't have recognized manually if we looked at it.

Here is the video embed:

Here is how Glenn Gabe summed it up on Twitter:

Glenn Gabe@glenngabe
Glenn Gabe@glenngabe

More from @johnmu: Machine learning helps us pull out patterns we might have missed. And for debugging, Google can see those weights which are determined by ML algos. If there is something that needs to be improved, Google can work to train the algorithms: https://www.youtube.com/watch?v=5QxYWMEZT3A&t=38m53s 

[Source: This article was published in seroundtable.com By Barry Schwartz - Uploaded by the Association Member: Robert Hensonw]

Categorized in Search Engine

Google has seemingly put the final nail in the coffin for Adobe Flash, the once-popular video and animation player that's become less relevant as newer web standards like HTML5 have taken over.

The company announced on Monday that its search engine will stop supporting Flash later this year, and that it will ignore Flash content in websites that contain it. The search engine will also stop indexing SWF files, the file format for media played through the Flash Player. Google noted that most users and websites won't see any impact from this change. 

The move has been a long time coming for Flash. Adobe announced in 2017 that it was planning to end-of-life Flash by ceasing to update and distribute it at the end of 2020, and Flash is already disabled in Chrome by default. When it made the announcement, Adobe said it was working with partners like Apple, Microsoft, Facebook, Google, and Mozilla to smoothly phase out Flash.

Flash was once a critical technology that enabled content creators to easily implement media, animations, and games  in their websites during the earlier days of the web. If you frequently played online games in your web browser in the early 2000s, you'll probably remember that Flash plugin was a necessity. 

But as new web standards like HTML5 and WebGL have risen in popularity, there became less of a need for Flash. Plus, as time went on, Flash became more prone to security concerns — including one vulnerability highlighted by security blog Naked Security which surfaced last year that would have made it possible for hackers to execute malicious code via a Flash file.

[Source: This article was published in businessinsider.com By Lisa Eadicicco - Uploaded by the Association Member: David J. Redcliff] 

Categorized in Search Engine

As always, when Google releases a new update to its search algorithm, it’s an exciting (and potentially scary) time for SEO. Google’s latest update, BERT, represents the biggest alteration to its search algorithm in the last five years.

So, what does BERT do?

Google says the BERT update means its search algorithm will have an easier time comprehending conversational nuances in a user’s query.

The best example of this is statements where prepositional words such as ‘to’ and ‘for’ inform the intent of the query.

BERT stands for Bidirectional Encoder Representations from Transformers, which is a language processing technique based on neural networking principles.

Google estimates the update will impact about 10% of United States-based queries and has revealed BERT can already be seen in action on featured snippets around the world.

How does Google BERT affect on-page SEO?

SEO practitioners can breathe a collective sigh of relief, because the Google BERT update is not designed to penalise websites, rather, only improve the way the search engine understands and interprets search queries.

However, because the search algorithm is better at understanding nuances in language, it means websites with higher-quality written content are going to be more discoverable.

Websites that have a lot of detailed ‘how-to’ guides and other in-depth content designed to benefit users are going to get the most from Google BERT. This means businesses who aren’t implementing a thorough content strategy are likely to fall behind the curve.

Basically, the BERT update follows Google’s long-running trend of trying to improve the ability of its search algorithm to accurately serve conversational search queries.

The ultimate result of this trend is users being able to perform detailed search queries with the Google voice assistant as if they were speaking to a real person.

Previous algorithm updates

While BERT may be the first major change to Google search in five years, it’s not the biggest shakeup in their history.

The prior Google PANDA and Google PENGUIN updates were both significant and caused a large number of websites to become penalised due to the use of SEO strategies that were considered ‘spammy’ or unfriendly to users.

PANDA

Google PANDA was developed in response to user complaints about ‘content farms’.

Basically, Google’s algorithm was rewarding quantity over quality, meaning there was a business incentive for websites to pump out lots of cheaply acquired content for the purposes of serving ads next to or even within them.

The PANDA update most noticeably affected link building or ‘article marketing’ strategies where low-quality content was published to content farms with a link to a business’ website attached to a keyword repeated throughout the article.

It meant that there was a significant push towards more ethical content marketing strategies, such as guest posting.

PENGUIN

Google PENGUIN is commonly seen as a follow up to the work started by PANDA, targeting spammy link-building practices and ‘black-hat’ SEO techniques.

This update was focused primarily on the way the algorithm evaluates the authority of links as well as the sincerity of their implementation in website content. Spammy or manipulative links now carried less weight. 

However, this meant that if another website posted a link to yours in a spammy or manipulative way, it would negatively affect your search rankings.

This meant that webmasters and SEO-focused businesses needed to make use of the disavow tool to inform Google what inbound links they approve of and which they don’t.

[Source: This article was published in smartcompany.com.au By LUCAS BIKOWSKI - Uploaded by the Association Member: Bridget Miller]

Categorized in Search Engine

Search-engine giant says one in 10 queries (and some advertisements) will see improved results from algorithm change

MOUNTAIN VIEW, Calif.—Google rarely talks about its secretive search algorithm. This week, the tech giant took a stab at transparency, unveiling changes that it says will surface more accurate and intelligent responses to hundreds of millions of queries each day.

Top Google executives, in a media briefing Thursday, said they had harnessed advanced machine learning and mathematical modeling to produce better answers for complex search entries that often confound its current algorithm. They characterized the changes—under a...

Read More...

[Source: This article was published in wsj.com By Rob Copeland - Uploaded by the Association Member: Jasper Solander] 

 
Categorized in Search Engine

Don't try to optimize for BERT, try to optimize your content for humans.

Google introduced the BERT update to its Search ranking system last week. The addition of this new algorithm, designed to better understand what’s important in natural language queries, is a significant change. Google said it impacts 1 in 10 queries. Yet, many SEOs and many of the tracking tools did not notice massive changes in the Google search results while this algorithm rolled out in Search over the last week.

The question is, Why?

The short answer. This BERT update really was around understanding “longer, more conversational queries,” Google wrote in its blog post. The tracking tools, such as Mozcast and others, primarily track shorter queries. That means BERT’s impact is less likely to be visible to these tools.

And for site owners, when you look at your rankings, you likely not tracking a lot of long-tail queries. You track queries that send higher volumes of traffic to your web site, and those tend to be short-tail queries.

Moz on BERT. Pete Meyers of Moz said the MozCast tool tracks shorter head terms and not the types of phrases that are likely to require the natural language processing (NLP) of BERT.

dr.pete

RankRanger on BERT. The folks at RankRanger, another toolset provider told me something similar. “Overall, we have not seen a real ‘impact’ — just a few days of slightly increased rank fluctuations,” the company said. Again, this is likely due to the dataset these companies track — short-tail keywords over long -tail keywords.

Overall tracking tools on BERT. If you look at the tracking tools, virtually all of them showed a smaller level of fluctuation on the days BERT was rolling out compared to what they have shown for past Google algorithm updates such as core search algorithm updates, or the Panda and Penguin updates.

Here are screenshots of the tools over the past week. Again, you would see significant spikes in changes, but these tools do not show that:

mozcast 800x348

serpmetrics 800x308

algoroo 800x269

advancedwebranking 800x186

accuranker 800x245

rankranger 800x265

semrush 800x358

SEO community on BERT. When it comes to individuals picking up on changes to their rankings in Google search, that also was not as large as a Google core update. We did notice chatter throughout the week, but that chatter within the SEO community was not as loud as is typical with other Google updates.

Why we care. We are seeing a lot of folks asking about how they can improve their sites now that BERT is out in the wild. That’s not the way to think about BERT. Google has already stated there is no real way to optimize for it. Its function is to help Google better understand searchers’ intent when they search in natural language. The upside for SEOs and content creators is they can be less concerned about “writing for the machines.” Focus on writing great content — for real people.

Danny Sullivan from Google said again, you cannot really optimize for BERT:

johan

Continue with your strategy to write the best content for your users. Don’t do anything special for BERT, but rather, be special for your users. If you are writing for people, you are already “optimizing” for Google’s BERT algorithm.

[Source: This article was published in searchengineland.com By Barry Schwartz - Uploaded by the Association Member: Joshua Simon]

Categorized in Search Engine

Google said it is making the biggest change to its search algorithm in the past five years that, if successful, users might not be able to detect.

The search giant on Friday announced a tweak to the software underlying its vaunted search engine that is meant to better interpret queries when written in sentence form. Whereas prior versions of the search engine may have overlooked words such as “can” and “to,” the new software is able to help evaluate whether those change the intent of a search, Google has said. Put a bit more simply, it is a way of understanding search terms in relation to each other and it looks at them as an entire phrase, rather than as just a bucket of words, the company said. Google is calling the new software BERT, after a research paper published last year by Google executives describing a form of language processing known as Bidirectional Encoder Representations from Transformers.

While Google is constantly tweaking its algorithm, BERT could affect as many as 10 percent of English language searches, said Pandu Nayak, vice president of search, at a media event. Understanding queries correctly so Google returns the best result on the first try is essential to Google’s transformation from a list of links to determining the right answer without having to even click through to another site. The challenge will increase as queries increasingly move from text to voice-controlled technology.

But even big changes aren’t likely to register with the masses, he conceded.

“Most ranking changes the average person does not notice, other than the sucking feeling that their searches were better,” said Nayak.

“You don’t have the comparison of what didn’t work yesterday and what does work today,” said Ben Gomes, senior vice president of search.

BERT, said Nayak, may be able to determine that a phrase such as “math practice books for adults” likely means the user wants to find math books that adults can use, because of the importance of the word “for.” A prior version of the search engine displayed a book result targeted for “young adults,” according to a demonstration he gave.

Google is rolling out the new algorithm to U.S. users in the coming weeks, the company said. It will later offer it to other countries, though it didn’t offer specifics on timing.

The changes suggest that even after 20 years of data collection and Google’s dominance of search — with about 90 percent market share — Web searches may best be thought of as equal parts art and science. Nayak pointed to examples like searches for how to park a car on a hill with no curb or whether a Brazilian needs a visa to travel to the United States as yielding less than satisfactory results without the aide of the BERT software.

To test BERT, Google turned to its thousands of contract workers known as “raters,” Nayak said, who compared results from search queries with and without the software. Over time, the software learns when it needs to read entire phrases versus just keywords. About 15 percent of the billions of searches conducted each day are new, Google said.

Google said it also considers other input, such as whether a user tries rephrasing a search term rather than initially clicking on one of the first couple of links.

Nayak and Gomes said they didn’t know whether BERT would be used to improve advertising sales that are related to search terms. Advertising accounts for the vast majority of Google’s revenue.

[Source: This article was published inunionleader.com By Greg Bensinger - Uploaded by the Association Member: Jeremy Frink]

Categorized in Search Engine

Friends, you're going to wish you were still making the scene with a magazine after reading this sentence: Google's web trackers are all up in your fap time and there's pretty much nothing (except maybe using a more secure browser like Firefox, read up on cybersecurity tips from the EFF, refusing to sign into a Google account and never going online without the protection of a VPN) that anyone can do about it.

From The Verge:

Visitors to porn sites have a “fundamentally misleading sense of privacy,” warn the authors of a new study that examines how tracking software made by tech companies like Google and Facebook is deployed on adult websites.

The authors of the study analyzed 22,484 porn sites and found that 93 percent of them leak data to third parties, including when accessed via a browser’s “incognito” mode. This data presents a “unique and elevated risk,” warn the authors, as 45 percent of porn site URLs indicate the nature of the content, potentially revealing someone’s sexual preferences.

According to the study, trackers baked up by Google and its creepy always-watching-you subsidiaries were found on over 74% of the porn sites that researchers checked out... for purely scientific reasons, of course. And the fun doesn't stop there! Facebook's trackers appeared on 10% of the websites and, for the discerning surveillance aficionado, 24% of the sites the researchers checked in on were being stalked by Oracle. According to The Verge, "...the type of data collected by trackers varies... Sometimes this information seems anonymous, like the type of web browser you’re using, or your operating system, or screen resolution. But this data can be correlated to create a unique profile for an individual, a process known as “fingerprinting.” Other times the information being collected is more obviously revealing like a user’s the IP address or their phone’s mobile identification number.

It's enough to give someone performance anxiety.

[Source: This article was published in boingboing.net By SEAMUS BELLAMY - Uploaded by the Association Member: Jay Harris]

Categorized in Search Engine

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media