At the end of 2016, YouTube suddenly changed its algorithm for calculating and presenting videos to viewers, leaving many popular creators to protest the change and label it as damaging to YouTubers everywhere.

YouTube’s new algorithm is just a sign of changing viewing habits and YouTube’s plan to reinvent itself. Contrary to the majority opinion, the algorithm change was necessary and is beneficial to small YouTubers.

This algorithm is responsible for what videos show up in the suggested tab beside the video a user’s currently watching and what’s on the trending tab. The algorithm deals with what videos are shown to a viewer compared to another video.

Many YouTubers such as pewdiepie and JackSepticEye said the new algorithm is killing their channels. They’ve claimed their videos are not being viewed as much as they used to be and that people are being randomly unsubscribed from their channels.

The algorithm does have major problems. Watch time is now the primary method for calculating what videos are displayed to viewers. Longer videos now do better on YouTube than shorter videos, but this doesn’t mean short videos don’t get views. Watch time isn’t a good, reliable factor for promoting certain videos.

Another issue is the Trending tab, which appears to be broken following the update. Whereas it previously showcased recent viral videos and up-and-coming videos, it now shows many videos from popular TV shows like NBC’s Today. These videos often have fewer views than new videos that aren’t on the Trending tab.

Despite the hate this new algorithm receives, it’s actually a good tool that smaller, unrecognized YouTubers can use to their advantage.

It all comes down to metadata, the behind-the-scenes information an uploader has to provide YouTube with when they upload their videos. This includes a video’s title, description, tags, thumbnail and playlists.

The platform is known for clickbait. This method works, but the algorithm works differently.

Large, established YouTube channels have fallen into the habit of promoting their new videos with clickbait and flashy thumbnails that don’t really have to do with the majority of that video’s content. They rely on their subscriber base to have notifications turned on or to arrive on their video watch page through a link on social media. That isn’t how it works anymore.

Large channels might be losing subscribers and getting fewer views because they aren’t adapting to the metadata system.

Using relevant tags and titles will allow YouTube to learn what a video is about. YouTube can then share the video as a recommendation to those looking for similar videos. Tagging videos with good search terms helps to get a video displayed higher on search results, which can lead to more views.

YouTuber Roberto Blake made a video detailing how creators can use good tags to get more views, even with the new algorithm.

“If you don’t know how to properly tag YouTube videos for search and discovery, then YouTube will have a harder time promoting your videos to new viewers and even to your subscribers based on what else they’ve watched,” Blake said in the video’s description.

YouTube is a search engine. Creators who understand this will have their videos rank higher and get more views if that particular topic is being search frequently. Making videos about trending topics will get more views than videos about the uploader’s life.

Just because a majority of YouTubers are calling out the new algorithm and complaining about losing views and subscribers doesn’t mean the algorithm is bad. It’s a flawed system that needs to be changed, but when it’s used correctly, success can still be found.

As long as these big YouTubers continue to blame the platform for their channels’ shortcomings, small YouTubers can grow by using good tags and understanding how to use the system.

Author : Chase Charaba

Source : http://www.puyalluppost.com/youtube-algorithm.htm/

Categorized in Social

If you’re reading this right now, it means you’re invested in search engine optimization.

Maybe you’ve been doing it for decades since Google was nascent. Or maybe you’re stumbling into SEJ for the first time as someone who’s brand new to SEO.

Either way, you’re here to learn how to use search engines to drive targeted traffic to your website and convert visitors into new customers, client, patients, readers, or loyal fans.

One of the hardest aspects of SEO is stay on top of all the updates Google announces, and especially the ones that it doesn’t.

Oh No


If you didn’t take a good hard look at your content before Panda came out, you might have felt the pain of a Panda slap and a big rankings drop.

If you didn’t clean up your trashy backlink profile before Penguin was released, that Penguin slap and rankings loss probably didn’t do you any favors.

You get it. A huge part of optimizing your website correctly for search engines is staying on top of algorithm changes in real-time.

But with so many resources to choose from in 2017, which bring you the most accurate information? And more importantly, which can you really trust?

Why Does Google Make Changes So Often?

First things first – why does Google make so many changes to its search algorithm?

In 2012 alone, Google launched 665 improvements to search. That number was probably even higher in 2013, 2014, 2015, and 2016.

Google’s mission with search is pretty straightforward: to give users the most valuable solution to their query. It sounds simple, but with almost five billion web pages on the web, Google search is taking on a massive undertaking. The constant changes in search algorithm are attempting to improve results for 40,000 searches every second and 1.2 trillion every year (not to mention ensure AdWords it improving their bottom line).

How Do I Stay Updated?

While Google makes hundreds of updates to search every year, knowing them all isn’t necessary for even the most skilled SEO. Remember that it’s much more important to stay up-to-date with the biggest updates and the most well-known changes in search results.

The most important updates are:

  1. Major algorithm updates like Panda, Penguin, Hummingbird, etc.
  2. Major click-biasing changes like knowledge graph, Google answer box, image carousel, etc.
  3. Major user behavior changes like mobile search, load speed expectations, CTR curve changes, etc.

These are the biggest updates that will affect the visibility of your website in search engine results. Here are the resources that will help you track these changes effectively.

1. Google Webmaster Tools Blog

The best place to start when it comes to Google updates is the source. While there are many trusted resources out there that report on Google search changes, it’s best to hear about major updates straight from the horse’s mouth.

The Google Webmaster Central Blog offers official news on crawling and indexing sites for the Google index.

Want to learn about the new mobile-friendly test API? Here’s a good place to start.

Google Webmaster Central Blog

The Google Webmasters YouTube Channel is another official Google source. This is a good option if you like to digest information via video and can be a nice change-of-pace from the nonstop onslaught of articles on new Google updates. 

Webmasters record videos from around the world, so the channel has a real international flavor. Subscribe to the channel and you’ll receive video updates via emails every once in a while from YouTube. Set it and forget it!

2. Search Engine Roundtable

Search Engine Roundtable is probably the best source outside of Google to find out about the most recent Google updates. They exclusively cover Google and publish five to ten articles or updates daily. They send out a daily recap daily so you can get all the news in your inbox without having to go searching.

Search Engine Roundtable

3. Moz

Moz is one of the companies at the forefront of SEO and inbound marketing, as it has been for years now. Their approach to making SEO accessible for everyone is their hallmark, and it makes them a must-bookmark source for SEO knowledge.

The Moz Blog is a great place not only to learn about Google’s most recent updates but to really dive deep into what they mean for you and your website. Knowing when and how updates affect websites in general is one thing, but learning exactly how they work and what you need to do to continue optimizing correctly and avoid penalties is going to be the difference-maker.

Join the email newsletter as well as the Moz Top 10 Newsletter to get updates right in your inbox.

Mozcast is another helpful Moz product. It’s a weather report showing turbulence in Google’s search algorithm for any given day. The stormier and hotter the weather, the more Google’s rankings changed.

Mozcast 100

4. Search Engine Journal

That’s right. There’s a reason most posts on SEJ receive hundreds or thousands of reads and shares – you’ve already arrived at a premier destination for SEO.

While most posts here go into real depth about a specific topic related to search engine marketing, Search Engine Journal offers a lot with regards to Google’s algorithm updates.

Matt Southern is the Lead Newswriter here and regularly publishes shorter articles that will give you the rundown on something happening right now.

For example, if you wanted to know what 200 sites were banned from Google search results for promoting fake news recently, Matt’s your guy.

5. Search Engine Land

Search Engine Land is another daily publication that covers all aspects of the search marketing industry. They have a section specifically to help you track Google’s updates, and you can of course subscribe to said updates via email.

6. Twitter

There are a few key figures in the SEO community you should follow. You can even activate mobile notifications so you’re the first one to see their tweets and learn about Google’s most recent updates.

The Real Trick

Now that you have all these great resources to stay up-to-date on Google’s changes, you need to set up a system that helps you keep track. Going to each of these websites every day isn’t sustainable, so let’s get you set up with a better system.

  1. Subscribe to each of the above periodicals above via email. That way, all your updates will flow directly into your inbox.
  2. Sign up for unroll.me and get all those updates in one, clean email every morning, afternoon or evening.

Now you’ll receive a daily update about Google algorithm changes without leaving your computer. Magic!

Knowing about the changes Google makes to its algorithm is essential to keeping your website’s content visible in search results. Staying vigilant, continuing to learn and setting up a system that will autonomously deliver you news are the keys towards maintaining an SEO-friendly website.

Author : Joe Howard

Source : https://www.searchenginejournal.com/communication-overload-keeping-google-searchs-constant-updates/185168/

Categorized in Search Engine

Did your rankings in Google get better or worse over the past week? Many webmasters and SEOs are noticing some significant changes in Google's search rankings algorithm.

Last Tuesday, Feb. 7, there seems to have been a Google algorithm change that adjusted how many sites rank — both for good and bad. I’ve been tracking the update since Feb. 8, and over time, more and more webmasters and SEOs have been taking notice of the ranking changes at Google.

This seems to be unrelated to the unconfirmed link algorithm change from earlier in February. This new update seems to be more related to Panda, based on such things as content and site quality, versus link factors.

Google has not confirmed the update and would not comment on what webmasters and SEOs have been noticing over the past week in the search results. So we cannot confirm if this was a content quality shift, link quality change or something else. But what we can say is that webmasters and SEOs are very busy noticing these ranking changes, through looking at ranking reports or their traffic from Google in their analytics, or using tracking tools that track visibility and other means.

The automated tracking tools from Mozcast, RankRanger, Accuranker and others also all showed evidence of an algorithm update on Feb. 7.

This update seems to have been somewhat significant, which is why we reached out to Google for a comment. If we hear more from Google, we will update you. But for now, this is all based on the conversation and chatter that I track closely within the industry.

Author : Barry Schwartz

Source : http://searchengineland.com/new-unconfirmed-google-algorithm-update-touched-february-7th-269338

Categorized in Search Engine
If you Google “Was the Holocaust real?” right now, seven out of the top 10 results will be Holocaust denial sites. If you Google “Was Hitler bad?,” one of the top results is an article titled, “10 Reasons Why Hitler Was One Of The Good Guys.”
In December, responding to weeks of criticism, Google said that it tweaked it algorithm to push down Holocaust denial and anti-Semitic sites. But now, just a month later, their fix clearly hasn’t worked.
In addition to hateful search results, Google has had a similar problem with its “autocompletes” — when Google anticipates the rest of a query from its first word or two. Google autocompletes have often embodied racist and sexist stereotypes. Google image search has also generated biased results, absurdly tagging some photos of black people as “gorillas.”
The result of these horrific search results can be deadly. Google search results reportedly helped shape the racism of Dylann Roof, who murdered nine people in a historically black South Carolina church in 2015. Roof said that when he Googled “black on white crime, the first website I came to was the Council of Conservative Citizens,” which is a white supremacist organization. “I have never been the same since that day,” he said. And of course, in December, a Facebook-fueled fake news story about Hillary Clinton prompted a man to shoot up a pizza parlor in Washington D.C. The fake story reportedly originated in a white supremacist’s tweet.
These terrifying acts of violence and hate are likely to continue if action isn’t taken. Without a transparent curation process, the public has a hard time judging the legitimacy of online sources. In response, a growing movement of academics, journalists and technologists is calling for more algorithmic accountability from Silicon Valley giants. As algorithms take on more importance in all walks of life, they are increasingly a concern of lawmakers. Here are some steps Silicon Valley companies and legislators should take to move toward more transparency and accountability:

1. Obscure content that’s damaging and not of public interest.

When it comes to search results about an individual person’s name, many countries have aggressively forced Google to be more careful in how it provides information. Thanks to the Court of Justice of the European Union, Europeans can now request the removal of certain search results revealing information that is “inadequate, irrelevant, no longer relevant or excessive,” unless there is a greater public interest in being able to find the information via a search on the name of the data subject.
Such removals are a middle ground between information anarchy and censorship. They neither disappear information from the internet (it can be found at the original source) nor allow it to dominate the impression of the aggrieved individual. They are a kind of obscurity that lets ordinary individuals avoid having a single incident indefinitely dominate search results on his or her name. For example, a woman in Spain whose husband was murdered 20 years ago successfully forced Google Spain to take news of the murder off search results on her name.

Such removals are a middle ground between information anarchy and censorship.

2. Label, monitor and explain hate-driven search results.

In 2004, anti-Semites boosted a Holocaust-denial site called “Jewwatch” into the top 10 results for the query “Jew.” Ironically, some of those horrified by the site may have helped by linking to it in order to criticize it. The more a site is linked to, the more prominence Google’s algorithm gives it in search results.
Google responded to complaints by adding a headline at the top of the page entitled “An explanation of our search results.” A web page linked to the headline explained why the offensive site appeared so high in the relevant rankings, thereby distancing Google from the results. The label, however, no longer appears. In Europe and many other countries, lawmakers should consider requiring such labeling in the case of obvious hate speech. To avoid mainstreaming extremism, labels may link to accounts of the history and purpose of groups with innocuous names like “Council of Conservative Citizens.”
In the U.S., this type of regulation may be considered a form of “compelled speech,” barred by the First Amendment. Nevertheless, better labeling practices for food and drugs have escaped First Amendment scrutiny in the U.S., and why should information itself be different? As law professor Mark Patterson has demonstrated, many of our most important sites of commerce are markets for information: search engines are not offering products and services themselves but information about products and services, which may well be decisive in determining which firms and groups fail and which succeed. If they go unregulated, easily manipulated by whoever can afford the best search engine optimization, people may be left at the mercy of unreliable and biased sources.

Better labeling practices for food and drugs have escaped First Amendment scrutiny in the U.S. Why should information itself be different?

3. Audit logs of the data fed into algorithmic systems.

We also need to get to the bottom of how some racist or anti-Semitic groups and individuals are manipulating search. We should require immutable audit logs of the data fed into algorithmic systems. Machine-learning, predictive analytics or algorithms may be too complex for a person to understand, but the data records are not.
A relatively simple set of reforms could vastly increase the ability of entities outside Google and Facebook to determine whether and how the firms’ results and news feeds are being manipulated. There is rarely adequate profit motive for firms themselves to do this — but motivated non-governmental organizations can help them be better guardians of the public sphere.

4. Possibly ban certain content.

In cases where computational reasoning behind search results really is too complex to be understood in conventional narratives or equations intelligible to humans, there is another regulatory approach available: to limit the types of information that can be provided.
Though such an approach would raise constitutional objections in the U.S., nations like France and Germany have outright banned certain Nazi sites and memorabilia. Policymakers should also closely study laws regarding incitement to genocide” to develop guidelines for censoring hate speech with a clear and present danger of causing systematic slaughter or violence against vulnerable groups.

It’s a small price to pay for a public sphere less warped by hatred.

5. Permit limited outside annotations to defamatory posts and hire more humans to judge complaints.

In the U.S. and elsewhere, limited annotations ― rights of reply” ― could be permitted in certain instances of defamation of individuals or groups. Google continues to maintain that it doesn’t want human judgment blurring the autonomy of its algorithms. But even spelling suggestions depend on human judgment, and in fact, Google developed that feature not only by means of algorithms but also through a painstaking, iterative interplay between computer science experts and human beta testers who report on their satisfaction with various results configurations.
It’s true that the policy for alternative spellings can be applied generally and automatically once the testing is over, while racist and anti-Semitic sites might require fresh and independent judgment after each complaint. But that is a small price to pay for a public sphere less warped by hatred.
We should commit to educating users about the nature of search and other automated content curation and creation. Search engine users need media literacy to understand just how unreliable Google can be. But we also need vigilant regulators to protect the vulnerable and police the worst abuses. Truly accountable algorithms will only result from a team effort by theorists and practitioners, lawyers, social scientists, journalists and others. This is an urgent, global cause with committed and mobilized experts ready to help. Let’s hope that both digital behemoths and their regulators are listening.
EDITOR’S NOTE: The WorldPost reached out to Google for comment and received the following from a Google spokesperson.
Search ranking:
Google was built on providing people with high-quality and authoritative results for their search queries. We strive to give users a breadth of content from a variety of sources, and we’re committed to the principle of a free and open web. Understanding which pages on the web best answer a query is a challenging problem, and we don’t always get it right.When non-authoritative information ranks too high in our search results, we develop scalable, automated approaches to fix the problems, rather than manually removing these one-by-one. We are working on improvements to our algorithm that will help surface more high quality, credible content on the web, and we’ll continue to improve our algorithms over time in order to tackle these challenges.
We’ve received a lot of questions about Autocomplete, and we want to help people understand how it works: Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for a wide range of material on the web ― 15 percent of searches we see every day are new. Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we don’t always get it right. Autocomplete isn’t an exact science, and we’re always working to improve our algorithms.
Image search:
Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online. This means that sometimes unpleasant portrayals of subject matter online can affect what image search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs.
Author : Frank Pasquale
Categorized in Search Engine

Did Yandex's new algorithm Palekh just go head to head with Google's RankBrain?

Yandex announced on their Russian blog that they have launched a new algorithm aimed at improving how they handle long-tail queries. The new algorithm is named Palekh, which is the name of a world-famous Russian city that has a firebird on its coat of arms.

The firebird has a long tail, and Yandex, the largest Russian search engine, used that as code name for long-tail queries. Long-tail queries are several words entered into the search box, more often seen in voice queries these days. Yandex says about 100 million queries per day fall under the “long-tail” classification within their search engine.

The Palekh algorithm allows Yandex to understand the meaning behind every query, and not just look for similar words. Which reminds me of Google RankBrain. I asked Yandex if it is similar to Google’s RankBrain, and they said they “don’t know exactly what’s the technology behind Google’s RankBrain, although these technologies do look quite similar.”

Yandex’s Palekh algorithm has started to use neural networks as one of 1,500 factors of ranking. A Yandex spokesperson told us they have “managed to teach our neural networks to see the connections between a query and a document even if they don’t contain common words.” They did this by “converting the words from billions of search queries into numbers (with groups of 300 each) and putting them in 300-dimensional space — now every document has its own vector in that space,” they told us. “If the numbers of a query and numbers of a document are near each other in that space, then the result is relevant,” they added.

When I asked if they are using machine learning, Yandex said they do use machine learning and explained that they teach their “neural network based on these queries will lead to some advancements in answering conversational based queries in the future.” Adding that they “also have many targets (long click prediction, CTR, “click or not click” models and so on) that are teaching our neural network — our research has showed that using more targets is more effective.”

Author : Barry Schwartz

Source : http://searchengineland.com/yandex-launches-new-algorithm-named-palekh-improve-search-results-long-tail-queries-262334

Categorized in Search Engine

Search giant was under fire for search results dominated by neo-Nazi white supremacist sites
Online search giant Google has tweaked an algorithm to ensure that Holocaust denial sites are not the top results when someone asks whether the Holocaust occurred, Digital Trends reported on Sunday.

“We recently made improvements to our algorithm that will help surface more high quality, credible content on the web,” a Google spokesperson told Digital Trends when asked about the topic. “We’ll continue to change our algorithms over time in order to tackle these challenges.”

Google recently faced criticism after journalists noted that the top results for queries about the Holocaust's legitimacy were all links to anti-Semitic, neo-Nazi sites.

Experts have recently said that far-right groups have been using methods to manipulate search algorithms and push their propaganda higher up Google's search rankings.

While Holocaust denial and neo-Nazi sites have not been banned from Google, they are now far less prominent in search results. A link from the neo-Nazi site Stormfront was previously consistently the top result, but now appears far down the first page of results or on the second page – if browsing in "incognito mode", which does not take a user's search history into account when weighing results.

Anti-hate groups have warned of a rise in online incitement this year, with the Anti-Defamation League (ADL) telling the Israeli parliament this year that there has been an "explosion of hate online."

"Online hate is particularly disturbing because of the ubiquity of social media and its deep penetration into our daily lives, plus the anonymity offered by certain platforms which facilitates this phenomenon," ADL CEO Jonathan A. Greenblatt said.

Earlier this year, Google removed an extension on its Chrome browser which allowed users to identify and track suspected Jewish members of the media and entertainment industries.

While it was active, Coincidence Detector identified suspected or confirmed Jews by adding triple parentheses to their names wherever they were referenced online. According to Mic, the extension had been downloaded more than 2,700 times and had a database of 8,700 people at the time it was removed and had a five-star rating.

Source : http://www.i24news.tv/en/news/technology/133654-161227-google-reportedly-tweaks-algorithm-over-holocaust-denying-search-results

Categorized in Search Engine

Between the long-awaited rollout of Penguin 4.0, a strengthening of Google’s mobile-friendly ranking signal and the ‘Possum’ algorithm update impacting local search, 2016 was an interesting year for Google algorithm changes.

And with an upcoming move to a mobile-first search index already on the cards, as well as a penalty for intrusive mobile interstitials coming into effect on the 10th, 2017 promises to be just as eventful.

Looking back at 2016, which algorithm changes were the most impactful for marketers in the industry? And how can brands best prepare themselves for what might be around the corner? I spoke to Sastry Rachakonda and Ajay Rama of digital marketing agency iQuanti, along with Search Engine Watch’s regular mobile columnist Andy Favell, to get their thoughts on what’s to come in the search industry.

The most impactful algorithm updates of 2016

“Mobile-first indexing is probably the most significant change that happened this year,” said Rachakonda, who is the CEO of iQuanti, a data-driven digital marketing agency, “since companies were creating unique mobile content that was not the same as their desktop content. They did that for user experience. There were smaller snippets that were design friendly, but weren’t relevant and optimal for the search query.”

But while Google’s shift to emphasise mobile search even more heavily – which has included a much fuller rollout of Accelerated Mobile Pages into organic search results – was probably its most noteworthy update overall, Rachakonda believes that a different update was actually more impactful from a brand perspective: Possum.

‘Possum’ is the name given to a major update to local search on Google which came into effect on 1st September 2016, and which is thought to be the most significant algorithm update to local search since Pigeon in 2014. The name was coined by Phil Rozek of Local Visibility System, who thought it was fitting as after the update, many business owners thought that their Google My Business listings were gone, when in fact they were only filtered – hence, ‘playing possum’.

A possum lies with its mouth open on the ground, appearing to be dead.

The apparently ‘dead’ Google My Business listings gave the Possum algorithm update its name. Image via Wikimedia Commons

The update seemed mostly aimed at improving the quality of the local search results and removing spammy listings, which meant that some businesses who had engaged in less-than-kosher practices in order to rank found themselves demoted.

“Possum has been the most impactful update for brands by far,” said Rachakonda. “One of our Fortune 500 clients in the insurance industry saw a 7% drop in keyword rankings, which resulted in a 13% loss of month-on-month traffic. We believe this was due to some outdated tactics their previous agency used to get them ranked, which clearly Google wasn’t fond of.”

While some businesses saw their traffic drop off as a result of Possum, others were seeing a remarkable recovery thanks to Penguin 4.0, which deployed after much anticipation in late September. The original Penguin update in 2012 targeted and devalued inorganic links, such as links which had been bought or placed solely to improve rankings, which led to significant losses in traffic for businesses who had engaged in those practices.

As Chuck Price explained in an article for Search Engine Watch in December 2015, “After Penguin, bad links became ‘toxic’, requiring a link audit and removal or disavow of spammy links. Even then, a Penguin refresh was usually required before one could see any signs of recovery.”

But thanks to the Penguin 4.0 update in 2016, these refreshes now take place in real-time, leading to significant recovery for brands who had already taken action to remove and disavow the bad links. Marcela De Vivo took a look at how this recovery works in practice, and what site owners can do to improve their situation if they haven’t already done so. 

What’s on the cards for 2017?

As I mentioned in my introduction, at least two updates in 2017 are already certain, both of them relevant to mobile search. One, Google’s penalty for mobile sites with annoying interstitials, is due to go live tomorrow, and our search news roundup last Friday featured some new clarifications from Google about what kind of interstitials will be affected by the penalty.

The other is Google’s move to a mobile-first search index, a major shift which reflects the fact that the majority of Google search queries are now coming from mobile devices. While we don’t yet have a date for this change, Google confirmed in October that the change would take place within the next few months, which means that Google’s primary index could switch to mobile any day now, and brands would do well to prepare themselves. I asked Andy Favell, Search Engine Watch’s resident mobile specialist, what advice he would give to brands who want to be prepared.

“Google has done an excellent job of focusing companies’ minds on the importance of having a mobile-friendly website. The stick approach – the fear of harming the search ranking – has worked wonders for driving adoption of mobile or mobile-friendly sites.“However, companies should have been focusing on the carrot – building websites that would appeal to some of the billions of mobile web users out there. The beauty of mobile first is that a mobile-friendly site is often a much better desktop site. That is still true today.“Rather than worrying about trying to make Google happy, brands should concentrate on the mobile users, consider who they are, their context, and what they want, and provide that the best possible way – i.e. intuitive, fast-loading, good UX and usability. Businesses that do this will get more traffic, more happy users and more conversions.“That’s not just good for business, it’s good for your search ranking also. Because Google wants what’s best for the search user.”

A white iPhone held in a person's hand against a blurry backdrop. The lock screen displays the time as 12:23 on Monday, July 29.

Brands will need to prepare themselves for mobile search becoming Google’s primary index some time soon in 2017.

Those are the changes we know about so far. But what do those in the industry think is coming for search in 2017? Ajay Rama, Senior Vice President of Product at iQuanti, believes that the mobile-first index will take up most of SEO mindshare over the coming year, but he also has a number of predictions for how voice search – which has become a huge part of the search landscape since 2015 – may evolve and change things.

“As voice search starts becoming mainstream, we might see the beginning of a SERPless search – search without a SERP page,” predicts Rama. “We could see early tests in this space where we will see Google Assistant and search being seamlessly integrated into an interactive search experience. Assistant interacts with the user to ask the right questions and take him to the target page or a desired action, instead of showing a SERP page with various options. In this new experience, ads would have to be reinvented all over.”

Given that Google’s innovations of the past few years, from semantic search to Quick Answers, have increasingly been geared towards understanding users’ exact intentions with the aim of finding a single, ideal result or piece of information to satisfy their query, it’s not hard to imagine this happening. Rama also foresees a much more extensive rollout of Google’s voice-controlled Assistant to go along with this.

“Google Assistant will become part of Android, and will be available on all Android devices. Talking to the device in local languages becomes mainstream, and Google Assistant will lead this space. Their machines will learn all accents and all languages, and will soon become a leader in the voice devices, especially in non-English speaking nations.”

A Google search results page for 'voice search'.

Can you imagine Google search without the SERP? With the expansion of voice search, it could become a reality.

While it’s hard to imagine that all of these developments will take place in 2017 alone, there’s definitely a possibility that we’ll see them begin. Google Assistant is already reported to be learning Hindi as a second language, and more languages could well follow if the uptake of Hindi is a success. However, Google Assistant is fairly late to the game compared to established voice assistants like Siri and Cortana who have been around much longer, and have had more time to refine their technology. So is it still possible for Google to pull ahead in this race?

With this change in the way we search comes a change in the way we market, as well; and if the search results page is to disappear one day, advertising will have no choice but to colonise whatever takes its place. We’re already seeing a shift towards an ‘always-on’, always-connected culture with devices like the Amazon Echo constantly listening out for voice commands. Rama believes that Internet of Things-connected devices could easily start to ‘spy’ on their owners, collecting data for the purposes of marketing – “Advertisers would love to get into living room and dinner table discussions.”

This might seem like sobering food for thought, or a whole new world of possibilities, depending on your perspective. Either way, it will be extremely interesting to see whether search continues to develop along the path it seems to be taking now – or whether it veers off in other, even more unexpected directions.

Author : Rebecca Sentance

Source : https://searchenginewatch.com/2017/01/09/which-google-algorithm-changes-impacted-marketers-most-in-2016-and-what-can-we-expect-from-2017/

Categorized in Search Engine

Google says it is “thinking deeply” about improving its search results after learning that Holocaust deniers and others were successful in making their links rise to the top.

The company, a subsidiary of Alphabet, told the BBC that it is thinking about ways to improve its search results.

The focus on skewing Google’s algorithm comes as there’s more pressure on internet firms to do more to combat fake news and conspiracy theory sites. Given that many people get news and information from Google, the company’s algorithm is of particular focus. Last week, Facebook announced it was partnering with third-party fact checkers, including news organizations, to fight fake news.

In the U.S. and the United Kingdom, those searching for “Did the Holocaust happen?” received a top result linking to a website with the headline, “Top Ten Reasons why the Holocaust didn’t happen.” The site is run by Stormfront, a neo-Nazi white supremacist group.

Google changed the ranking for U.S. users. Now the link to the Holocaust denier is the second result, after an ad and three news stories about Google’s struggles with the issue.

But in the United Kingdom, the Holocaust denier’s site is still the top result, the BBC said.

The internet giant has struggled with its position that its algorithm is surfacing the best content on the web and those who are able to skew the algorithm:

This is a really challenging problem, and something we’re thinking deeply about in terms of how we can do a better job….Search is a reflection of the content that exists on the web….The fact that hate sites may appear in search results in no way means that Google endorses these views.

Danny Sullivan of Search Engine Land wrote that it’s likely that Holocaust deniers and others are figuring out how to game Google’s system to bring their own results to the top of the search results.

The challenge for Google, he said, is to find a systemic fix, rather than just respond to one-off discoveries of misinformation:

It’s very easy to take a search here and there and demand Google change something and then the next day you find a different search and say, ‘why didn’t you fix that?’

The BBC also found that so-called “snippets,” short summaries of information that appear at the top of search results, also can be gamed.

Above: Google search page.

Author : 

Source : http://www.siliconbeat.com/2016/12/20/google-moves-past-denial-comes-skew-algorithm/

Categorized in Search Engine

Google's search algorithm has been changed over the last year to increasingly reward search results based on how likely you are to click on them, multiple sources tell Business Insider.

As a result, fake news now often outranks accurate reports on higher quality websites.

The problem is so acute that Google's autocomplete suggestions now actually predict that you are searching for fake news even when you might not be, as Business Insider noted on December 5.

There is a common misconceptionthat the proliferation of fake news is all Facebook's fault. Although Facebook does have a fake news problem, Google's ranking algorithm does not take cues from social shares, likes, or comments when it is determining which result is the most relevant, search experts tell Business Insider. The changes at Google took place separately, experts say, to the fake news problem occurring on Facebook.

The changes to the algorithm now move links up Google's search results page if Google detects that more people are clicking on them, search experts tell Business Insider.

Joost De Valk, founder of Yoast, a search consultancy that has worked for The Guardian, told Business Insider: "All SEOs [search engine optimisation experts] agree that they include relative click-through rate (CTR) from the search results in their ranking patterns. For a given 10 results page, they would expect a certain CTR for position five, for instance. If you get more clicks than they’d expect, thus a higher CTR, they’ll usually give you a higher ranking and see if you still outperform the other ones," 

Search marketing consultant Rishi Lakhani said: "Though Google doesn't like to admit it, it does use CTR (click through rate) as a factor. Various tests I and my contemporaries have run indicate that. The hotter the subject line the better the clicks, right?"

It is well known that Google includes user-behaviour signals to evaluate its ranking algorithms. Google has an obvious interest in whether users like its search results. Its ranking engineers look at live traffic frequently to experiment with different algorithms. User behavior signals have the added advantage of being difficult to model, or reproduce, by unscrupulous web publishers who want to game the algorithm. 

The unfortunate side effect is that user-behaviour signals also reward fake news. Previously, Google's ranking relied more heavily on how authoritative a page is, and how authoritative the incoming links to that page are. If a page at Oxford University links to an article published by Harvard, Google would rank that information highly in its display of search results.

Now, the ranking of a page can also be boosted by how often it's clicked on even if it does not have incoming links from authoritative sources, according to Larry Kim, founder and chief technology officer of WordStream, a search marketing consultancy.

The result of all this is that "user engagement" has become a valuable currency in the world of fake news. And people who believe in conspiracy theories — the kind of person who spends hours searching for "proof" that Hillary Clinton is a child abuser, for instance — are likely to be highly engaged with the fake content they are clicking on.

Thus even months after a popular fake news story has been proven to be fake, it will still rank higher than the most relevant result showing that it's false, if a large enough volume of people are clicking on it and continuing to send engagement signals back to Google's algorithm.

Here some examples.

President Obama has never signed an order banning the US national anthem, and yet ...

Obama_signs_a_nationwide_order_ _Google_Search

And Hillary Clinton has never sold weapons to ISIS, but ...

Hillary_Clinton_sold_weapons_to_ISIS_ _Google_Search

De Valk says: "I think the reason fake news ranks is the same reason why it shows up in Google’s autocomplete: they’ve been taking more and more user signals into their algorithm. If something resonates with users, it’s more likely to get clicks. If you’re the number three result for a query, but you’re getting the most clicks, Google is quick to promote you to #1 ... Nothing in their algorithm checks facts."

Google never explains in full how its algorithm works but a spokesperson for the company said:

"When someone conducts a search, they want an answer, not trillions of webpages to scroll through. Algorithms are computer programs that look for clues to give you back exactly what you want. Depending on the query, there are thousands, if not millions, of web pages with helpful information. These algorithms are the computer processes and formulas that take your questions and turn them into answers. Today Google’s algorithms rely on more than 200 unique signals or 'clues' that make it possible to show what you might really be looking for and surface the best answers the web can offer."

Larry Kim, founder and chief technology officer of WordStream, a search marketing consultancy, tracks the changes to October 2015, when Google added a machine-learning program called "RankBrain" to the list of variables that control its central search algorithm. It deployed the system to all search results in June of this year.

Is this machine learning's fault? "I’m certain of this," Kim said. "This is the only thing that changed in the algorithm over the last year."

Rankbrain is now the third most-important variable in Google's search algorithm, according to Greg Corrado, a senior research scientist at Google.

The change was intended to help Google make more intelligent guesses about the 15% of new daily search queries that Google has never encountered before. RankBrain considers previous human behaviour, such as the historical popularity of similar searches, in its attempt to get the right answer.

Kim told us: "The reason why they did this was not to create fake news. They created this because links can be vulnerable, links can be gamed. In fact it is so valuable to have these number one listings for commercial listings that there’s a lot of link fraud. User-engagement [looking at how popular a search result is] sees through attempts to inflate the value of content." 

Kim's opinion is disputed by his peers. Lakhani doubts that RankBrain is the sole cause of the proliferation of fake news. "It's been tested on and off for a while," he says.

De Valk is not so sure either. "I'm not sure it's related to that. It might be, but I'm not sure. Google does hundreds of updates every year," he told Business Insider.

Naturally, the type of content that is more likely to get clicked on is also more likely to get shared, commented on, and liked on Facebook. And Facebook and Google both reward engagement (or popularity, which gives off similar signals). That pushes an item higher on the newsfeed in Facebook's case, and on the search results page in Google's case. The performance of fake news on Google is correlated to its performance on Facebook because they both deal in the same currency — user engagement. So what does well on Facebook often does well on Google. 

This is correlation, not causation.

But the fact that the two of them are occurring at the same time exacerbates the high-level presence of fake news generally. Google and Facebook dominate the amount of time people spend online and inside apps.

The changes at Google help explain why fake news has suddenly gone from circulating in a tiny corner of the internet to something that outperforms real newsinfluences Google's predictive search, and has real world consequences — such as the Comet Ping Pong shooting, done by a man who was convinced from his internet searches that Clinton was using the restaurant as a front for a child abuse ring. Nearly 50% of Trump voters believe the same thing, according to research by Public Policy Polling.

Author:  Hannah Roberts

Source:  http://www.businessinsider.com/

Categorized in Search Engine

It’s a zoo out there. For marketers and advertisers alike, it’s hard to control your organic search rankings with all the Google updates; Pandas, Penguins, Pigeons and Hummingbirds to keep track of. We know it’s important, but why?

In a 2015 study done by Eli Swartz, Google dominated other search engines like Yahoo, Bing and Duck Duck Go by a landslide. Seventy-five percent of responders to a survey stated that Google was their primary search engine, and in 2016 the percentage is only rising.

So what does that mean for local and organic search, or search in general? For one, it means that SEO has to continue to abide by the rules each algorithm puts forth during their updates. It also means search engine marketers have to take into account Google’s new local algorithm update, Possum. Here’s why.

About Possum

You may not have seen a drastic change in your local organic listings in September, but a recent study shows that Google’s Possum Algorithm changed sixty-four percent of local SERPs. The debut of Possum in September impacted website rankings in the local 3-pack and Local Finder. The biggest impact Possum currently has on search results is filtering your business out if it has a duplicate, similar or second listing. This update runs separately from organic SERPs, and affects the following types of businesses:

  1. Businesses outside of city limits.
  2. Separate businesses location at the same address as a similar business.
  3. Two or more businesses owned by the same company.

Playing Possum Outside of City Limits

One of the biggest and most beneficial changes seen for the new Possum update are rankings for businesses outside of their own city limits. With the algorithm in place, businesses that are attempting to rank in the local 3-pack or Local Finder for the town over are having an easier time doing so, and may have already seen rankings in those areas increase drastically. With Possum in place, there is a need now more than ever for all search engine marketers to do more to go local with their SEO campaigns. Going local with your SEO will help to increase traffic and revenue to your site, as well as improve local rankings and authority in the SERPs.

Separate Business Locations at the same Address as a Similar Business

This is where Possum comes into full-effect. Businesses of similar industries located in the same building will begin to be filtered out and ultimately won’t show up for the same search. Keyword variation plays a more important role as this algorithm continues to make its way across search. That means that attorneys, lawyers, dentists, and chiropractors located in the same building will rank locally for different keywords than a similar or competitive businesses at the same location.

One Algorithm, One Parent Company and Two Businesses

Although not as apparent as the other two listings, Possum’s update has also affected separate businesses owned by one company. One business remains filtered out in all searches for certain keyword terms while the other business continues to show up in the search results. While there isn’t a way around the filter (yet), we’re trusting Google to test, tweak and update their newest algorithm to differentiate the two businesses as separate rooftops, even if they have the same parent company.

Streams Kick Start Step: If you haven’t already invested in link building, you should start. Google still views quality links that point to your website as a vote of confidence. The more local, quality and authoritative links you have pointing back to your site, the more likely you are to rank in the local SERPs.

Taking Action

While Google seems to still be working out and testing their newest algorithm, it’s always a good idea to stay ahead of the next update. In order to take action, one thing that is extremely beneficial for your local SEO strategy is to start incorporating local content and putting a heavy focus on off-page SEO if you haven’t already. When incorporating local and off-page SEO, you have a recipe for success to maintain and improve your local rankings.

Take SEO a step further. Download this FREE SEO Checklist to learn 17 ways to improve your organic search results, where to start with a link building strategy and steps to integrating content into your SEO campaign.

Source : http://www.business2community.com/

Auhtor : Keisha James

Categorized in Search Engine
Page 4 of 6

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar - GET 70% OFF FOR MEMBERS ONLY      Register Now