fbpx

Updates related to operating systems (OS), iOS apps and android apps are very common. Consumers expect it, anticipate it and often demand it. This is great, because in world where needs and wants are constantly evolving, the ability to improve and make changes on the fly are essential. Since more products are living online or connected to the internet, updates are required. There is one company whose updates can put businesses on edge and greatly impact websites. Google algorithm changes can seemingly come out of nowhere and recovery times if penalized can take a while.

A Google algorithm change can be happening right now but due to the subtleness, is not noticeable to the average web surfer. However, when big updates happen like Google Panda, Google Hummingbird or Google Penguin, businesses should check to see that their website isn’t affected. Moz details larger algorithm updates in-depth if you would like to dig deeper into Google algorithm changes.

Google Penguin 4.0 and Impact

Recently Google released a new algorithm update to Google Penguin. Google Penguin 4.0 has the same core function of the original Penguin algorithm. Originally Google Penguin was rolled out to address the growing practice of generating links unnaturally. These links often came from spammy sites that Google is actively working to eliminate.

 

Google values user experiences above all and will penalize websites who are in violation. Google penalties are legendary and tend to have a seemingly lasting impact. These penalties result in loss of keyword rankings on search engine pages. This is a very big deal, as an important keyword your company once ranked on and drove traffic and/or sales could be lost. By ‘lost, we mean that Google could greatly hinder your chances of ranking on a keyword like “big screen tv” or “consulting service” if penalized.

SEM Rush explains historically once a site was penalized by Google it could take months or even years to recover. Google Penguin had a lasting impact as some websites waited years for a Google Penguin Recovery after making site changes. This was because larger Google algorithm changes and updates were rather infrequent and websites could only recover after a new update.

This delay could mean missing out on valuable prospects due to your site falling in the rankings. Now Google Penguin 4.0 is focusing on maintaining the integrity of websites while also increasing the speed at which website can recover from any potential penalty.

Penalties From Google

Previously, when a site received a Google penalty it impacted the entire website. Now Google Penguin 4.0 makes the penalties much more targeted. This means that opposed to the entire site being impacted it could just impact a specific page. Therefore, it is more important to regularly check subdomains and individual web pages to ensure that no drastic changes have happened. This isn’t uncommon as you aren’t notified when you are impacted – you just have to look for the signs.

Signs may vary but some of the signs are significant sudden drops in organic keywords or/and keywords dropping in position. If you notice any of scenarios happening to you chances are your website or webpage may have been hit by Google Penguin. These checks can be performed using sites like Google Analytics and SEM Rush. Once you identify bad links you want to remove or disavow those links. If you can’t remove the links, Google explains the disavow process here. Once these changes are made your Google Penguin Recovery will be much quicker.

 

The biggest benefit of this Google algorithm change is the speed the penalty can be resolved or even implemented. Previously when Google Penguin penalized a website for spammy links, it could take a while before Google ran the algorithm again to see if the problem had been corrected. Now due to the real time aspect of this algorithm a Google Penguin Recovery can occur fairly quickly.

In addition the opposite is true as a violation can be discovered and a web page can be impacted.

The easiest way to protect yourself is to perform regular site audits to make sure that any backlink that you receive is credible. While Google rewards quality links a bad link can slow down all of the hard work you put into building your online presence.

Things to Keep in Mind

In conclusion here are a few points to remember when thinking about Google Penguin.

  • The goal is to ensure a good user experience so you should always write for your web visitors. If you are doing this and making sure your links are credible you greatly minimize the chance of getting hit by Google Penguin 4.0.
  • Penguin is in real time meaning that it is vitally important to keep track of any backlinks and monitor your web pages. Previously the whole site would be affected by Penguin but that currently isn’t the case. While it’s great to make changes and see Google quickly stop penalizing your web page you must remember the inverse. This means a if spammy sites began linking to your site you could be penalized.
  • Use tools such as Google’s Disavow Tool or SEM Rush’s Backlink Audit to eliminate any spammy backlinks. You can also consider speaking to a professional in regards to performing a site audit. The important thing is to regularly check site traffic for any strange fluctuations.

Author:  Ray Bennett

Source:  http://www.business2community.com/

Categorized in Search Engine

What should SEOs do to make the best of the new Penguin update? Perhaps not much. Columnist Dave Davies notes that while Penguin 4.0 was indeed significant, things ultimately haven't changed that much.

For the last four-plus years now, we’ve heard a lot about Penguin. Initially announced in April 2012, we were told that this algorithm update, designed to combat web spam, would impact three percent of queries.

More recently, we’ve witnessed frustration on the part of penalized website owners at having to wait over a year for an update, after Google specifically noted one was coming “soon” in October of 2015.

In all the years of discussion around Penguin, however, I don’t believe any update has been more fraught with confusing statements and misinformation than Penguin 4.0,the most recent update. The biggest culprit here is Google itself, which has not been consistent in its messaging.

And this is the subject of this article: the peeling away of some of the recent misstated or just misunderstood aspects of this update, and more importantly, what it means for website owners and their SEOs.

So, let’s begin.

What is Penguin?

Note: We’re going to keep this section short and sweet — if you want something more in-depth, you should begin by reading Danny Sullivan’s article on the initial release of Penguin, “Google Launches ‘Penguin Update’ Targeting Webspam In Search Results.” You can also browse Search Engine Land’s Penguin Update section for all the articles written here on the topic.

The Penguin algorithm update was first announced on April 24, 2012, and the official explanation was that the algorithm targeted web spam in general. However, since the biggest losses were incurred by those engaged in manipulative link schemes, the algorithm itself was viewed as being designed to punish sites with bad link profiles.

I’ll leave it at that, with the assumption that I shouldn’t bore you with additional details on what the algorithm was designed to do. Let’s move now to the confusion.

Where’s the confusion?

Until Penguin 4.0 rolled out on September 23, 2016, there really wasn’t a lot of confusion around the algorithm. The entire SEO community — and even many outside it — knew that the Penguin update demoted sites with bad links, and it wasn’t until it was next updated that an affected site could expect some semblance of recovery.

The path was clear: a site would get hit with a penalty, the website owner would send out requests to have offending links removed, those that couldn’t be removed would be added to a disavow list and submitted, and then one would simply wait.

 

However, things got more complicated with this most recent update — not because the algorithm itself got any more difficult to understand, but rather because the folks at Google did.

In essence, there were only a couple of major changes with this update:

  1. Penguin now runs in real time. Webmasters impacted by Penguin will no longer have to wait for the next update to see the results of their improvement efforts — now, changes will be evident much more quickly, generally not long after a page is recrawled and reindexed.
  2. Penguin 4.0 is “more granular,” meaning that it can now impact individual pages or sections of a site in addition to entire domains; previously, it would act as a site-wide penalty, impacting rankings for an entire site.

It would seem that there isn’t a lot of room for confusion here on first glance. However, when the folks at Google started adding details and giving advice, that ended up causing a bit of confusion. So let’s look at those to get a better understanding of what we’re expected to do.

Disavow files

Rumor had it, based on statements by Google’s Gary Illyes, that a disavow file is no longer necessary to deal with Penguin-related ranking issues.

This is due to a change in how Penguin 4.0 deals with bad links: they now devalue the links themselves rather than demoting the site they’re linking to.

Now, that seems pretty clear. If you read Illyes’ statements in the article linked above, there are a few takeaways:

  1. Spam is devalued, rather than sites being demoted.
  2. There’s less need to use a disavow file for Penguin-related ranking penalties.
  3. Using the disavow file for Penguin-related issues can help Google help you, but it is more specifically useful for sites under manual review. 

So now we have a “yes, you should use it for Penguin” and a “no, you don’t need it for Penguin.” But wait, it gets more fun. On October 4, 2016, Google Webmaster Trends Analyst John Mueller stated the following in an Office Hours Hangout:

[I]f these are problematic links that are affected [by Penguin], and you use a disavow file, then that’s a good way for us to pick that up and recognize, “Okay, this link is something you don’t want to have associated with this site.” So when we recrawl that page that is linking to you, we can drop that link from our link graph.

With regards to devaluing these low quality links instead of punishing you, in general we try to figure out what kind of spammy tactics are happening here and we say, “Well, we’ll just try to ignore this part with regards to your website.”

So… clear as mud?

The disavow takeaway

The takeaway here is that the more things change, the more they stay the same. There is no change. If you’ve used unethical link-building strategies in the past and are considering submitting a disavow file — good, you should do that. If you haven’t used such strategies, then you shouldn’t need to; if Google finds bad links to your site, they’ll simply devalue them.

 

Of course, it was once also claimed that negative SEO doesn’t work, meaning a disavow wasn’t necessary for bad links you didn’t build. This was obviously not the case, and negative SEO did work (and may well still), so you should be continuing to monitor your links for bad ones and adding them to your disavow file periodically. After all, if bad links couldn’t negatively impact your site, there would be no need for a disavow at all.

And so, the more things change, the more they stay the same. Keep doing what you’ve been doing.

The source site?

In a recent podcast over on Marketing Land, Gary Illyes explains that under Penguin, it’s not the target site of the link that matters, it’s the source. This doesn’t just include links themselves, but other signals a page sends to indicate that it’s likely spam.

So, what we just were informed is that the value of a link comes from the site/page it’s on and not where it’s pointing. In other words, when you’re judging your inbound links, be sure to look at the source page and domain of those links.

The more things change, the more they stay the same.

Your links are labeled

In the same podcast on Penguin, it came to light that Google places links on a page into categories, including things like:

  • footer;
  • Penguin-impacted; and
  • disavowed.

It was suggested that there are other categories, but they weren’t named. So, what really does this mean?

It means what we all pretty well knew was going on for about a decade. We now have a term to use to describe it (“labels”) rather than simply understanding that a page is divided into sections, and the sections that are the most visible and more likely to be engaged with hold the highest value (with regard to both content and links).

Additionally, we already knew that links that were disavowed were flagged as such.

 

There is one new side

The only really new piece of information here is that either Google has replaced a previous link weighting system (which was based on something like visibility) with a labeling system, or they have added to it. Essentially, it appears that where previously, content as a whole may have been categorized and links included in that categorization, now a link is given one or possibly multiple labels.link-labels.jpg

So, this is a new system and a new piece of information, which brings us to…

The link labeling takeway

Knowing whether the link is being labeled or simply judged by its position on the page — and whether it’s been disavowed or not — isn’t particularly actionable. It’s academically interesting, to be sure, and I’m certain it took Google engineers many days or months to get it figured out (maybe that’s what they’ve been working on since last October). But from an SEO’s perspective, we have to ask ourselves, ”What really changed?”

Nothing. You will still be working to develop highly visible links, placed contextually where possible and on related sites. If this strays far from what you were doing, you likely weren’t doing your link building correctly to begin with. I repeat: the more things change, the more they stay the same.

But not Penguin penalties, right? Or… ?

It turns out that Penguin penalties are treated very differently in 4.0 from the way they were previously. In a discussion with Google’s Gary Illyes, he revealed that there is no sandbox for a site penalized by Penguin. 

So essentially, if you get hit with a Penguin penalty, there is no trust delay in recovery — once you fix the problem and your site is recrawled, you’d bounce back.

That said, there’s something ominous about Illyes’ final tweet above. So Penguin does not require or impose a sandbox or trust-based delay… but that’s not to say there aren’t other functions in Google’s algorithm that do.

 

So, what are we to conclude? Avoid penalties — and while not Penguin-related, there may or may not be delays in recovering from one. Sound familiar? That’s because (surely you can say it with me by now)…

The more things change, the more they stay the same

While this was a major update with a couple of significant changes, what it ultimately means is that our SEO process hasn’t really changed at all. Our links will get picked up faster (both the good and the bad), and penalties will likely be doled out and rolled back much more reliably; however, the links we need to build and how they’re being weighted remain pretty much the same (if not identical). The use of the disavow file is unchanged, and you should still (in my opinion) watch for negative SEO.

The biggest variable here comes in their statement that Penguin is not impacted by machine learning: 

I have no doubt that this is currently true. However, now that Penguin is part of the core algorithm — and as machine learning takes on a greater role in how search engines rank pages — it’s likely that it will eventually begin to control some aspects of what are traditionally Penguin algorithms.

But when that day comes, the machines will be looking for relevancy and maximized user experience and link quality signals. So the more you continue to stay focused on what you should be doing… the more it’ll stay the same.

Source : searchengineland

Categorized in Search Engine

Google's manual action team has access to look at a page's link labels and may use that to dig deeper into the site's activities.

In the A conversation with Google’s Gary Illyes (part 1) podcast at Marketing Land, our sister site, we learned that Google adds labels to your links. These labels can add classifications or attributes to the link, including whether the link is a footer link, whether it’s impacted by the latest Penguin update, whether it’s disavowed or other categorizations. A link can have multiple labels that make up the value and meaning of that link, which ultimately helps Google determine how to rank the related documents on the web.

Google’s manual actions team may look at these labels to determine if they should dig deeper into the site’s links and add a manual action penalty to the site or not. Although, Illyes added, he doesn’t do much work with that specific team, so he isn’t too aware of their daily processes.

In addition, Illyes listed three types of labels one might find on a link: “Penguin real time,” which would be the new Penguin algorithm; “footer” links, which would help Google determine how important the link is — i.e., it being in the footer versus the main content; and “disavow” — so if a link is in the disavow file, it will also be labeled as such for that site. There are many more link labels, but he only shared these three.

 

Barry Schwartz: Is there some sort of flag that happens automatically. So that the manual action team is notified that, hey, there is a devalue going on here by Penguin?

Gary Illyes: I don’t work much with the manual actions team, but to the best of my knowledge, there is no flag that said that they have; they can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labeled, then they might actually take a deeper look and see what the content owner is trying to do.

Danny Sullivan: You were talking about how Penguin is looking and identifying things and to think of it as part of like a link label. So, like Google is looking at things like, OK, I know this is a footer link or this is a content link or this is a Penguin link. So tell us more about that.

Gary Illyes: So, if you think about it, there are tons of different kinds of links on the internet. There are footer links, for example. There are Penguinized links, and all of these kinds of links have certain labels internally attached to them, basically for our own information. And if the manual actions team is reviewing a site for whatever reason, and they see that most of links are labeled as Penguin real-time affected, then they might decide to take a much deeper look on the site and see what’s up with those links and what could be the reason those links exist — and then maybe apply a manual action on the site because of the links.

Barry Schwartz: Which is why Google still recommends you disavow links, because I guess if a manual action team sees all these labels on it, and they look at the disavow file, and they say, hey this SEO or this webmaster [is] aware of these links and they’re going ahead and trying to take no responsibility… Would these labels show up if the links are disavowed, or they probably wouldn’t?

 

Gary Illyes: So disavow is again, basically, just a label internally. It’s applied on the links and anchors. And then you can see that, as well. Basically, you could have like a link from, I don’t know, WhiteHouse.gov, and it has labels Penguin RT, footer and disavow. And then they would see that — they would know that someone or the webmaster or content owner is actively tackling those links.

You can listen to part one of the interview at Marketing Land.

Source : searchengineland

Categorized in Search Engine

Google is often criticized for how it handles spammy links, but columnist Ian Bowden believes this criticism may be unfair. Here, he takes a look at the challenges Google might face in tackling the ongoing issue of paid links.

Prior to the recent arrival of Penguin 4.0, it had been nearly two years since Penguin was last updated. It was expected to roll out at the end of 2015, which then became early 2016. By the summer, some in the industry had given up on Google ever releasing Penguin 4.0. But why did it take so long?

I’d argue that criticism directed at Google is in many cases unjustified, as people often take too simplistic a view of the task at hand for the search engine.

Detecting and dealing with paid links is a lot harder than many people think, and there are likely good reasons why Google took longer than hoped to release the next iteration of Penguin.

Here are some of the challenges Google may have faced in pushing out the most recent Penguin update:

1. It has to be effective at detecting paid links

To run and deploy an effective Penguin update, Google has to have the ability to (algorithmically and at scale) determine which links violate guidelines. It’s not clear the extent to which Google is capable of this; there are plenty of case studies which show that links violating the guidelines continue to work.

However, not all paid links are created equal.

Some paid links are obviously paid for. For instance, they may have certain types of markup around them, or they may be featured within an article clearly denoted as an advertorial.

On the other hand, some links may have no telltale signs on the page that they are paid for, so determining whether or not they are paid links comes through observing patterns.

The reality is that advanced paid linking strategies will be challenging for Google to either devalue or penalize.

Penguin has historically targeted very low-quality web spam, as it is easier to distinguish and qualify, but a level above this is an opportunity. Google has to have confidence in its capability before applying a filter, due to the severity of the outcome.

 

2. Google is still dependent on links for the best quality search results

Maybe, just maybe, Google is actually capable of detecting paid links but chooses not to devalue all of them.

Most people will be familiar with third-party tools that perform link analyses to assess which links are “toxic” and will potentially be harming search performance. Users know that sometimes these tools get it wrong, but generally they’re pretty good.

I think it is fair to assume that Google has a lot more resources available to do this, so in theory they should be better than third-party tools at detecting paid links.

Google has experimented with removing links from their index with negative consequences for the quality of search results. It would be interesting to see the quality of search results when they vary the spammy link threshold of Penguin.

It’s possible that even though certain links are not compliant with webmaster guidelines, they still assist Google in their number one goal of returning users the best quality search results. For the time being, they might still be of use to Google.

3. Negative SEO remains a reality

If Google is sure that a link has been orchestrated, it is very difficult for the search engine to also be sure whether it was done by the webmaster or by someone else executing a negative SEO campaign.

If a penalty or visibility drop were as easy to incur from a handful of paid links, then in theory, it would be pretty straightforward to perform negative SEO on competitors. The barriers to doing this are quite low, and furthermore, the footprint is minimal.

Google has tried to negate this problem with the introduction of the disavow tool, but it is not realistic to think all webmasters will know of this, let alone use the tool correctly. This is a challenge for Google in tackling paid links.

 

4. It provides a PR backlash and unwanted attention

When rolling out large algorithm updates, it’s inevitable that there will be false positives or severe punishments for small offenses. After any rollout, there will be a number of “adjustments” as Google measures the impact of the update and attempts to tweak it.

Despite that, a large number of businesses will suffer as a result of these updates. Those who regularly join Google Webmaster Hangouts will be used to business owners, almost in tears, discussing the devastating impact of a recent update and pleading for more information.

While the vast majority of Google users will most likely never be aware of or care about the fallout of algorithm updates, these situations do provide Google with some degree of negative PR. Any noise that points toward Google yielding too much power is unwanted attention.

On a related note, sometimes penalties are just not viable for Google. When someone walks down Main Street, they expect to see certain retailers. It’s exactly the same with search results. Users going to Google expect to see the top brands. The user doesn’t really care if a brand is not appearing because of a penalty. Users will hold it as a reflection on the quality of Google rather than the brand’s non-compliance with guidelines.

To be clear, that’s not to say that Google never penalizes big brands — JCPenneySprintthe BBCand plenty of other large brands have all received high-profile manual penalties in the past. But Google does have to consider the impact on the user experience when choosing how to weight different types of links. If users don’t see the websites they expect in search results, the result could be switching to another search engine.

This is how Google deals with the problem

The above four points highlight some of the challenges Google faces. Fewer things are more important than meeting its objective of returning the most useful results to its users, so it has a massive interest in dealing with paid links.

Here are some ways Google could address the challenges it faces:

 

1. Prefer to devalue links and issue fewer penalties

Penalties act as a deterrent for violating guidelines, and they serve to improve the quality of search results by demoting results that were artificially boosted. A lot of the risk of “getting it wrong” can simply be mitigated through devaluing links algorithmically, rather than imposing manual penalties.

In the instance of a negative SEO attack, the spammy links, instead of causing a penalty for a website, could simply not be counted. In theory, this is the purpose of a disavow file. Penalties could be saved for only the most egregious offenders.

The fact that Penguin now runs in real time as part of the core ranking algorithm suggests that this is the direction they are heading in: favoring the devaluation of spammy links through “algorithmic” penalties (which websites can now recover from more quickly), and manual penalties only being applied for serious offenses.

2. Do a slow rollout combined with other updates

Slowly rolling out the Penguin 4.0 update provides Google two advantages. First, it softens the blow of the update. There is not one week when suddenly some large profile brands drop visibility, drawing attention to the update.

Second, it allows Google to test the impact of the update and adjust over time. If the update is too harsh, they can adjust the parameters. Penguin 4.0 may take several weeks to roll out.

To add to the confusion and make it more difficult to understand the impact of Penguin 4.0, it is probable Google will roll out some other updates at the same time.

If you cast your memory back two years to the introduction of Panda 4.1 and Penguin 3.0, they were rolled out almost in conjunction. This made it more difficult to understand what their impacts were.

There was a lot of SERP fluctuation this September. It is possible part of this fluctuation can be attributed to Penguin 4.0 testing, but there is no certainty because of the amount of other updates occurring (such as the local update dubbed “Possum“).

 

3. Encourage a culture of fear

Even if the risk of receiving a penalty is the same now as it was five years ago, the anxiety and fear of receiving one is much greater among brands. High-profile penalties have not only served their function of punishing the offending brand, but they also have provided a great deterrent to anyone else considering such a strategy.

The transition to content marketing and SEO becoming less of a black box assisted in this, but this culture of fear has been a large driver in the reduction of paid link activity.

Final thoughts

Google is often criticized for not doing more to tackle paid links, but I think that criticism is unfair. When one considers the challenges search engines face when tackling paid links, one can be more forgiving.

Now that Google has incorporated Penguin into the core algorithm, webmasters may have an easier time recovering from ranking issues that arise from spammy or paid links, as they will not have to wait until “the next update” (sometimes years) to recover from an algorithmic devaluation.

However, the fact that Penguin now operates in real time will make it more difficult for webmasters to know when a loss in rankings is due to spammy links or something else — so webmasters will need to be vigilant about monitoring the health of their backlink profiles.

I suspect that Google will continue to make tweaks and adjustments to Penguin after the rollout is complete, and I expect to see a continued shift from penalties to devaluing links over time.

Source: Search Engine Land

Categorized in Search Engine

Google has rolled out its new algorithm Google Penguin 4.0. Michael Jenkins explains how these changes will affect your website and what you can do to avoid being penalised.

It’s been a two-year wait for SEO tragics – Google’s anticipated Penguin 4.0 started rolling out over the weekend and while it’s too soon to see the full impact here’s what you need to know.michael jenkins - ceo - shout agency

What is Google Penguin?

In a nutshell Penguin is the name for a Google algorithm designed to catch websites that are spamming search results. The first iteration launched in 2012, the last update to the algorithm was in 2014 and now Penguin 4.0 landed on the weekend.

Tell me more about Penguin 4.0

Penguin 4.0, dubbed the ‘real time update’, has targeted over-optimised sites. Over-optimisation is two-fold. Firstly, when there is an overuse of keywords that are unnaturally placed second is the over optimisation of link profiles, so if you have too many links from external sites pointing to the same keywords on your page, it’s time to update your disavow file before you get penalised.

Moving forward, there’s one thing that’s for certain – use keywords to write for your audience, not for search engine rankings as you will get found out quicker than ever!Happy chinstrap Penguin

How exactly will sites be affected?

The two key changes are:

  1. You will start to see considerable fluctuations in Google rankings from now as real-time updates will occur as updates are made to a site.
  2. Penguin 4.0 is more granular and penalises specific pages on a site. In the past it was a domain wide penalty.

Pros

  • Penguin is real-time! When a webmaster updates or optimises their website Google will now recognize these changes quickly; and rankings will change accordingly – no more lag time!
  • Penguin 4.0 penalises competitors sites that aren’t doing things by the book and taking short cuts for short term rankings. If you have been doing things well and building genuine authority in your marketplace online then it’s likely to see a positive effect on rankings.

Cons

  • Penguin is real-time. I hear you – I’ve named it as a ‘pro’ but it is also a watch out. You need to ensure your site is being optimized and updated correctly – Google will now notice errors faster than ever that can quickly alter your ranking.
  • SEO is becoming much more sophisticated over time and Google is getting faster at seeing unnatural tactics. Regularly updating your SEO strategy and keeping constant monitor on your websites back links is essential to remain compliant with Penguin 4.0

How can I make the most out of Penguin 4.0?

Marketers should always keep an eye on back links and perform regular checks using the Google disavows tool. The main difference between good and bad backlinks depends on the quality of the website they are on. Bad backlinks will see your site penalised.

If you have noticed fluctuations in rankings there are a few steps you can take to help:

  • Clean out hazardous links
  • Review keyword density on site. Is the keyword repetition natural?
  • Create some branded links. The fastest way to do this is though citation building

 

Watch this space.

Penguin 4.0 has literally just landed we’re bound to learn more in the coming week as it rolls out. Keep your eye out for more insights.

Source : https://mumbrella.com.au

Categorized in Internet Technology

Dive Brief

  • Google has updated four-year-old Penguin, which penalizes sites involved in artificially boosting search rankings via poor-quality links, and made it a part of the search engine's core algorithm, the company said in a blog post
  • The key changes, which are among the top requests from website developers, include making Penguin real-time, meaning any changes in rankings will be visible more quickly.
  • Penguin is also more granular, adjusting rankings based on spam signals rather than affecting the ranking of the entire site.

Dive Insight:

As the leading search engine, one of Google’s goals is to ensure strong user experiences. Penguin, which was first introduced in 2012 and last updated in 2014, is the company’s way of weeding out site pages filled with links to unrelated content in an attempt to boost search rankings.

While paid search is Google’s biggest source of revenue, search engine optimization, which Penguin addresses, is important for brands and marketers. With content marketing gaining steam as more consumers spend time online researching and reading about topics of interest, a strong SEO strategy is one of the ways that marketers can drive success for these programs.

 

Over the past few years, Google has been testing and developing Penguin and now feels it is ready to be part of its core algorithm. In the past, the list of sites affected by Penguin was periodically refreshed. As a result, when sites were improved with an eye toward removing bad links, website developers had to wait until the next refresh before any changes were taken into account by Google’s web crawlers.

Source : http://www.marketingdive.com/

Categorized in Search Engine

Google has officially confirmed on the Google webmaster blog that they've began rolling out the Penguin 4.0 real time algorithm. It has been just under two years since we had a confirmed Penguin update, which we named Penguin 3.0 in October 2014 and this will be the last time Google confirms a Penguin update again. We saw signals yesterday that Google began testing Penguin 4.0, Google wouldn't confirm if those signals were related to this Penguin 4.0 launch announcement this morning but nevertheless, we are live now with Penguin 4.0.

No Future Penguin Confirmations

Google said because this is a real-time algorithm "we're not going to comment on future refreshes." By real time, Google means that as soon as Google recrawls and reindexes your pages, those signals will be immediately used in the new Penguin algorithm.

Google did this also with Panda, when it became part of the core algorithm. Google saidno more confirmations for Panda.

Penguin 4.0 Is Real Time & More Granular

Google again said this is now rolling out, so you may not yet see the full impact until it fully rolls out. I don't expect the roll out to take long. Google wrote:

  • Penguin is now real-time. Historically, the list of sites affected by Penguin was periodically refreshed at the same time. Once a webmaster considerably improved their site and its presence on the internet, many of Google's algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed. With this change, Penguin's data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we're not going to comment on future refreshes.
  • Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

The real-time part we understand, it means when Google indexes the page, it will immediately recalculate the signals around Penguin.

Penguin being more "granular" is a bit confusing. I suspect it means that now Penguin can impact sites on a page-by-page basis, as opposed to how it worked in the past where it impacted the whole site. So really spammy pages or spammy sections of your site can solely be impacted by Penguin now, as opposed to your whole web site. That is my guess, I am trying to get clarification on this.

 

Google Penguin Timeline:

Was My Site Impacted By Google Penguin 4.0

If your site was hit by Penguin 3.0 and you don't see a recovery, even a small one by now, then that probably means you are still impacted. I'd give this a couple weeks to fully roll out and check your analytics to see if you have a nice recovery. Again, I still think specific sections and pages will be impacted and it will make it harder to know if you are impacted by this update or not.

The nice thing, you can use the disavow file on links you think are hurting you and you should know pretty quickly (I suspect days) if that helped as opposed to waiting two years for Google to refresh the algorithm. At the same time, you can be hit by Penguin much faster now.

Source : https://www.seroundtable.com

Categorized in Search Engine

I am reluctant to write something about this because the chatter in the SEO community is all over the place. It seems like many are saying that Google is testing the new Penguin 4.0 update in the wild. Some are saying they see rankings for Penguin impacted sites jump up and down over the course of the same day.

It is possible that Google might be testing Penguin 4.0 on some users but I am not sure if that is how this algorithm release can work. I suspect it can, but again, it is hard to tell.

The Black Hat Forums thread on page 14 started to spike up again with people saying their sites were going up and down over the course of 24 hours. There is also some chatter about this in the ongoing WebmasterWorld thread.

The tools are hit or miss, it depends on when they run and if they hit the Google Penguin test (if there is a test). For example, Mozcast has been on fire all week:

click for full size

But some of the other tools are steadily high or just all over the place.

Here are some quotes from the threads:

Right now it's just dancing around bit, but in all likelihood these small patch updates are warning sings of an impending penguin 4.0
It is my opinion that Google is probably testing a filter on those weird days, and some people are winners and other are loosers. Since our company is expecting a Penguin recovery (due to recent NSEO disavowels), and since Penguin is imminent, and since we have good days when others report bad days, this behavior is most likely Penguin testing. About a year ago, Google said Penguin was coming soon. That's about the time "Zombies" started getting reported.

I am seeing many tweets from folks and I have tons and tons of emails from people asking me what is up.

Again, it is too early to tell but it wouldn't surprise me if Google is testing a Penguin refresh.

Forum discussion at Black Hat Forums and WebmasterWorld.

Update: As the day goes on, it seems more and more like Google is indeed testing the new Penguin update. No confirmation from Google on this.

Update #2: Friday at 8am ET, Google confirmed the roll out of Google Penguin 4.0, the real time version.

 

 

Source : https://www.seroundtable.com

Categorized in Search Engine

 

This week in search, we saw even more signals of a massive Google update, both seem not to be related to Penguin. Although, Google does have a date in mind on when to launch Penguin 4.0. Google seems to have dropped how often they show the image search box in the search results. Google will be updating their JavaScript recommendations in the upcoming weeks. Google now shows the reviewers name next to the review snippets. Google added new schema and structured markup for courses. Google is showing image thumbnails in the mobile search results. Google is testing a whiter desktop search look. Google added helpful buttons to local reviews. Google sometimes hides the full address in the maps results. Google may show a local map pack at the bottom of the search results. Google AdWords keyword planner tool is changing your keywords. Google had another big bug with the keyword planner tool this week, they fixed it days later. Google Adwords added a new way to access multiple accounts. Google AdWords extended the expanded text ads deadline. Google is dropping support for the campaign experiments feature in AdWords. That was this week in search at theSearch Engine Roundtable.

Make sure to subscribe to our video feed or subscribe directly on iTunes to be notified of these updates and download the video in the background. Here is the YouTube version of the feed:

For the original iTunes version, click here.

Search Topics of Discussion:

Please do subscribe via iTunes or on your favorite RSS reader. Don't forget to comment below with the right answer and good luck!

 

 

Source : https://www.seroundtable.com/video-09-16-2016-22709.html

 

Categorized in Search Engine

Google has multiple named parts of the algorithm that influence search rankings. Google Panda is part of the algo that is specific to the quality of content, Penguin is specific to the quality of links, and Hummingbird is Google’s part of the algo for handling conversational search queries accurately.

Google Panda

Google Panda takes the quality of a site’s content into account when ranking sites in the search results. For sites that have lower quality content, they would likely find themselves negatively impacted by Panda. As a result, this causes higher quality content to surface higher in the search results, meaning higher quality content is often rewarded with higher rankings, while low-quality content drops.

Google Panda

When Panda originally launched, many saw it as a way for Google to target content farms specifically, which were becoming a major problem in the search results with their extremely low-quality content that tended to rank due to sheer volume. These sites were publishing a fantastic amount of low-quality content very quickly on topics with very little knowledge or research, and it was very obvious to a searcher who landed on one of those pages.

Google has now evolved Panda to be part of the core algorithm. Previously, we had a known Panda update date, making it easier to identify when a site was hit or had recovered from Panda. Now it is part of a slow rolling update, lasting months per cycle. As a result, it is hard to know whether a site is negatively impacted by Panda or not, other than doing a content audit and identifying factors that sites hit by Panda tend to have.

User Generated Content

It is important to note that Panda does not target user-generated content specifically, something that many webmasters are surprised to learn. But while Panda can target user-generated content, it tends to impact those sites that are producing very low-quality content – such as spammy guest posts or forums filled with spam.

Do not remove your user-generated content, whether it is forums, blog comments or article contributions, simply because you heard it is “bad” or marketed as a “Panda proof” solution. Look at it from a quality perspective instead. There are many highly ranking sites with user-generated content, such as Stack Overflow, and many sites would lose significant traffic and rankings simply because they removed that type of content. Even comments made on a blog post can cause it to rank and even get a featured snippet.

Word Count

Word count is another aspect of Panda that is often misunderstood by SEOs. Many sites make the mistake that they refuse to publish any content unless it is above a certain word count, with 250 words and 350 words often cited. Instead, Google recommends you think about how many words the content needs to be successful for the user.

For example, there are many pages out there with very little main content, yet Google thinks the page is quality enough that it has earned the featured snippet for the query. In one case, the main content was a mere 63 words, and many would have been hard pressed to write about the topic in a non-spammy way that was 350+ words in length. So you only need enough words to answer the query.

Content Matches the Query

Ensuring your content matches the query is also important. If you see Google is sending traffic to your page for specific queries, ensure that your page is answering the question searchers are looking for when they land there. If it is not, it is often as simple as adding an extra paragraph or two to ensure that this is happening.

As a bonus, these are the types of pages – ones that answer a question or implied question – that Google is not only looking to rank well but is also awarding the featured snippet for the query to.

 

 

Technical SEO

Technical SEO also does not play any role in Panda. Panda looks just at the content, not things like whether you are using H1 tags or how quickly your page loads for users. That said, technical SEO can be a very important part of SEO and ranking in general, so it should not be ignored. But it does not have any direct impact on Panda specifically.

Determining Quality

If you are struggling to determine whether a particular piece of content is considered quality or not, there is one surefire way to confirm. Look in Search Analytics or your site’s analytics program such as Google Analytics and look at the individual page. If Google is ranking a page and sending it traffic, then clearly it is viewing it as quality enough to show high enough in the search results that people are landing there from those Google’s search results.

However, if a page is not getting traffic from Google, it does not automatically mean it is bad, but the content is worth looking at closer. Is it simply newer and has not received enough ranking signals to rank yet? Do you see areas of improvement you can make by adding a paragraph or two, or changing the title to match the content better? Or is it truly a garbage piece of content that could be dragging the site down the Panda hole?

Also, do not forget that there is traffic outside of Google. You may question a page because Google is not sending it traffic, but perhaps it does amazingly well in Bing, Baidu, or one of the other search engines instead. Diversity in traffic is always a good thing, and if you have pages that Google might not be sending traffic to, but is getting traffic from other search engines or other sites or through social media shares, then removing that content would be the wrong decision to make.

Panda Prevention

How to prevent Google Panda from negatively impacting your site is pretty simple. Create high-quality, unique content that answers the question searchers are asking.

Reading content out loud is a great way to tell if content is high-quality or not. When content is read aloud, suddenly things like over usage of repetitive keywords, grammatical errors, and other signals that the content is less than quality will stand out. Read it out yourself and edit as you go, or ask someone else to read it so you can flag what should be changed.

Google Penguin

Google Penguin

The second major Google algorithm is Penguin. Penguin deals solely with link quality and nothing else. Sites that have purchased links or have acquired low-quality links through places such as low-quality directories, blog spam, or link badges and infographics could find their sites no longer ranking for search terms.

Who Should Worry about Penguin?

Most sites do not need to worry about Penguin unless they have done some sketchy link building in the past or have hired an SEO who might have engaged in those tactics. Even if the site owner was not aware of what an SEO was doing, the owner is still ultimately responsible for those links. That is why site owners should always research an SEO or SEO agency before hiring.

If you have done link building in the past while tactics were accepted, but which are now against Google’s webmaster guidelines, you could be impacted by Penguin. For example, guest blogging was fine years ago, but is not a great way to build links now unless you are choosing your sites well. Likewise, asking site visitors or members to post badges that linked to your site was also fine previously, but will now definitely result in Penguin or a link manual action.

Algorithmic Penguin and Link Manual Actions

Penguin is strictly algorithmic in nature. It cannot be lifted by Google manually, regardless of the reason why those links might be pointing to a website.

Confusing the issue slightly is that there is a separate manual action for low-quality links and that one can be lifted by Google once the links have been cleaned up. This is done with a reconsideration request in Google Search Console. And sites can be impacted by both a linking manual action and Penguin at the same time.

Incoming Links Only

Penguin only deals with a site’s incoming links. Google only looks at the links pointing to the site in question and does not look at the outgoing links at all from that site. It is important to note that there is also a Google manual action related directly to a site’s outgoing links (which is different from the regular linking manual action), so the pages and sites you link to could result in a manual action and the deindexing of a site until those links are cleaned up.

Finding Your Backlinks

If you suspect your site has been negatively impacted by Penguin, you need to do a link audit and remove or disavow the low quality or spammy links. Google Search Console includes a list of backlinks for site owners, but be aware that it also includes links that are already nofollowed. If the link is nofollowed, it will not have any impact on your site, but keep in mind, the site could remove that nofollow in the future without warning.

There are also many third-party tools that will show links to your site, but because some websites block those third-party bots from crawling their site, it will not be able to show you every link pointing at your site. And while some of the sites blocking these bots are high-quality well-known sites not wanting to waste the bandwidth on those bots, it is also being used by some spammy sites to hide their low-quality links from being reported.

Assessing Link Quality

When it comes to assessing the links, this is where many have trouble. Do not assume that because a link comes from an .edu site that it is high-quality. There are plenty of students who sell links from their personal websites on those .edu domains which are extremely spammy and should be disavowed. Likewise, there are plenty of hacked sites within .edu domains that have low-quality links.

 

 

Do not make judgments strictly based on the type of domain. While you can’t make automatic assumptions on .edu domains, the same applies to all TLDs and ccTLDs. Google has confirmed that just being on a specific TLD it does not help or hurt the search rankings. But you do need to make individual assessments. There is a long running joke about how there’s never been a quality page on a .info domain because so many spammers were using them, but in fact, there are some great quality links coming from that TLD, which shows why individual assessment of links is so important.

Beware of Links from Presumed High-Quality Sites


Do not look at the list of links and automatically consider links from specific websites as being a great quality link, unless you know that very specific link is high quality. Just because you have a link from a major website such as Huffington Post or the BBC does not make that an automatic high-quality link in the eyes of Google – if anything, you should question it more.

Many of those sites are also selling links, albeit some disguised as advertising or done by a rogue contributor selling links within their articles. These types of links from high-quality sites actually being low-quality has been confirmed by many SEOs who have received link manual actions that include links from these sites in Google’s examples. And yes, they could likely be contributing to a Penguin issue.

As advertorial content increases, we are going to see more and more links like these get flagged as low-quality. Always investigate links, especially if you are considering not removing any of them simply based on the site the link is from.

Promotional Links

As with advertorials, you need to think about any links that sites may have pointed to you that could be considered promotional links. Paid links do not always mean money is exchanged for the links.

Examples of promotional links that are technically paid links in Google’s eyes are any links given in exchange for a free product for review or a discount on products. While these types of links were fine years ago, they now need to be nofollowed. You will still get the value of the link, but instead of it helping rankings, it would be through brand awareness and traffic. You may have links out there from a promotional campaign done years ago that are now negatively impacting a site.

For all these reasons, it is vitally important to individually assess every link. You want to remove the poor quality links because they are impacting with Penguin or could cause a future manual action. But you do not want to remove the good links, because those are the links that are helping your rankings in the search results.

Promotional links that are not nofollowed can also trigger the manual action for outgoing links on the site that placed those links.

Editor Note: Removing links and submitting a disavow request is also covered in more detail in the ‘What to Do When Things Go Wrong‘ section of our SEO Guide.

Link Removals

Once you have gone through your backlinks and determined that there are some that should be removed or disavowed, you will need to get these links removed. You should first approach site owners and ask them to remove the links pointing to your site. If removals are unsuccessful, add those URLs to a disavow file, one you will submit to Google.

There are tools that will automate the link removal requests and agencies that will handle the requests as well, but do not feel it is necessary to do this. Many webmasters find contact forms or emails and will do it themselves.

Some site owners will demand a fee to remove a link from a site, but Google recommends not paying for link removals. Just include them in your disavow file instead and move onto the next link removal. Some site owners are using link removals to generate revenue, so the practice is becoming more common.

Creating and Submitting a Disavow File

The next step in cleaning up Penguin issues is to submit a disavow file. The disavow file is a file you submit to Google that tells them to ignore all the links included in the file so that they will not have any impact on your site. The result is that the negative links will no longer cause negative ranking issues with your site, such as with Penguin, but it does also mean that if you erroneously included high-quality links in your disavow file, those links will no longer help your ranking. This is another reason why it is so crucial to check your backlinks well before deciding to remove them.

If you have previously submitted a disavow file to Google, they will replace that file with your new one, not add to it. So it is important to make sure that if you have previously disavowed links, you still include those links in your new disavow file. You can always download a copy of the current disavow file in Google Search Console.

 

 

Disavowing Individual Links Versus Domains

It is recommended that you choose to disavow links on a domain level instead of disavowing the individual links. There will be some cases where you will want to disavow individually specific links, such as on a major site that has a mix of quality versus paid links. But for the majority of links, you can do a domain based disavow. Then, Google only needs to crawl one page on that site for that link to be discounted on your site.

Doing domain based disavows also means that you are do not have to worry about those links being indexed as www or non-www, as the domain based disavow will take this into account.

What to Include in a Disavow File

You do not need to include any notes in your disavow file, unless they are strictly for your reference. It is fine just to include the links and nothing else. Google does not read any of the notations you have made in your disavow file, as they process it automatically without a human ever reading it. Some find it useful to add internal notations, such as the date a group of URLs was added to the disavow file or comments about their attempts to reach the webmaster about getting a link removed.

Once you have uploaded your disavow file, Google will send you a confirmation. But while Google will process it immediately, it will not immediately discount those links. So you will not instantly recover from submitting the disavow alone. Google still needs to go out and crawl those individual links you included in the disavow file, but unfortunately the disavow file itself will not prompt Google to crawl those pages specifically.

It can take six or more months for all those individual links to be crawled and disavowed. And no, there is no way to determine which links have been discounted and which ones have not been, as Google will still include both in your linking report in Google Search Console.

Speeding Up the Disavow Process

There are ways you can speed up the disavow process. The first is using domain based disavows instead of individual links. And the second is to not waste time include lengthy notations for Google’s benefit so that you can submit your disavow faster. Because reconsideration requests require you to submit more details, some misunderstand and believe the disavow needs more details, too.

 

 

Lastly, if you have undergone any changes in your domain, such as switching to https or switching to a new domain, you need to remember to upload that disavow file to the new domain property in Google Search Console. This is one step that many forget to do, and they can be impacted by Penguin or the linking manual action again, even though they have cleaned it up previously.

Recovery from Penguin

When you recover from Penguin, do not expect your rankings to go back to where they used to be before Penguin, nor for the return to be immediate. Far too many site owners are under the impression that they will immediately begin ranking at the top for their top search queries once Penguin is lifted.

First, some of the links that you disavowed were likely contributing to an artificially high ranking, so you cannot expect those rankings to be as high as they were before. Second, because many site owners have trouble assessing the quality of the links, some high-quality links inevitably get disavowed in the process, links that were contributing to the higher rankings.

Add to the mix the fact Google is constantly changing their ranking algorithm, so factors that benefited you previously might not have as big of an impact now, and vice versa.

Compensated Links via Badges and More

Also be aware of any link building campaigns you are doing, or legacy ones that could come back to impact your site. This would include things like badges you have given to other site owners to place on their sites or the requirement that someone includes a link to your site to get a directory listing or access something. In simple terms, if the link was placed in exchange for anything, it either needs to be nofollowed or disavowed.

When it comes to disavowing files that people are using to clean up poor quality links, there is a concern that a site could be hurt by competitors placing their URLs into a disavow file uploaded to Google. But Google has confirmed that they do not use the URLs contained within a disavow file for ranking, so even if your site appears in thousands of disavows, it will not hurt. That said, if you are concerned your site is legitimately appearing in thousands of disavows, then your site probably has a quality issue you should fix.

Negative SEO

There is also the negative SEO aspect of linking, where some site owners worry that a competitor could buy spammy links and point them to their site. And many use negative SEO as an excuse when their site gets caught by Google for low-quality links.

If you are worried about this, you can proactively disavow the links as you notice them. But Google has said they are pretty good about recognizing this when it happens, so it is not something most website owners need to worry about.

Real Time Penguin

Google is expected to release a new version of Penguin soon, which will have one very notable change. Instead of site owners needing to wait for a Penguin update or refresh, the new Penguin will be real-time. This is a huge change for those dealing with the impact of spamming links and the weights many have had to endure after cleaning up.

Hummingbird

Hummingbird

Google Hummingbird is part of the main Google search algorithm and was the first major change to their algorithm since 2001. But what is different about Hummingbird is that this one is not specifically a spam targeting algorithm, but instead an algorithm to ensure they are serving the best results for specific queries. Hummingbird is more about being able to understand search queries better, particularly with the rise of conversational search.

 

 

It is believed that Hummingbird is positively impacting the types of sites that are providing high-quality content that reads well to the searcher and is providing answers to the question the searcher is asking, whether it is implied or not.

Hummingbird also impacts long-tailed search queries, similarly to how Rank Brain is also helping those types of queries. Google wants to ensure that they can provide high-quality results for the longer queries. For example, instead of sending a specific question related to a company to the company’s homepage, Google will try to serve an internal page on the site about that specific topic or issue instead.

Hummingbird cannot be optimized for, outside of optimizing for the rise of conversational search. Longer search queries, such as what we see with voice search, and the types of queries that searchers tend to do on mobile are often highlighted with a conversational search. And optimizing for conversational search is easier than it sounds. Make sure your content is highly readable and can answer those longer tail queries as well as shorter tail ones.

Like Rank Brain, Hummingbird had been released for a period before it was announced, and SEOs did not particularly notice anything different regarding the rankings. It is not known how often Hummingbird is updated or changed by Google.

Source :https://www.searchenginejournal.com/seo-guide-panda-penguin-hummingbird/169167/

Categorized in Search Engine

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Finance your Training & Certification with us - Find out how?      Learn more