fbpx

Updates related to operating systems (OS), iOS apps and android apps are very common. Consumers expect it, anticipate it and often demand it. This is great, because in world where needs and wants are constantly evolving, the ability to improve and make changes on the fly are essential. Since more products are living online or connected to the internet, updates are required. There is one company whose updates can put businesses on edge and greatly impact websites. Google algorithm changes can seemingly come out of nowhere and recovery times if penalized can take a while.

A Google algorithm change can be happening right now but due to the subtleness, is not noticeable to the average web surfer. However, when big updates happen like Google Panda, Google Hummingbird or Google Penguin, businesses should check to see that their website isn’t affected. Moz details larger algorithm updates in-depth if you would like to dig deeper into Google algorithm changes.

Google Penguin 4.0 and Impact

Recently Google released a new algorithm update to Google Penguin. Google Penguin 4.0 has the same core function of the original Penguin algorithm. Originally Google Penguin was rolled out to address the growing practice of generating links unnaturally. These links often came from spammy sites that Google is actively working to eliminate.

 

Google values user experiences above all and will penalize websites who are in violation. Google penalties are legendary and tend to have a seemingly lasting impact. These penalties result in loss of keyword rankings on search engine pages. This is a very big deal, as an important keyword your company once ranked on and drove traffic and/or sales could be lost. By ‘lost, we mean that Google could greatly hinder your chances of ranking on a keyword like “big screen tv” or “consulting service” if penalized.

SEM Rush explains historically once a site was penalized by Google it could take months or even years to recover. Google Penguin had a lasting impact as some websites waited years for a Google Penguin Recovery after making site changes. This was because larger Google algorithm changes and updates were rather infrequent and websites could only recover after a new update.

This delay could mean missing out on valuable prospects due to your site falling in the rankings. Now Google Penguin 4.0 is focusing on maintaining the integrity of websites while also increasing the speed at which website can recover from any potential penalty.

Penalties From Google

Previously, when a site received a Google penalty it impacted the entire website. Now Google Penguin 4.0 makes the penalties much more targeted. This means that opposed to the entire site being impacted it could just impact a specific page. Therefore, it is more important to regularly check subdomains and individual web pages to ensure that no drastic changes have happened. This isn’t uncommon as you aren’t notified when you are impacted – you just have to look for the signs.

Signs may vary but some of the signs are significant sudden drops in organic keywords or/and keywords dropping in position. If you notice any of scenarios happening to you chances are your website or webpage may have been hit by Google Penguin. These checks can be performed using sites like Google Analytics and SEM Rush. Once you identify bad links you want to remove or disavow those links. If you can’t remove the links, Google explains the disavow process here. Once these changes are made your Google Penguin Recovery will be much quicker.

 

The biggest benefit of this Google algorithm change is the speed the penalty can be resolved or even implemented. Previously when Google Penguin penalized a website for spammy links, it could take a while before Google ran the algorithm again to see if the problem had been corrected. Now due to the real time aspect of this algorithm a Google Penguin Recovery can occur fairly quickly.

In addition the opposite is true as a violation can be discovered and a web page can be impacted.

The easiest way to protect yourself is to perform regular site audits to make sure that any backlink that you receive is credible. While Google rewards quality links a bad link can slow down all of the hard work you put into building your online presence.

Things to Keep in Mind

In conclusion here are a few points to remember when thinking about Google Penguin.

  • The goal is to ensure a good user experience so you should always write for your web visitors. If you are doing this and making sure your links are credible you greatly minimize the chance of getting hit by Google Penguin 4.0.
  • Penguin is in real time meaning that it is vitally important to keep track of any backlinks and monitor your web pages. Previously the whole site would be affected by Penguin but that currently isn’t the case. While it’s great to make changes and see Google quickly stop penalizing your web page you must remember the inverse. This means a if spammy sites began linking to your site you could be penalized.
  • Use tools such as Google’s Disavow Tool or SEM Rush’s Backlink Audit to eliminate any spammy backlinks. You can also consider speaking to a professional in regards to performing a site audit. The important thing is to regularly check site traffic for any strange fluctuations.

Author:  Ray Bennett

Source:  http://www.business2community.com/

Categorized in Search Engine

What should SEOs do to make the best of the new Penguin update? Perhaps not much. Columnist Dave Davies notes that while Penguin 4.0 was indeed significant, things ultimately haven't changed that much.

For the last four-plus years now, we’ve heard a lot about Penguin. Initially announced in April 2012, we were told that this algorithm update, designed to combat web spam, would impact three percent of queries.

More recently, we’ve witnessed frustration on the part of penalized website owners at having to wait over a year for an update, after Google specifically noted one was coming “soon” in October of 2015.

In all the years of discussion around Penguin, however, I don’t believe any update has been more fraught with confusing statements and misinformation than Penguin 4.0,the most recent update. The biggest culprit here is Google itself, which has not been consistent in its messaging.

And this is the subject of this article: the peeling away of some of the recent misstated or just misunderstood aspects of this update, and more importantly, what it means for website owners and their SEOs.

So, let’s begin.

What is Penguin?

Note: We’re going to keep this section short and sweet — if you want something more in-depth, you should begin by reading Danny Sullivan’s article on the initial release of Penguin, “Google Launches ‘Penguin Update’ Targeting Webspam In Search Results.” You can also browse Search Engine Land’s Penguin Update section for all the articles written here on the topic.

The Penguin algorithm update was first announced on April 24, 2012, and the official explanation was that the algorithm targeted web spam in general. However, since the biggest losses were incurred by those engaged in manipulative link schemes, the algorithm itself was viewed as being designed to punish sites with bad link profiles.

I’ll leave it at that, with the assumption that I shouldn’t bore you with additional details on what the algorithm was designed to do. Let’s move now to the confusion.

Where’s the confusion?

Until Penguin 4.0 rolled out on September 23, 2016, there really wasn’t a lot of confusion around the algorithm. The entire SEO community — and even many outside it — knew that the Penguin update demoted sites with bad links, and it wasn’t until it was next updated that an affected site could expect some semblance of recovery.

The path was clear: a site would get hit with a penalty, the website owner would send out requests to have offending links removed, those that couldn’t be removed would be added to a disavow list and submitted, and then one would simply wait.

 

However, things got more complicated with this most recent update — not because the algorithm itself got any more difficult to understand, but rather because the folks at Google did.

In essence, there were only a couple of major changes with this update:

  1. Penguin now runs in real time. Webmasters impacted by Penguin will no longer have to wait for the next update to see the results of their improvement efforts — now, changes will be evident much more quickly, generally not long after a page is recrawled and reindexed.
  2. Penguin 4.0 is “more granular,” meaning that it can now impact individual pages or sections of a site in addition to entire domains; previously, it would act as a site-wide penalty, impacting rankings for an entire site.

It would seem that there isn’t a lot of room for confusion here on first glance. However, when the folks at Google started adding details and giving advice, that ended up causing a bit of confusion. So let’s look at those to get a better understanding of what we’re expected to do.

Disavow files

Rumor had it, based on statements by Google’s Gary Illyes, that a disavow file is no longer necessary to deal with Penguin-related ranking issues.

This is due to a change in how Penguin 4.0 deals with bad links: they now devalue the links themselves rather than demoting the site they’re linking to.

Now, that seems pretty clear. If you read Illyes’ statements in the article linked above, there are a few takeaways:

  1. Spam is devalued, rather than sites being demoted.
  2. There’s less need to use a disavow file for Penguin-related ranking penalties.
  3. Using the disavow file for Penguin-related issues can help Google help you, but it is more specifically useful for sites under manual review. 

So now we have a “yes, you should use it for Penguin” and a “no, you don’t need it for Penguin.” But wait, it gets more fun. On October 4, 2016, Google Webmaster Trends Analyst John Mueller stated the following in an Office Hours Hangout:

[I]f these are problematic links that are affected [by Penguin], and you use a disavow file, then that’s a good way for us to pick that up and recognize, “Okay, this link is something you don’t want to have associated with this site.” So when we recrawl that page that is linking to you, we can drop that link from our link graph.

With regards to devaluing these low quality links instead of punishing you, in general we try to figure out what kind of spammy tactics are happening here and we say, “Well, we’ll just try to ignore this part with regards to your website.”

So… clear as mud?

The disavow takeaway

The takeaway here is that the more things change, the more they stay the same. There is no change. If you’ve used unethical link-building strategies in the past and are considering submitting a disavow file — good, you should do that. If you haven’t used such strategies, then you shouldn’t need to; if Google finds bad links to your site, they’ll simply devalue them.

 

Of course, it was once also claimed that negative SEO doesn’t work, meaning a disavow wasn’t necessary for bad links you didn’t build. This was obviously not the case, and negative SEO did work (and may well still), so you should be continuing to monitor your links for bad ones and adding them to your disavow file periodically. After all, if bad links couldn’t negatively impact your site, there would be no need for a disavow at all.

And so, the more things change, the more they stay the same. Keep doing what you’ve been doing.

The source site?

In a recent podcast over on Marketing Land, Gary Illyes explains that under Penguin, it’s not the target site of the link that matters, it’s the source. This doesn’t just include links themselves, but other signals a page sends to indicate that it’s likely spam.

So, what we just were informed is that the value of a link comes from the site/page it’s on and not where it’s pointing. In other words, when you’re judging your inbound links, be sure to look at the source page and domain of those links.

The more things change, the more they stay the same.

Your links are labeled

In the same podcast on Penguin, it came to light that Google places links on a page into categories, including things like:

  • footer;
  • Penguin-impacted; and
  • disavowed.

It was suggested that there are other categories, but they weren’t named. So, what really does this mean?

It means what we all pretty well knew was going on for about a decade. We now have a term to use to describe it (“labels”) rather than simply understanding that a page is divided into sections, and the sections that are the most visible and more likely to be engaged with hold the highest value (with regard to both content and links).

Additionally, we already knew that links that were disavowed were flagged as such.

 

There is one new side

The only really new piece of information here is that either Google has replaced a previous link weighting system (which was based on something like visibility) with a labeling system, or they have added to it. Essentially, it appears that where previously, content as a whole may have been categorized and links included in that categorization, now a link is given one or possibly multiple labels.link-labels.jpg

So, this is a new system and a new piece of information, which brings us to…

The link labeling takeway

Knowing whether the link is being labeled or simply judged by its position on the page — and whether it’s been disavowed or not — isn’t particularly actionable. It’s academically interesting, to be sure, and I’m certain it took Google engineers many days or months to get it figured out (maybe that’s what they’ve been working on since last October). But from an SEO’s perspective, we have to ask ourselves, ”What really changed?”

Nothing. You will still be working to develop highly visible links, placed contextually where possible and on related sites. If this strays far from what you were doing, you likely weren’t doing your link building correctly to begin with. I repeat: the more things change, the more they stay the same.

But not Penguin penalties, right? Or… ?

It turns out that Penguin penalties are treated very differently in 4.0 from the way they were previously. In a discussion with Google’s Gary Illyes, he revealed that there is no sandbox for a site penalized by Penguin. 

So essentially, if you get hit with a Penguin penalty, there is no trust delay in recovery — once you fix the problem and your site is recrawled, you’d bounce back.

That said, there’s something ominous about Illyes’ final tweet above. So Penguin does not require or impose a sandbox or trust-based delay… but that’s not to say there aren’t other functions in Google’s algorithm that do.

 

So, what are we to conclude? Avoid penalties — and while not Penguin-related, there may or may not be delays in recovering from one. Sound familiar? That’s because (surely you can say it with me by now)…

The more things change, the more they stay the same

While this was a major update with a couple of significant changes, what it ultimately means is that our SEO process hasn’t really changed at all. Our links will get picked up faster (both the good and the bad), and penalties will likely be doled out and rolled back much more reliably; however, the links we need to build and how they’re being weighted remain pretty much the same (if not identical). The use of the disavow file is unchanged, and you should still (in my opinion) watch for negative SEO.

The biggest variable here comes in their statement that Penguin is not impacted by machine learning: 

I have no doubt that this is currently true. However, now that Penguin is part of the core algorithm — and as machine learning takes on a greater role in how search engines rank pages — it’s likely that it will eventually begin to control some aspects of what are traditionally Penguin algorithms.

But when that day comes, the machines will be looking for relevancy and maximized user experience and link quality signals. So the more you continue to stay focused on what you should be doing… the more it’ll stay the same.

Source : searchengineland

Categorized in Search Engine

When we asked Google's Gary Illyes about Penguin, he said SEOs should focus on where their links come from for the most part, but they have less to worry about now that Penguin devalues those links, as opposed to demoting the site.

Here’s another nugget of information learned from the A conversation with Google’s Gary Illyes (part 1) podcast at Marketing Land, our sister site: Penguin is coined a “web spam” algorithm, but it indeed focuses mostly on “link spam.” Google has continually told webmasters that this is a web spam algorithm, but every webmaster and SEO focuses mostly around links. Google’s Gary Illyes said their focus is right, that they should be mostly concerned with the links when tackling Penguin issues.

Gary Illyes made a point to clarify that it isn’t just the link, but rather the “source site” the link is coming from. Google said Penguin is based on “the source site, not on the target site.” You want your links to come from quality sources, as opposed to a low-quality source.

One example Gary revealed was his looking at a negative SEO case submitted to him, and he said the majority of the links were on “empty profile pages, forum profile pages.” When he looked at those links, the new Penguin algorithm was already “discounting” those links, devaluing those links.

“The good thing is that it is discounting the links, basically ignoring the links instead of the demoting,” Gary Illyes added. 

Barry Schwartz: You also talked about web spam versus link spam and Penguin. I know John Mueller specifically called it out again, in the original Penguin blog post that you had posted, that you said this is specifically a web spam algorithm. But every SEO that I know focuses just on link spam regarding Penguin. And I know when you initially started talking about this on our podcast just now, you said it’s mostly around really really bad links. Is that accurate to say when you talk about Penguin, [that] typically it’s around really, really bad links and not other types of web spam?

Gary Illyes: It’s not just links. It looks at a bunch of different things related to the source site. Links is just the most visible thing and the one that we decided to talk most about because we already talked about about links in general.

 

But it looks at different things on the source site, not on the target site, and then makes decisions based on those special signals.

I don’t actually want to reveal more of those spam signals because I think they would be pretty, I wouldn’t say easy to spam, but they would easy to mess with. And I really don’t want that.

But there are quite a few hints in the original, the old Penguin article.

Barry Schwartz: Can you mention one of those hints that is in the article?

Gary Illyes: I would rather not. I know that you can make pretty good assumptions. So I would just let you make assumptions.

Danny Sullivan: If you were making assumptions, how would you make those assumptions?

Gary Illyes: I try not to make assumptions. I try to to make decisions based on data.

Barry Schwartz: Should we be focusing on a link spam aspect of it for Penguin? Obviously, focus on all the “make best quality sites,” yada-yada-yada, but we talk about Penguin as reporters, and we’re telling people that SEOs are like Penguin specialists or something like that — they only focus on link spam — Is that wrong? I mean should they?

Gary Illyes: I think that’s the the main thing that they should focus on.

See where it is coming from, and then make a decision based on the source site — whether they want that link or not.

Well, for example, like I was looking at the negative SEO case just the yesterday or two days ago. And basically, the content owner played hundreds of links on empty profile pages, forum profile pages. Those links with the new Penguin were discounted. But like if you looked at the page, it was pretty obvious that the links were placed there for a very specific reason, and that’s to game the ranking algorithms. But not just Google’s but any other ranking algorithm that uses links. Like if you look at a page, you can make a pretty easy decision on whether to disavow or remove that link or not. And that’s what Penguin is doing. It’s looking at signals on the source page. Basically, what kind of page it is, what could be the purpose of that link ,and then make a decision based on that whether to discount those things or not.

 

The good thing is that it is discounting the links, basically ignoring the links instead of the demoting.

So in general, unless people are overdoing it, it’s unlikely that they will actually feel any sort of effect by placing those. But again, if they are overdoing it, then the manual actions team might take a deeper look.

You can listen to part one of the interview at Marketing Land.

Source : searchengineland

Categorized in Search Engine

Google's manual action team has access to look at a page's link labels and may use that to dig deeper into the site's activities.

In the A conversation with Google’s Gary Illyes (part 1) podcast at Marketing Land, our sister site, we learned that Google adds labels to your links. These labels can add classifications or attributes to the link, including whether the link is a footer link, whether it’s impacted by the latest Penguin update, whether it’s disavowed or other categorizations. A link can have multiple labels that make up the value and meaning of that link, which ultimately helps Google determine how to rank the related documents on the web.

Google’s manual actions team may look at these labels to determine if they should dig deeper into the site’s links and add a manual action penalty to the site or not. Although, Illyes added, he doesn’t do much work with that specific team, so he isn’t too aware of their daily processes.

In addition, Illyes listed three types of labels one might find on a link: “Penguin real time,” which would be the new Penguin algorithm; “footer” links, which would help Google determine how important the link is — i.e., it being in the footer versus the main content; and “disavow” — so if a link is in the disavow file, it will also be labeled as such for that site. There are many more link labels, but he only shared these three.

 

Barry Schwartz: Is there some sort of flag that happens automatically. So that the manual action team is notified that, hey, there is a devalue going on here by Penguin?

Gary Illyes: I don’t work much with the manual actions team, but to the best of my knowledge, there is no flag that said that they have; they can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labeled, then they might actually take a deeper look and see what the content owner is trying to do.

Danny Sullivan: You were talking about how Penguin is looking and identifying things and to think of it as part of like a link label. So, like Google is looking at things like, OK, I know this is a footer link or this is a content link or this is a Penguin link. So tell us more about that.

Gary Illyes: So, if you think about it, there are tons of different kinds of links on the internet. There are footer links, for example. There are Penguinized links, and all of these kinds of links have certain labels internally attached to them, basically for our own information. And if the manual actions team is reviewing a site for whatever reason, and they see that most of links are labeled as Penguin real-time affected, then they might decide to take a much deeper look on the site and see what’s up with those links and what could be the reason those links exist — and then maybe apply a manual action on the site because of the links.

Barry Schwartz: Which is why Google still recommends you disavow links, because I guess if a manual action team sees all these labels on it, and they look at the disavow file, and they say, hey this SEO or this webmaster [is] aware of these links and they’re going ahead and trying to take no responsibility… Would these labels show up if the links are disavowed, or they probably wouldn’t?

 

Gary Illyes: So disavow is again, basically, just a label internally. It’s applied on the links and anchors. And then you can see that, as well. Basically, you could have like a link from, I don’t know, WhiteHouse.gov, and it has labels Penguin RT, footer and disavow. And then they would see that — they would know that someone or the webmaster or content owner is actively tackling those links.

You can listen to part one of the interview at Marketing Land.

Source : searchengineland

Categorized in Search Engine

Google is often criticized for how it handles spammy links, but columnist Ian Bowden believes this criticism may be unfair. Here, he takes a look at the challenges Google might face in tackling the ongoing issue of paid links.

Prior to the recent arrival of Penguin 4.0, it had been nearly two years since Penguin was last updated. It was expected to roll out at the end of 2015, which then became early 2016. By the summer, some in the industry had given up on Google ever releasing Penguin 4.0. But why did it take so long?

I’d argue that criticism directed at Google is in many cases unjustified, as people often take too simplistic a view of the task at hand for the search engine.

Detecting and dealing with paid links is a lot harder than many people think, and there are likely good reasons why Google took longer than hoped to release the next iteration of Penguin.

Here are some of the challenges Google may have faced in pushing out the most recent Penguin update:

1. It has to be effective at detecting paid links

To run and deploy an effective Penguin update, Google has to have the ability to (algorithmically and at scale) determine which links violate guidelines. It’s not clear the extent to which Google is capable of this; there are plenty of case studies which show that links violating the guidelines continue to work.

However, not all paid links are created equal.

Some paid links are obviously paid for. For instance, they may have certain types of markup around them, or they may be featured within an article clearly denoted as an advertorial.

On the other hand, some links may have no telltale signs on the page that they are paid for, so determining whether or not they are paid links comes through observing patterns.

The reality is that advanced paid linking strategies will be challenging for Google to either devalue or penalize.

Penguin has historically targeted very low-quality web spam, as it is easier to distinguish and qualify, but a level above this is an opportunity. Google has to have confidence in its capability before applying a filter, due to the severity of the outcome.

 

2. Google is still dependent on links for the best quality search results

Maybe, just maybe, Google is actually capable of detecting paid links but chooses not to devalue all of them.

Most people will be familiar with third-party tools that perform link analyses to assess which links are “toxic” and will potentially be harming search performance. Users know that sometimes these tools get it wrong, but generally they’re pretty good.

I think it is fair to assume that Google has a lot more resources available to do this, so in theory they should be better than third-party tools at detecting paid links.

Google has experimented with removing links from their index with negative consequences for the quality of search results. It would be interesting to see the quality of search results when they vary the spammy link threshold of Penguin.

It’s possible that even though certain links are not compliant with webmaster guidelines, they still assist Google in their number one goal of returning users the best quality search results. For the time being, they might still be of use to Google.

3. Negative SEO remains a reality

If Google is sure that a link has been orchestrated, it is very difficult for the search engine to also be sure whether it was done by the webmaster or by someone else executing a negative SEO campaign.

If a penalty or visibility drop were as easy to incur from a handful of paid links, then in theory, it would be pretty straightforward to perform negative SEO on competitors. The barriers to doing this are quite low, and furthermore, the footprint is minimal.

Google has tried to negate this problem with the introduction of the disavow tool, but it is not realistic to think all webmasters will know of this, let alone use the tool correctly. This is a challenge for Google in tackling paid links.

 

4. It provides a PR backlash and unwanted attention

When rolling out large algorithm updates, it’s inevitable that there will be false positives or severe punishments for small offenses. After any rollout, there will be a number of “adjustments” as Google measures the impact of the update and attempts to tweak it.

Despite that, a large number of businesses will suffer as a result of these updates. Those who regularly join Google Webmaster Hangouts will be used to business owners, almost in tears, discussing the devastating impact of a recent update and pleading for more information.

While the vast majority of Google users will most likely never be aware of or care about the fallout of algorithm updates, these situations do provide Google with some degree of negative PR. Any noise that points toward Google yielding too much power is unwanted attention.

On a related note, sometimes penalties are just not viable for Google. When someone walks down Main Street, they expect to see certain retailers. It’s exactly the same with search results. Users going to Google expect to see the top brands. The user doesn’t really care if a brand is not appearing because of a penalty. Users will hold it as a reflection on the quality of Google rather than the brand’s non-compliance with guidelines.

To be clear, that’s not to say that Google never penalizes big brands — JCPenneySprintthe BBCand plenty of other large brands have all received high-profile manual penalties in the past. But Google does have to consider the impact on the user experience when choosing how to weight different types of links. If users don’t see the websites they expect in search results, the result could be switching to another search engine.

This is how Google deals with the problem

The above four points highlight some of the challenges Google faces. Fewer things are more important than meeting its objective of returning the most useful results to its users, so it has a massive interest in dealing with paid links.

Here are some ways Google could address the challenges it faces:

 

1. Prefer to devalue links and issue fewer penalties

Penalties act as a deterrent for violating guidelines, and they serve to improve the quality of search results by demoting results that were artificially boosted. A lot of the risk of “getting it wrong” can simply be mitigated through devaluing links algorithmically, rather than imposing manual penalties.

In the instance of a negative SEO attack, the spammy links, instead of causing a penalty for a website, could simply not be counted. In theory, this is the purpose of a disavow file. Penalties could be saved for only the most egregious offenders.

The fact that Penguin now runs in real time as part of the core ranking algorithm suggests that this is the direction they are heading in: favoring the devaluation of spammy links through “algorithmic” penalties (which websites can now recover from more quickly), and manual penalties only being applied for serious offenses.

2. Do a slow rollout combined with other updates

Slowly rolling out the Penguin 4.0 update provides Google two advantages. First, it softens the blow of the update. There is not one week when suddenly some large profile brands drop visibility, drawing attention to the update.

Second, it allows Google to test the impact of the update and adjust over time. If the update is too harsh, they can adjust the parameters. Penguin 4.0 may take several weeks to roll out.

To add to the confusion and make it more difficult to understand the impact of Penguin 4.0, it is probable Google will roll out some other updates at the same time.

If you cast your memory back two years to the introduction of Panda 4.1 and Penguin 3.0, they were rolled out almost in conjunction. This made it more difficult to understand what their impacts were.

There was a lot of SERP fluctuation this September. It is possible part of this fluctuation can be attributed to Penguin 4.0 testing, but there is no certainty because of the amount of other updates occurring (such as the local update dubbed “Possum“).

 

3. Encourage a culture of fear

Even if the risk of receiving a penalty is the same now as it was five years ago, the anxiety and fear of receiving one is much greater among brands. High-profile penalties have not only served their function of punishing the offending brand, but they also have provided a great deterrent to anyone else considering such a strategy.

The transition to content marketing and SEO becoming less of a black box assisted in this, but this culture of fear has been a large driver in the reduction of paid link activity.

Final thoughts

Google is often criticized for not doing more to tackle paid links, but I think that criticism is unfair. When one considers the challenges search engines face when tackling paid links, one can be more forgiving.

Now that Google has incorporated Penguin into the core algorithm, webmasters may have an easier time recovering from ranking issues that arise from spammy or paid links, as they will not have to wait until “the next update” (sometimes years) to recover from an algorithmic devaluation.

However, the fact that Penguin now operates in real time will make it more difficult for webmasters to know when a loss in rankings is due to spammy links or something else — so webmasters will need to be vigilant about monitoring the health of their backlink profiles.

I suspect that Google will continue to make tweaks and adjustments to Penguin after the rollout is complete, and I expect to see a continued shift from penalties to devaluing links over time.

Source: Search Engine Land

Categorized in Search Engine

This is the wrap-up of the most popular posts and announcements on SEJ over the previous week. Newsletter subscribers are the first to receive this and other updates.

Penguin is Now Part of Google’s Core Algorithm

penguin is now running in real-time as a part of Google’s core algorithm. The update is already in effect in all languages. Learn what else has changed, which is based on some of the top requests from webmasters.

Everything You Need to Know About On-Page SEO

Everything You Need to Know About On-Page SEO

How are you optimizing your online presence to make your voice heard? It starts with ensuring you are up to date on on-page SEO basics to provide peak performance for your website and visibility for your target audience.

Popular Search Marketing Posts

Here is a rundown of the most popular posts on SEJ from last week:

  1. Penguin is Now a Real-Time Component of Google’s Core Algorithm, byMatt Southern
  2. Everything You Need to Know About On-Page SEO, by Ryan Clutter
  3. The Complete Guide to Mastering E-Commerce Product Page, byStoney G deGeyter
  4. 10 Reasons Why Your E-Commerce SEO Campaign is Failing, by James Brockbank
  5. Google AdWords Introduces Cross-Device Remarketing, by Matt Southern / [AD] Looks Aren’t Everything: Why a Successful Infographic is Much More Than Just Design
  6. The Difference Between Accelerated Mobile Pages (AMP) and Mobile-Friendly Pages, by Bharati Ahuja
  7. Managing Your Website’s SEO Health, by Melih Oztalay
  8. Google Displaying Vacation Prices on Front Page of Search Results, byMatt Southern
  9. Google Allo Keeps Messages Indefinitely, Raising Privacy Concerns, byMatt Southern
  10. Google Testing New Schema Markup for ‘Science Datasets’, by Matt Southern

 

Download This Week’s Episode of Marketing Nerds

In this Marketing Nerds episode, SEJ Chief Social Media Strategist, Brent Csutoras, was joined by Tom Anthony, Head of Research & Development at Distilled, to talk about the future of search, other technology trends, and how to put it all together to understand the main trajectories in the industry. Listen to the full episode, or download the MP3 here.

Original source of this article is Search Engine Journal

Categorized in Search Engine

Google has rolled out its new algorithm Google Penguin 4.0. Michael Jenkins explains how these changes will affect your website and what you can do to avoid being penalised.

It’s been a two-year wait for SEO tragics – Google’s anticipated Penguin 4.0 started rolling out over the weekend and while it’s too soon to see the full impact here’s what you need to know.michael jenkins - ceo - shout agency

What is Google Penguin?

In a nutshell Penguin is the name for a Google algorithm designed to catch websites that are spamming search results. The first iteration launched in 2012, the last update to the algorithm was in 2014 and now Penguin 4.0 landed on the weekend.

Tell me more about Penguin 4.0

Penguin 4.0, dubbed the ‘real time update’, has targeted over-optimised sites. Over-optimisation is two-fold. Firstly, when there is an overuse of keywords that are unnaturally placed second is the over optimisation of link profiles, so if you have too many links from external sites pointing to the same keywords on your page, it’s time to update your disavow file before you get penalised.

Moving forward, there’s one thing that’s for certain – use keywords to write for your audience, not for search engine rankings as you will get found out quicker than ever!Happy chinstrap Penguin

How exactly will sites be affected?

The two key changes are:

  1. You will start to see considerable fluctuations in Google rankings from now as real-time updates will occur as updates are made to a site.
  2. Penguin 4.0 is more granular and penalises specific pages on a site. In the past it was a domain wide penalty.

Pros

  • Penguin is real-time! When a webmaster updates or optimises their website Google will now recognize these changes quickly; and rankings will change accordingly – no more lag time!
  • Penguin 4.0 penalises competitors sites that aren’t doing things by the book and taking short cuts for short term rankings. If you have been doing things well and building genuine authority in your marketplace online then it’s likely to see a positive effect on rankings.

Cons

  • Penguin is real-time. I hear you – I’ve named it as a ‘pro’ but it is also a watch out. You need to ensure your site is being optimized and updated correctly – Google will now notice errors faster than ever that can quickly alter your ranking.
  • SEO is becoming much more sophisticated over time and Google is getting faster at seeing unnatural tactics. Regularly updating your SEO strategy and keeping constant monitor on your websites back links is essential to remain compliant with Penguin 4.0

How can I make the most out of Penguin 4.0?

Marketers should always keep an eye on back links and perform regular checks using the Google disavows tool. The main difference between good and bad backlinks depends on the quality of the website they are on. Bad backlinks will see your site penalised.

If you have noticed fluctuations in rankings there are a few steps you can take to help:

  • Clean out hazardous links
  • Review keyword density on site. Is the keyword repetition natural?
  • Create some branded links. The fastest way to do this is though citation building

 

Watch this space.

Penguin 4.0 has literally just landed we’re bound to learn more in the coming week as it rolls out. Keep your eye out for more insights.

Source : https://mumbrella.com.au

Categorized in Internet Technology

Google has officially confirmed on the Google webmaster blog that they've began rolling out the Penguin 4.0 real time algorithm. It has been just under two years since we had a confirmed Penguin update, which we named Penguin 3.0 in October 2014 and this will be the last time Google confirms a Penguin update again. We saw signals yesterday that Google began testing Penguin 4.0, Google wouldn't confirm if those signals were related to this Penguin 4.0 launch announcement this morning but nevertheless, we are live now with Penguin 4.0.

No Future Penguin Confirmations

Google said because this is a real-time algorithm "we're not going to comment on future refreshes." By real time, Google means that as soon as Google recrawls and reindexes your pages, those signals will be immediately used in the new Penguin algorithm.

Google did this also with Panda, when it became part of the core algorithm. Google saidno more confirmations for Panda.

Penguin 4.0 Is Real Time & More Granular

Google again said this is now rolling out, so you may not yet see the full impact until it fully rolls out. I don't expect the roll out to take long. Google wrote:

  • Penguin is now real-time. Historically, the list of sites affected by Penguin was periodically refreshed at the same time. Once a webmaster considerably improved their site and its presence on the internet, many of Google's algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed. With this change, Penguin's data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we're not going to comment on future refreshes.
  • Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

The real-time part we understand, it means when Google indexes the page, it will immediately recalculate the signals around Penguin.

Penguin being more "granular" is a bit confusing. I suspect it means that now Penguin can impact sites on a page-by-page basis, as opposed to how it worked in the past where it impacted the whole site. So really spammy pages or spammy sections of your site can solely be impacted by Penguin now, as opposed to your whole web site. That is my guess, I am trying to get clarification on this.

 

Google Penguin Timeline:

Was My Site Impacted By Google Penguin 4.0

If your site was hit by Penguin 3.0 and you don't see a recovery, even a small one by now, then that probably means you are still impacted. I'd give this a couple weeks to fully roll out and check your analytics to see if you have a nice recovery. Again, I still think specific sections and pages will be impacted and it will make it harder to know if you are impacted by this update or not.

The nice thing, you can use the disavow file on links you think are hurting you and you should know pretty quickly (I suspect days) if that helped as opposed to waiting two years for Google to refresh the algorithm. At the same time, you can be hit by Penguin much faster now.

Source : https://www.seroundtable.com

Categorized in Search Engine

I am reluctant to write something about this because the chatter in the SEO community is all over the place. It seems like many are saying that Google is testing the new Penguin 4.0 update in the wild. Some are saying they see rankings for Penguin impacted sites jump up and down over the course of the same day.

It is possible that Google might be testing Penguin 4.0 on some users but I am not sure if that is how this algorithm release can work. I suspect it can, but again, it is hard to tell.

The Black Hat Forums thread on page 14 started to spike up again with people saying their sites were going up and down over the course of 24 hours. There is also some chatter about this in the ongoing WebmasterWorld thread.

The tools are hit or miss, it depends on when they run and if they hit the Google Penguin test (if there is a test). For example, Mozcast has been on fire all week:

click for full size

But some of the other tools are steadily high or just all over the place.

Here are some quotes from the threads:

Right now it's just dancing around bit, but in all likelihood these small patch updates are warning sings of an impending penguin 4.0
It is my opinion that Google is probably testing a filter on those weird days, and some people are winners and other are loosers. Since our company is expecting a Penguin recovery (due to recent NSEO disavowels), and since Penguin is imminent, and since we have good days when others report bad days, this behavior is most likely Penguin testing. About a year ago, Google said Penguin was coming soon. That's about the time "Zombies" started getting reported.

I am seeing many tweets from folks and I have tons and tons of emails from people asking me what is up.

Again, it is too early to tell but it wouldn't surprise me if Google is testing a Penguin refresh.

Forum discussion at Black Hat Forums and WebmasterWorld.

Update: As the day goes on, it seems more and more like Google is indeed testing the new Penguin update. No confirmation from Google on this.

Update #2: Friday at 8am ET, Google confirmed the roll out of Google Penguin 4.0, the real time version.

 

 

Source : https://www.seroundtable.com

Categorized in Search Engine

Was there a major Google algorithm change this week? Many webmasters believe so.

 

Earlier this month, we reported about significant chatter around a Google algorithm update. Well, it looks like we have another update to report to you this week.

On Tuesday of this week, there were some early signals of a Google update. Those signalsintensified Thursday and seem to just be getting stronger day by day.

In short, the webmaster and SEO community is confident that there was an algorithm change with the Google organic search results this week. Not only are the SEO forums and communities discussing it, the tracking tools from MozcastAccurankerRankRanger and others have also shown significant fluctuations in the organic rankings in Google.

Google’s PR team wouldn’t directly comment. Instead, they pointed to a tweet by John Mueller from Google: “nothing specific, sorry — we’re always working to improve things!” This is in response to questions about an algorithm update. John also said this morning on Twitter that these are normal fluctuations:

In any event, it seems this is not directly related to the Google Penguin update we are all anxiously awaiting.

 

 

Source : http://searchengineland.com/google-downplays-google-algorithm-ranking-update-week-normal-fluctuations-258923

Categorized in Search Engine

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Finance your Training & Certification with us - Find out how?      Learn more