First there was Panda and Penguin. Now, Google will release a Google mobile update on April 21. This update promises to be even wider-reaching than both of the “bird-inspired” updates that valued high-quality content.

Writing For Google's Biggest Algorithm Update Yet | SEJ

Understanding the Scope of Google’s “Mobilegeddon” Update

Google’s new update promises to be a game changer. The algorithm will rank mobile-friendly sites higher than non-mobile-friendly ones. Many webmasters from around the world are (rightfully) anxious about its release since it could significantly impact traffic.

From a writer’s perspective, the update gives us something to think about as well. Does this mean we need to learn a whole new way to create web content?

There is no getting around the fact that your website must be mobile.

Before Panda and Penguin made their debuts, it was fairly easy to rank a website at the top of the search result by indiscriminately stuffing a particular keyword. These updates crippled a number of websites because they depended on that tactic to gain traffic.

The Mobilegeddon promises to do the same for webmasters who have neglected optimization for mobile browsers. This could be potentially devastating to some reaches of the Internet. Google has already stated that there will be no middle ground. Your site will either be mobile friendly or not. This could mean an entire reworking of site architecture and the content contained therein. This is of utmost importance to us as webmasters, writers, and marketers.

Content Production for the Mobilegeddon

Get ahead of this potentially game-changing update. Although it isn’t in effect yet, estimate how writing for a mobile site differs from writing for PCs. There is going to be a series of changes that content producers should aim to heed if they intend to keep producing high-quality, compelling content after the update has rolled out. Read this Search Engine Land post that offers three actions to prepare your website for the impending update.

From what we know about the update, it’s likely that we will have to make changes to our content production habits. Here are a few tactics that will help:

1. Curtail Headline Length

User experience on a mobile device is different than a desktop browser. One of the most obvious differences is the change in screen size (and the amount of usable real estate). Currently, a headline can stretch across the full banner-length of a browser, but mobile screens change the game when it comes to headline width.


What this Means for Us: Create shorter headlines. For Twitter users, it just means that you can practice your 140-character limit more often. For those of us who don’t use this particular social media network, now is a good time to start. We need to learn how to condense page-width headlines into more bite-sized chunks, without sacrificing the impact potential of our headline.

2. Make Shorter Paragraphs

“Snackable content” is something that content producers are all too aware of, but is especially important for mobile optimization. Create content that the user can consume in one sitting. However, the format in which we present this content is likely to be as bite-sized as the content itself. Because of short attention spans and aversions to “walls of text” it’s likely that mobile users would feel put upon when it comes to dealing with paragraphs that fill their entire screen.

What this Means for Us: Learn to summarize your ideas. Keep to the point and make your copy more targeted in nature. In some cases, such as home pages, reduce the amount of copy there altogether. Increased copy gives the user a hard time and makes for difficult reading, especially on a tiny display. Get your message across in short bursts.

3. Less Words, More Action

In Orwell’s 1984, he invented a form of the English language called “newspeak” where words were combined, removing unnecessary and frivolous ones and replacing the others that didn’t serve a purpose. This mobile update is likely to make content producers do the same, paring content down to be less wordy while at the same time interspersing calls to action. Condensing content will require us to consider what we write and distill the message in as few words as possible.

What this Means for Us: Rethink the methodology for creating content. In addition to making content compelling and benefit focused, we must also now take a look at the amount of words we use and how often we call to action. It could possibly mean a change in the basic tenets of web writing.

The exclusion is blog content– they will always rank and read better in long form – but for your home and main pages, less content means a better mobile experience, and happier readers.


Preparing for the Mobilegeddon Now

Luckily, this change does not require us to find a fallout shelter to survive. Writing habits just need to be carefully considered.

You may need to review web writing and revamp some marketing approaches accordingly to align to with what is expected from mobile friendly sites.

Source : searchenginejournal

Categorized in Search Engine

Google recently came out with a good news, bad news announcement for small businesses who have a poor mobile experience on their websites. I’m a bad news first kind of guy, so I’ll start there:

Google is going to update their algorithm again, and if you’re on the wrong side of the law you might lose a lot of traffic.

The good news is that, unlike with other algorithm updates, they’re actually telling you about this one before they roll it out. Time is ticking down on the actual update going live, but you still have some time to get your mobile house in order.

So What Exactly is Happening?

In a nutshell, Google announced on their Webmaster Central blog that “Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal” and also announced that they would be using more indexed content from apps in search results (which they’ve already started doing).

There’s a lot more information on the details around the update, and if you’re looking to get a deeper understanding of what’s happening there are already some great resources:

But if you own a small business you’re probably not interested in becoming a mobile SEO expert. The key questions you’re likely asking yourself are:

  1. How will this impact my business?
  2. What should I actually do?
  3. Do I need to get everything done by April 21st?
  4. What is all this going to cost me?

In this post, I’ll unpack these questions to help you evaluate the potential impact of the update for your business and give you a framework to evaluate what changes you should be making to your site and when, based on actual costs and business impact.

Does Your Site Even Have Major Issues to Fix?

The first step in this process is obviously to determine if your site is going to actually be at risk for the April 21 update. In addition to the resources outlined above, Google has offered webmasters a number of tools to help determine how “mobile friendly” their sites are:

  • The mobile friendliness tool, which gives you a mobile friendly or unfriendly thumbs up or thumbs down as well as a quick screenshot of your site as Google sees it, and specific recommendations for areas to fix if you’re “mobile unfriendly”.
  • Google’s Page Speed Insights tool which gives you a sense of your site’s mobile page speed and also offers recommendations on things to fix. Note the two tools report differently, with the mobile friendliness tool looking at the page the way Googlebot would, and the PageSpeed Insights tool fetches a page as a user would.
  • When logged into Webmaster Tools you can navigate to the Mobile Usability Report and see specific issues WMT is showing with specific pages there, as well. Google is also sending out notifications through Webmaster Tools for sites that won’t meet the mobile friendliness requirements, which you might find under your messages in Webmaster Tools if you haven’t seen the notification already (assuming your site isn’t mobile friendly).

So if your site is mobile friendly and loads really quickly on mobile devices, you can probably head out and get some other work done and skip the rest of this post.

But if your site is showing as not mobile friendly or even just showing extremely low mobile page speed scores, you’ll want to answer the question: what specifically should  you do next?

Mobile is Important, but its Impact Isn’t Equal for Everyone

Mobile traffic is growing, and mobile search is growing and this is going to be a major update, so this isn’t something you can ignore. If your site has a poor mobile experience (for smart phones and/or tablets), you could be throwing away conversions, and fixing it could be a massive opportunity for your business.

If your choice is “should my site be mobile friendly right away or mobile unfriendly indefinitely” my expert opinion is that you should make it mobile-friendly right away.

But if you’re a small business, that’s likely not the choice you’re looking at. You’re looking at prioritizing dev resources against other dev initiatives and against other company initiatives. It’s easy for me to tell you that it’s urgent to make your mobile site responsive because more people are searching and visiting your site via mobile devices, but what if your choice is to make your site mobile friendly or hire one all-star employee?

What if you’re weighing a mobile site overhaul against doubling down on a promotion that you know has been profitable for you in the past, but that you don’t quite have enough cash available to push harder on if you are revamping your website?

That all-star employee or profitable promotion might be the difference between you keeping the lights on and not, while a mobile site overhaul might just mean maintaining your status quo or an incremental short-term improvement.

You need to gather some information, and make your own business decision.

The first step is identifying the actual likely impact of the update on your current traffic.

How Much Mobile Traffic Do You Actually Get from Search?

I find that a lot of sites – and particularly busy small business owners – don’t know the answer to this question, but it’s a crucial starting point for understanding how urgent and the degree of impact an update like this will have on your business.

Within Google Analytics, you can quickly get a sense of the overall mobile traffic to your site by navigating to Audience > Mobile > Overview and looking at the breakdown of desktop / mobile / tablet:

An image of how you drill down to get a breakdown of mobile traffic from Google Analytics

You can quickly sense that for this site, mobile traffic alone makes up over 30% of all traffic. You’ll also want to look at goal numbers and value, which should also be present in the right columns of the same report.

Your site’s overall mobile and tablet traffic and goals should absolutely factor into your overall business decision, so note those numbers. But, for the change on the 21st, we’re particularly interested in search traffic, so we’ll need to use an advanced segment to look only at organic traffic.

To do this, click on “All Sessions” from the advanced segments drop down, and click “Organic Traffic” to look at the breakdown (and raw impact) for your organic traffic:

Screenshot of how to drill down to organic traffic only


You’ll see the same breakdown as above, but this time applied to organic traffic only.

Now you have a high-level understanding of how much of your organic traffic is coming from mobile search. Next, you want to understand where your mobile traffic is coming from. The mobile update will have a very different impact on your site if most of your mobile traffic is branded traffic coming to your homepage versus location-specific listings for your business that drive a lot of great traffic for individual landing pages that are poorly optimized for mobile.

You can get a more granular idea of what traffic is actually at risk by drilling down to look at the mobile organic traffic coming to your site by page.

You can do this by building a custom report. Click “Customization” in the top nav in Google Analytics, and then click “New Custom Report” and you’ll be taken to an interface where you can fill in your metrics (here you’ll want to select Users> users and Goal Conversions > Goal Completions, and any other key metrics for your site – this could be pageviews if you’re monetizing based on ad revenue, revenue or goal values if you have that data in Analytics, etc.) and your dimensions (in this case behavior > landing page) and save the report:

Screenshot of setting up a Google Analytics custom report for landing pages.


From there you want to again check off the advanced segment for organic traffic, and then add a second segment for Device Category by clicking Users > Device Category:

Screenshot of how to drill down to device category within a custom report in Google Analytics

Finally, click on the blue “advanced” link just above the traffic and goal grid and create a filter to show only mobile traffic:

Screenshot of how to configure a mobile only device category filter in a custom landing page report in Google Analytics


And you have a report for the pages that are driving your organic traffic from mobile only, as well as all of your key metrics:

Screenshot of mobile traffic by page breakdown in Google Analytics

If you want more information about the specific searches going to each of these pages, you can find information on the specific types of queries each page is ranking for by looking in Webmaster Tools at Search Queries > Search Traffic > Top Pages, then clicking Filter and filtering for mobile traffic:

How to view only mobile pages and search queries in Google Webmaster Tools

From there, Webmaster Tools will show you the top pages for mobile traffic (you can sort by impressions or clicks) and you can click into each page and see the actual search terms the page is showing up for. Note that this report doesn’t show actual conversion metrics, so these pages may be driving impressions and clicks without driving real revenue (which could be an opportunity, but also might mean less short-term revenue risk for those pages), and all of this data is rough estimates and could be subject to lags or outages, but it should give you a better picture of the types of terms searchers are finding various pages on your site for.

You can also use a tool like SEMrush to get more information about what those pages are ranking for (sadly, Google Analytics keyword data isn’t likely to be a lot of help in this case).

So now, you have a pretty strong sense of how much mobile traffic is coming to your site through search. You also know what shape that traffic is taking (is it branded? Where is it coming in through your site? What are the actual queries people are landing on your site with?).

You can make a decision on whether updating your site’s mobile experience by the 21st is a priority or not. If you’re getting 5-10% of a limited amount of organic traffic from mobile search and most of that traffic is branded and directed at your home page, you might want to put a mobile site overhaul on the “nice to have” long term project list.

If you’re getting 30% of your site’s traffic from mobile, it’s driving to various pages on your site, and organic traffic is a key revenue driver for your business you may need to move this up on the company to-do list.

My Site is at Risk: What Do I (and What Don’t I) Need to Fix Right Now?

Once you determine that mobile search traffic is significant enough to warrant addressing mobile issues before the April 21 update, it’s time to figure out which items – specifically – you need to update.

If you have a great developer, shoot them a note and let them know what your constraints and goals are (having checked your traffic, you should have a better sense of the overall business impact of the update, and can better spec out a budget for mobile enhancements) to find out what they can complete is the designated time-frame and whether that will address the issues with your current mobile performance.

If you’re like a lot of small businesses, you may not have a dedicated dev resource. For those folks, I reached out to my colleague Jamie Mazur, the co-founder and CTO at digital marketing agency All Points Digital. Jamie is currently in the process of helping several small to mid-sized businesses determine how to best handle the mobile update from a development perspective, and I asked him:

How should a small business with limited resources think about triaging development in the run-up to April 21st?

He answered:

The “Mobilegeddon” meme is dripping with doom-and-gloom. Before panicking over the Android/iPhone zombie apocalypse, business owners and IT managers need to take a step back and evaluate whether urgency around the April 21st deadline is warranted based on the realities of their business.

Look at the analytics and review the traffic coming from mobile devices. Where possible, analyze what revenue or soft return is attached to those visitors. Armed with this information, make an informed decision on whether a potentially expensive “rush job” or throw-away short-term fix is worth the cost.

Your choices are:

  • Ignore the deadline, and focus on creating a quality mobile-friendly user experience on your own timeline.
  • Enlist whoever you need, regardless of time or cost, to beat the clock while implementing a lasting solution
  • Hack in quick fixes to pass the “mobile-friendly” test to avoid the wrath of The Google


For those looking for the quick fix, there are a few basic strategies that we recommend:

  • Attempt a Plug & Play Solution: If your site is built using a basic WordPress theme, check for an update that may make it responsive. If one is not available, take a look at plugins like Wptouch.
  • Force Your Page to Fit: Uses the <meta name=”viewport” width=device-width initial-scale=1> tag appropriately. If you don’t know what this is, Google it or ask your developer.
  • Add Rudimentary Responsive Design to Trouble Spots: Audit your page styling (CSS and HTML) for fixed-positioning, larger fixed-size images, and font sizes. Use media queries to remove or restyle elements that are too large or small for a mobile device.
  • Let Google Be Your Guide: Use the mobile friendly test. Treat the results as your to do list.

For those for whom it matters, there is still time to escape Mobilegeddon. Don’t try to boil the ocean – just grab the low-hanging fruit. It may not be perfect, but you will end up with a better mobile experience than you have today, and you will likely retain your search positioning.

If mobile traffic just does not mean that much to you today, take a few deep breaths. Stop circling April 21st on your calendar, but plan to act soon. You won’t want to be left on the outside looking in at the hordes of web traffickers happily engaging through their devices.

So to recap, if you’re a small business worried about the impending mobile friendliness update, there are some straight-forward next-steps you can take to determine how to prepare:

  • Look at your analytics to get an idea of the actual business impact mobile traffic has on your business currently
  • Use the tools listed above to determine if your site has mobile usability issues.
  • Contact your current development resource (or a new one), and find out what you can get fixed when, and for how much.
  • Take all of that information and determine the best course of action for your business as a whole as you prepare for Google’s mobile friendliness algorithm update.

Regardless of what you decide to do in the next few weeks, keep in mind that your percentage of mobile visitors is likely to rise (possibly very aggressively) in the coming months and years, and you should have a long-term plan for creating a great mobile experience for your prospects – even if getting it perfeclty in place by the 21st isn’t possible, or pragmatic.

Source : searchenginejournal

Categorized in Business Research

Data collected from Glenn Gabe at G-Squared Interactive shows strong evidence that a Google algorithm updated occurred this month. Throughout the month of June, Gabe says he has seen some major volatility in search rankings, with some websites soaring in rankings and others falling.

“It’s pretty clear that Google rolled out a major algo update in June,” says Gabe, pointing to specific dates where has has seen the most movement. June 1, 8, 21, and 26 are the dates where the most volatility was seen, suggesting that the update rolled out at the beginning of the month with “tremors” occurring through the remainder of the month.

Gabe suggests that the update could be either Panda-related, or it could be one of Google’s quality updates. Whatever the case may be, the update appears to be targeting websites with obvious quality issues. These issues can include: poor content quality, user experience issues, an over-abundance of advertising, and so on. Conversely, the update appears to be rewarding sites that have worked hard to clean up the quality of their content.

These are the top issues Gabe found when analyzing sites that have dropped in rankings as a result of this update:Burying content below sponsored posts: A site that included a number of link thumbnails to sponsored content at the top of articles ended up getting hammered in the search results.

  • More ads than content: A site that featured two thirds ads, and one third content, ended up seeing a significant drop in search results.
  • Thin Content: A site focused on Q&A ended up seeing a drop in search results, most likely because it contained many pages with thin or irrelevant content.
  • Generic Content: A site that provided very generic content for the topic it covered ended up seeing a drop in search results. Content consisted mostly of rewrites of information that could be found elsewhere on the web. It probably didn’t help that the content was filled with ads as well.
  • Indexing Issues: One site that saw major fluctuations in search results is one that had indexing issues based on the site’s robots.txt file. This could have led to Google not being able to properly crawl the content on the site.

Interesting Issue

Gabe ran into an unusual issue where a site ended up climbing in rankings after enhancing the user experience of its mobile website. By Gabe’s estimation the site provided a horrible experience on desktop, but received a boost in rankings after redesigning the mobile version.


An algorithm update associated with Panda has not been since since March. According to the time frame, and the evidence presented by Glenn Gabe, there’s reason to believe what we’re seeing is another Panda update. At the very least, it’s likely one of Google’s “quality updates”.

Source : searchenginejournal

Categorized in Search Engine

In the future, we may own much less and share much more. And if we do, it will all be down to big data.

In the last century, owning things was the marker of the middle class.  Those who had more money could own more things.  But as manufacturing became less expensive, the barrier to owning a great deal of stuff was lowered. Today, many people living at or below the poverty level own plenty of things, but it isn’t a good indicator of their relative wealth.



In fact, as millennials enter adulthood and the middle class, the trend seems to be for them to own less stuff.  Not only is there a thriving “minimalist” movement, but the advent of the digital and sharing economies have made this much easier.

Where Baby Boomers and Gen Xers might have had shelves and shelves dedicated to books, magazines and music in their homes, today we can fit the same amount of media and more onto the pocket-sized computers we anachronistically still call phones.

Whereas being a “two-car family” (or even three or four cars) was once a mark of status, today many millennials see more status in being a one-car or even zero-car family and making use of services like Uber, Lyft, CarGo, and others to use cars only when they need one.

Ridesharing, apartment/home lending, peer-to-peer lending, reselling, coworking, talent-sharing… The sharing economy, sometimes also called the collaboration economy, is taking off in all sorts of niches. 

Beyond a disillusionment with consumerism, what’s driving this trend is data.  Most — if not all — of these upstarts would not be viable businesses, certainly not on a large scale, without leveraging a platform and a foundation of big data.

These companies don’t just represent a new way of thinking or new services, but a new way to use data effectively to provide services to people when and where they want them.

The obvious examples here are Uber and Airbnb, both of which developed their own platforms to allow service providers and users to connect to the benefit of both. But there are other interesting examples of companies using data and developing platforms to join this new economy:

  • Freelancing — Sites like TaskRabbitCare.com and Upwork have taken the freelance market to a new level. Upwork specializes in helping more traditional freelancers (writers, graphic designers, coders, etc.) connect with business owners looking to hire; while TaskRabbit does the same for services like handymen, dog walkers, personal assistants, and so on. Care.com specializes in caregivers for both children and the elderly. The platforms each of these sites have built make it possible to connect those offering services with those seeking the services.
  • Coworking — WeWork is only one of many companies providing coworking spaces in big cities around the world. Freelancers, entrepreneurs, and telecommuters can rent a desk or an office without the overhead and cost of renting an entire building or suite. Prices are low enough that you can use it as you like, and the space offers some of the benefits of an office including meeting space, phone lines, internet, and often free coffee and even free beer and wine sometimes.
  • Car sharing — Services like Lyft and Uber allowed individual drivers to operate like a taxi service by providing them a safe way to find clients and get paid. Zipcar allowed people to borrow cars for very short periods of time, like the length of a big shopping trip. And now, services like Getaround enable individuals to share their cars with neighbors and get paid for it by connecting the users on the Getaround platform, automating payments, and even insuring the cars for up to $1 million. Liquid provides the same service for renting bicycles!
  • Peer-to-Peer lending — Lending Club and sites like it allow people to lend one another money, with much lower interest rates and fees than traditional credit cards or bank loans. Investors earn solid returns and borrowers get more competitive rates, all facilitated by the platform.
  • Fashion — Sites like Poshmark and threadUP allow individuals to sell their gently used clothing while services like Le Tote offer subscribers the ability to borrow clothes and return them like a Netflix subscription for your closet. Rent the Runway allows women to rent designer gowns for a special event at a fraction of the price of buying one.
  • Sharing resources — Neighborgoods and similar sites allow people to borrow resources — like tools and kitchen appliances — directly from their neighbors. Rather than buying a specialized tool for a single project, people can connect with and borrow from their neighbors, facilitated by the platform.

None of these services would be possible without the big data and algorithms that drives their individual platforms. Without a sophisticated app to match a driver with a rider, Uber wouldn’t be competitive with taxi drivers who cruise around all day looking for fares — and the same is true of each of these services.What’s fascinating is that the company is rarely the actual service provider; instead, they act as facilitator, making the transaction possible, easy, and safe for both the provider and the user.

They break down the barriers that otherwise exist to starting a business or a “side hustle” for many people and make it both easy and lucrative to participate in this collaborative economy.But none of it would be possible without data and algorithms to use it.Bernard Marr is a best-selling author & keynote speaker. His new book: 'Big Data in Practice: How 45 Successful Companies Used Big Data Analytics to Deliver Extraordinary Results'

Source : forbes

Categorized in Search Engine

In the second installment of his two-part series on Google's quality updates (aka Phantom updates), columnist Glenn Gabe explains the connection between user experience and organic search performance.

In part one of this series, I covered a number of important points about Google’s quality updates (aka Phantom). I tried to provide a solid foundation for understanding what Phantom is and how it works, explaining its history, its path since 2015 and how it rolls out.

In part two, I’m going to dive deeper by providing examples of “low-quality user engagement” that I’ve seen on sites affected by Phantom. I’ve analyzed and helped many companies that have been impacted by Google’s quality updates since May of 2015, and I’ve seen many types of quality problems during my travels. Today, I’ll try and explain more about those problems and provide some recommendations for sites that have been impacted.

So fire up your proton packs. There’s ectoplasm ahead.

Back to school, back to algorithm updates

First, a quick note about what we’ve been seeing this fall so far. September of 2016 was one of the most volatile months in a long time from an algorithm update standpoint. We’ve witnessed a serious combination of updates since August 31, 2016.

First, Joy Hawkins picked up a major local algorithm change that many in the local space are calling Possum. That was combined with what looked like another quality update starting on August 31 (I’m covering Google’s quality updates in this series). And to add to the already volatile September,Penguin 4 was announced on September 23 (which started to roll out before the announcement date).

And if you’re thinking, “Would Google really roll out multiple algorithm updates at one time?,” the answer is YES. They have done this several times before. My favorite was the algorithm sandwich in April of 2012, when Google rolled out Panda, then Penguin 1.0, and then Panda again, all within 10 days.

But again, I saw many sites impacted by the August 31 update that were also impacted by previous quality updates. Here are two sites that were impacted by the early September update with a clear connection to previous quality updates. And I’m not referring to Penguin impact, which seems to have started mid-month. I’m referring to sites without Penguin issues that have been impacted by previous quality updates:




I’m planning on covering more about this soon, but I wanted to touch on the September volatility before we hop into this post. Now, back to our regularly scheduled program.

Phantom and low-quality user engagement: hell hath no fury like a user scorned

After analyzing many sites that either dropped or surged during Google quality updates, I saw many examples of what I’m calling “low-quality user experience” (which could include low-quality content).

I’ve helped a lot of companies with Panda since 2011, and there is definitely overlap with some of the problems I surfaced with Phantom. That said, quality updates seem more focused on user experience than content quality. Below, I’ll cover some specific low-quality situations I’ve uncovered while analyzing sites impacted by Phantom.

It’s also important to note that there’s usually not just one problem that’s causing a drop from a quality update. There’s typically a combination of problems riddling a website. I can’t remember analyzing a single site that got hit hard that had just one smoking gun. It’s usually a battery of smoking guns working together. That’s important to understand.

Quick disclaimer: The list below does not cover every situation I’ve come across. It’s meant to give you a view of several core quality problems I’ve seen while analyzing sites impacted by Google’s quality updates. I cannot tell you for sure which problems are direct factors, which ones indirectly impact a site and which ones aren’t even taken into account. And again, it’s often a combination of problems that led to negative impact.

Personally, I believe Google can identify barriers to usability and the negative impact those barriers have on user engagement. And as I’ve always said, user happiness typically wins. Don’t frustrate users, force them a certain way, push ads on them, make them jump through hoops to accomplish a task and so forth.

One thing is for sure: I came across many, many sites during my Phantom travels that had serious quality problems (which included a number of the items I’m listing below).

Let’s get started.

Clumsy and frustrating user experience

I’ve seen many sites that were impacted negatively that employed a clumsy and frustrating user experience (UX). For example, I’ve looked at websites where the main content and supplementary content were completely disorganized, making it extremely hard to traverse pages within the site. Mind you, I’m using two large monitors working in tandem — imagine what a typical user would experience on a laptop, tablet, or even their phone!

I have also come across a number of situations where the user interface (UI) was breaking, the content was being blocked by user interface modules, and more. This can be extremely frustrating for users, as they can’t actually read all of the content on the page, can’t use the navigation in some areas and so on.


Taking a frustrating experience to another level, I’ve also analyzed situations where the site would block the user experience. For example, blocking the use of the back button in the browser, or identifying when users were moving to that area of the browser and then presenting a popup with the goal of getting those users to remain on the site.

To me, you’re adding fuel to a fire when you take control of someone’s browser (or inhibit the use of their browser’s functionality). And you also increase the level of creepiness when showing that you are tracking their mouse movements! I highly recommend not doing this.


Aggressive advertising

Then there were times ads were so prominent and aggressive that the main content was pushed down the page or squeezed into small areas of the page. And, in worst-case scenarios, I actually had a hard time finding the main content at all. Again, think about the horrible signals users could be sending Google after experiencing something like that.


Content rendering problems 

There were some sites I analyzed that were unknowingly having major problems with how Google rendered their content. For example, everything looked fine for users, but Google’s “fetch and render” tool revealed dangerous problems. That included giant popups in the render of every page of the site (hiding the main content).

I’ve also seen important pieces of content not rendering on desktop or mobile — and in worst-case scenarios, no content rendered at all. Needless to say, Google is going to have a hard time understanding your content if it can’t see it.


Note, John Mueller also addressed this situation in a recent Webmaster Hangout and explained that if Google can’t render the content properly, then it could have a hard time indexing and ranking that content. Again, beware.


Deception (ads, affiliate links, etc.)

One of the worst problems I’ve come across when analyzing both Panda and Phantom hits relates to ad deception. That’s when ads are woven into the main content and look like the main content. So users believe they are clicking on links that will take them to additional content on the same site, but instead, they are whisked off the site downstream to an advertiser.

That could be a shockingly bad user experience for visitors, but the ad deception ride typically doesn’t stop there. Some of those downstream, third-party sites are extremely aggressive content-wise, contain malware, force downloads, and more. So combine deception with danger, and you’ve got a horrible recipe for Mr. Phantom. Again, hell hath no fury like a user scorned.


Quick case study: I remember asking a business owner who was negatively impacted by a quality update why a content module in the middle of their pages that contained ads wasn’t marked as sponsored (The ads were deceptive). They said that the click-through rate was so good, they didn’t want to impact their numbers. So basically, they kept deceiving users to maintain ad revenue — not good, to say the least. It’s a prime example of what not to do.

In addition, you have to look no further than Google’s Quality Rater Guidelines (QRG) to find specific callouts of ad deception, and how pages that employ that type of deception should be considered low-quality. I find many people still haven’t read Google’s QRG. I highly recommend doing so. You will find many examples of what I’m seeing while analyzing sites impacted by Google’s quality updates.


More excessive monetization

I also saw additional examples of aggressive monetization during my Phantom travels. For example, forcing users to view 38 pages of pagination in order to read an article (with only one paragraph per page). And of course, the site is feeding several ads per page to increase ad impressions with every click to a new page. Nice.

That worked until the sites were algorithmically punched in the face. You can read my post aboutPanda and aggressive advertising to learn more about that situation. Phantom has a one-two punch as well. Beware.

Excessive Pagination and Google Phantom

Aggressive popups and interstitials

First, yes, I know Google has not officially rolled out the mobile popup algorithm yet. But that will be a direct factor when it rolls out. For now, what I’m saying is that users could be so thrown off byaggressive popups and interstitials that they jump back to the search results quickly (resulting in extremely low dwell time). And as I’ve said a thousand times before, extremely low dwell time (in aggregate) is a strong signal to Google that users did not find what they wanted (or did not enjoy their experience on the site).

While analyzing sites negatively impacted by Google’s quality updates, I have seen manyexamples of sites presenting aggressive popups and other interstitials that completely inhibit the user experience (on both desktop and mobile). For some of the sites, the annoyance levels were through the roof.

Now take that frustration and extrapolate it over many users hitting the site, over an extended period of time, and you can see how bad a situation it could be, quality-wise. Then combine this with other low-quality problems, and the ectoplasm levels increase. Not good, to say the least.

It’s also important to note that Googlebot could end up seeing the popup as your main content. Yes, that means the amazing content you created will be sitting below the popup, and Googlebot could see that popup as your main content. Here’s a video of Google’s John Mueller explaining that.


The core point here is to not inhibit the user experience in any way. And aggressive popups and interstitials can absolutely do that. Beware.


Lackluster content not meeting user expectations

From a content standpoint, there were many examples of sites dropping off a cliff that contained content that simply couldn’t live up to user expectations. For example, I reviewed sites that were ranking for competitive queries, but the content did not delve deep enough to rank for those queries. (This often involved thin content that had no shot at providing what the user needed.)

Does this remind you of another algorithm Google has employed? (Cough, Panda.) Like I said, I’ve seen overlap with Panda during my Phantom travels.

As you can guess, many users were probably not happy with the content and jumped back to the search results. As I mentioned earlier, very low dwell time (in aggregate) is a powerful signal to Google that users were not happy with the result. It’s also a giant invitation to Panda and/or Phantom.


By the way, dwell time is not the same as bounce rate in Google Analytics. Very low dwell time is someone searching, visiting a page, and then returning to the search results very quickly.

I believe Google can understand this by niche — so a coupon site would be much different from a news site, which would be different from an e-commerce site. There are flaws with bounce rate in Google Analytics (and other analytics packages), but that’s for another post.

Low-quality supplementary content

I also saw many examples of horrible and unhelpful supplementary content. For example, if there’s an article on a specific subject, the supplementary content should be relevant and helpful. It shouldn’t contain a boatload of links to non-relevant content, contain deceptive and cloaked ads and so on.

It’s okay to have advertising, but make sure it can easily be identified as such. And help your users find more of your great content. Make sure the links you provide are relevant, helpful, and enhance the user experience.

Side note: Phantom eats rich snippets — scary, but true

One interesting side note (which I’ve mentioned in my previous posts about Phantom) is that when quality updates have rolled out, sites have gained or lost rich snippets. So it seems the algorithm has something built in that can strip or provide rich snippets. Either that, or Google is updating the rich snippets algorithm at the same time (tied to the quality threshold for displaying rich snippets that Google has mentioned several times).

So, not only can quality updates impact rankings, but it seems they can impact SERP treatment, too.

Phantom eats rich snippets.


Again, I chose to list several examples of what I’ve seen during my Phantom travels in this post, but I did not cover all of the low-quality situations I’ve seen. There are many!

Basically, the combination of strong user experience and high-quality content wins. Always try to exceed user expectations (from both a content standpoint and UX standpoint). Don’t deceive users, and don’t fall into the excessive monetization trap. That includes both desktop and mobile.

Remember, hell hath no fury like a user scorned. From what I’ve seen, users have a (new) voice now, and it’s called Phantom. Don’t make it angry. It probably won’t work out very well for you.

Exorcising the Phantom: what webmasters can do now

If you’ve been impacted by a quality update, or several, what can you do now? Below, I’ll provide some bullets containing important items that you can (and should) start today. And the first is key.

  • If you’ve been impacted negatively, don’t wait. Even Google’s John Mueller has said not to. Check 30:14 in the video.
  • Objectively measure the quality of your site. Go through your site like a typical user would. And if you have a hard time doing this objectively, have others go through your site and provide feedback. You might be surprised by what you surface.
  • Perform a thorough crawl analysis and audit of your site through a quality lens. This is always a great approach that can help surface quality problems. I use both DeepCrawl and Screaming Frog extensively(Note: I’m on the customer advisory board for DeepCrawl and have been a huge fan of using it for enterprise crawls for years.)
  • Understand the landing pages that have dropped, the queries those pages ranked for, and put yourself in the position of a user arriving from Google after searching for those queries. Does your content meet or exceed expectations?
  • Understand the “annoyance factor.” Popups, excessive pagination, deception, UIs breaking, thin content, user unhappiness and so on. Based upon my analyses, all of these are dangerous elements when it comes to Phantom.
  • Use “fetch and render” in Google Search Console to ensure you aren’t presenting problems to Googlebot. I’ve seen this a number of times with Phantom hits. If Google can’t see your content, it will have a hard time indexing and ranking that content.
  • Fix everything as quickly as you can, check all changes in staging, and then again right after you roll them out to production. Don’t make the situation worse by injecting more problems into your site. I’ve seen his happen, and it’s not pretty.
  • Similar to what I’ve said to Panda victims I have helped over the years, drive forward like you’re not being negatively impacted. Keep building high-quality content, meeting and exceeding user expectations, building new links naturally, building your brand, and more. Fight your way out of the ectoplasm. Remember, Google’s quality updates seem to require a refresh. Don’t give up.

Summary: Panda is in stealth mode, Phantom is front and center

With Panda now in stealth mode, there’s another algorithm that SEOs need to be aware of. We’ve seen a number of Google Quality Updates (aka Phantom) since May of 2015, and in my opinion, they heavily focus on “low-quality user experience.” I highly recommend reviewing your analytics reporting, understanding if you’ve been impacted, and then forming a plan of attack for maintaining the highest level of quality possible on your website.

Good luck, and be nice to Phantoms this Halloween!

Source : searchengineland

Categorized in Search Engine

Google is often criticized for how it handles spammy links, but columnist Ian Bowden believes this criticism may be unfair. Here, he takes a look at the challenges Google might face in tackling the ongoing issue of paid links.

Prior to the recent arrival of Penguin 4.0, it had been nearly two years since Penguin was last updated. It was expected to roll out at the end of 2015, which then became early 2016. By the summer, some in the industry had given up on Google ever releasing Penguin 4.0. But why did it take so long?

I’d argue that criticism directed at Google is in many cases unjustified, as people often take too simplistic a view of the task at hand for the search engine.

Detecting and dealing with paid links is a lot harder than many people think, and there are likely good reasons why Google took longer than hoped to release the next iteration of Penguin.

Here are some of the challenges Google may have faced in pushing out the most recent Penguin update:

1. It has to be effective at detecting paid links

To run and deploy an effective Penguin update, Google has to have the ability to (algorithmically and at scale) determine which links violate guidelines. It’s not clear the extent to which Google is capable of this; there are plenty of case studies which show that links violating the guidelines continue to work.

However, not all paid links are created equal.

Some paid links are obviously paid for. For instance, they may have certain types of markup around them, or they may be featured within an article clearly denoted as an advertorial.

On the other hand, some links may have no telltale signs on the page that they are paid for, so determining whether or not they are paid links comes through observing patterns.

The reality is that advanced paid linking strategies will be challenging for Google to either devalue or penalize.

Penguin has historically targeted very low-quality web spam, as it is easier to distinguish and qualify, but a level above this is an opportunity. Google has to have confidence in its capability before applying a filter, due to the severity of the outcome.

2. Google is still dependent on links for the best quality search results

Maybe, just maybe, Google is actually capable of detecting paid links but chooses not to devalue all of them.

Most people will be familiar with third-party tools that perform link analyses to assess which links are “toxic” and will potentially be harming search performance. Users know that sometimes these tools get it wrong, but generally they’re pretty good.

I think it is fair to assume that Google has a lot more resources available to do this, so in theory they should be better than third-party tools at detecting paid links.

Google has experimented with removing links from their index with negative consequences for the quality of search results. It would be interesting to see the quality of search results when they vary the spammy link threshold of Penguin.

It’s possible that even though certain links are not compliant with webmaster guidelines, they still assist Google in their number one goal of returning users the best quality search results. For the time being, they might still be of use to Google.

3. Negative SEO remains a reality

If Google is sure that a link has been orchestrated, it is very difficult for the search engine to also be sure whether it was done by the webmaster or by someone else executing a negative SEO campaign.

If a penalty or visibility drop were as easy to incur from a handful of paid links, then in theory, it would be pretty straightforward to perform negative SEO on competitors. The barriers to doing this are quite low, and furthermore, the footprint is minimal.

Google has tried to negate this problem with the introduction of the disavow tool, but it is not realistic to think all webmasters will know of this, let alone use the tool correctly. This is a challenge for Google in tackling paid links.

4. It provides a PR backlash and unwanted attention

When rolling out large algorithm updates, it’s inevitable that there will be false positives or severe punishments for small offenses. After any rollout, there will be a number of “adjustments” as Google measures the impact of the update and attempts to tweak it.

Despite that, a large number of businesses will suffer as a result of these updates. Those who regularly join Google Webmaster Hangouts will be used to business owners, almost in tears, discussing the devastating impact of a recent update and pleading for more information.

While the vast majority of Google users will most likely never be aware of or care about the fallout of algorithm updates, these situations do provide Google with some degree of negative PR. Any noise that points toward Google yielding too much power is unwanted attention.

On a related note, sometimes penalties are just not viable for Google. When someone walks down Main Street, they expect to see certain retailers. It’s exactly the same with search results. Users going to Google expect to see the top brands. The user doesn’t really care if a brand is not appearing because of a penalty. Users will hold it as a reflection on the quality of Google rather than the brand’s non-compliance with guidelines.

To be clear, that’s not to say that Google never penalizes big brands — JCPenneySprintthe BBCand plenty of other large brands have all received high-profile manual penalties in the past. But Google does have to consider the impact on the user experience when choosing how to weight different types of links. If users don’t see the websites they expect in search results, the result could be switching to another search engine.

This is how Google deals with the problem

The above four points highlight some of the challenges Google faces. Fewer things are more important than meeting its objective of returning the most useful results to its users, so it has a massive interest in dealing with paid links.

Here are some ways Google could address the challenges it faces:

1. Prefer to devalue links and issue fewer penalties

Penalties act as a deterrent for violating guidelines, and they serve to improve the quality of search results by demoting results that were artificially boosted. A lot of the risk of “getting it wrong” can simply be mitigated through devaluing links algorithmically, rather than imposing manual penalties.

In the instance of a negative SEO attack, the spammy links, instead of causing a penalty for a website, could simply not be counted. In theory, this is the purpose of a disavow file. Penalties could be saved for only the most egregious offenders.

The fact that Penguin now runs in real time as part of the core ranking algorithm suggests that this is the direction they are heading in: favoring the devaluation of spammy links through “algorithmic” penalties (which websites can now recover from more quickly), and manual penalties only being applied for serious offenses.

2. Do a slow rollout combined with other updates

Slowly rolling out the Penguin 4.0 update provides Google two advantages. First, it softens the blow of the update. There is not one week when suddenly some large profile brands drop visibility, drawing attention to the update.

Second, it allows Google to test the impact of the update and adjust over time. If the update is too harsh, they can adjust the parameters. Penguin 4.0 may take several weeks to roll out.

To add to the confusion and make it more difficult to understand the impact of Penguin 4.0, it is probable Google will roll out some other updates at the same time.

If you cast your memory back two years to the introduction of Panda 4.1 and Penguin 3.0, they were rolled out almost in conjunction. This made it more difficult to understand what their impacts were.

There was a lot of SERP fluctuation this September. It is possible part of this fluctuation can be attributed to Penguin 4.0 testing, but there is no certainty because of the amount of other updates occurring (such as the local update dubbed “Possum“).

3. Encourage a culture of fear

Even if the risk of receiving a penalty is the same now as it was five years ago, the anxiety and fear of receiving one is much greater among brands. High-profile penalties have not only served their function of punishing the offending brand, but they also have provided a great deterrent to anyone else considering such a strategy.

The transition to content marketing and SEO becoming less of a black box assisted in this, but this culture of fear has been a large driver in the reduction of paid link activity.

Final thoughts

Google is often criticized for not doing more to tackle paid links, but I think that criticism is unfair. When one considers the challenges search engines face when tackling paid links, one can be more forgiving.

Now that Google has incorporated Penguin into the core algorithm, webmasters may have an easier time recovering from ranking issues that arise from spammy or paid links, as they will not have to wait until “the next update” (sometimes years) to recover from an algorithmic devaluation.

However, the fact that Penguin now operates in real time will make it more difficult for webmasters to know when a loss in rankings is due to spammy links or something else — so webmasters will need to be vigilant about monitoring the health of their backlink profiles.

I suspect that Google will continue to make tweaks and adjustments to Penguin after the rollout is complete, and I expect to see a continued shift from penalties to devaluing links over time.

Source: Search Engine Land

Categorized in Search Engine

Columnist Adam Dorfman explains how Google's recent local search algorithm update, "Possum," has impacted brick-and-mortar businesses.

For brick-and-mortar businesses, proximity to the searcher’s location has become even more important as a ranking signal thanks to a Google algorithm update nicknamed Possum. With the Possum algorithm change, Google is continuing down a path it has been traveling for quite some time, which is the merging of local and organic ranking signals.

Google is now applying filters to reward certain businesses that are not only physically closest to searchers but that also are optimizing their location data and content for search far better than anyone else. To understand the impact of Possum crawling into our lives, let’s look at the following scenario:

  • Before Possum: Let’s say Jim, a resident of San Mateo, California, requires orthopedic surgery and is doing a search for orthopedic specialists in the area. An area hospital, Hospital A, that publishes location pages for dozens of orthopedic surgeons might dominate the local pack results — not necessarily because Hospital A optimizes its content better than anyone else, but because it is the largest hospital in the area and has enough domain strength to make those pages relevant from an algorithmic standpoint.
  • After Possum: Jim conducts the same search for orthopedic specialists. Instead of a single hospital dominating search results, Google allocates more real estate to other hospitals nearby based on their location and the usual ranking signals — unless Hospital A’s content and data are so well optimized for search that they outperform other hospitals by a wide margin.

What Google is doing here is not new to search. For some time, Google has been making it harder for monster brands such as Amazon to dominate search results for product searches simply because of their size and prominence.

The Amazons and Walmarts of the world no longer dominate the top search results like they once did unless their search signals outperform competitors’ content by a wide margin. With Possum, Google is applying to local search a similar filter it has been using for organic search more generally.

Greater competition with your neighbors

Possum also affects local results in a more arcane, but important, way. As Joy Hawkins discussed in a recent Search Engine Land column, the algorithm is affecting search results for similar businesses that are clustered closely together, examples being:

  • Two or more retailers, such as mattress stores or restaurants, located across the street from each other or in the same strip mall.
  • Professionals, such as attorneys, insurance agents or accountants, who might share the same office space.

Before Possum, an unbranded search for, say, Greek restaurants in Chicago might yield the names of several Greek dining establishments clustered closely together in Chicago’s Greektown area. Such a result would make perfect sense if the person doing the search were located a block away from Greektown, which is located on the city’s near west side.

But what if the searcher were located in the north or south suburbs and wanted to find Greek restaurants in Chicago? Getting the names of a bunch of Greektown restaurants might not be a very good user experience if the searcher wanted to find locations closer to their physical location.

Possum has made it less likely that similar businesses clustered together will dominate location-based searches unless, as noted, the searcher is conducting the search close to the actual location of those businesses.

One implication of this, as local search expert Andrew Shotland uncovered, is that national to local brands may see positive shifts in rankings due to brand authority being turned up as a signal in the ranking algorithm.

Possum has a number of other implications for businesses, as Hawkins details in her article. But for brick-and-mortar businesses that rely on local foot traffic, the impacts I have described are especially important. As Hawkins wrote, “The physical location of the searcher is more important than it was before.”

How to beat the competition

If you are a business that operates brick-and-mortar locations, you should first check to see if your rankings for local search have been affected. You might not have been affected — or you might be seeing better results, not necessarily a drop in rankings.

Regardless of whether you’ve been affected, now is the time to get more rigorous about how you manage your data and content as assets to make your brand more visible where people conduct near-me searches. Ask questions like:

  • Is my data accurate and shared properly with the publishers and aggregators that distribute my data?
  • Are my data and content differentiated to make my brand stand out? Am I listing data attributes, such as the availability of free parking, which might differentiate me when near-me searches occur? Is my deep content, such as long-form description of my business, or visual imagery, optimized properly for search?

Now, more than ever, it’s time to boost your signal for local search to be found. Don’t let your business play possum with local search.

Source: Search Engine Land

Categorized in Search Engine

TechCrunch reports that a week or so ago, Apple updated their App Store search algorithm, and the rankings of many apps for many keywords have changed.

Sarah Perez said, “[A]round a week ago, it appears that Apple yet again tweaked the way its rankings worked, but this time around, the changes have only impacted a subset of iPad app developers in the US App Store.”

My company has several apps, and I noticed ranking changes, some positive and some negative, for our key apps in both iPhone and iPad App Store searches.

The TechCrunch story shared several charts and examples of tools showing ranking changes across apps as large as Facebook, and also to other smaller apps.

Here is one example from the story:

Facebook’s iPad app offers a good example of the change, as its app moved from a #2 position in “Social Networking” and a #7 ranking “Overall” the day before, down to #4 and #24, respectively, on Friday, and then it crashed to #38 in “Social Networking” and a practically invisible #858 “Overall” by Monday.

The app’s download ranking has since begun climbing back up, reaching again #2 in “Social Networking” and #9 “Overall” by mid-week.


You can see other examples and more details at TechCrunch.


Source : http://searchengineland.com/

Categorized in Science & Tech

Google has officially confirmed on the Google webmaster blog that they've began rolling out the Penguin 4.0 real time algorithm. It has been just under two years since we had a confirmed Penguin update, which we named Penguin 3.0 in October 2014 and this will be the last time Google confirms a Penguin update again. We saw signals yesterday that Google began testing Penguin 4.0, Google wouldn't confirm if those signals were related to this Penguin 4.0 launch announcement this morning but nevertheless, we are live now with Penguin 4.0.

No Future Penguin Confirmations

Google said because this is a real-time algorithm "we're not going to comment on future refreshes." By real time, Google means that as soon as Google recrawls and reindexes your pages, those signals will be immediately used in the new Penguin algorithm.

Google did this also with Panda, when it became part of the core algorithm. Google saidno more confirmations for Panda.

Penguin 4.0 Is Real Time & More Granular

Google again said this is now rolling out, so you may not yet see the full impact until it fully rolls out. I don't expect the roll out to take long. Google wrote:

  • Penguin is now real-time. Historically, the list of sites affected by Penguin was periodically refreshed at the same time. Once a webmaster considerably improved their site and its presence on the internet, many of Google's algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed. With this change, Penguin's data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we're not going to comment on future refreshes.
  • Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

The real-time part we understand, it means when Google indexes the page, it will immediately recalculate the signals around Penguin.

Penguin being more "granular" is a bit confusing. I suspect it means that now Penguin can impact sites on a page-by-page basis, as opposed to how it worked in the past where it impacted the whole site. So really spammy pages or spammy sections of your site can solely be impacted by Penguin now, as opposed to your whole web site. That is my guess, I am trying to get clarification on this.

Google Penguin Timeline:

Was My Site Impacted By Google Penguin 4.0

If your site was hit by Penguin 3.0 and you don't see a recovery, even a small one by now, then that probably means you are still impacted. I'd give this a couple weeks to fully roll out and check your analytics to see if you have a nice recovery. Again, I still think specific sections and pages will be impacted and it will make it harder to know if you are impacted by this update or not.

The nice thing, you can use the disavow file on links you think are hurting you and you should know pretty quickly (I suspect days) if that helped as opposed to waiting two years for Google to refresh the algorithm. At the same time, you can be hit by Penguin much faster now.

Source : https://www.seroundtable.com

Categorized in Search Engine

Something is going on and it is getting bigger and bigger as time goes on. On September 2nd we reported significant changes in the Google search results, it seems Google did an unconfirmed Google algorithmic tweak but they said it was not Penguin. Then Tuesday we reported another shift in chatter which now seems to be escalating over the past 24 hours.

Let me first quote John Mueller of Google who said it wasn't Penguin this morning, he said on Twitter that this is nothing specific:

But an update it seems to be, if not Penguin, then maybe something related to September 2nd or something else.

There is a lot of ongoing chatter at Black Hat World forums and WebmasterWorld. Here are some recent quotes:

I am seeing extremely diverse datasets rolling through SERPs.

It's affecting our referrals in a big way, depending on what set is dominating. I can make myself see patterns as early as last Wednesday (7th Sept), but it's been clear since Monday (12th) and on steroids today.

We generally do not see swings like this unless a chunky general update rolls through - and we have never had any impact either way during the Panda/Penguin updates.

Hope this all settles soon. We've seen a sizeable rankings boost across our 1500 tracked keywords in Australia from the 2nd of Sep update. It's been very turbulent ever since.
I have noticed that after an initial jump in rankings at the start of this "update" or whatever it is / will be, but I can see that the rankings are drifting back down. I also didn't notice any real traffic gains from the rises in ranking but this may be more due to most reaching the top of page 2 and not hitting page 1. Anyone noticing similar jumps up then drift back down?
Pretty sure this is Penguin. If not it has to be something completely new.

This can't just be a "core" update.

I guess we just gotta wait for the official announcement, which should be coming soon since it seems to be rolling out in the US now.

Penguin maybe ... it's a google dance now. One of my client websites yesterday was reach top position in first page and then ....boom didn't see it in first 10 pages of google and i think is nowere because i can found it. And then it show up again, but now i do a search and is nowere. I only used web2.0 and natural backlinks...
Complete BS.. rankings are just constantly going up & down. NEVER seen this before..


View image on TwitterView image on Twitter



@vladrpt has been posting a ton of this on Twitter as well.

The comments on Tuesdays post here and the September 2nd posts here are close to 400 comments together. So this is a hot topic.

Mozcast showed really hot weather:

click for full size

Accuranker spiked a bit on Tuesday:

click for full size

RankRanger also showed a spike:

click for full size

So what is it? Maybe Google is testing Penguin but I believe this is just tweaks to September 2nd's update.

Forum discussion at Twitter Black Hat World and WebmasterWorld.


Source : https://www.seroundtable.com/google-search-algorithm-update-22701.html

Categorized in Search Engine
Page 5 of 6

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar - GET 70% OFF FOR MEMBERS ONLY      Register Now