Articles
Pages
Products
Research Papers
Search - Easy Blog Comment
Blogs
Search Engines
Events
Webinar, Seminar, Live Classes

[Source: This article was Published in moz.com  - Uploaded by the Association Member: Barbara larson]

As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet's content in order to offer the most relevant results to the questions searchers are asking.

In order to show up in search results, your content needs to first be visible to search engines. It's arguably the most important piece of the SEO puzzle: If your site can't be found, there's no way you'll ever show up in the SERPs (Search Engine Results Page).

How do search engines work?

Search engines have three primary functions:

  1. Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
  2. Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
  3. Rank: Provide the pieces of content that will best answer a searcher's query, which means that results are ordered by most relevant to least relevant.

What is search engine crawling?

Crawling is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.

What's that word mean?

Having trouble with any of the definitions in this section? Our SEO glossary has chapter-specific definitions to help you stay up-to-speed.

Googlebot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, the crawler is able to find new content and add it to their index called Caffeine — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.

What is a search engine index?

Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.

Search engine ranking

When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher's query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.

It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.

By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!

In SEO, not all search engines are equal

Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that's nearly 20 times Bing and Yahoo combined.

Crawling: Can search engines find your pages?

As you've just learned, making sure your site gets crawled and indexed is a prerequisite to showing up in the SERPs. If you already have a website, it might be a good idea to start off by seeing how many of your pages are in the index. This will yield some great insights into whether Google is crawling and finding all the pages you want it to, and none that you don’t.

One way to check your indexed pages is "site:yourdomain.com", an advanced search operator. Head to Google and type "site:yourdomain.com" into the search bar. This will return results Google has in its index for the site specified:

A screenshot of a site:moz.com search in Google, showing the number of results below the search box.

The number of results Google displays (see “About XX results” above) isn't exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.

For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don't currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google's index, among other things.

If you're not showing up anywhere in the search results, there are a few possible reasons why:

  • Your site is brand new and hasn't been crawled yet.
  • Your site isn't linked to from any external websites.
  • Your site's navigation makes it hard for a robot to crawl it effectively.
  • Your site contains some basic code called crawler directives that is blocking search engines.
  • Your site has been penalized by Google for spammy tactics.

Tell search engines how to crawl your site

If you used Google Search Console or the “site:domain.com” advanced search operator and found that some of your important pages are missing from the index and/or some of your unimportant pages have been mistakenly indexed, there are some optimizations you can implement to better direct Googlebot how you want your web content crawled. Telling search engines how to crawl your site can give you better control of what ends up in the index.

Most people think about making sure Google can find their important pages, but it’s easy to forget that there are likely pages you don’t want Googlebot to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.

To direct Googlebot away from certain pages and sections of your site, use robots.txt.

Robots.txt

Robots.txt files are located in the root directory of websites (ex. yourdomain.com/robots.txt) and suggest which parts of your site search engines should and shouldn't crawl, as well as the speed at which they crawl your site, via specific robots.txt directives.

How Googlebot treats robots.txt files

  • If Googlebot can't find a robots.txt file for a site, it proceeds to crawl the site.
  • If Googlebot finds a robots.txt file for a site, it will usually abide by the suggestions and proceed to crawl the site.
  • If Googlebot encounters an error while trying to access a site’s robots.txt file and can't determine if one exists or not, it won't crawl the site.

Optimize for crawl budget!

Crawl budget is the average number of URLs Googlebot will crawl on your site before leaving, so crawl budget optimization ensures that Googlebot isn’t wasting time crawling through your unimportant pages at risk of ignoring your important pages. Crawl budget is most important on very large sites with tens of thousands of URLs, but it’s never a bad idea to block crawlers from accessing the content you definitely don’t care about. Just make sure not to block a crawler’s access to pages you’ve added other directives on, such as canonical or noindex tags. If Googlebot is blocked from a page, it won’t be able to see the instructions on that page.

Not all web robots follow robots.txt. People with bad intentions (e.g., e-mail address scrapers) build bots that don't follow this protocol. In fact, some bad actors use robots.txt files to find where you’ve located your private content. Although it might seem logical to block crawlers from private pages such as login and administration pages so that they don’t show up in the index, placing the location of those URLs in a publicly accessible robots.txt file also means that people with malicious intent can more easily find them. It’s better to NoIndex these pages and gate them behind a login form rather than place them in your robots.txt file.

You can read more details about this in the robots.txt portion of our Learning Center.

Defining URL parameters in GSC

Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly:

https://www.example.com/products/women/dresses/green.htm

https://www.example.com/products/women?category=dresses&color=green

https://example.com/shopindex.php?product_id=32&highlight=green+dress
&cat_id=1&sessionid=123$affid=43

How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages. If you use this feature to tell Googlebot “crawl no URLs with ____ parameter,” then you’re essentially asking to hide this content from Googlebot, which could result in the removal of those pages from search results. That’s what you want if those parameters create duplicate pages, but not ideal if you want those pages to be indexed.

Can crawlers find all your important content?

Now that you know some tactics for ensuring search engine crawlers stay away from your unimportant content, let’s learn about the optimizations that can help Googlebot find your important pages.

Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It's important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.

Ask yourself this: Can the bot crawl through your website, and not just to it?

A boarded-up door, representing a site that can be crawled to but not crawled through.

Is your content hidden behind login forms?

If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won't see those protected pages. A crawler is definitely not going to log in.

Are you relying on search forms?

Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.

Is text hidden within non-text content?

Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there's no guarantee they will be able to read and understand it just yet. It's always best to add text within the markup of your webpage.

Can search engines follow your site navigation?

Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.

A depiction of how pages that are linked to can be found by crawlers, whereas a page not linked to in your site navigation exists as an island, undiscoverable.

Common navigation mistakes that can keep crawlers from seeing all of your sites:

  • Having a mobile navigation that shows different results than your desktop navigation
  • Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
  • Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
  • Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!

This is why it's essential that your website has clear navigation and helpful URL folder structures.

Do you have clean information architecture?

Information architecture is the practice of organizing and labeling content on a website to improve efficiency and findability for users. The best information architecture is intuitive, meaning that users shouldn't have to think very hard to flow through your website or to find something.

Are you utilizing sitemaps?

A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google's standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.

Ensure that you’ve only included URLs that you want indexing by search engines, and be sure to give crawlers consistent directions. For example, don’t include a URL in your sitemap if you’ve blocked that URL via robots.txt or include URLs in your sitemap that are duplicates rather than the preferred, canonical version (we’ll provide more information on canonicalization in Chapter 5!).

Learn more about XML sitemaps 
If your site doesn't have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console. There's no guarantee they'll include a submitted URL in their index, but it's worth a try!

Are crawlers getting errors when they try to access your URLs?

In the process of crawling the URLs on your site, a crawler may encounter errors. You can go to Google Search Console’s “Crawl Errors” report to detect URLs on which this might be happening - this report will show you server errors and not found errors. Server log files can also show you this, as well as a treasure trove of other information such as crawl frequency, but because accessing and dissecting server log files is a more advanced tactic, we won’t discuss it at length in the Beginner’s Guide, although you can learn more about it here.

Before you can do anything meaningful with the crawl error report, it’s important to understand server errors and "not found" errors.

4xx Codes: When search engine crawlers can’t access your content due to a client error

4xx errors are client errors, meaning the requested URL contains bad syntax or cannot be fulfilled. One of the most common 4xx errors is the “404 – not found” error. These might occur because of a URL typo, deleted page, or broken redirect, just to name a few examples. When search engines hit a 404, they can’t access the URL. When users hit a 404, they can get frustrated and leave.

5xx Codes: When search engine crawlers can’t access your content due to a server error

5xx errors are server errors, meaning the server the web page is located on failed to fulfill the searcher or search engine’s request to access the page. In Google Search Console’s “Crawl Error” report, there is a tab dedicated to these errors. These typically happen because the request for the URL timed out, so Googlebot abandoned the request. View Google’s documentation to learn more about fixing server connectivity issues. 

Thankfully, there is a way to tell both searchers and search engines that your page has moved — the 301 (permanent) redirect.

Create custom 404 pages!

Customize your 404 pages by adding in links to important pages on your site, a site search feature, and even contact information. This should make it less likely that visitors will bounce off your site when they hit a 404.

Say you move a page from example.com/young-dogs/ to example.com/puppies/. Search engines and users need a bridge to cross from the old URL to the new. That bridge is a 301 redirect.

When you do implement a 301:When you don’t implement a 301:
Link Equity Transfers link equity from the page’s old location to the new URL. Without a 301, the authority from the previous URL is not passed on to the new version of the URL.
Indexing Helps Google find and index the new version of the page. The presence of 404 errors on your site alone don't harm search performance, but letting ranking / trafficked pages 404 can result in them falling out of the index, with rankings and traffic going with them — yikes!
User Experience Ensures users find the page they’re looking for. Allowing your visitors to click on dead links will take them to error pages instead of the intended page, which can be frustrating.

The 301 status code itself means that the page has permanently moved to a new location, so avoid redirecting URLs to irrelevant pages — URLs where the old URL’s content doesn’t actually live. If a page is ranking for a query and you 301 it to a URL with different content, it might drop in rank position because the content that made it relevant to that particular query isn't there anymore. 301s are powerful — move URLs responsibly!

You also have the option of 302 redirecting a page, but this should be reserved for temporary moves and in cases where passing link equity isn’t as big of a concern. 302s are kind of like a road detour. You're temporarily siphoning traffic through a certain route, but it won't be like that forever.

Watch out for redirect chains!

It can be difficult for Googlebot to reach your page if it has to go through multiple redirects. Google calls these “redirect chains” and they recommend limiting them as much as possible. If you redirect example.com/1 to example.com/2, then later decide to redirect it to example.com/3, it’s best to eliminate the middleman and simply redirect example.com/1 to example.com/3.

Once you’ve ensured your site is optimized for crawl ability, the next order of business is to make sure it can be indexed.

Indexing: How do search engines interpret and store your pages?

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page's contents. All of that information is stored in its index.

A robot storing a book in a library.

Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Can I see how a Googlebot crawler sees my pages?

Yes, the cached version of your page will reflect a snapshot of the last time Googlebot crawled it.

Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like https://www.nytimes.com will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, http://www.rogerlovescupcakes.com (if only it were real…)

You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing "Cached":

A screenshot of where to see cached results in the SERPs.

You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.

Are pages ever removed from the index?

Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:

  • The URL is returning a "not found" error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
  • The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
  • The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
  • The URL has been blocked from crawling with the addition of a password required before visitors can access the page.

If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can use the URL Inspection tool to learn the status of the page, or use Fetch as Google which has a "Request Indexing" feature to submit individual URLs to the index. (Bonus: GSC’s “fetch” tool also has a “render” option that allows you to see if there are any issues with how Google is interpreting your page).

Tell search engines how to index your site

Robots meta directives

Meta directives (or "meta tags") are instructions you can give to search engines regarding how you want your web page to be treated.

You can tell search engine crawlers things like "do not index this page in search results" or "don’t pass any link equity to any on-page links". These instructions are executed via Robots Meta Tags in theof your HTML pages (most commonly used) or via the X-Robots-Tag in the HTTP header.

Robots meta tag

The robots meta tag can be used within theof the HTML of your webpage. It can exclude all or specific search engines. The following are the most common meta directives, along with what situations you might apply them in.

index/noindex tells the engines whether the page should be crawled and kept in a search engines' index for retrieval. If you opt to use "noindex," you’re communicating to crawlers that you want the page excluded from search results. By default, search engines assume they can index all pages, so using the "index" value is unnecessary.

  • When you might use: You might opt to mark a page as "noindex" if you’re trying to trim thin pages from Google’s index of your site (ex: user generated profile pages) but you still want them accessible to visitors.

follow/nofollow tells search engines whether links on the page should be followed or nofollowed. “Follow” results in bots following the links on your page and passing link equity through to those URLs. Or, if you elect to employ "nofollow," the search engines will not follow or pass any link equity through to the links on the page. By default, all pages are assumed to have the "follow" attribute.

  • When you might use: nofollow is often used together with noindex when you’re trying to prevent a page from being indexed as well as prevent the crawler from following links on the page.

noarchive is used to restrict search engines from saving a cached copy of the page. By default, the engines will maintain visible copies of all pages they have indexed, accessible to searchers through the cached link in the search results.

  • When you might use: If you run an e-commerce site and your prices change regularly, you might consider the noarchive tag to prevent searchers from seeing outdated pricing.

Here’s an example of a meta robots noindex, nofollow tag:

...

This example excludes all search engines from indexing the page and from following any on-page links. If you want to exclude multiple crawlers, like googlebot and bing for example, it’s okay to use multiple robot exclusion tags.

Meta directives affect indexing, not crawling

Googlebot needs to crawl your page in order to see its meta directives, so if you’re trying to prevent crawlers from accessing certain pages, meta directives are not the way to do it. Robots tags must be crawled to be respected.

X-Robots-Tag

The x-robots tag is used within the HTTP header of your URL, providing more flexibility and functionality than meta tags if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.

For example, you could easily exclude entire folders or file types (like moz.com/no-bake/old-recipes-to-noindex):

 Header set X-Robots-Tag “noindex, nofollow”

The derivatives used in a robots meta tag can also be used in an X-Robots-Tag.

Or specific file types (like PDFs):

 Header set X-Robots-Tag “noindex, nofollow”

For more information on Meta Robot Tags, explore Google’s Robots Meta Tag Specifications.

WordPress tip:

In Dashboard > Settings > Reading, make sure the "Search Engine Visibility" box is not checked. This blocks search engines from coming to your site via your robots.txt file!

Understanding the different ways you can influence crawling and indexing will help you avoid the common pitfalls that can prevent your important pages from getting found.

Ranking: How do search engines rank URLs?

How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.

An artistic interpretation of ranking, with three dogs sitting pretty on first, second, and third-place pedestals.

To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.

Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: "We’re making quality updates all the time." This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.

What do search engines want?

Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?

Think about it in terms of someone learning a new language.

At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics — the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.

When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:

Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.

This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.

The role links play in SEO

When we talk about links, we could mean two things. Backlinks or "inbound links" are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).

A depiction of how inbound links and internal links work.

Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.

Backlinks work very similarly to real-life WoM (Word-of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:

  • Referrals from others = good sign of authority
    • Example: Many different people have all told you that Jenny’s Coffee is the best in town
  • Referrals from yourself = biased, so not a good sign of authority
    • Example: Jenny claims that Jenny’s Coffee is the best in town
  • Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam
    • Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
  • No referrals = unclear authority
    • Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.

This is why PageRank was created. PageRank (part of Google's core algorithm) is a link analysis algorithm named after one of Google's founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.

The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.

The role content plays in SEO

There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.

Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?

Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.

Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.

What is RankBrain?

RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.

For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.

An image showing how results can change and are volatile enough to show different rankings even hours later.

Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.

What does this mean for SEOs?

Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.

Engagement metrics: correlation, causation, or both?

With Google rankings, engagement metrics are most likely part correlation and part causation.

When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:

  • Clicks (visits from search)
  • Time on page (amount of time the visitor spent on a page before leaving it)
  • Bounce rate (the percentage of all website sessions where users viewed only one page)
  • Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)

Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?

What Google has said

While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.

According to Google’s former Chief of Search Quality, Udi Manber:

“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”

Another comment from former Google engineer Edmond Lau corroborates this:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”

Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.

What tests have confirmed

Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:

  • Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
  • Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
  • Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.

Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.

In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.

The evolution of search results

Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.

A screenshot of what a 10-blue-links SERP looks like.

In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on its search result pages, called SERP features. Some of these SERP features include:

  • Paid advertisements
  • Featured snippets
  • People Also Ask boxes
  • Local (map) pack
  • Knowledge panel
  • Sitelinks

And Google is adding new ones all the time. They even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”

The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.

So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.

Query IntentPossible SERP Feature Triggered
Informational Featured snippet
Informational with one answer Knowledge Graph/instant answer
Local Map pack
Transactional Shopping

We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.

Localized search

A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.

If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.

When it comes to localized search results, Google uses three main factors to determine the ranking:

  1. Relevance
  2. Distance
  3. Prominence

Relevance

Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.

Distance

Google uses your geo-location to better serve your local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).

Organic search results are sensitive to a searcher's location, though seldom as pronounced as in local pack results.

Prominence

With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine the local ranking, such as:

Reviews

The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.

Citations

A "business citation" or "business listing" is a web-based reference to a local business' "NAP" (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).

Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business's name, location, and phone number it strengthens Google's "trust" in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.

Organic ranking

SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.

In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.

[Bonus!] Local engagement

Although not listed by Google as a local ranking factor, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits...

 

Curious about a certain local business' citation accuracy? Moz has a free tool that can help out, aptly named Check Listing.

...and even provides searchers with the ability to ask the business questions!

A screenshot of the Questions & Answers result in local search.

Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.

Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real-time engagement metrics to determine quality and relevance.

You don’t have to know the ins and outs of Google's algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let's learn about choosing the keywords your content will target in Chapter 3 (Keyword Research)!

 

Categorized in Search Engine

Now more than ever, marketing experts are improving their marketing strategy with fewer resources, and they are shifting marketing budgets from traditional to digital tactics like search engine optimization and social media. Companies, too often, omit their social media marketing strategy from their SEO strategy, which is a grave mistake. A study conducted by Ascend2 indicates that companies with the strongest SEO via social media strategies now produce the best results, and vice-versa. Companies that consider themselves “very successful” at search engine optimization are integrating social media into their strategy, whereas, companies that are “not successful” at search engine optimization are not integrating social media into their strategy.

See the graph below:

SEOSocialIntegration

In the above graph, companies with successful SEO are in blue while those companies with an inferior SEO strategy are in amber. You can see 38% of those doing very well with search engine optimization was also extensively integrating social media. A full 50% of those doing poorly at search engine optimization was not integrating social media at all in their strategy. This graph signifies that companies that are succeeding in search engine optimization today are including social in their strategy.

SEO is much more than just high ranking in Google. It is a multi-disciplinary, comprehensive approach to website optimization that ensures potential customers, who come to your website, will have an excellent experience, easily find what they are looking for, and have an easy time sharing your optimum-quality content. The combination of SEO and social media platforms such as YouTube, Facebook, Google+, Twitter, LinkedIn, and Pinterest can be overwhelming for big as well as small business marketers. Until recently, search engine optimization and social media marketing were thought of as two very different things, but actually, these are two sides of the same coin. Consider the below mentioned social network growth statistics:

  • YouTube hosts nearly 14 billion videos. Source: comScore
  • Google sites handle about 100 billion searches each month. Source: SEL
  • Facebook is now over 1 billion users. Source: Mark Zuckerberg
  • Twitter has over 550 million accounts. Source: Statistics Brain
  • Google+ has over 500 million users. Source: Google
  • LinkedIn is at 225 million users. Source: LinkedIn
  • Pinterest grew 4,377% in 2012 and continues to expand to 25 million users. Source: TechCrunch
  • Following statistics shows how social media is quite helpful in effective search engine optimization:
  • 94% increase in CTR (Click-Thru-Rate) when searching and social media are used together. Source: eMarketer
  • 50% of consumers use a combination of search and social media to make purchase decisions. Source: Inc
  • Consumers who use social media (vs. people who don’t) are 50% more likely to use search. Source: srcibd
  • Websites with a Google+ business page yield a 15% rise in search rank. Source: Open Forum

With these statistics, we can say that social media can be a primary engine for promoting new content and can take your website from zero visibility to a strong performing position almost overnight. For enhancing SEO through social media platform two factors play a vital role, which are social signals and natural link building. I have explained these two factors in an elaborative manner:

What’s Your Social Signal?

Social Signals are signals to various search engines that your content or information is valuable. Every time someone likes, shares, tweets or +1′s content about your brand, especially a link, they are sending a social signal and the more social signals means you have better chances to rank high on search engine result pages. Many researchers have found that social shares are quite valuable when it comes to building your website authority. Here is the latest research from Searchmetrics, highlighting which social signals correlate to rankings on Google:

socialsignals2

Note that 7 out of the top 9 factors are social signals. Now, it’s clear that social signals can have a huge impact on your search rankings, especially social signals from Google+. If you do not have time to leverage all of the social networking sites, then make sure that Google+ is one of the few you do use because it will play the biggest part in increasing your rankings on search engines. Top social signals that Google is tracking on your website are mentioned below:

Google+

Google+ is a fledgling community when it is compared to social networking giants like Facebook and Twitter, but its social signals have the most impact on search ranking results. Some factors that you should look at are:

Amount of +1s- You need to start distinguishing +1 to your website in general and +1 to each piece of your content. You should increase +1s to your brand/your authorship profile. This also applies to +1s on Local+ pages.

Authority of +1s- If your profile or brand gets more +1, then you will get to rank higher and easier for the future content you produce.

Growth rate of +1s- You should strategize a plan that will increase your +1 steadily over an extended period of time.

Amount of Adds and Shares- How many people are following and sharing your content tells about how authoritative you are.

Authority of Adds and Shares- Who is following you is also important. A network with people with great profiles helps you to establish a voice.

Facebook

The king of social networking sites, Facebook has an active community of over 900 million. Millions of active users make it a perfect platform for generating social signals. Various research has shown that Facebook influences more search rankings as compared to Google+ or Twitter. Some factors that you should look at are:

Amount of Shares and Likes- You should remember that “shares” carry more weight than “likes”.

Amount of Comments- The collective amount of likes, shares, and comments correlate the closest with search ranking.

Twitter

Twitter is second only to Facebook and boasts 500 million users that are constantly “tweeting”, status updates and events in real time. Twitter users, known as “tweeps”, put more premium on a tweet’s authority rather than sheer amount; though the overall social signals generated by it lags just a little behind Facebook. On Twitter, you should look at some factors like:

  • Authority of followers, mentions, and retweets
  • Number of followers, mentions, and retweets
  • Speed and intensity of tweets and RT over time

Other social websites like Pinterest, Reddit, Digg, StumbleUpon, and FourSquare

The big three, i.e. Facebook, Twitter, and Google+, play quite important role when it comes to social ranking factors, but you should not ignore the potential of other user-driven social websites like Pinterest, Reddit, Digg, StumbleUpon, and FourSquare. On these social networking sites you should look at following factors:

  • Amount of Pins and re-pins on Pinterest
  • Comments on Pinterest
  • Growth rate of Pins and Re-pins
  • Check-ins on Foursquare
  • Spread rate of check-ins at FourSquare
  • Upvotes on Reddit, Digg, StumbleUpon
  • Comments on Reddit, Digg, StumbleUpon

Link Development through Social Media

The traditional way of link building like en-masse link directories, spammy comments, forum-posts for the sake of links, and anchor text sculpting are over now. In the modern era, the powerful way to build link is an effective content marketing strategy. People love informative and quality content, and they love sharing content. Social media sites are one of the best platforms for content marketing, in this way these are quite important for natural link development.

How to build natural and quality links through Social Media Platforms

There are two tactics that will help you immensely in earning quality and natural links through Social Media Platforms are:

Link-building through interaction and community engagement

If you’re link-building but never building relationships or never interacting with people, you’re not really link building: you are spamming. If you interact with people who might care about your brand, you can gain a cutting edge over other competitors. Meaningful interactions with the audience in your niche prove your credibility and will lead to more authority links. 

You can also get links through interaction from a popular site or a popular brand, when they post to their Facebook page, make a Google+ post, launch a new blog post, or put up a new video on YouTube. In this case, I also recommend you to interact early and often. Early because a lot of times, being in the first five or ten comments, interactions, or engagements really helps you to be seen by the editors who are almost always watching. When you do such interaction, make sure you are adding value, by doing this you make yourself stand out in the comments. You can add value by doing a little bit of detailed research and by making the conversation more interesting. By posting great comments, you will create interest in target customers and they often click your profile that will latently earn you some links. In addition to this, you can also offer help to other people and you can help people without being asked. This is a great way to drive links back to your own site and you can do this, not just on blog posts, but on Google+ posts, Facebook pages, and YouTube comments.

Link building through quality content

In addition to gaining links from popular sites, you can also earn links by posting qualitative and linkable content on social media platforms. If you create content that people find valuable and informative, they are more likely to want to share it. What people find valuable can vary, but optimum quality blog posts and infographics that provide well-researched information, statistics, and new angles on a subject are all good starting points. A good and informative video that attracts viewers’ attention is eminently shareable, which is one reason nearly 87% of agency and brand marketers now creating a video for content marketing. When someone reads your quality and informative content on social media sites and finds it of value, it is more likely that they will want to link to it.

Article-Effective-Content

In order to give your informative content the best chance of reaching a wide audience, you should identify the key influencers or target audience in your field. In this way, you will be able to target your efforts effectively. Facebook and Twitter are the two go-to social media platforms for most people but you should also seek out targets on other platforms such as Pinterest, YouTube, and Tumblr. In addition to this, if you are marketing within specific regions, you might want to channel your efforts to the most popular websites in each market. For example, VK is the preferred social media website in Russia, while Orkut can help extend your reach within Brazil and India.

You can also use various tools and services that can help you find the best targets. For example, Followerwonk offers a Twitter analytics service and it can help you to compare and sort followers by looking at data such as social authority scores and the percentage of URLs. Furthermore, you can also gauge reactions to your own tweets by monitoring your activity alongside current follower numbers. Apart from this, Fresh Web Explorer is a handy tool, as it searches for mentions of your brand, company or other keyword and automatically matches this with ‘feed authority’. In this way, you can sort key influencers from those with less perceived authority that will allow you to target your efforts more effectively.

Now, it is clear that social media is an essential part of search engine optimization. Following diagram explains you a blueprint of how social media supports SEO: 

seo-social-media

Quality Content gets published- One of the best ways to increase quality traffic to your website is to publish shareable, useful and relevant content on social media sites.

Content gets Shares, Links, & Likes- As you start publishing your company’s blog posts or research work on a regular basis and spreading it across the social networking sites, your content will start generating shares, links, and “likes”.

Sites Gain Subscriptions while Social Profiles get Fans & Followers- As a result, your site’s blog will gain more subscribers and your social media channels will gain more followers, fans, and connections.

Thriving Community Supporting the Website & Social Networks Grows- A thriving community of people who are interested in your user-focused content develops and starts to thrive.

Reputation Reinforced through Social Media & SEO as Authoritative Brand for the Niche- Signals are sent to various search engines about your activity on social media platforms and your keyword-rich and informative content. Your website starts being viewed as reputable, relevant, and authoritative.

Sites Gain Authority in Search Engines- As a result, your website and its informative and quality content starts appearing higher and more frequently in the top rankings and listings of search engines for your keyword phrases and targeted keywords.

Sustainable Stream of Users Discover the Site organically- A consistently growing stream of users will begin discovering the website via the social media sites, search engines, and your email marketing efforts.

I have explained how aligning SEO and social media efforts can really enhance your SEO performance. In order to execute this task effectively, you might even like to hire experienced SEO experts. You should make sure that your social media and SEO teams are working together in order to create a unified digital marketing strategy.

Source: This article was published problogger.com By Guest Blogger

Categorized in How to

Got great video content that no one's seeing? Columnist Tony Edward provides a comprehensive guide to optimizing your YouTube presence.

YouTube is arguably the second largest search engine on the Web. It is the third most visited site on the Web, according to Alexa and SimilarWeb. Recent information released by Google has shown that more and more users are using YouTube as a search engine. Searches related to “how to” on YouTube are growing 70% year over year. It is also no secret that video content is more engaging than a page of text and can be much more informative. YouTube’s popularity and reach are also expanded by its inclusion in both Google Web and Video search.

YoUTube Videos in Google Web SearchYouTube Videos in Google Video Search

Google weeded out the video competition in Web search by predominantly displaying only video-rich snippets for YouTube videos back in 2014. Here is a graph outlining the percentage share of video-rich snippets in Google.

Wistia Video Snippet Share Google

Source: Wistia

Google also made a surprising update to Google Trends recently by including YouTube trending topics in the tool. This shows that YouTube search traffic is significant enough that Google needed to incorporate it to paint a better picture of trending topics and stories across the Web.

YouTube tending topics in Google trends

So it is very important to have a presence on YouTube to expand your marketing reach, build your brand and drive traffic back to your website. Not only can you rank in YouTube search, but also in Google’s Web and video search.

YouTube Search Result Page Overview

The YouTube search result page is very similar to Google’s SERP’s, with paid ads at the top and organic results below.

YouTube Search Result Page

Videos can also be ranked in the related video sections of specific video pages.

YouTube Video Rankings Related Videos

Esta A Presence On YouTubeblishing

Before creating a YouTube channel or videos, you must have a strategy! This strategy is, of course, heavily based upon a solid video content calendar that aligns with company goals. Once your strategy has been developed, you can launch a new channel or optimize an existing channel. Here are the steps you need to take to optimize your YouTube channel.

1. Channel Name, Branding & Vanity URL

It goes without saying that your YouTube channel should be well branded. Your channel name, icon, banner (aka “Art”) and vanity URL should reflect your brand. The channel icon and banner images should be high quality to avoid pixelation.

How to Optimize YouTube Channel

Example of a vanity URL: https://www.youtube.com/user/EliteSEMInc

YouTube recently changed its policy for claiming a vanity URL for your channel. You must now meet the following qualifications to claim a vanity URL:

  • 500 or more subscribers
  • Channel is at least 30 days old
  • Channel has uploaded a photo for the channel icon
  • Channel has uploaded channel art

A channel without a vanity URL will receive an unoptimized URL that is not user-friendly or memorable, so it’s very important to work toward getting a vanity URL. I recently created a YouTube channel for Elite SEM’s CEO Ben Kirshner. His channel does not yet qualify for the vanity URL, and here is what it looks like:

(See Google’s YouTube Help documentation for more details on vanity URL qualifications.)

2. Channel Keywords & Targeted Country

Placing the appropriate keywords in the channel keywords element can help your channel rank higher in YouTube search. Be sure to select keywords that are related to your business and that have search volume. Leverage the Google AdWords keyword planner tool for search volume data. Be sure to select the targeted country you want to rank in.

YouTube Channel Keywords

3. Associated Website

Leverage the associated website feature in your channel settings. Linking your website will help establish your brand authority in the YouTube search results.

YouTube Associated Website

4. Channel Description & Links

A big opportunity to improve the rankability of your channel is to place branded and keyword-targeted content in the description section of your YouTube channel. The more content the better. Be sure to also include links to your website and social profiles to help users easily navigate to your website.

YouTube Channel Description Optimization

5. Channel Homepage

For your YouTube channel homepage you should utilize the featured video feature. This will allow you to highlight a specific video that will automatically play when someone visits the channel homepage. This will help boost engagement and can help you highlight specific information about your business. You should also highlight video playlists on the homepage to help users discover the different video content that you have uploaded. The more video playlists the better.

YouTube Channel Homepage Optimization

6. Links To Your Channel

Place links to your YouTube channel on your website and in your social profiles and emails. This will help increase channel exposure, visits and authority.

YouTube Video Ranking Factors

Video ranking factors are pretty straightforward. You can go about boosting rank in much the same way you would optimize a Web page. Here is breakdown of the rank factors:

  • Meta Data. Video titles, descriptions and tags are the core ranking factors. Keyword insertion is very important in all three elements. Similar to Web page title optimization, you should place the primary keywords at the forefront of the video titles. Be sure to include links to your website and social profiles in video descriptions to help users easily navigate to your website.

YouTube video optimization meta data

  • Video Quality. HD videos will rank higher than low-quality videos. YouTube highlights HD videos in search results. HD is a user experience element. Poor quality videos will annoy users, and you will not only lose views and subscribers, you’ll also get dislikes.

YouTube Video Quality

  • Views, Likes, Shares & Links. YouTube video rankings are affected by the number of views, likes (thumbs up) on YouTube, social shares and inbound links. When a video is published on your channel, you should begin distribution to help gain views, likes, shares and links. Here are some ways to distribute your video content:
    • Share on all your social profiles
    • Include in email updates/newsletters to your customers
    • Embed on your website, use as a topic for a blog post, or place in an existing video section of your website
    • Share on social bookmarking sites like Reddit or StumbleUpon
    • If the video is relevant and significant enough, you can conduct outreach to targeted sites, blogs, etc., to gain exposure
    Here is an example of one of Elite SEM’s video link and social share metrics. As you can see, the video has a page authority of 49/100, which is pretty good.

YouTube Video Link and Social Metrics

  • Thumbnail Optimization & Annotations. Utilizing the custom thumbnail feature for videos and annotations can help increase video CTR, views and shares. For each video, you have the option to upload a custom thumbnail. The image should be high quality (640 x 360 pixels minimum, 16:9 aspect ratio), vibrant and eye-catching. Visually compelling imagery will help get your video more clicks and views.YouTube Video Thumbnail OptimizationAnnotations allow you to highlight text in a video. This text can be additional video notes, calls to action and links to other related videos. You can use this feature to ask users to like and share your video. YouTube recently expanded annotations by adding the Cards feature.

YouTube Video Annotation Optimization

  • Subtitles & Closed Captions. YouTube allows you to add closed captions for videos that have spoken-word content. This feature opens up your content to a larger audience, including deaf or hard of hearing viewers or those who speak languages besides the one spoken in your video. The captions are crawlable by the search engines! This takes your video to the next level from a ranking perspective. By enabling the closed captions feature, you will increase the video’s rankability. Note that the YouTube automatic captions feature is not perfect, and you will have to make corrections. You have the option to upload a closed caption file. More details here.

YouTube Closed Caption Feature

  • Branding. While branding your videos does not directly affect video rankings, it does help increase brand authority and engagement, which can lead to more subscribers, shares and views. Be sure to include a branded intro and outro to your videos. YouTube also offers a watermark feature that allows you to brand watermark all your videos.

YouTube Video Branding Watermark

At the end of the day, it is very important to have quality video content that adds value, solves a problem, engages and meets user needs. Great content will naturally get shared and get links, which will help increase rankings. Spend time working on your video content calendar, and employ the above optimizations to have a successful YouTube presence.

Author : Tony Edward

Source : Searchengineland.com

Categorized in Search Engine

Whitespark, a team of SEO experts focused on local search, has noticed something big: proximity to the searcher is now the #1 ranking factor for local search pack results.

 

What defines a local search? Think of local searches as searches on a search engine for something you’d traditionally look up in the yellow pages, usually for a place you’d like to physically go (e.g. a hotel, an atm nearby, or a dry cleaner).

A ‘pack’ – usually a 3-pack – is the Google configuration of results that appears at the top of a page for some searches and looks like this:

According to Whitespark’s tests, proximity in local searches now outweighs Google reviews, linking domains, whether businesses have claimed their Google listings – even whether or not they have a website!

How Can We Tell?

Here’s what one search for a local plumber turned up, and the local ranking factors for those plumbers:

Here are the factors that are normally some of the most important in determining search rank. As you can see, most of the plumbers listed rank pretty poorly for every factor… except location:

As Darren Shaw, CEO of Whitespark, points out: “Surely, Google, there are more prominent businesses in [my city] that deserve to rank for this term?”

There probably are.

However, when it comes to local searches, it looks like proximity is now the truest measure of worth from Google’s perspective.

Here’s the evidence from Whitespark, based on 9 different local searches by 4 people spread out across a city.

Browsing in incognito mode in Chrome from 4 dispersed locations around the city, they get 4 very different sets of results for the same search term.

The Good News

It’s not all about proximity, though! Local organic listings – the ones below the 3-pack – are relatively unaffected by proximity.

“Generally, localized organic results are consistent no matter where you’re located in a city — which is a strong indication of traditional ranking signals (links, reviews, citations, content, etc) that outweigh proximity when it comes to local organic results,” writes Shaw.

This is good news for hotels, especially those with restaurants, spas and other services that serve the local community as well as guests.

What Should Hotels Do?

Hotels should make their Google My Business profiles as strong as possible, and hotels should be on other sites that provide recommendations (like TripAdvisor).

This proximity factor won’t matter in many cases – most people searching for hotels aren’t looking for hotels in their city, after all. They’re doing research on hotels where they’d like to go. However, there are two important cases where proximity matters:

  • As mentioned above, for your hotel’s restaurant, spa, or other business that serves the local area as well as your guests
  • For last-minute bookers searching for a hotel as they drive into town

For example, when we search for ‘hotels’ on an incognito browser from our office at Net Affinity, this is what we get:

As you can see, those hotel results are pretty much triangulated around our location.

To prepare for those last-minute bookers and those on the hunt for your restaurant, you should make your Google profiles as strong as possible. While proximity is the dominating factor for the 3-pack, you can still aim for that coveted top organic listing.

You should also keep in mind that a lot of people won’t like these results. This move from Google likely won’t be too popular. Often, you’re not searching for the closest restaurant in town, you’re searching for the best. If all Google is showing you is the fast food joint up the street, you’re more likely to turn to sites like TripAdvisor, Yelp, and others.t

So, another important thing to do is to diversify and make sure your hotel, restaurant, and spa are on those sites, and that you’re putting time into optimizing those profiles. Don’t force yourself to rely completely on Google.

Conclusion

This is a big deal, but, managed properly, won’t affect your hotel too much. Local SEO has become more competitive with this change, and you’ll need to compensate with optimizing everything you can. Talk to your digital marketing team if your hotel is likely to be affected!

Additionally, diversify your efforts to TripAdvisor, Yelp, and other popular review and ranking sites in your area.

Author  :  Taylor Smariga

Source : http://www.4hoteliers.com/features/article/10305

Categorized in Search Engine

Google claimed 5 spots in the list of top 10 downloaded apps in 2016, dominating the list which also contains apps from Facebook, Amazon, and Apple.

However, Google didn’t claim the coveted top 1 and 2 spots in the list, which go to the flagship Facebook app and Messenger app respectively. The list follows with YouTube and Google Maps beating out the Google Search app.

Google Play managed to come out ahead of GMail, while Instagram, Apple Music, and Amazon rounded out the rest of the top 10 in that order.

top apps 2016

Google is the unquestionable leader in search when it comes to smartphone apps. Bing has smartphone apps as well, but as you can see none of them managed to crack the top 10. 

Smartphone adoption is currently at 88% of US mobile subscribers, with the majority (53%) using the Android operating system. A close 45% of smartphone users use iOS, followed by Windows phone at a distant third with 2% market share, and Blackberry coming in last with 1% market share.

top apps of 2016

Data from this report is based on Nielsen’s monthly survey of 30,000-plus mobile subscribers aged 13 and up in the U.S.

Author : Matt Southern

Source : https://www.searchenginejournal.com/google-5-top-10-apps-2016-according-nielsen-rankings-report/182449/

This holiday season, when we Google for the most trending gifts, compare different items on Amazon or take a break to watch a holiday movie on Netflix, we are making use of what might be called “the three R’s” of the Internet Age: rating, ranking and recommending.

Much like the traditional “three R’s” of education – “reading, ’riting and ’rithmetic” – no modern education is complete without understanding how websites’ algorithms combine, process and synthesize information before presenting it to us.

As we explore in our new book, “The Power of Networks: Six Principles that Connect Our Lives,” the three tasks of rating, ranking and recommending are interdependent, though it may not be initially obvious. Before we can rank a set of items, we need some measure by which they can be ordered. This is really a rating of each item’s quality according to some criterion.

With ranked lists in hand, we may turn around and make recommendations about specific items to people who may be interested in purchasing them. This interrelationship highlights the importance of how the quality and attractiveness of an item is quantified into a rating in the first place.

Ranking

What consumers and internet users often call “rating,” tech companies may call “scoring.” This is key to, for example, how Google’s search engine returns high-quality links at the top of its search results, with the most relevant information usually contained in the first page of responses. When a person enters a search query, Google assigns two main scores to each page in its database of trillions, and uses these to generate the order for its results.

The first of these scores is a “relevance score,” a combination of dozens of factors that measure how closely related the page and its content are to the query. For example, it takes into account how prominently placed search keywords are on the result page. The second is an “importance score,” which captures the way the network of webpages are connected to one another via hyperlinks to quantify how important each page is.

The combination of these two scores, along with other information, gives a rating for each page, quantifying how useful it might be to the end user. Higher ratings will be placed toward the top of the search results. These are the pages Google is implicitly recommending that the user visit.

Rating

The three Rs also pervade online retail. Amazon and other e-commerce sites allow customers to enter reviews for products they have purchased. The star ratings contained in these reviews are usually aggregated into a single number representing customers’ overall opinion. The principle behind this is called “the wisdom of crowds,” the assumption that combining many independent opinions will be more reflective of reality than any single individual’s evaluation.

Key to the wisdom of crowds is that the reviews accurately reflect customers’ experiences, and are not biased or influenced by, say, the manufacturer adding a series of positive assessments to its own items. Amazon has mechanisms in place to screen out these sorts of reviews – for example, by requiring a purchase to have been made from a given account before it can submit a review. Amazon then averages the star ratings for the reviews that remain.

Averaging ratings is fairly straightforward. But it’s more complicated to figure out how to effectively rank products based on those ratings. For example, is an item that has 4.0 stars based on 200 reviews better than one that has 4.5 stars but only from 20 reviews? Both the average rating and sample size need to be accounted for in the ranking score.

There are even more factors that may be taken into consideration, such as reviewer reputation (ratings based on reviewers with higher reputations may be trusted more) and rating disparity (products with widely varying ratings may be demoted in the ordering). Amazon may also present products to different users in varying orders based on their browsing history and records of previous purchases on the site.

Recommending

The prime example of recommendation systems is Netflix’s method for determining which movies a user will enjoy. Algorithms predict how each specific user would rate different movies she has not yet seen by looking at the past history of her own ratings and comparing them with those of similar users. The movies with the highest predictions are those that will then make the final cut for a particular user.

The quality of these recommendations depends heavily on the algorithm’s accuracy and its use of machine learning, data mining and the data itself. The more ratings we start with for each user and each movie, the better we can expect the predictions to be.

A simple rating predictor might assign one parameter to each user that captures how lenient or harsh a critic she tends to be. Another parameter might be assigned to each movie, capturing how well-received the movie is relative to others. More sophisticated models will identify similarities among users and movies – so if people who like the kinds of movies you like have given a high rating to a movie you haven’t seen, the system might suggest you’ll like it too.

This can involve hidden dimensions that underlie user preferences and movie characteristics. It can also involve measuring how the ratings for any given movie have changed over time. If a previously unknown film becomes a cult classic, it might start appearing more in people’s recommendation lists. A key aspect of dealing with several models is combining and tuning them effectively: The algorithm that won the Netflix Prize competition of predicting movie ratings in 2009, for example, was a blend of hundreds of individual algorithms.

This combination of rating, ranking and recommendation algorithms has transformed our daily online activities, far beyond shopping, searching and entertainment. Their interconnection brings us clearer – and sometimes unexpected – insights into what we want and how we get it.

Source : http://theconversation.com/rating-ranking-and-recommending-three-rs-for-the-internet-age-70512

Categorized in Future Trends

Searchmetrics’ annual study of top Google ranking factors undergoes radical format shift to match industry-specific needs and results.

SAN MATEO, Calif., December 13, 2016 ‒ Today’s search results are shifting dramatically to match answers to the perceived intent of a search, as Google and other search engines increasingly employ deep learning techniques to understand the motivation behind a query, according to key findings in a new Searchmetrics study.

Findings from the latest Searchmetrics Ranking Factors study, “Rebooting for Relevance,” suggest marketers face new challenges as Google deemphasizes traditional ranking factors such as collecting more backlinks and employing enough focus keywords in text. As technical SEO factors become table stakes in online content strategies, marketers in various industries will be forced to adopt new techniques to succeed,” said Marcus Tober, Searchmetrics founder and CTO.

“Google revealed last year that it is turning to sophisticated AI and machine-learning techniques such as RankBrain to help it better understand intent behind the words searchers enter, and to make its results more relevant,” Tober says. “User signals such as how often certain results are clicked and how long people spend on a page help the search engine get a sense of how well searchers’ questions are answered. That allows it to continually refine and improve relevance.”

The findings come from Searchmetrics’ annual study of Google ranking factors, which analysed the top 20 search results for 10,000 keywords on Google.com. The aim of the analysis (carried out every year since 2012) is to identify the key factors that high ranking web pages have in common, providing generalized insights and benchmarks to help marketers, SEO professionals and webmasters.

“The most relevant content ultimately ranks by trying to match user intent - whether a searcher is looking to answer a question quickly, shopping or researching,” Tober says.

“Someone who types ‘who won Superbowl 50?’ wants a single piece of information, while a query like ‘halloween costume ideas’ is most likely to best feature a series of images,” Tober explains. “A query on ‘how to tie a Windsor knot’ might be best served with video content. Our research suggests Google is getting better at interpreting user intent to show the most relevant content.”

Here are five indications from this year’s study that suggest Google is getting better at showing the most relevant results:

1. High ranking pages are significantly more relevant than those that appear lower

Higher ranking search results are significantly more relevant to the search query than those lower down, according to the study, an indication that Google recognises when content is more relevant, and then gives it a rankings boost. It’s also clear it is not simply based on a crude analysis of the number of times web pages mention keywords that match those entered in the search box.

In this year’s study, Searchmetrics has used Big Data techniques to calculate a Content Relevance score[1], a new factor that assesses the semantic relationship between the words entered in search queries and the content shown in results; in effect, it measures how closely they are related. To make Content Relevance more meaningful, its calculation actually excludes instances of simple keyword matches between search queries and search results.

In general the Content Relevance scores of results positioned near the top are higher, suggesting that Google knows when content is more relevant and then places it more prominently. This rule does not apply to results found in positions 1 and 2, which tend to be reserved for top brand websites - presumably because Google considers content from more recognisable and trusted brands will better serve searchers’ needs than non-brand pages that might have slightly more relevant content. Results with the highest Content Relevance scores appear in results found in positions 3 to 6.

2. Word count is increasing on pages that rank higher, while keyword mentions fall

The number of words on higher ranked pages has been increasing for several years now, and this trend is continuing. According to Searchmetrics, this is because top performing results are more detailed, more holistic (cover more of the important aspects of a topic) and are hence better able to answer search queries.

But interestingly, even as text grows longer, the number of keywords (words that match the search query) on higher ranked pages is not increasing. This is because Google is no longer just trying to reward pages that use more matching keywords with higher rankings; it is trying to interpret the search intention and boosting the content that is most relevant to the query.

In fact, the top 20 results include 20% fewer matching keywords (on average) in the copy than in 2015. Also in 2016, just 53% of the top 20 results have the keyword in the title (compared with 75% in 2015). Less than 40% now have the matching keyword in H1 title tag (usually used in the HTML of web pages to tell search engines what the page is about).

On average, pages appearing in desktop results are a third longer than those appearing in mobile search results.

3. User signals suggest Google increasingly guides searchers to exactly the right result

If Google was presenting precisely the right results to answer searchers’ queries, then more of them would visit those pages, take in what is there and leave without having to look elsewhere (having found exactly want they were looking for).

That is just what seems to be happening. Searchmetrics’ analysis of user signals indicates that bounce rates (when a searcher visits a page and leaves without clicking more pages on the same site) have risen for all positions in the top 20 search results, and for position 1 have gone up to 46% (from 37% in 2014). This is not because more people are bouncing away from pages immediately - having found that the content does not answer their question. Because time-on-site has also increased significantly over previous years, with people spending around 3 minutes 10 seconds on average when they visit pages listed in the top 20 results.

4. Backlinks: The rise of mobile search is making them less important

The number of backlinks coming into a page from other sites has always been an important common factor among high ranking pages. It still has a strong correlation with pages that rank well. However, it is on a downward trend as other factors such as those related to the content on the page become more important.

As well as the growing importance of content related factors, backlinks are becoming less important because of the rise of mobile search queries: pages viewed on mobile devices are often ‘liked’ or shared but seldom actively linked to.

5. Google shows longer URLs to answer search queries, not just optimised short-URL home and landing pages

Until now, marketers and SEO professionals have been able to use optimization techniques to help their site’s homepage or favored landing pages rank higher. But the study shows that the URL addresses for pages that feature in the top 20 search results are around 15% longer on average than in 2015. Searchmetrics’ hypothesis is that instead of the highly optimised home and landing pages that marketers might prefer to appear in searches (and which tend to have short, tidy URLs), Google is better able to identify and display the precise pages that answer the search intention; these pages are more likely to have longer URLs because they possibly lie buried deeper within websites.

Other important findings include:

  • Technical factors such as loading time, file size, page architecture and mobile friendliness are a prerequisite for good rankings, as these factors help to make web pages easily accessible and easy to consume for both humans and search engines. They lay the foundation for breaking into the top 20 search results with the quality and relevance of the content enabling further rankings.
  • There are significant differences between high ranking content on desktop devices and that which appears on mobile devices. For example, high ranking mobile results tend to have faster page load speeds, smaller file sizes, shorter wordcounts and fewer interactive elements such as menus and buttons.

For marketers and SEO professionals the advice from the study is clear, Tober says:
“Since Google is becoming much more sophisticated about how it interprets search intent and relevance, you also need to work harder and be smarter at understanding and delivering on these areas in content you put on your websites. You need to use data-driven insights to analyze exactly what searchers are looking for when they type specific queries in the search box, and make sure your content answers all their questions clearly and comprehensively in the most straightforward way – and you need to do it better that your competitors.”

Google’s application of machine learning to evaluate search queries and web content means that the factors it uses to determine search rankings are constantly changing. They are becoming fluid and vary according the context of the search (is it a travel search? An online shopping search? etc.), and according to the intention behind each individual query. Because of this, Searchmetrics will in future no longer be conducting a single generalized universally applicable ranking factors study. Instead, in the coming months, it will be publishing a series of industry-specific ranking factors studies focused on verticals such as ecommerce, travel, finance and more.

To download the new Searchmetrics Ranking Factors whitepaper, please visit: 
http://www.searchmetrics.com/knowledge-base/ranking-factors/

[1]Content Relevance is based on measurement methods that use linguistic corpora and the conceptualisation of semantic relationships between words as distances in the form of vectors. For the semantic evaluation of a text, this makes it possible to analyse the keyword and the content separately from one another. We can calculate a content relevance score for a complete text on a certain keyword or topic. The higher the relevance score, the more relevant the content of the analysed landing page for the given search query.

About the study

As in previous years, the study analysed Google US (Google.com) search results for 10,000 keywords and 300,000 websites, featuring in the top 30 positions. For some factors, a more in-depth analysis required the definition of specially-defined keyword sets. The correlations between different factors and the Google search rankings were calculated using Spearman's rank correlation coefficient. To provide maximum context this year’s desktop search analysis has been compared either with the equivalent desktop data from previous years or with the mobile data from 2016.

About Searchmetrics

Changing search technology has forced SEO platform providers to up their game. These changes have created an entirely new search paradigm − search and content optimization. And since search engines have put a fence around a lot of their data, SEO platforms need to bring their own rich data to the party − and powerful tools to analyze it.

There’s only one search platform that owns its data: Searchmetrics, the world’s #1 SEO and content performance platform. We don’t rely on data from third parties. Our historical database spans nine years and contains over 250 billion pieces of information, such as keyword rankings, search terms, social links and backlinks. It includes global, mobile and local data covering organic and paid search, as well as social media. We have the largest global reach of any SEO platform, crawling the Web every day in more than 130 countries.

Searchmetrics monitors and reveals the full business available to you online. We provide our customers with a competitive advantage and help them identify new business opportunities by exposing the content consumers are engaging with on industry and competitors’ sites. Our Visibility Score − trusted by reputable media sources such as The New York Times, Bloomberg and The Guardian − reliably indicates your online presence.

We provide the insights our customers need to deliver results. Searchmetrics guides SEOs and content marketers with suggestions for creating content that improves relevance and boosts conversions. It shows the connection between social media links and overall engagement. And its analytics make clear which content performs the best and how an organization’s content performs against its competitors.

With Marcus Tober, one of the top 10 SEO minds in the world, leading Searchmetrics’ product development, we have over 100,000 users worldwide, many of whom are respected brands such as T-Mobile, eBay, Siemens and Symantec. They depend on Searchmetrics and our 12 years of product innovation to maximize their online performance.

More information: www.searchmetrics.com.

Media Contacts:
Cliff Edwards
Searchmetrics Inc.
San Mateo, Calif.
650.730.7091

Uday Radia
CloudNine PR Agency
This email address is being protected from spambots. You need JavaScript enabled to view it.
+44 (0)7940 584161

Source : http://www.realwire.com/releases/Searchmetrics-Ranking-Factors-Rethinking-Search-Results

Categorized in Search Engine

For many websites and businesses Google continues to provide more traffic to their website than any other channel: more organic visitors than paid search, social and even direct traffic.

With that in mind it’s certainly worth keeping up-to-date with what works, what doesn’t work, what Google has said and how to avoid a dreaded penalty.

This article looks at the ranking factors Google has confirmed in the hope of helping to increase your business’ prominence on the web.

Positive Factors:

No one will be surprised to see links make it on the list. Link building has been a big business for many a year now.

Earlier this year, in a Q&A with Google, Search Quality Senior Strategist at Google Andrey Lipattsev confirmed that two of the top three ranking factors were ’content’ and ’links’:

“I can tell you what they are. It is content. And it’s links pointing to your site.”

So what specifically is it about content and links that helps ranks your website in Google’s Search Engine Results Pages (SERPs)?

Links

Quality

The quality of your inbound links is a huge factor with Google. A company or website with a small backlink profile can see a huge boost from just one link from an authoritative page on an authoritative website.

Google PageRank died in importance some years ago. These days I use Majestic’s Trust Flow score as it gives us a greater idea of authority sites from the regular sites. Or the difference between regular sites and the low-quality and spammy sites.

Quantity

It’s all well and good getting a link from one authoritative site but the more you get the better your site will rank.

This is not suggesting build links for the sake of numbers – I certainly wouldn’t recommend directory links or junk comments for the sake of increasing the number of links. However, if you had one link from a news story from an authoritative site recently, such as a news site, it may naturally be picked up by other sites, but also it presents you with an opportunity to get links from smaller news sites, publications and blogs from the same story.

And just because you got a story in a certain publication once doesn’t mean you can’t or shouldn’t go back to them in the future.

Anchor Text

Anchor text used to be the be-all-and-end-all of ranking a website many years ago, as stated in Google’s original algorithm:

“First, anchors often provide more accurate descriptions of web pages than the pages themselves.”

Whilst anchor text isn’t as strong a factor as it was a few years ago, it does still play a big role in ranking websites and webpages. One expects this to lessen over time but for now it’s still important to get some anchor text links to your website.

Don’t overdo it though – a natural backlink profile is made up with a majority of brand name and URL anchor text links.

Here’s an example of a keyword jumping from page 10 to page 2 off the back of one anchor text link in September 2016:

clip_image002
Screenshot of Authority Labs data showing a keyword jumping from position 99 to 18 and maintaining that ranking

Internal Links

Internal links also play a big role in your rankings. If you have lots of good links pointing to a specific page, or a number of pages, you should consider passing this onto your product pages if they need a boost in the SERPs. Though please do make it user-friendly.

Last month we shot up from page 4 to position 7 (page 1) for our own website by adding a couple of internal links from popular, relevant blog posts on our website. Though it has since dropped down to page two:

clip_image004
Looks like we need some external links pointing to this keyword!)

Content

Andrey Lipattsev also mentioned content as one of the top three factors when Google comes to ranking a site.

Getting the content right on your page can certainly play a big role in where you rank in Google’s SERPs so here are a few things to bear in mind when it comes to content as a ranking factor:

Title Tag

It’s still a very important ranking factor to have your keyword(s) in the title tag. Not only because it weighs heavily when Google is determining where to rank your site for specific searches, but also to help attract click-throughs to your website. Someone that has searched your target keyword and sees it in your page title, particularly at the beginning, is more likely to visit your website than if it wasn’t there at all.

We switched focus on our target keywords last month and here is one of the results we achieved just by changing the page title:

clip_image006
 
Screenshot of Authority Labs data showing a keyword jumping from position 54 to 5 after changing page title

Heading Tags

Heading tags also have some weight when it comes to ranking your website and it’s important to get your keyword(s) in the H1 where possible on your pages.

It is recommended that you only use one H1 per page though there is no harm in using H2s, H3s, etc. as well.

Content Length

Since the Google Panda algorithm update back in February, 2011 it became very noticeable how seriously the search engine takes the content on a given page.

Those of you who were working in SEO six or more years ago may remember how you could rank websites and webpages with thin content thanks to their backlinks. It’s not the case these days.

It’s more and more the case now that the top results in Google are in-depth articles. An interesting result I came across for ‘fx trading’ recently is that most of the top organic results are information pages and not the homepage or target page that fx companies would aim first-time users to land on:

clip_image008

This is a keyword that Google suggests bidding £38.98 ($65 CAD) in AdWords!

clip_image010

URLs

Making sure to include your keyword in the URL slug of your page also helps with ranking. If your keyword is already in the page title as advised then there’s no reason why you won’t have it in your page URL too.

Google continues to bold keywords within your URL that match search queries which help your listing stand out:

clip_image012

RankBrain

Completing the top three ranking factors in Google SERPs, a news piece published by Bloomberg last October quoted a Google senior research scientist, Greg Corrado, confirming RankBrain’s importance:

“RankBrain is one of the ’hundreds’ of signals that go into an algorithm that determines what results appear on a Google search page and where they are ranked,” Corrado said. “In the few months it has been deployed, RankBrain has become the third-most important signal contributing to the result of a search query.”

RankBrain is Google’s AI system that helps it process search results to provide more relevant results for its users. It works by guessing what words might have a similar meaning so it can filter the results according.

Other Positive Ranking Factors

Google has confirmed the following ranking factors. Although they are not in the top three we certainly believe these have a big influence on your rankings and should be taken into consideration when optimising your website:

Page Loading Speed

All the way back in April, 2010 Google confirmed that page loading speed was one of their search ranking factors; how quickly a website responds to web requests.

This isn’t just useful for helping your site rank better but for the users’ experience too. Ever been frustrated over the time it takes a website or webpage to load? Imagine your website loading slowly for first-time users and how this could put them off making a purchase or enquiry to your business?

Secure Website

In August, 2014 Google confirmed it was starting to use HTTPS as a ranking signal. Since then the web is now fill of https:// websites. How significant a factor it is as a ranking factor is anyone’s guess but Google claimed two years ago that they were seeing positive results following a test.

A secure website is also a trust signal to users. They are more likely to place an order through your website if you have a secure site than with a non-secure site.

Negative Ranking Factors

As well as proving that some of our efforts are positive towards ranking a website highly in Google, the search engine giant has also been able to confirm other factors that may have a negative effect on your positions within the results:

Manual Penalty

Manual penalties are Google’s way of removing or demoting your website or individual webpages. Those who have suffered a manual penalty from Google are notified via a message in Search Console if they have it set up.

These penalties are not related to algorithm additions such as Penguin or Panda but are Google manually judging and punishing websites themselves. This is usually a result of underhand behaviour such as trying to manipulate Google’s SERPs.

In September, Search Engine Watch published a list of 12 well-known businesses that have been hit with a manual penalty from Google over the years. When the Washington Post, WordPress and BBC are being hit by penalties from Google, no website is safe!

clip_image014

Penguin Penalty

Google first introduced the Penguin algorithm update in April, 2012 to devalue the impact of low-quality backlinks. This resulted in some websites and webpages being demoted in Google’s SERPs and some even being kicked out altogether.

For the nearly four-and-a-half years of its existence, websites could only recover from Penguin after both cleaning up their backlink profile and waiting for Google to manually refresh their results. As of 23 September, 2016 Penguin is now real-time meaning you can be hit or recover within days.

It has been questioned whether sites ever fully recover from a Penguin penalty so it’s certainly worth avoiding any underhand activity to put yourselves at risk – no one wants their website or business kicked out of Google.

Panda Penalty

The Google Panda update was released in February, 2011 with the aim of hitting sites with low-quality or thin content. This was the start of a big shift with Google providing higher-quality results and not just those with a large number of links.

Shortly after the algorithm was rolled out Google received lots of questions on their support forum, which may have resulted in them releasing a 23-bullet point guide on building high-quality sites.

The quality of content and the length of content for sites ranking at the top of the SERPs for popular keywords has noticeably increased over the past couple of years following the original Panda rollout.

Buying Or Selling Links

Google lists “Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link” on its Link Schemes page.

I’ve certainly heard of more and more websites getting messages from Google about unnatural links within their content. Here’s an example of the message some webmasters have received in their Search Console accounts:

clip_image016

Reciprocal Links

Google has stated that “Excessive link exchanges ("Link to me and I'll link to you") or partner pages exclusively for the sake of cross-linking” are to be avoided.

You’re taking a big risk by having a ‘links page’ on your website these days: something that was common more than five years ago was for sites to exchange links with each other.

Article Marketing

Also on the Google Link Schemes page is: “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links”.

At the start of 2014, then head of the Google web spam team, Matt Cutts published a blog on his website announcing the decline in the benefit from guest posting. Within his article Mr Cutts went on to explain how guest posting had turned from something respectable into pure out spam solely with the intention of increase your website’s rankings within Google.

Press Releases

Press releases aren’t a no-no, but Google has stated that you should avoid any optimised anchor text links within them.

Submitting a release to the wire with optimised anchor text links is straightforward for Google to pick up.

Directories And Bookmarks

The search engine giant disapproves of ’Low-quality directory and bookmark site links’.

These link building methods remain popular though, perhaps because they’re cheap and used to work. I certainly wouldn’t advise this approach in the year 2016 or beyond.

Widgets

Optimised anchor text links on a widget should also be avoided if you want to avoid the wrath of Google.

Footer Links

Google doesn’t react too kindly to “Widely distributed links in the footers or templates of various sites” either.

Forum Comments

What used to be a popular link building tactic, Google acts negatively on “Forum comments with optimised links in the post or signature.” Presumably the same can be said of optimised anchor text left in comments on websites too.

Interstitials Or Distracting Ads

As of 10 January 2017, Google will start to demote pages that display intrusive interstitials and annoying ads on mobile devices. The theory is that they provide a poorer experience to users.

Whilst this is initially rolling out on mobile you can expect it to negatively affect desktop sites as well later in the year.

DMCA Complaints

Since August, 2012 Google has been lowering websites in their search results that they have received valid copyright removal notices for. Their official word back then was “Sites with high numbers of removal notices may appear lower in our results” but Google now removes pages entirely, though does still display a message to notify the search user:

clip_image018

Duplicate Content

Google states that: “In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved.” Basically, your site won’t appear in their results if they believe you were intentionally duplicating content to rank.

Google provides plenty of useful documents containing advice and support to give you an idea how to drive organic traffic to your website and also helping you to avoid making any bad choices when attempting to rank your website within their search engine.

The Webmaster Support pages also have plenty of useful advice and a link to Google’s popular support forum where webmasters and such try to offer advice and help of their own.

Author:  Barrie Smith

Source:  http://www.searchenginepeople.com/

Categorized in News & Politics

Search, more so than any other tool, plays a pivotal role when it comes to travel bookings. Travellers use Google at almost every stage of travel planning, from destination selection to booking a local taxi. It stands to reason then that Google pays special attention to the travel industry and just last month launched its own travel dashboard, sharing consumer search trends publicly for the first time.

But how can travel agents make the most of the tools made available by Google (some specifically available for the travel industry only), as well as other factors such as seasonality and unplanned world events, to enhance their position on Google and ensure they are standing out from their competitors?

Extensions

One of the most interesting things about this industry is that it is fantastic for showcasing all possible pay-per-click extensions (when a search engine allows businesses to buy their listings in search results). These range from sitelinks that are perfect for highlighting destinations to price extensions that can be used to promote special offers.

Keyword Planner

The keyword planner that Google offers will suggest bids for a range of generic holiday search terms, which indicates a highly competitive market. But it’s worth noting, these estimates can be conservative at best, so you should expect actual cost-per-clicks to be higher in reality, especially around the key booking periods of the year.

This makes it imperative to ensure that pay-per-click accounts are well optimised and bids managed on a regular and seasonal basis to ensure efficiency. It also makes “niche” spotting or exploiting your unique selling points imperative; long-tail expansions (low-volume, obscure, infrequently searched-for keywords) are vital to make campaigns commercially viable.

Search Engine Results Pages

The travel industry is unique when it comes to pay-per-click because there are ad formats available in this vertical that aren’t open to other markets. One that has rolled out to most countries now is the “3 pack” ad, which will suggest three different channels to book a room when a potential customer is searching for a hotel on Google. The “Book a Room” ads are shown below the main hotel information; clicks on these ads direct you to the advertiser’s landing page. Previously these ads were shown in the knowledge card on the right of Google, however after various tests (and probably falling revenue) Google has decided to move these into the heart of the page.

The “3 pack” is only showing on hotel results at the moment. These ads are powered by the Hotel Ads API; they need to be feed-based and work dynamically in order to serve accurate availability and pricing.

Seasonality

Marketers need to be on top of the seasonal trends that are particular to their brand to ensure they drive maximum volume during the periods of highest interest. Forecasting against trends and seasonality for your brand and allocating budgets to capitalise on surges in interest or changes in the weather are an important consideration when working to maximise cost efficiency and return on investment. Seasonal copy is also imperative within the travel industry. In a highly competitive market it is crucial to stand out and the cheapest way to do this is with quirky and seasonal copy.

Be agile

Agility is crucial when it comes to preventing wastage and protecting brand reputation. Global events have the ability to scupper the travel industry in an instant. Natural disasters, airplane crashes, terrorism threats and health crises all have a dramatic effect on searches and bookings. For example, searches around “Turkey” in July would have been driven more by the coup rather than people looking for holidays. In this instance it would have been crucial to remove Turkey-based copy and ads promoting holiday deals to the country or at the very least to have added negative terms such as “coup” or “uprising”.

Source : tornosnews

Categorized in Search Engine

Google has rolled out its new algorithm Google Penguin 4.0. Michael Jenkins explains how these changes will affect your website and what you can do to avoid being penalised.

It’s been a two-year wait for SEO tragics – Google’s anticipated Penguin 4.0 started rolling out over the weekend and while it’s too soon to see the full impact here’s what you need to know.michael jenkins - ceo - shout agency

What is Google Penguin?

In a nutshell Penguin is the name for a Google algorithm designed to catch websites that are spamming search results. The first iteration launched in 2012, the last update to the algorithm was in 2014 and now Penguin 4.0 landed on the weekend.

Tell me more about Penguin 4.0

Penguin 4.0, dubbed the ‘real time update’, has targeted over-optimised sites. Over-optimisation is two-fold. Firstly, when there is an overuse of keywords that are unnaturally placed second is the over optimisation of link profiles, so if you have too many links from external sites pointing to the same keywords on your page, it’s time to update your disavow file before you get penalised.

Moving forward, there’s one thing that’s for certain – use keywords to write for your audience, not for search engine rankings as you will get found out quicker than ever!Happy chinstrap Penguin

How exactly will sites be affected?

The two key changes are:

  1. You will start to see considerable fluctuations in Google rankings from now as real-time updates will occur as updates are made to a site.
  2. Penguin 4.0 is more granular and penalises specific pages on a site. In the past it was a domain wide penalty.

Pros

  • Penguin is real-time! When a webmaster updates or optimises their website Google will now recognize these changes quickly; and rankings will change accordingly – no more lag time!
  • Penguin 4.0 penalises competitors sites that aren’t doing things by the book and taking short cuts for short term rankings. If you have been doing things well and building genuine authority in your marketplace online then it’s likely to see a positive effect on rankings.

Cons

  • Penguin is real-time. I hear you – I’ve named it as a ‘pro’ but it is also a watch out. You need to ensure your site is being optimized and updated correctly – Google will now notice errors faster than ever that can quickly alter your ranking.
  • SEO is becoming much more sophisticated over time and Google is getting faster at seeing unnatural tactics. Regularly updating your SEO strategy and keeping constant monitor on your websites back links is essential to remain compliant with Penguin 4.0

How can I make the most out of Penguin 4.0?

Marketers should always keep an eye on back links and perform regular checks using the Google disavows tool. The main difference between good and bad backlinks depends on the quality of the website they are on. Bad backlinks will see your site penalised.

If you have noticed fluctuations in rankings there are a few steps you can take to help:

  • Clean out hazardous links
  • Review keyword density on site. Is the keyword repetition natural?
  • Create some branded links. The fastest way to do this is though citation building

Watch this space.

Penguin 4.0 has literally just landed we’re bound to learn more in the coming week as it rolls out. Keep your eye out for more insights.

Source : https://mumbrella.com.au

Categorized in Science & Tech
Page 1 of 2

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.
Please wait

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Subscribe to Our Newsletter

Receive Great tips via email, enter your email to Subscribe.
Please wait

Follow Us on Social Media