fbpx

[This article is originally published in searchengineland.com written by Stephan Spencer - Uploaded by AIRS Member: Anthony Frank]

There’s no perfect method to snagging the top overall search result for every relevant query, but columnist Stephan Spencer believes understanding each element of Google's search listings can give you the best chance for success.

Are you looking to dominate in Google search results?

Your strategy needs to involve more than keyword research and a savvy AdWords campaign. In order to make the most of your Google presence, you need to craft a search result that entices users to click through to your web page. This is a crucial yet often-ignored aspect of SEO.

Believe it or not, small changes to your Google listing can make a big difference when it comes to click-through rate. Here is a detailed guide to better understanding a basic Google search listing.

Title

It’s no secret that page titles can heavily influence user behavior. But did you know that Google doesn’t always show a web page’s title tag? The title that appears in search results might be influenced by several factors. Google looks to find titles that are short, descriptive and relevant to search queries. Though they most commonly use a page’s title tag, they can also pull from page content or links pointing to the page. Try to keep your title tag short, and provide context to users in order for it to be displayed.

Is your title is being cut off in the Google search results? You might need to shorten it. The maximum length for a title tag is 600 pixels, which is about 70 characters (78 for mobile); otherwise, Google will truncate it. Truncated titles are indicated by an ellipsis.

Anatomy of a Google Search Listing 1

URL

You may have noticed that Google often omits parts of a URL. Google truncates URLs by removing their middle sections, even when the URL is only one line. Use short but meaningful URLs whenever possible to maximize their impact in the Google SERPs (search engine results pages).

Anatomy of a Google Search Listing 2

The URL is often displayed as clickable breadcrumb links. In these instances, Google displays the site’s internal hierarchical linking structure from the on-page breadcrumb navigation when those breadcrumbs are marked up using breadcrumb semantic markup.

Anatomy of a Google Search Listing 3

Google search listings may also include time stamps under their URL. This is a common practice for news publishers, blogs and other sites that wish to bring attention to the freshness of their content and provide the date of publication or date of last update.

Anatomy of a Google Search Listing 4

To integrate this, you need to add a time stamp into your page copy. You can provide Google with specific times by adding comment tags through the W3 Total Cache plugin for WordPress, which will appear something like this: Served from user @ 2017-03-03 17:15:25.

You can also manually add a time tag to a page or blog post using structured data markup. Otherwise, Google will use the publication date, which is easy for Google to determine with WordPress blogs.

Here is an example of the structured data HTML markup:

Anatomy of a Google Search Listing 5

Cached link

The cached link is a fail-safe in case your website is unavailable; it is a snapshot that Google takes of each page and adds to its cache. It also serves as a backup in case a page is deleted, temporarily down or failing to load.

Google has made changes to the Cached link location in recent years. Cached links are now stored next to the URL in a green down arrow.

Anatomy of a Google Search Listing 6

The cached link will be missing for sites that have not been indexed, as well as for sites whose owners have requested that Google not cache their content. Owners can block their page from being cached by using a meta-robots “noarchive” tag.

What’s the benefit of doing this? For one thing, it can prevent users from copying your content for redistribution; people can still copy and paste content from a cached page even if you’ve blocked these functions on your site. Sites with paid content often block cached pages to prevent their content from being seen for free. Fortunately for them, pages being cached or not by Google have no bearing on the overall ranking.

Snippet

A snippet is a description for the page that appears underneath the title. Google can obtain the snippet from either the page’s meta description tag or contextual information on the page. Like titles, the search snippet is based on the query and can be altered by different keyword searches.

For example, in a search for “meta description,” the snippet below is returned for the yoast.com search result.

Anatomy of a Google Search Listing 7

Searching for “160 character snippet” in Google returns a very different snippet for a search result for the same page as above.

Anatomy of a Google Search Listing 8

Keyword bolding (known by us information retrieval geeks as “Keywords in Context” or KWIC) is also query-based and will often appear in the snippet, depending on the search term

Anatomy of a Google Search Listing 9

Google currently limits a snippet to around 156 characters per search result (or 141 with a date stamp). The actual limit, in terms of a total pixel width, is 928 pixels (based on 13px Arial). Snippets will be truncated and end with ellipses when they run over this limit.

Anatomy of a Google Search Listing 10

Often, Google will choose not to use a meta description in favor of a more relevant snippet. The snippet can come from anywhere on your page (including disparate parts of the page), so it’s important to pay close attention to your content — especially around common keywords.

It’s still worth it to carefully craft a meta description. In many cases, Google will still show a quality meta description for popular searches. What makes it a quality meta description? It’s well-written, includes popular search terms and avoids redundant information, such as repetition of the title tag. Since the snippet is query-based, you need to incorporate popular, relevant search terms into both your meta description and your on-page content.

There are also times when a snippet does not appear. Why does this happen? It’s because that URL is blocked using a disallow in the site’s robots.txt file. In such cases, Google will display a message in the snippet’s place stating, “A description for this result is not available because of this site’s robots.txt.”

Anatomy of a Google Search Listing 11

You can prevent this with noindex instead of disallowing. That way, Google can still crawl the page, but it will not add it to its search engine index or display it in the SERPs.

Conversely, you can opt out of snippets by using the meta name="”googlebot”" co" />"tag on yur page.

Sitelinks

Google sitelinks are additional sub-listings that appear underneath the first search result. For instance, if a user were to search for “Search engine land,” this is what they would see:

Anatomy of a Google Search Listing 12

Sitelinks are intended to help users navigate around websites. In this instance, the user might want to jump to the latest industry news rather than navigating through Search Engine Land’s home page.

You might have noticed a “more results” feature in the above screenshot. This restricts the results to only coming from indexed pages on that specific site. In this example, the More results from searchengineland.com >> link leads to a refined search of just pages on searchengineland.com for the query “Search engine land.” This is accomplished using the Google site: search operator.

Anatomy of a Google Search Listing 13

Google allows up to six automated sitelinks, but they are far from guaranteed for poorly optimized sites. Websites with a clear hierarchy and structure and a unique brand name are more likely to have sitelinks. As a result, you’re more likely to see sitelinks appear in search results after typing in a specific brand.

Anatomy of a Google Search Listing 14

You’ll notice that in this instance, a search for The New York Times renders both sitelinks and a search box. If you wish to include a search engine, you can do so by embedding structured data on your website.

Though the system is automated, the best way to get sitelinks is to reach the top overall position for your website name. A downside to using different domains (or subdomains) in your web strategy is that they won’t be included in the sitelinks. Still, the impact of sitelinks is undeniable. AdWords advertisers with sitelinks see a 20-50 percent boost in click-through rate when the search is a branded term.

Final thoughts

Small changes to a search result can have a big impact on a site’s traffic. Google search is an ever-evolving science, so rules that exist today might not exist tomorrow. For the time being, you can follow this guide to help improve your presence in the Google SERPs.

Categorized in Search Engine

You’re a savvy digital marketer. You follow Google best practices and read “all the SEO blogs.” You sound like a zookeeper with your extensive knowledge of Pandas, Penguins, Possums and Pigeons. You’re always looking for ways to improve organic search rankings. Instead of investing your time researching some of those gray (or even black) hat tactics that are oh so tempting, I suggest you take a step back and look at the basics of your organic SERP listing.

An area that often gets overlooked by digital marketers is engagement and the click-through rate (CTR) associated with their organic listings. No matter how much you improve your ranking, if your listing itself is not compelling, it’s all for nothing!

Google has not confirmed that CTR is a direct ranking factor, but this slide from a Google engineer at SMX West in March 2016 suggests that click-through rate plays a significant role.

serp-queries

Regardless of Google’s ranking algorithm, all digital marketers strive to make organic listings compelling to searchers and enticing to prospects. These recommendations will help you improve organic search results and drive additional qualified traffic.

Step 1: Identify pages with a relatively low click-through rate

In Google Analytics, navigate to Acquisition > Search Console > Landing Pages and export the data into a CSV or Excel document. Identify pages with high Impressions, a low Average Position and a relatively low CTR based on position.

AdvancedWebRanking.com has a great study on average CTR by position that you can use as a guide. This analysis will help you create a list of prioritized landing pages to be improved.

Step 2: Find opportunities to expand title tags

One of the best things you can do to increase the CTR for a listing is improve the effectiveness of the Page Title. Back in 2014, Google changed the Title Tag limit to be based on pixel length (estimated to be 512px) which resulted in a significant reduction in organic Title Tag width. In May of this year, SEOs everywhere rejoiced as Google expanded this limit to 600px, a 17 percent increase!

Take advantage of this increased space and the opportunity to include more high-priority keywords (if you haven’t already). An easy way to view your current Meta Tags is to download them from the free Screaming Frog SEO Spider Tool.

The challenge is that the new pixel-based limit is harder to adhere to and more difficult to visualize than a simple character count. For example, a “W” takes up more space than an “l.” It’s all about size now, not number of characters. As you’re improving and expanding your Meta Tags, I recommend using a SERP Preview Tool. This will help you visualize how your listing (URL, Title and Description) will appear on a Google SERP.

After the Google SERP update in May, we noticed that popular SEO tools had not been updated to reflect the new guidelines, so we created our own Google SERP Tool to help SEO experts visualize the new, expanded pixel limits.

Step 3: Make your meta tags more compelling

The best Page Titles are often written like a newspaper headline. They are intriguing, interesting, descriptive, and often evoke emotion. Here is an example of two boring headlines and one compelling/engaging headline that really stands out.

killer-serp

Title Tag tips

It’s still crucial to have target keywords in your Title Tag, but don’t ignore the importance of engaging prospects. Optimize for user intent first, and SEO keywords second. Here are a few proven tips for Title Tags:

  • If your web page provides a list of some sort, state the number of items. For example: 17 Delicious Broccoli Recipes Your Kids Will Love
  • Mention if the page includes a video or a presentation. For instance: 10 Reasons Why The New Macbook Stinks w/Video Review
  • Special characters stand out, but don’t go overboard.
  • Mention pricing or sales numbers.
  • Timely/relevant content is key. Provide a date. Example: The 12 Lightest Laptops Available in November 2017
  • Use a free headline analyzer such as: http://coschedule.com/headline-analyzer

Description tips

Don’t forget to have a compelling and descriptive Meta Description as well. Use your Meta Description to complement and expand upon your Title Tag statement. Be persuasive; encourage an action.

Since Meta Descriptions have no explicit SEO value (other than CTR), don’t be obsessive about forcing keywords into your Description unless they fit naturally. Most of all, inspire curiosity and entice searchers to click.

Step 4: Make your SERP jump off the page with rich snippets

The buzz for structured markup has quieted in the last few years, but this is a powerful strategy that should not be ignored. Rich snippets can really make your SERP jump off the page, increasing your CTR and stealing clicks right out of the hands of your competition.

Using structured markup properly can really make your products stand out. This example below shows powerful information such as star rating, number of reviews, price and if the product is in stock or not. That’s a lot of valuable information in the search engine results!

macbook-snippit

Using the Recipe structured markup can also be really powerful. In the snapshot below, you can see a large photo and most of the ingredients needed for a recipe. It really jumps off the page as the first result. For the second result, you notice the star rating, number of reviews, time to cook and, of course, a picture! Wow, that’s powerful.

Recipe reviews are so popular that if you’re not using them, you may not make the first page of Google. The good news is that there are a variety of WordPress plugins and free tools to help make implementation very simple.

recipe-snippet

Some other powerful rich snippets are breadcrumbs, music, (notable) people, video content and events. You can find a rich snippet to improve click-through rate for almost every page imaginable. Google has a great Guide to Structured Markup, a Testing Tool, and even a Data Highlighter to use structured markup from the Google Search Console without having to implement any code. There are no excuses for not using these free features!

Getting back to basics

Once you expand and enhance your Meta Tags, track progress in Google Search Console. Continue to test and improve your organic listing over time.

You might be shocked by the dramatic increase in organic traffic delivered simply by getting back to the basics of writing a unique, compelling and relevant Page Title and Description. Remember, your Meta Tag is the only thing standing between a search result and a visitor!

Author: Jason Decker
Source: http://searchengineland.com/4-steps-make-organic-listings-effective-260192

Have an issue with your listings in Google? Getting an official answer might be tough. And when that happens to a Google competitor, as it did with ProtonMail, it could come back to harm Google's defense from antitrust charges.

Did Google deliberately try to reduce the rankings of ProtonMail, a tiny rival to Google’s own Gmail service? Almost certainly not. Even Proton doesn’t seem to believe that. But the case highlights how Google’s problems with publisher, business and webmaster communication can hurt it as it faces challenges on antitrust grounds.

What happened with Proton

Proton Technologies is a Swiss-based company offering a secure, encrypted email service called ProtonMail. It might be an attractive alternative for those who worry a service like Gmail isn’t private enough, either from government requests or Google’s own ad uses.

Last November, Proton noticed that they were seeing a drop in daily signups for ProtonMail. Wondering why, the company started looking into its rankings on Google and determined there was a problem. In particular, ProtonMail wasn’t showing in the top results for “secure email” or “encrypted email,” as it assumed was the case in the past.

Proton then suffered a problem that’s not unique for businesses and publishers. It had no guaranteed way to get an official answer from Google if there was a problem.

Google offers a wide-ranging toolset called Google Search Console that tells businesses if they have problems with their sites. Proton told Search Engine Land it even made use of the toolset. The problem is that the system doesn’t allow site publishers to contact Google if they suspect something is wrong on Google’s end. There’s no way to ask for help, unless you have received what’s called a “manual action,” a penalty placed on your site by a human being. Proton had no manual actions, it told us.

Without such an option, Proton ended up using Google’s spam reporting tool earlier this year. There was no indication that Proton had been spamming Google. But it appears Proton hoped that by using the form, it might trigger a review by Google which, in turn, would uncover what the real problem was.

That didn’t solve the issue. Finally, ProtonMail tweeted out for help in August to Google and to Google’s former head of web spam, Matt Cutts, who’s on leave from the company and hasn’t been involved with it for over two years. Moreover, a new head of web spam was named ages ago.

Still, reaching out to a semi-former Googler seems to have done the trick. Within about a week, the problem was resolved. Exactly what happened was never explained.

Enter the antitrust concerns

Last week, this all drew attention it hadn’t really received before because Proton did a blog postabout it, one that raised the specter that it was perhaps related to competitive issues.

This incident however highlights a previously unrecognized danger that we are now calling Search Risk. The danger is that any service such as ProtonMail can easily be suppressed by either search companies, or the governments that control those search companies.

The only reason we survived to tell this story is because the majority of ProtonMail’s growth comes from word of mouth, and our community is too loud to be ignored. Many other companies won’t be so fortunate. This episode illustrates that Search Risk is serious, which is why we now agree with the European Commission that given Google’s dominant position in search, more transparency and oversight is critical.

Could that have really been the situation here?

Unlikely competitive reasons were to blame

It’s unlikely. Google has over one billion daily active Gmail users. ProtonMail has just over a million, according to its recent post. It shows no growth trajectory that’s going to cause it to rival Google even in years to come.

Given all this, would Google really have actively worked to suppress it while not bothering to do the same for real email rivals? For example, Outlook ranks in the top results on Google for a popular term like email.

It doesn’t make sense. Even Proton isn’t saying the issue was due to competitive reasons, with cofounder Andy Yen telling Search Engine Land via email:

From the data we have, it is impossible to draw a concrete conclusion. We are willing to give Google the benefit of the doubt here and in our blog post, we aren’t drawing any conclusions in this regard.

We are grateful to the individual Googlers who stepped in to fix the issue, but overall this was a very difficult and costly situation for us. We are software developers ourselves, so we know that software bugs do happen, and Google isn’t infallible either, but when Google isn’t behaving correctly, the stakes can be very high.

At the end of the day, we hope that by sharing our experience, more people will become aware of Search Risk, as it is a challenge the internet community is going to have to confront.

“Search Risk?” I’ll get back to that. But even with Proton thinking it could be a completely innocent technical glitch, this case will probably come back to haunt Google. In particular, it’s harmful as the European Union continues its antitrust review and actions with the company.

Indeed, years ago, Google almost certainly wasn’t trying to act anti-competitively against tiny UK-based shopping search engine Foundem. But spam actions against that company were the seed for other complaints and concerns to grow. Last year’s antitrust charges levied against Google by the EU grew directly out of that.

In short, Google’s can’t really afford to be making mistakes with anyone who can be deemed a competitor, because they have a big club to swing that other publishers don’t get — that of Google acting competitively. And even if Proton doesn’t swing that club, others may take its situation as an example to challenge Google.

Despite glitch, Google did still send Proton traffic

It’s a bit of a side-issue among the bigger issues here, but it’s worth addressing. Proton said it had a growth rate drop of 25 percent because of the Google change. However, it really has no idea how much the Google drop harmed it. This is because, as it turns out, Proton had no idea how much traffic Google was sending it before, after or even now.

Proton is focused on the fact that it didn’t rank for well for a period of time for the two keywords mentioned above. Unfortunately, rank checking is a terrible way to assess how well you’re doing with Google or search engines in general. Sites are typically found for many different terms. Focusing only on some is far from the full picture.

These terms might have been traffic drivers for ProtonMail or not. Proton doesn’t know directly, because the company told Search Engine Land that it’s not running any type of analytics that would show how much traffic it gets from Google or other sources.

The company said it doesn’t use Google Analytics specifically because of privacy worries. It could, of course, find this type of data without using Google Analytics, such as by processing its own server logs directly. That’s much more complicated and time-consuming, but it’s an option.

So where’s that 25-percent drop in growth come from? Proton emailed us this:

We saw a noticeable drop in the number of daily sign ups with everything else held equal. More strikingly, after Google fixed the problem, we saw a >25% increase overnight (we changed nothing on our side). For us, this 25% was the difference between bleeding money each month and being able to break even.

Keep in mind, it’s not that Google wasn’t sending Proton traffic. It’s just that the loss of ranking well for those terms, and perhaps others, caused it to get less traffic than before. That drop in traffic edged the company out of making money to breaking even. That leads to the whole “search risk” issue.

Everyone has “search risk”

The bottom line is that any business or publisher is at “search risk,” as Proton dubbed it in its post, where losing search visibility could jeopardize your business. It’s not a new risk. It’s one that literally goes back over 20 years, to the days when Yahoo was deemed the internet “gatekeeper” that could make or break businesses.

Don’t depend on search engines, Google or otherwise. For that matter, don’t build any business on the idea that you’re going to somehow get free traffic from a source, such as Facebook, Pinterest or whatever. That should be common sense. If you’re not paying for something, you’re not guaranteed to get anything.

Wise search marketers know this. Smart SEOs know you don’t want to have an overdependence on Google. Algorithms change all the time. But apparently in 2016, it’s still a lesson that people need to learn.

Google needs to improve communication

That said, Google could and should do a better job with communication. Something was wrong with the ProtonMail site, in terms of how Google was processing it. We know that, because something was fixed. Google just won’t say what. All it will say is the statement it sent us below:

Google’s algorithms rely on hundreds of unique signals or “clues” that make it possible to surface the results we think will be most relevant to users.

While we understand that situations like this may raise questions, we typically don’t comment on how specific algorithms impact specific websites. We’re continually refining these algorithms and appreciate hearing from users and webmasters.

While in many cases search ranking changes reflect algorithmic criteria working as intended, in some cases we’re able to identify unique features that lead to varied results.

We’re sorry that it took so look to connect in this case and are glad the issue is resolved. For webmasters who have questions about their own sites, our Webmaster team provides support through the Webmaster Forums and office hours.

It shouldn’t have taken so long for the problem to be fixed. Google itself shouldn’t want it to take so long. The company needs to find a better way for publishers to report potential errors and get resolutions. I wished for that back in 2006 and again in 2011, as part of my revisiting my “25 Things I Hate About Google” post:

Sure, a paid support option might put you under fire that you might be making algorithm updates like Farmer/Panda just to generate support revenue. But others might appreciate a guaranteed route.

If not paid, maybe you could give anyone who registers with Google [Search Console] one or two free guaranteed express support tickets, so that we don’t have bloggers talkingabout getting in contact with Google being a “crap shoot” and diminishing the huge amount of resources you do put in to support through Google Webmaster Central.

Now it’s been 15 years since I first had that wish, and it still hasn’t been solved. Yes, there would be time and cost involved. But that might be well worth it, versus adding more ammunition for those who might use glitches to attack Google on antitrust grounds.

Google’s main advice for those with problems is to use its Google Webmaster Forums. Proton was even going to try that next, it told us, if its tweets didn’t help. Personally, I’d never want someone to go there because:

  1. while Googlers are there, they’re not guaranteed to review your problem; and
  2. some problems (as with Proton’s) can only be diagnosed by Googlers; and
  3. most people who answer in the forums are not Googlers; and
  4. non-Googlers might not even give the right answer.

For instance, here’s a non-Googler telling someone they have a manual action against them because the site can’t be found in Google’s search results and the Google URL shortener doesn’t work for it. Maybe. Probably, even. But that person is guessing. The only way to actually know if there’s a manual action is by going into Google Search Console as the publisher and checking. A third-party person can’t tell you.

Still, that’s the option you have. Unless you catch someone from Google’s attention another way, as ProtonMail did. Or MetaFilter did in 2014.

In both of those cases, Google took a public relations blow. Improve the communications, and everyone wins.

Author:  Danny Sullivan

Source:  http://searchengineland.com/

Categorized in Search Engine

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media