fbpx

How has the search landscape changed over the last decade? Columnist Jayson DeMers explores the biggest shake-ups over the last 10 years and their impact on search engine optimization (SEO).

Few marketing channels have evolved as quickly or as dramatically as search engine optimization (SEO). In its infancy, SEO was the shady practice of stuffing keywords, tweaking back-end code and spamming links until you started ranking well for the keywords you wanted. Thankfully, Google stamped out those practices pretty quickly, and its search algorithm has never really stopped evolving.

Much of Google’s foundation was in place by the mid-2000s, but how has its algorithm — and as a result, our approach to SEO — changed in the past 10 years?

1. The rise of content

First, there’s the rise of content marketing as part of a successful SEO strategy. Google has steadily refined what it considers to be “good” content over the years, but it was the Panda update in 2011 that served as the death blow to spammy content and keyword stuffing.

After Panda, it was virtually impossible to get away with any gimmicky content-based tactics, such as favoring a high quantity of content while forgoing quality and substance. Instead, the search engine winners were ones who produced the best, most valuable content, spawning the adoption of content marketing among SEOs — and content is still king today.

2. The death of link schemes

Google has provided its own definition of what a “link scheme” actually is, along with some examples. Many find the guidelines here somewhat ambiguous, but the simplest explanation is this: Any attempt to deliberately influence your ranking with links could qualify as a scheme.

By the late 2000s, Google had worked hard to stamp out most black-hat and spam-based link-building practices, penalizing participants in link wheels and exchanges and paid linkers. But it was in 2012, with the Penguin update, that link building really became what it is today. Now, only natural link attraction and valuable link building with guest posts will earn you the authority you need to rank higher.

3. The reshaping of local

Compared to 2006, local SEO today is a totally different animal. There have been dozens of small iterations and changes to the layout (such as the local carousel, and today’s modern “3-pack” layout), but the biggest recent change to ranking factors was in 2014, with the Pigeon update.

 

With this update, Google more heavily incorporated traditional web ranking signals into its ranking algorithm, giving well-optimized websites a major edge in local search. Google also boosted the visibility of high-authority directory websites in its search results.

More generally, local searches have become more common — and more location-specific — over the last few years, thanks to mobile devices.

4. SERP overhauls

I can’t tell you how many times the search engine results pages (SERPs) have changed, and not many people could; some of these changes are so small, it’s debatable whether to even count them. But take a look at a SERP screen shot from 2006 and compare it to today, and you’ll see how different your considerations must be.

Google-Search-Results-2006.jpg

5. The rise of the Knowledge Graph

Another major influencer in modern SEO has been Google’s Knowledge Graph, which first emerged on the scene in 2012. The Knowledge Graph attempts to give users direct, concise answers to their queries, often presenting them with a box of information about a general subject or a succinct answer to a straightforward query. This is great for the user but often takes precedence over organic search results.

Accordingly, optimizers have had to compensate for this, either by avoiding generally answerable keyword targets altogether or by using Schema.org microformatting to make their on-site content more easily deliverable to the system.

6. Mobile prioritization

Mobile devices have exploded in popularity since the iPhone first emerged back in 2007, and Google has done everything it can to emphasize the importance of optimizing websites for those mobile users. Indeed, in 2015, mobile queries officially surpassed desktop queries in Google search.

Optimizing for mobile has become not only common, but downright required these days, in no small part due to Google’s continuing and escalating insistence. Its mobile-friendly update, which occurred in two separate phases, has been a major enforcer of this new standard.

7. The soft death of keywords

Panda and Penguin killed off the practice of keyword stuffing, but a smaller, more curious update in 2013 spelled the “soft” death of keyword optimization altogether. Hummingbird is the name of the update that introduced semantic search, Google’s way of deciphering user intent rather than mapping out individual keywords and phrases.

Today, Google attempts to understand meaning rather than matching keywords, so keyword-centric optimization doesn’t work the same way. However, keyword research is still relevant, as it can help guide your strategic focus and provide you with ranking opportunities.

 

8. Update pacing and impact

It’s also worth noting that for a time — in the few years following Panda — Google stressed out search optimizers by releasing seemingly random, major updates to its search algorithm that fundamentally changed how rankings were calculated. However, now that the search engine has reached a strong foundation, the significance and pacing of these updates have declined. Today, updates are smaller, less noticeable, and roll out gradually, giving them a much less dramatic impact on the industry.

Final thoughts

Understanding where SEO has come from and where SEO stands today will help you become a better online marketer. Hopefully, by now you’ve long ago eliminated any black-hat techniques in your strategy.

Google — and we, as marketers alongside it — are constantly pushing this now-fundamental element of our lives forward, so if you want to stay relevant, you’ll need to keep focused on the next 10 years of search engine updates.

Source : searchengineland

Categorized in Search Engine

In search engine optimization, sometimes even small errors can have a large and costly impact. Columnist Patrick Stox shares his SEO horror stories so that you can be spared this fate.

We’ve all had those moments of absolute terror where we just want to crawl into the fetal position, cry and pretend the problem doesn’t exist. Unfortunately, as SEOs, we can’t stay this way for long. Instead, we have to suck it up and quickly resolve whatever went terribly wrong.

There are moments you know you messed up, and there are times a problem can linger for far too long without your knowledge. Either way, the situation is scary — and you have to work hard and fast to fix whatever happened.

Things Google tells you not to do

There are many things Google warns about in their Webmaster Guidelines:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content
  • Cloaking
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages
  • Scraped content
  • Participating in affiliate programs without adding sufficient value
  • Loading pages with irrelevant keywords
  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans or other badware
  • Abusing rich snippets markup
  • Sending automated queries to Google

 

Unfortunately, people can convince themselves that many of these things are okay. They think spinning text to avoid a duplicate content penalty that doesn’t exist is the best option. They hear that “links are good,” and suddenly they’re trying to trade links with others. They see review stars and will fake them with markup so that they have them and stand out in the SERPs.

None of the above are good ideas, but that won’t stop people from trying to get away with something or simply misunderstanding what others have said.

Crawl and indexation issues

User-agent: *
Disallow: /

That’s all it takes — two simple lines in the robots.txt file to completely block crawlers from your website. Usually, it’s a mistake from a dev environment, but when you see it, you’ll feel the horror in the pit of your stomach. Along with this, if your website was already indexed, you’ll typically see in the SERPs:

Then there’s the noindex meta tag, which can prevent a page you specify from being indexed. Unfortunately, many times this can be enabled for your entire website with a simple tick of a button. It’s an easy enough mistake to make and painful to overlook.

Even more fun is a UTF-8 BOM. Glenn Gabe had a great article on this where he explained it as such:

BOM stands for byte order mark and it’s used to indicate the byte order for a text stream. It’s an invisible character that’s located at the start of a file (and it’s essentially meaningless from an SEO perspective). Some programs will add the BOM to a text file, which … can remain invisible to the person creating the text file. And the BOM can cause serious problems when Google tries to read the file. …

[W]hen your robots.txt file contains the UTF-8 BOM, Google can choke on the file. And that means the first line (often user-agent), will be ignored. And when there’s no user-agent, all the other lines will return as errors (all of your directives). And when they are seen as errors, Google will ignore them. And if you’re trying to disallow key areas of your site, then that could end up as a huge SEO problem.

Also of note: Just because a large portion of your traffic comes from the same IP addresses doesn’t mean it’s a bad thing. A friend of mine found this out the hard way after he ended up blocking some of the IP addresses Googlebot uses while being convinced those IPs were up to no good.

 

Another horrific situation I’ve run into was when someone had the bright idea to block crawlers to get pages out of the index after a subdomain migration. This is never a good idea, as crawlers need to be able to access the old versions and follow the redirects to the new versions. It was made worse by the fact that the robots.txt file was actually the shared for both subdomains, and crawlers couldn’t see either the old or the new pages because of this block.

Manual penaltiesgoogle-penalty-blue-.jpg

Just hearing the word “penalty” is scary. It means you or someone associated with the website did something wrong — very wrong! Google maintains a list of common manual actions:

  • Hacked site
  • User-generated spam
  • Spammy freehosts
  • Spammy structured markup
  • Unnatural links to your site
  • Thin content with little or no added value
  • Cloaking and/or sneaky redirects
  • Cloaking: First Click Free violation
  • Unnatural links from your site
  • Pure spam
  • Cloaked images
  • Hidden text and/or keyword stuffing

Many of these penalties are well-deserved, where someone tried to take a shortcut to benefit themselves. With Penguin now operating in real time, I expect a wave of manual penalties very soon.

A recent scary situation was a new one to me. A company had decided to rebrand and migrate to a new website, but it turned out the new website had a pure spam penalty.

Unfortunately, because Google Search Console wasn’t set up in advance of the move, the penalty was only discovered after the migration had happened.

 

Oops, I broke the website!

One character is all it takes to break a website. One bad piece of code, one bad setting in the configuration, one bad redirect or plugin.

I know I’ve broken many websites over the years, which is why it’s important to have a backup before you make any changes. Or better yet, set up a staging environment for testing and deployment.

Rebuilding a website

With any new website, there are many ways for things to go horribly wrong. I’m always scared when someone tells me they just got a new website, especially when they tell me after it’s already launched. I get this feeling in the pit of my stomach that something terrible just happened, and usually I’m right.

The most common issue is redirects not being done at all, or developers arguing that redirects aren’t necessary or too many redirects will slow down the website. Another common mistake I see is killing off good content; sometimes these are city pages or pages about their services, or sometimes an entire domain and all the information will be redirected to a single page.

Issues can range from very old issues that still exist — like putting all text in images — to more modern problems like “We just rebuilt our website in Angular” when there was no reason for them to ever use Angular.

Overwrote the file

This scares me the most with overwritten disavow files, especially when a copy is not made and the default action happens to overwrite, or with an .htaccess file where redirects can easily be lost. I’ve even had shared hosts overwrite .htaccess files, and of course, no email is ever sent of the changes.

I don’t even know

In my years, I’ve seen some really random and terrible things happen.

I’ve seen people lose their domain because it expired or because they unknowingly signed a contract that said they didn’t own the domain. I’ve seen second and even third websites created by other marketing companies.

There are times when canonical tags are used incorrectly or just changed randomly. I’ve seen all pages canonicalized to the home page or pages with a canonical set to a different website.

 

I’ve seen simple instructions that sounded like a good idea, like “make all links relative path,” end up in disaster when they made canonical URLs relative along with alternate versions of the website, such as with m. and hreflang alternate tags.

SEO is scary

It’s amazing how one little thing or one bad decision can be so costly and scary. Remember to follow the rules, plan, execute and QA your work to prevent nightmares. Share your own tales of horror with me on Twitter @patrickstox.

Source: searchengineland

Categorized in Search Engine

Think you don't have the resources for an effective SEO program? Think again! Columnist Dianna Huff shares a case study detailing how a small business was able to make big gains with a limited budget. 

The small manufacturers who are thriving in the face of global competition and other challenges have spent the last five to seven years improving productivity and process efficiencies. This focus has often meant that marketing activity was next to nonexistent — with much new business coming from word-of-mouth.

Once a small manufacturer has their process down, however, they’re ready to begin a marketing program that includes SEO. The problem is, where to start? With so much information and so many moving parts, a small business owner can be easily overwhelmed. It’s much easier to simply focus on running the business.

Such was the case with one of our clients, a small manufacturing firm of about 30 people. The owner and his team had done a SWOT analysis and were ready to embark on a marketing program that included SEO.

The challenges, however, were pretty daunting: zero historical data, few backlinks, and building content and brand awareness on a limited budget.

 

Challenge #1: Zero historical data

When my company first began working with the small manufacturer in November 2015, we noted right away that the client’s website had a huge error with regard to the Google Analytics tracking code, which had been added to the website home page only. The low number of visitor sessions was a dead giveaway.

Challenge_1_Zero_historical_data.png

With our smaller clients, we see this type of UA code/analytics error on a regular basis, as well as others, such as the wrong UA code inserted into the HTML code or the client not having Admin access to Google Analytics. And then we learn that the person who did have access has fallen off the planet. When this happens, we often have to start fresh with a new Google Analytics account.

The first step in creating the client’s SEO program, therefore, was to ensure Google Analytics was properly tracking all web pages. An easy fix, but one that left us with zero data on which to base recommendations for moving forward.

Months of keyword guessing

Not having any Analytics or Search Console data meant we didn’t know the types of search queries people were using. And since the company hadn’t done much marketing in the past and had relatively low traffic volume, it would take months before we had any data that could tell us anything.

The client wanted to appear in Google for a few specific keywords pertaining to the services his company provided. However, the Keyword Planner showed few searchers were using these keywords in their searches.

Because we’ve worked with many small manufacturers and their esoteric products and services, we’ve learned the Keyword Planner isn’t always accurate, so we went ahead and optimized the website around iterations of these keywords plus others.

After a couple of months, it became apparent that those weren’t the right keywords based on traffic and other data.

We ended up making a new list and then carefully analyzing the SERPs for each keyword. We wanted to see how Google viewed the intent of each query and then choose the more transactional keywords — i.e., the keywords people would use when looking for the particular products and services the client provided.

In addition, we employed standard SEO tactics: ensuring images had descriptive alt tags using keywords whenever possible, creating internal links to key pages and writing descriptive title/meta description tags for all pages of the website.

 

Challenge #2: Few backlinks

For smaller manufacturers, the backlink profile is often limited, and budget and personnel constraints mean the company simply can’t take advantage of a full-fledged content and social media marketing program.

However, even on a budget, some things can be done which are easy and cost-effective: One of our first steps was to create a Google My Business page, get the company listed in the YP.com directory and create a LinkedIn corporate profile page.

To begin creating a few high-quality links that would also start building awareness (Challenge #3), the marketing plan for the year included sending out two press releases, as well as pitching three article ideas to trade publications (and writing the articles should the pitch be accepted).

The first press release and pitch resulted in two publications running a case study and an application note respectively. The case study appeared online; the application note appeared in the publication’s print version and online as well — a huge win for any company, but especially nice for a smaller firm.

In addition, we continued to add content to the Resources section of the website. For small companies on a tight budget, creating a Resources section is a cost-effective way to create content. This content can then be posted on the corporate or personal LinkedIn profile, added to e-newsletters and most important, optimized to attract search traffic and links.

For our client, we created application notes, FAQs and other types of information of interest to the target audience. As a side note, one of the application notes was repurposed for the industry print publication article — a good example of how small companies can get maximum bang for the marketing buck.

Challenge #3: Creating content and building awareness

One of the tactics the client had wanted to implement from the beginning was a monthly e-newsletter. The client already had an internal list, so we created a new account in MailChimp, imported the list and developed a template.

We created a new topic for each month, but midway through the campaign, the client suggested a topic we could break down into multiple articles — and which would be of high interest to the target audience. That’s when we hit paydirt.

 

Although e-newsletters generally don’t fall under the purview of SEO, they do play a role in that they assist in conversions and inquiries over time.

According to Gardner Business Media’s 2015 Media Usage in Manufacturing Report, 68 percent of survey respondents view e-newsletters as an effective method for finding solutions-based info, application stories and information on new products and processes.

And 93 percent of respondents indicated they click on companies whose name they recognize in the search results — making e-newsletters an effective way to reinforce brand awareness over time (even if subscribers delete the email after quickly skimming it or don’t read it at all some months).

Based on Analytics data we’ve seen with other small manufacturing clients, e-newsletters often play a role in assisting conversions over time and are one of several channels searchers use in their path to conversion.

This is why we like to focus on new and returning visitors to the website, conversions and conversion paths rather than open rates.

One trick we used, which helped indirectly with SEO, was to repurpose each newsletter article for the website. Then, in each newsletter we added links to this material — which drove people back to the website and gave us more content we could optimize.

 

Results: Slow but steady traffic growth and conversions

The chart below shows the All Channels traffic data (adjusted for referral spam) for January through September. Of this, organic accounts for 69 percent, direct 22 percent and referral three percent.

traffic-chart 

More importantly, however, the work we’ve been doing is resulting in conversions. The chart below shows the goal completions for new and returning users for the Q3 period for the website form only; the client has also been getting email and phone inquiries, which his team tracks in-house.

Results_Slow_but_steady_traffic_growth_and_conversions.png

What I find exciting is that while the website content we’ve been creating is being found by searchers, the e-newsletter also is driving new and return users to the website — and recently, a few conversions as well.

 

Although the numbers are small, we now have data we can use to create a more finely tuned marketing and measurement plan for year two — a plan which can now include SEO KPIs and targets. The data also gives us a baseline for a discussion on whether to budget for an AdWords campaign in order to determine which keywords drive clicks and inquiries and increase traffic.

In conclusion

Starting an SEO program from scratch for a smaller company on a budget can be a little daunting, as the expectation for fast results lurks in the background (especially given all the hype and misinformation regarding SEO).

The key to success is to set realistic expectations and have patience: for smaller companies on a tight budget and/or limited resources, it can take up to a year to see results from SEO and content marketing.

I would also add two other success tips. The first tip is to be consistent. Regularly create pieces of content for the website and optimize it, publish the e-newsletter each month, post to social platforms even if only one platform is being used and so on. Over time, these efforts create momentum which begins to snowball.

The second tip is to employ a little ingenuity; make things do double and triple duty so that you can leverage multiple channels without a whole lot of additional effort.

Source : searchengineland

Categorized in Business Research

Search engine optimization (SEO) can be divided into two main components: on-site and off-site. While it’s practically impossible to categorize one as more important than the other, on-site optimization serves a foundational role—it’s the anchor point of your SEO strategy.

With on-site SEO, most of the changes and additions you make will remain static, as opposed to off-site optimization, which demands ongoing work.

Advertisment

become-an-internet-research-specialist

The problem is, you’ve probably heard a number of lies regarding on-site optimization that could compromise how effective your strategy is.

Why “Lies” Are Common

First off, why are these “lies” common in the first place? Well, they aren’t always lies—at least, not exactly. Instead, they’re usually misconceptions or misunderstandings that arise naturally because of the nature of SEO:

  • Changing standards. Google is constantly releasing new updates, so it’s sometimes hard to tell which practices are still modern and which ones have become obsolete. If you don’t keep up with the latest information, you could easily bear a misconception forward.
  • Anecdotal evidence. If you make a change on your own site and see a marked increase in your results, you might assume this change was the one that did it. Unfortunately, anecdotal evidence isn’t always reliable when it comes to SEO.
  • Misinterpretation and inexperience. It could also be that the person or agency telling you these bits of information have misinterpreted a piece of truth; it’s easy to do when the wording of a best practice can fundamentally alter its meaning.

 

On-site Optimization Lies 

From those root causes, these are six of the most common on-site optimization misconceptions I’ve encountered: 

1. It’s too complicated for non-experts to understand.

It’s true that some elements of SEO are technically complex, and require digging into the back end of your site to alter code. However, you don’t necessarily have to be a professional web developer to understand and make these changes. Some of them can be done literally by copying and pasting certain snippets of code, such as placing a Google Analytics tracking script or making adjustments to your robots.txt file. Tactics here sport a range of difficulties, but none of them are unlearnable by non-experts.

2. Duplicate content will kill your rankings.

The phrase “duplicate content,” is enough to make most SEO professionals cringe. It’s true that duplicate content is almost never a good thing, but most duplicate content arises due to canonicalization errors—essentially, Google sees one page as two because of how it interprets your page structure. Duplicate content errors are relatively easy to fix with rel=canonical tags, but don’t let anyone tell you these mistakes will kill your strategy—the difference is marginal at best. Unless you’re purposefully plagiarizing content and trying to pass it off as your own, you don’t need to worry as much about duplicate content as some would have you think.

3. 404 errors should be always be avoided.

You should know first-hand the feeling of encountering a 404 error page when trying to access a piece of content, so it’s no wonder why so many agencies and professionals recommend fixing 404 errors with 301 redirects or content restoration at all costs. However, there are some functional uses for 404 pages and they won’t always damage your rankings (in fact, they can help your rankings in certain cases), so don’t feel the need to track down and redirect every single 404 page on your site. 

 

4. Every title and description should have a keyword.

Your page’s title tags and meta descriptions are what appear in its entry in search engine results pages (SERPs), and they provide meaningful information to Google about what your page is about. Some SEO practitioners will tell you to include at least one keyword for every title and description on your site, but this isn’t exactly necessary—especially now that Google indexes and provides content using semantic search. Instead, it’s better to focus on describing your page accurately and trying to entice more click-throughs on your pages by using titles and descriptions that appeal to real people. There’s evidence that the higher your click-through rate (CTR) in search engine results pages, the higher your search rankings will be.

5. More content is better.

It’s a best practice to include at least a few hundred words of high-quality content on every page of your site. It’s also necessary to have a strong, ongoing content marketing strategy to add new pages to your site and provide more value to readers. However, don’t mistake these best practices as a recommendation to produce as much content as possible; more content won’t necessarily help you. Instead, focus on producing the best content you can.

6. Creating a page for each of your target keywords is ideal.

This was an older on-site SEO practice, back when keyword optimization was taken more literally. The idea was to create a specific, designated page of your site for every keyword you wanted to target in order to boost your relevance for those keywords. This strategy may be marginally effective today, but in general, it’s better to focus on creating pages that fit a certain theme or topic, in which you can include all the keywords that fit within that topical theme. Focus on creating “ultimate guides” to certain topics that are, above all, valuable for your readers. Keyword rankings will come only if and when those pages are shared, engaged with, and linked to by your readers.

Once you get past these myths and misconceptions, you’ll better understand the nature of on-site optimization and what you actually need to do to get your site ready for your long-term strategy. Keep in mind that on-site changes are only a small part of SEO; you’ll still need a kick-ass content strategy and an off-site strategy to support your efforts, but with a good on-site foundation, you’ll be well poised to earn the rankings you want.

Source : forbes

Categorized in Search Engine

Seven Reasons LinkedIn Is More Than A Digital Resume by Crystal Lee Butler

Advisor Perspectives welcomes guest contributions. The views presented here do not necessarily represent those of Advisor Perspectives.

LinkedIn is a business-oriented social networking service that was launched in 2003. The main reason a person joins LinkedIn is for professional networking.

It is an online resume database, only better!

If you are looking for a place to network with other professionals and cut out all of the “fluff” that other social networking sites bring, LinkedIn is an excellent choice.

Here’s why I highly recommend LinkedIn to be the first social media account you use:

1-Easily develop a personal brand

An optimized LinkedIn profile is a surefire way to solidify your personal and professional brand. It will immediately pay dividends in better search engine rankings for your and your firm’s name. Don’t forget a professional headshot; it makes you 14-times more likely to be found on LinkedIn.

2-Go where your prospective clients are

Advertisment

become-an-internet-research-specialist

LinkedIn has over 128 million users in the United States alone. It is particularly popular among college graduates, those in higher-income households and the currently employed. It is the perfect gathering space for the types of clients you are seeking.

 

3-Accelerate connections

LinkedIn is the largest Rolodex in the world. Imagine the ability to tap into the power of nearly 433 million people worldwide. With a bit of ongoing effort to grow your network, you’ll notice connections sprout from within connections. LinkedIn is a networking superpower because it is not as private as Facebook’s personal profiles, so it is easier to connect with people who are not readily accessible.

4-Create opportunities

By consistently contributing valuable content on a particular subject, advisors will lead the conversation rather than follow. If you can post a few bits of useful information on a regular basis, you will get recognized. There is no other tool that will do this for you.

5-Improve your SEO

A well-thought-out LinkedIn profile increases the chance you’ll be found because of search-engine optimization (SEO). Successful SEO makes it easier for people to find financial advisors such as yourself; without it the likelihood that you will be found significantly decreases.

6-Build rapport

Anyone can search for a person on LinkedIn to see their job history and a myriad of other information. You can give and receive recommendations and endorsements from past colleagues. While financial advisors may not be able to take advantage of this particular LinkedIn function, it’s something you can do for your centers-of-influence to build a strong rapport.

7-Reinforce your professionalism

Another good thing about LinkedIn is that you are not bombarded with annoying game requests, cat videos and drama. You can simply browse your connections and their content. With that said, be sure to keep your own content professional. Do not post anything that does not belong.

 

Every second, two new members join LinkedIn. As an industry professional, this is great news. After you put in the initial effort to set up a strong profile, it takes minimal time to make those connections, post useful content and watch the benefits accrue. It is like the phone book of the 21st century. Don’t miss this opportunity by not creating a profile on LinkedIn.

Crystal Lee Butler is a creative marketer and results-driven business consultant with over a decade of experience collaborating with independent advisors. At Crystal Marketing Solutions she provides marketing services for financial professionals by communicating a cohesive brand and building business relationships to help them grow and thrive. Connect with Crystal on LinkedIn at crystallbutler.

Source : advisorperspectives

Google Search Console can help you determine which of your website pages are indexed, but what about identifying the ones that aren't? Columnist Paul Shapiro has a Python script that does just that. 

There are three main components to organic search: crawlingindexing and ranking. When a search engine like Google arrives at your website, it crawls all of the links it finds. Information about what it finds is then entered into the search engine’s index, where different factors are used to determine which pages to fetch, and in what order, for a particular search query.

As SEOs, we tend to focus our efforts on the ranking component, but if a search engine isn’t able to crawl and index the pages on your site, you’re not going to receive any traffic from Google. Clearly, ensuring your site is properly crawled and indexed by search engines is an important part of SEO.

But how can you tell if your site is indexed properly?

If you have access to Google Search Console, it tells you how many pages are contained in your XML sitemap and how many of them are indexed. Unfortunately, it doesn’t go as far as to tell you which pages aren’t indexed.

webmaster-tools-index-status

 This can leave you with a lot of guesswork or manual checking. It’s like looking for a needle in a haystack. No good! Let’s solve this problem with a little technical ingenuity and another free SEO tool of mine.

 

Determining if a single URL has been indexed by Google

To determine if an individual URL has been indexed by Google, we can use the “info:” search operator, like so:

info:http://searchengineland.com/google-downplays-google-algorithm-ranking-update-week-normal-fluctuations-258923

If the URL is indexed, a result will show for that URL:

url-indexed

However, if the URL is not indexed, Google will return an error saying there is no information available for that URL:

not-indexed-info-opperator

Using Python to bulk-check index status of URLs

Now that we know how to check if a single URL has been indexed, you might be wondering how you can do this en masse. You could have 1,000 little workers check each one — or, if you prefer, you could use my Python solution:

  # Google says don't use this script: https://twitter.com/methode/status/783733724655517696
  # This script is a violation of Google Terms of Service. Don't use it.
   
  import requests
  import csv
  import os
  import time
  from bs4 import BeautifulSoup
  from urllib.parse import urlencode
   
  seconds = input('Enter number of seconds to wait between URL checks: ')
  output = os.path.join(os.path.dirname(__file__), input('Enter a filename (minus file extension): ')+'.csv')
  urlinput = os.path.join(os.path.dirname(__file__), input('Enter input text file: '))
  urls = open(urlinput, "r")
   
  proxies = {
  'https' : 'https://localhost:8123',
  'https' : 'http://localhost:8123'
  }
   
  user_agent = 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 Safari/537.36'
  headers = { 'User-Agent' : user_agent}
   
  f = csv.writer(open(output, "w+", newline="\n", encoding="utf-8"))
  f.writerow(["URL", "Indexed"])
   
  for line in iter(urls):
  query = {'q': 'info:' + line}
  google = "https://www.google.com/search?" + urlencode(query)
  data = requests.get(google, headers=headers, proxies=proxies)
  data.encoding = 'ISO-8859-1'
  soup = BeautifulSoup(str(data.content), "html.parser")
  try:
  check = soup.find(id="rso").find("div").find("div").find("h3").find("a")
  href = check['href']
  f.writerow([line,"True"])
  print(line + " is indexed!")
  except AttributeError:
  f.writerow([line,"False"])
  print(line + " is NOT indexed!")
  print("Waiting " + str(seconds) + " seconds until checking next URL.\n")
  time.sleep(float(seconds))
  urls.close()

To use the Python script above, make sure you have Python 3 installed. You will also have to install the BeautifulSoup library. To do this, open up a terminal or command prompt and execute:

pip install beautifulsoup4

You can then download the script to your computer. In the same folder as the script, create a text file with a list of URLs, listing each URL on a separate line.

file-directory

Now that your script is ready, we need to set up Tor to run as our free proxy. On Windows, download the Tor Expert Bundle. Extract the zip folder to a local directory and run tor.exe. Feel free to minimize the window.

tor-expert

Next, we have to install Polipo to run Tor and HTTP proxy. Download the latest Windows binary (it will be named “polipo-1.x.x.x-win32.zip”) and unzip to a folder.

In your Polipo folder, create a text file (ex: config.txt) with the following contents:

socksParentProxy = "localhost:9050"
socksProxyType = socks5
diskCacheRoot = ""
disableLocalInterface=true

Open a command prompt and navigate to your Polipo directory.

Run the following command:

polipo.exe -c config.txt

polipo-screen

At this point, we’re ready to run our actual Python script:

python indexchecker.py

python-script-prompts

The script will prompt you to specify the number of seconds to wait between checking each URL.

It will also prompt you to enter a filename (without the file extension) to output the results to a CSV.

Finally, it will ask for the filename of the text file that contains the list of URLs to check.

 

Enter this information and let the script run.

The end result will be a CSV file, which can easily be opened in Excel, specifying TRUE if a page is indexed or FALSE if it isn’t.

ouputted csv showing whether urls are indexed or not

In the event that the script seems to not be working, Google has probably blocked Tor. Feel free to use your own proxy service in this case, by modifying the following lines of the script:

proxies = {
'https' : 'https://localhost:8123',
'https' : 'http://localhost:8123'
}

Conclusion

Knowing which pages are indexed by Google is critical to SEO success. You can’t get traffic from Google if your web pages aren’t in Google’s database!

Unfortunately, Google doesn’t make it easy to determine which URLs on a website are indexed. But with a little elbow grease and the above Python script, we are able to solve this problem.

Source : Search Engine Land

Categorized in Search Engine

Your website is your company’s virtual storefront. At its best, it presents your business to current and potential customers, showcases your products and services, and influences – or even facilitates – purchasing decisions. But poor information architecture and content strategy can turn clients away. Here are five mistakes you might be making on your website, and how to fix them.

1. Burying key information

A restaurant’s mission statement and glamour shots of the food might seem like the top priority. But most of the time, people are landing on such sites to find out one of four things: hours, menu, location and contact information. At best, it’s annoying to customers to have to click through several pages to find out if you serve lunch or take reservations. At worst, it might make them give up and go elsewhere.

Fix it: When planning what goes where on your website, put yourself in your customers’ shoes and imagine the top things they’ll be looking for. This information should go in a prominent place: the home page or even a footer that appears everywhere on your site. (The CN Tower, for instance, has a very basic landing page that gives the majority of visitors exactly what they’re looking for – and lets those who want more click through and explore.) And don’t forget a place to post updates such as holiday hours or seasonal promotions, even if it’s simply a Twitter feed embedded into your site.

 

2. Forgetting SEO

Search engine optimization, or SEO, isn’t some dark-magic alchemy that requires a highly paid specialist. At its most basic, it’s simply a matter of having a website that loads quickly and includes all keywords relevant to your business. If you don’t come up near the top in Google when people search for your company name (and city, if the name isn’t unique enough), you have a problem.

Fix it: There are two key things to think about here: first, ensuring people can find you when they’re specifically searching for you; and second, working toward your site coming up in search results related to your business. Make two corresponding lists of all keyword phrases someone might type into Google to find your business. For instance, if you’re a florist in Vancouver’s Mount Pleasant neighbourhood, the first list would include your business name together with words such as “hours” and “address,” and the second might include phrases like “Mount Pleasant florist delivery.” Ensure these keywords are included in your site copy in an organic way – i.e., not so it feels like a bot wrote them.

3. Letting information go stale

If you used to be open seven days a week and you’re now closed on Mondays, you’ll have some angry customers venting on social media if they show up to a locked door. And if your site updates are so infrequent that Halloween lasts until the end of November, visitors might wonder if you take your business seriously. As for broken links, they can make for a frustrating online experience.

Fix it: Ideally, have your site built with an easy-to-use content management system, so that a designated member of your staff can quickly update text and photos without having to depend on tech support. And whether you keep yourself organized with a paper calendar or a Web-based task management system, set regular reminders to review the site to ensure that it’s up to date.

4. Favouring flash over function

Auto-play music, animated splash pages and intro videos might have been cute in 2001. But this far into the Internet age, they’re just a distraction keeping people from efficiently finding the information they’re looking for. As for layout, we’re long past the time when the goal was to be mobile-friendly. Nowadays, it’s mobile-first, and if, say, your e-commerce site is clunky on iPhones, you’re likely to be missing out on sales.

 

Fix it: You don’t have to kill that fancy video – just don’t put it at centre stage, and make sure it works on mobile. Similarly, ensure that your whole site is at least readable on smartphones, and ideally uses a responsive design that sizes itself to browser windows automatically. (In most cases, a separate mobile site with limited information is a bad idea.) And kill anything that auto-plays. No one needs surprise audio blasting from their cubicle at an inopportune time.

5. Being uninformative

A bare-bones site is fine for launch, but at some point, it needs to be filled out with content to draw in customers and help them assess your business. After all, not everyone will take the extra step to contact you with questions. And for companies that might want media exposure, excluding facts such as the city in which they’re based and full names and bios of founders – not to mention, for some kinds of businesses, easy-to-download photos, logos and other resources – might mean getting left out of stories.

Fix it: Build up your site with informative content, be it FAQ pages, behind-the-scenes stories and photos, or profiles of staff. Give potential customers multiple reasons to do business with you, and encourage current customers to stay engaged with meaningful articles that help them feel part of your community.

Kat Tancock is co-founder of Tavanberg, a Toronto-based marketing agency that helps businesses plan and execute successful content strategies.

Source : theglobeandmail

A practical local SEO guide for business owners. 

Have you ever used Google to find something nearby? Like searching for “sushi”, “locksmith” or “nightclub”? If your business has a physical address such as a storefront, you should consider using local SEO to get new customers.

Local SEO helps you get customers using a location keyword in their search (such as “Irving Park Plumber”) or who simply search from a device with geolocation enabled, such as a smart phone. Local SEO also helps you to rank higher on Google Maps pages.

For example, The Art Studio NY, the top-rated painting school in New York, ranks #1 for local search term “painting classes nyc”:

painting-classes-google-search

The local listing of The Art Studio NY includes NAP (name, address, phone number), Google Map, Google Reviews, hours of operation, website, and directions. This local listing format is very helpful in driving business.

So how can we achieve good local search rankings?

According to Moz.com, the major local search ranking factors are:

moz-search-ranking-factors

According to Moz, on-page signals such as NAP (name, address, phone), optimized meta tags and titles, and domain authority are the most important ranking factors (20.3%) for local SEO.

Here are a few best practices for local SEO:

#1: Verify your Google My Business listing

Google My Business connects your business with customers. Go to Google My Business and claim your Google My Business page, If you haven’t already. Google will send a verification code to your address, and you simply enter that code into Google My Business.

The verification process may take 1-2 weeks.

Once you’ve verified your account, make sure that your NAP (name, address, phone) is correct, choose the right categories for your business, and provide a unique, engaging description. Upload some high-res images, add your hours of operation, and most importantly, ask your customers to write reviews for your business.

Google will display your local business information on the right column as below:

google-my-business

 

#2: Use consistent contact information across your online profiles

Make sure the business name, address, and phone number of each of your staffed locations (aka NAP – Name, Address, Phone) is consistent throughout the site (homepage, contact us page, footer, etc) and on other websites like Google My Business, Yelp and Facebook.

For example, beauty school The Beauty Institute – Allentown location, NAP is set consistent on multiple places (http://allentown.thebeautyinstituteskp.edu/)

Sitewide header (add address and Zip code to the header):

site-wide-header

 Homepage map:

homepage-map

 Sitewide footer:

sitewide-footer

 

#3: Embed a Google Map in your website

Embed a Google Map on your website. You can use the map on Contact page or Footer section. But do not just embed a map that points to your address. You should points to your actual Google Plus local listing.embed-a-map

#4: Include Geo tag to show your location to search engines

Include geo tags if your business is location specific. These tags can be generated via many online tools such as http://www.geo-tag.de/generator/en.html, and is placed on every page of the website. They let the search engines know where you’re based and improve your rankings for local search terms.

 

<meta name=”geo.position” content=”latitude; longitude”>


<meta name=”geo.placename” content=”Place Name”>


<meta name=”geo.region” content=”Country Subdivision Code”>

#5: Apply business-related rich snippets

You can add schema markups for NAP (name, address, phone), geo coordinates (read more here), and specific business-related snippets, like event snippet, school snippets, and more.

These rich snippets help search engines understand your site content better, and show your local listings to more relevant local searches.

You can look up the schema format at https://schema.org/docs/schemas.html. The schema generator http://www.microdatagenerator.com/ is helpful to quickly generate schema.

To verify the schema and rich snippets are correctly applied, check with Google Testing Tool:https://search.google.com/structured-data/testing-tool.

For painting school The Art Studio NY, the relevant snippets include:

Local Business snippet: (school name, phone, location, hours of operation).business-snippet

 School SnippetShow information about the school.

school-snippet

Event Snippets: Display class schedule. Each event/class will be displayed on SERP (search engine result page) as below:

event-snippet

Person snippet: This snippet is applied for instructor page, to display instructor information

person-snippet

#6: Optimize meta tags and page content for local keywords

Meta title & description tag: Include your city and state in your title tag and your meta descriptiontag. This can boost clickthrough rates for local search results.

<title>New York Art Studio – Art Studio NYC – The Art Studio NY</title>

<meta name=”description” content=”The Art Studio NY provides a variety of art classes and event planning services. From New York art camps to bachelorette parties, we cater all needs.” />

Heading tag: You should include your city in your heading H1 tag.

For example: http://allentown.thebeautyinstituteskp.edu/

heading-tag

Page content: It’s also important, but often overlooked, to include your location within your page’s content. Make sure your website shows your location in as many places as it makes sense.

 

Logo: The logo of your site should be optimized with local keywords.

For example: With The Beauty Institute, the logo of the main site and logos of each campus site are optimized:

  • Edited logo file name to add local keyword
  • Edited alt tag of logo to add local keyword

<img src=”http://thebeautyinstituteskp.edu/wp-content/themes/toniandguy/images/logo.png” alt=”Beauty School | Cosmetology School | The Beauty Institute – Schwarzkopf Professional”>

inspect-element

Image Alt tags: Your alt text (the text that describes your images) should also include your city.

<img class=”alignnone wp-image-2157″ src=”http://tghairacademy.edu/wp-content/uploads/2015/11/banner2.jpg” alt=”The Beauty Institute | Schwarzkopf Professional – Top beauty school in Pennsylvania” width=”100%” />

 

#7: Create separate web page for each location

If your business has multiple locations, it’s almost always better to build separate location pages with strong content for each location. For example, The Beauty Institute uses multi-site WordPress to create separate subdomains for each of its schools in different locations: Ambler, Philly, Allentown, and Stroudsburg.

It is important to make content in each location unique, instead of using the same content template and only replace location data. Here are a few ideas to make local content unique:

  • Adding testimonials from customers from each city you service.
  • Differentiate what you do in one location vs. another. Offer city-specific, service-specific, or product-specific specials, schedules, and calendars.
  • Participate in local events, or sponsor local events, and write about those local events to create unique content.
  • Interview experts inside or outside of your company to get city-specific or product-specific content.
  • Build an location-based blog for each location to keep your content fresh. Include your city in the alt tags for images and videos, and consider writing out transcriptions.

#8: Submit your site to local directories to build citations

A citation is an online reference to your business’s name, address and phone number (NAP). Local directories are a useful resource for building citations. These citations are valuable even if they aren’t linked, as long as they’re displaying your NAP consistently.

For multi-location or multi-practitioner businesses, point the link on all citations to the correct corresponding landing page on your website. For example, you should point all Philadelphia citations to your Philadelphia landing page on your site.

You can use tools such as Bright Local or Yext to find any existing citations you have, and then update them all at once to make them consistent. You can also use the tool to check out your competitors’ citations.

Bright Local citation tracker for The Art Studio NY:

citation-check

Also, remember to set up alerts through social listening tools like Mention or Google Alerts to track new mentions of your competitors’ NAP listings.

#9: Ask your customers to write (a lot of) reviews

Customer reviews, especially from credible sources like Google My Business, Yelp, and Facebook, have the solid impact on Google local rankings. Just don’t try to get too many reviews at once, because Google might found your activities suspicious.

 

#10: Put locations in Social Media profiles

Facebook, Twitter, Google Plus, and Pinterest are the most popular social media sites you should appear to promote your local business. Always include your contact information where you can, and make sure the contact information are consistent with your website. (Twitter doesn’t allow addresses or phone numbers, though).

Active social channels could be a strong local ranking factors, and indicating you have a credible and healthy business.

Summary

People are searching for lawyers, schools, restaurants and local shops online, especially via mobile devices. If you run a local business, let’s follow the above tactics to optimize your website for local search, and earn your share of new business! 

 Source : searchenginewatch

Categorized in Search Engine

How should brands with multiple brick-and-mortar locations structure their web properties? Columnist Andrew Beckman weighs in. 

Enterprise brands with a major brick-and-mortar presence have a unique challenge in digital marketing: connecting with consumers online, with the intent of ultimately encouraging them to visit a physical business location to make a purchase.

By creating a more user-focused experience that includes individual location landing pages for physical business locations, franchise systems and multi-location brands can turn user queries into business visits.

Developing location pages that are an extension of the primary brand domain allows brands to capture valuable real estate on search engine results pages (SERPs) and rank more prominently on hyper-local search terms above online directories like Yelp, Insider Pages and more.

Often, during the initial research phase of the customer journey, a consumer is looking for a product or service but is brand-agnostic. Leveraging this type of local SEO strategy can help drive in-store sales from these brand-agnostic consumers by tapping into coveted geo-specific, non-branded search terms and phrases.

However, many brand teams and franchisors have adopted a policy of allowing their franchisees and location owners to create their own landing pages and website domains instead of creating location pages on the primary brand domain. This type of independently executed approach can be found across a variety of industry verticals, and it can lead to the creation of domains like this: 

lawn-doctor-geo-url

As you can see, the domain is branded, but with a geo-modified URL.

Another version of this singular approach can involve using a non-branded, geo-modified domain, like this one for a men’s salon in Glendale, Colorado:

sport-clips-geo-url

There has been a debate going on for quite some time as to whether brands should take an approach that manages local SEO from the top down or allows individual locations to manage SEO on their own — a “centralized strategy” vs. a “decentralized strategy.” Let’s take a look at both approaches and evaluate the pros and cons.

Decentralized strategy

Establishing a decentralized strategy essentially involves allowing your individual franchisees to run their own digital marketing programs by themselves, with no guidelines, management or oversight by the brand or corporate teams. This approach can include both paid and organic media strategies and is often summed up in a fashion that resembles a “wild west” scenario with each franchisee responsible for its own local digital marketing.

When it comes to SEO specifically, the use of many domains — such as xyzdenver.com and xyzdallas.com — creates the challenge of having to manage each domain separately, costing the brand the opportunity to build valuable ranking authority around one primary domain. Each property also ends up needing its own web analytics setup, content strategy and more. Multiply that by thousands of locations, and you’re looking at a scenario that requires a massive amount of resources to manage.

 

Furthermore, if these sites are managed by a third-party vendor who decides to delete those previously indexed URLs when your relationship ends, you could find yourself in serious trouble trying to gain back the SERP equity you’ve lost.

The “holy grail” of search engine marketing is to drive incremental visits from consumers who are not familiar with your brand and thus tend to find you when searching with geo-modified and/or non-branded terms, such as “hardware store near me.”  Trying to compete for these desired phrases on thousands of different domains creates a vastly more complex world for your SEO practice. It necessitates vast amounts of content production to populate and maintain multiple sites.

Additionally, this strategy means that valuable links to your brand will be spread across many local domains instead of being concentrated on one central source. This creates an environment that ultimately doesn’t build a tremendous amount of ranking authority because the search engine signals are being spread too thinly across multiple domains.

Note: Some multi-location brands have employed a tactic that leverages subdomains for location pages, where the local property shares a root domain with the main brand website but is sitting on an entirely separate IP block. There is some debate over whether Google treats these subdomains as separate websites versus a single website, and it seems to depend somewhat on how the subdomains are set up. Use caution if employing this strategy.

Centralized strategy

Establishing a centralized strategy involves ensuring the franchisor or brand management team is in ultimate control over the decision-making on key aspects of both national and hyper-local strategies. These key aspects can include the brand position, messaging, important seasonal and direct marketing initiatives and more.

When a single primary domain continues to build authority over time, you afford yourself a much better opportunity for improved link acquisition, which is one of the main signals that helps drive up your rankings by allowing you to appear higher on non-branded geo-modified queries.

From a local search standpoint, it’s also important to create metro and location pages so that your brand appears in SERPs for non-branded and geo-modified phrases. Those pages (e.g., Google Maps, Apple Maps) can also be associated with your local business listings by using the appropriate location URLs. It’s important to ensure that those pages also have the appropriate content, metadata and structured data in place to appear on organic local search queries.

When using the centralized approach, an ideal location URL would look something like this:https://www.yourbrand.com/new-york-city-ny/325-manhattan-midtown-east.

Notice the location page is set up in a subdirectory of the root domain (not a subdomain) so that the root domain can assist this page with internal linking strategies to drive more ranking authority. If your brand is selling products in a physical store, creating pages under the primary location page to show things such as updated inventory by store location can help give the consumer more useful information as they decide where to make a purchase.

 

Furthermore, creating metro pages within the main brand domain also allows a brand or franchisor to go after larger geographical and regional phrases, such as “Brooklyn athletic club,” by populating several locations under one domain and structuring the content to focus on those larger regional phrases.

An example of this type of metro page structure looks something like this:https://www.yourbrand.com/metro/new-york-city-ny.

In doing so, your brand now has hyper-local, regional and state-level structure in place to compete on all geographical levels.

In conclusion

Local landing page subdirectories can be very beneficial for the overall health of your local SEO strategy, particularly if your brand has a multitude of brick-and-mortar locations. As Google continues to evaluate and leverage different ranking factors, centralizing your efforts with a focus on one primary domain will benefit lower-level location pages, giving the physical business locations a sound SEO foundation that is set up to allow for more prominent organic rankings.

As search engines continue to refine their ranking algorithms, the battle to drive greater traffic from the SERPs becomes more critical as time goes on. Having a solid, centralized foundation that focuses on developing individual location pages can give your brand the edge in helping capture the attention of brand-agnostic consumers in an attempt to turn them into your customers.

Source : Search Engine Land

Categorized in Search Engine

This is the wrap-up of the most popular posts and announcements on SEJ over the previous week. Newsletter subscribers are the first to receive this and other updates.

Penguin is Now Part of Google’s Core Algorithm

penguin is now running in real-time as a part of Google’s core algorithm. The update is already in effect in all languages. Learn what else has changed, which is based on some of the top requests from webmasters.

Everything You Need to Know About On-Page SEO

Everything You Need to Know About On-Page SEO

How are you optimizing your online presence to make your voice heard? It starts with ensuring you are up to date on on-page SEO basics to provide peak performance for your website and visibility for your target audience.

Popular Search Marketing Posts

Here is a rundown of the most popular posts on SEJ from last week:

  1. Penguin is Now a Real-Time Component of Google’s Core Algorithm, byMatt Southern
  2. Everything You Need to Know About On-Page SEO, by Ryan Clutter
  3. The Complete Guide to Mastering E-Commerce Product Page, byStoney G deGeyter
  4. 10 Reasons Why Your E-Commerce SEO Campaign is Failing, by James Brockbank
  5. Google AdWords Introduces Cross-Device Remarketing, by Matt Southern / [AD] Looks Aren’t Everything: Why a Successful Infographic is Much More Than Just Design
  6. The Difference Between Accelerated Mobile Pages (AMP) and Mobile-Friendly Pages, by Bharati Ahuja
  7. Managing Your Website’s SEO Health, by Melih Oztalay
  8. Google Displaying Vacation Prices on Front Page of Search Results, byMatt Southern
  9. Google Allo Keeps Messages Indefinitely, Raising Privacy Concerns, byMatt Southern
  10. Google Testing New Schema Markup for ‘Science Datasets’, by Matt Southern

 

Download This Week’s Episode of Marketing Nerds

In this Marketing Nerds episode, SEJ Chief Social Media Strategist, Brent Csutoras, was joined by Tom Anthony, Head of Research & Development at Distilled, to talk about the future of search, other technology trends, and how to put it all together to understand the main trajectories in the industry. Listen to the full episode, or download the MP3 here.

Original source of this article is Search Engine Journal

Categorized in Search Engine
Page 4 of 6

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Finance your Training & Certification with us - Find out how?      Learn more