fbpx

Ever had to search for something on Google, but you’re not exactly sure what it is, so you just use some language that vaguely implies it? Google’s about to make that a whole lot easier.

Google announced today it’s rolling out a new machine learning-based language understanding technique called Bidirectional Encoder Representations from Transformers, or BERT. BERT helps decipher your search queries based on the context of the language used, rather than individual words. According to Google, “when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English.”

Most of us know that Google usually responds to words, rather than to phrases — and Google’s aware of it, too. In the announcement, Pandu Nayak, Google’s VP of search, called this kind of searching “keyword-ese,” or “typing strings of words that they think we’ll understand, but aren’t actually how they’d naturally ask a question.” It’s amusing to see these kinds of searches — heck, Wired has made a whole cottage industry out of celebrities reacting to these keyword-ese queries in their “Autocomplete” video series” — but Nayak’s correct that this is not how most of us would naturally ask a question.

As you might expect, this subtle change might make some pretty big waves for potential searchers. Nayak said this “[represents] the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.” Google offered several examples of this in action, such as “Do estheticians stand a lot at work,” which apparently returned far more accurate search results.

I’m not sure if this is something most of us will notice — heck, I probably wouldn’t have noticed if I hadn’t read Google’s announcement, but it’ll sure make our lives a bit easier. The only reason I can see it not having a huge impact at first is that we’re now so used to keyword-ese, which is in some cases more economical to type. For example, I can search “What movie did William Powell and Jean Harlow star in together?” and get the correct result (Libeled Lady; not sure if that’s BERT’s doing or not), but I can also search “William Powell Jean Harlow movie” and get the exact same result.

BERT will only be applied to English-based searches in the US, but Google is apparently hoping to roll this out to more countries soon.

[Source: This article was published in thenextweb.com By RACHEL KASER - Uploaded by the Association Member: Dorothy Allen]

Categorized in Search Engine

[Source: This article was published in seroundtable.com By Barry Schwartz - Uploaded by the Association Member: Bridget Miller]

Google's John Mueller said it again, do not worry about words or keywords in the URLs. John responded to a recent question on Twitter saying "I wouldn't worry about keywords or words in a URL. In many cases, URLs aren't seen by users anyway."

oliver

It references that video from Matt Cutts back in 2009 where it says keywords play a small role in rankings, but really small.

In 2017, John Mueller said keywords in URLs are overrated and that it is a small ranking factor back in 2016.

Forum discussion at Twitter.

Categorized in Search Engine

Source: This article was Published searchengineland.com By Ginny Marvin - Contributed by Member: Jeremy Frink

Here's what some marketers are saying about the move to include same meaning queries in exact match close variants.

Marketer reactions to the news that Google is yet again degrading the original intent (much less meaning) of exact match to include “same meaning” close variants is ranging from pessimism to ho-hum to optimism.

Expected impact on performance

“The impact of this will probably be most felt by accounts where exact match has historically been successful and where an exact match of a query made a difference in conversions — hence the reason you’d use exact in the first place,” said digital consultant and President of Netptune MoonJulie Friedman Bacchini.

Friedman Bacchini said the loss of control with exact match defeats the match type’s purpose. Many marketers use exact match to be explicit — exacting — in their targeting and expect a match type called “exact” to be just that.

Brad Geddes, the co-founder of ad testing platform AdAlysis and head of consultancy Certified Knowledge, said one problem with expanding the queries that can trigger an exact match keyword is that past changes have shown it can affect the overall performance of exact match. “The last change meant that our ‘variation matches’ had worse conversion rates than our exact match and that we lowered bids on most exact match terms. This change might just drive us from using it completely, or really hitting the negative keywords.”

Like Geddes, Andy Taylor, associate director of research at performance agency Merkle, also said they saw an increase in traffic assigned as exact match close variants with the last change, “and those close variants generally convert at a lower rate than true exact matches.”

Yet, others who participated in the test see the loosening of the reigns as a positive action.

One of the beta testers for this change was ExtraSpace Storage, a self-storage company in the U.S. with locations in more than 40 states. The company says it saw positive results from the test.

“The search queries were relevant to our industry and almost all of our primary KPIs saw an overall improvement,” said Steph Christensen, senior analyst for paid search at ExtraSpace.

Christensen said that during the test they did not do any keyword management, letting it run in a “normal environment to give it the best chance to provide the truest results.” She says they will continue to watch performance and make adjustments as needed after it’s fully rolled out by the end of October.

Advertisers as machine learning beneficiaries or guinea pigs

A big driver of these changes, of course, is machine learning. The machine learning/artificial intelligence race is on among Google and the other big tech companies.

Google says its machine learning is now good enough to determine when a query has the same intent as a keyword with a high enough rate of success that advertisers will see an overall performance lift.

Another way to look at the move, though, is that by opening up exact match to include same meaning queries, Google gets the benefit of having marketers train its algorithms by taking action on query reports.

Or as Geddes, put it: “Advertisers are basically paying the fee for Google to try and learn intent.”

Geddes’ point is that this change will help Google’s machine learning algorithms improve understanding of intent across millions of queries through advertiser actions and budgets.

“The fact that Google doesn’t understand user intent coupled with how poor their machine learning has been at times, means we might just move completely away from exact match,” says Geddes.

Of the example Google highlighted in its announcement, Geddes says, “If I search for Yosemite camping; I might want a blog article, stories, social media, or a campground. If I search for a campground — I want a campground.” (As an aside, from what I’ve found it appears Google doesn’t even monetize “Yosemite camping” or “Yosemite campground” results pages that it used as examples.)

Expected workflow changes

One big thing Google has emphasized is that these close variants changes allow advertisers to focus on things other than building out giant keyword lists to get their ads to show for relevant queries. Rather than doing a lot of upfront keyword research before launching, the idea is that the management will happen after the campaign runs and accumulates data. Marketers will add negatives and new keywords as appropriate. But this reframing of the management process and what amounts to a new definition of exact match has marketers thinking anew about all match types.

“The further un-exacting of exact match has me looking at phrase match again,” says Friedman Bacchini. “I definitely see it impacting use of negatives and time involved to review SQRs and apply negatives properly and exhaustively”.

Taylor agrees. “This change places more importance on regularly checking for negatives, but that has already been engrained in our management processes for years and won’t be anything new.”

Geddes said that advertisers might come up against negative keyword limits, which he has seen happen on occasion. Rather than relying heavily on adding negatives, he says they may consider only using phrase match going forward.

In addition to having ads trigger for queries that aren’t relevant or don’t convert well, there’s the matter of having the right ad trigger for a query when you have close variants in an account already.

Matt van Wagner, president and founder of search marketing firm Find Me Faster, says the agency will be monitoring the impact before assessing workflow adjustments, but is not anticipating performance lifts.

“We’ll watch search queries and how, or if, traffic shifts from other ad groups as well as CPC levels. We expect this to have neutral impact at best,” says van Wagner, “since we believe we have our keywords set to trigger on searches with other match types.”

Along those lines, Geddes says it will be critical to watch for duplicate queries triggering keywords across an account to make sure the right ad displays. It puts new focus on negative keyword strategies, says Geddes:

Google will show the most specific matching keyword within a campaign; but won’t do it across the account. So if I have both terms in my account as exact match (“Yosemite camping” and “Yosemite campground”), with one a much higher bid than the other, my higher bid keyword will usually show over my actual exact match word in a different campaign. That means that I now need to also copy my exact match keywords from one campaign and make them exact match negatives in another campaigns that is already using exact match just to control ad serving and bidding. I should never have to do that.

Measuring impact can be challenging

The effects of the change will take some time to unfold. Taylor says it took several months to see the impact of the last change to exact match close variants.

It’s difficult to calculate the incremental effect of these changes to close variants, in part says Taylor, because some close variant traffic comes from keywords – close variants or other match types — that are already elsewhere in the account.

“Google gives a nod to this in its recent announcement, saying that ‘Early tests show that advertisers using mostly exact match keywords see 3 percent more exact match clicks and conversions on average, with most coming from queries they aren’t reaching today,’” Taylor highlights with bolding added.

Another complicating factor, particularly for agencies, is that the effects of these changes don’t play out uniformly across accounts. Taylor shares an example:

An advertiser saw traffic on one of its key brand keywords shift to a different brand keyword several months after the close variants change last year.

“The normal reaction might be to use negatives to get that traffic back over to the correct keyword, but we were getting a better CPC and still getting the same traffic volume with the new variation,.

It didn’t make much sense, especially given Google’s continued assertion even in the current announcement that ‘Google Ads will still prefer to use keywords identical to the search query,’ but if the clicks are cheaper, the clicks are cheaper. This also speaks to how there’s not really a universal response to deploy for changes in close variants, aside from being mindful of what queries are coming in and how they’re performing.”

Looking ahead

Performance advertisers go where they get the best results.

“At the end of the day, the question is if poorer converting close variant queries might pull keyword performance down enough to force advertisers to pull back on bids and reduce overall investment,” said Taylor. “Generally speaking, giving sophisticated advertisers greater control to set the appropriate bids for each query (or any other segment) allows for more efficient allocation of spend, which should maximize overall investment in paid search.”

Geddes says their “priority is to make sure our Bing Ads budgets are maxed and that we’re not leaving anything on the table there. If our [Google] results get worse, we’ll also move some budgets to other places. But this might be one where we really have to do another account organization just to get around Google’s decisions.”

After the change has fully rolled out and they have enough data to act on, ExtraSpace’s Christensen said they will evaluate again. “Since we have such a large [account] build, when we do decide to make any changes we will have to show how we can do this at scale and maintain performance.”

Bacchini calls attention to the current misnomer of exact match and said Google should get rid of exact match altogether if it’s going to take away the original control of exact match. “It is particularly sneaky when you think of this move in terms of less sophisticated advertisers,” said Bacchini. “If they did not click on the ‘Learn More’ link below the formatting for entering in match types for keywords, how exactly would they know that Google Ads does not really mean exact?”

Categorized in Search Engine

Source: This article was Published irishtechnews.ie By Sujain Thomas - Contributed by Member: Carol R. Venuti

Well, Google does not know you personally, so there is no reason to hate you. If you are writing and still not getting that first ranking on the page of the search engine, it means something is not right from your side.  First of all, let’s just get some ideas straight. How do you think search engine ranking is effective a web page? Being in the few lines of code will not always determine whether the page is capable enough to be placed on the first page of the search engine. Search engines are always on the lookout for signals to rank any page. So, it is easier for you to tweak an article and give those signals to search engines for enjoying a huge round of traffic.

Starting with the primary point:

To get that huge round of audience, you need to start with keyword research. It is one such topic which every blogger might have covered at least once. They need to work on that from the very first day of their blogging life. Every SEO blog or blogger might have used Google Keyword Planner for sure. You might have heard of it, because if you haven’t then you are missing out on a lot of things for your massive business growth.

More on Google Keyword Planner:

There are so many types of keyword research tools available in the market but Google Keyword Planner is at the top of the list. It is also one of the major keyword spy tool names you will come across recently. Google Keyword Planner is an official item from Google, offering you a traffic estimation of targeted keywords. It further helps users to find some of the related and relevant KWs for matching your niche. There are some important points you need to know about Google Keyword Planner before you can actually start using it.

  • For using this Google Keyword Planner tool, you need to register your name with Google and have an AdWords account. The tool is free of cost and you don’t have to spend a single penny on using this item. You have every right to create an AdWords tool using some simple steps and get to use it immediately.
  • If you want, you can clearly search for the current Google AdWords coupons, which will help you to create one free account for your own use. It will help you to use the Google Keyword Planner tool on an immediate count for sure.
  • The main target of this tool is towards AdWords advertisers. On the other hand, it is able to provide some amazing deals of information when it is time to find the right keyword for the blog and the relevant articles to your business.

Log online and get a clear idea on how the homepage of this tool from Google looks like. You just have to enter the target keyword in the given search bar and start your search results quite immediately.  Later, you can add filters if you want to.

Categorized in Online Research

Online research involves collecting information from the internet. It saves cost, is impactful and it offers ease of access. Online research is valuable for gathering information. Tools such as questionnaires, online surveys, polls and focus groups aid market research. You can conduct market research with little or no investment for e-commerce development.

Search Engine Optimization makes sure that your research is discoverable. If your research is highly ranked more people will find, read and cite your research.

Steps to improve the visibility of your research include:

  1. The title gives the reader a clear idea of what the research is about. The title is the first thing a reader sees. Make your research title relevant and consistent. Use a search engine friendly title. Make sure your title provides a solution.
  2. Keywords are key concepts in your research output. They index your article and make sure your research is found quickly. Use keywords that are relevant and common to your research field. Places to use relevant keywords include title, heading, description tags, abstract, graphics, main body text and file name of the document.
  3. Abstract convince readers to read an article. It aids return in a search.
  4. When others cite your research your visibility and reputation will increase. Citing your earlier works will also improve how search engines rank your research.
  5. External links from your research to blogs, personal webpage, and social networking sites will make your research more visible.
  6. The type of graphics you use affects your ranking. Use vectors such as .svg, .eps, .as and .ps. Vectors improve your research optimization.
  7. Make sure you are consistent with your name across all publications. Be distinguishable from others.
  8. Use social media sites such as Facebook, Twitter, and Instagram to publicize your research. Inform everyone. Share your links everywhere.
  9. Make sure your research is on a platform indexed properly by search engines.

Online research is developing and can take place in email, chat rooms, instant messaging and web pages.  Online research is done for customer satisfaction, product testing, audience targeting and database mining.

Ethical dilemmas in online research include:

  1. How to get informed consent from the participants being researched?
  2. What constitutes privacy in online research?
  3. How can researchers prove the real identity of participants?
  4. When is covert observation justifiable?

Knowing how to choose resources when doing online research can help you avoid wasted time.

WAYS TO MAKE ONLINE RESEARCH EASY AND EFFECTIVE

  1. Ask: Know the resources recommended for your research from knowledgeable people. You can get information on valuable online journals or websites from an expert or knowledgeable people.
  2. Fact from fiction: Know the sites that are the best for your research topic. Make sure the websites you have chosen are valuable and up to date. Sites with .edu and .gov are usually safe. If you use a .org website make sure it is proper, reliable and credible. If you use a .com site; check if the site advertises, bias is a possibility.

Social media sites, blogs, and personal websites will give you personal opinions and not facts.

  1. Search Smartly: Use established search engines. Use specific terms. Try alternative searches. Use search operators or advanced search. Know the best sites.
  2. Focus: Do not be distracted when conducting an online research. Stay focused and away from social media sites.
  3. Cite Properly: Cite the source properly. Do not just copy and paste for plagiarism can affect your work.

When conducting research use legitimate and trustworthy resources. sites to help you find articles and journals that are reliable include:

  1. BioMedCentral
  2. Artcyclopedia
  3. FindArticles.com
  4. Digital History
  5. Infomine
  6. Internet Public Library
  7. Internet History Sourcebooks
  8. Librarians Internet Index
  9. Intute
  10. Library of Congress
  11. Project Gutenberg
  12. Perseus Digital Library
  13. Research Guide for Students.

No matter what you are researching the internet is a valuable tool. Use sites wisely and you will get all the information you need.

ONLINE RESEARCH METHODS

  1. Online focus group: This is for business to business service research, consumer research and political research. Pre-selected participants who represent specific interest are invited as part of the focus group.
  2. Online interview: This is done using computer-mediated communication (CMC) such as SMS or Email. Online interview is synchronous or asynchronous. In synchronous interviews, responses are received in real-time for example online chat interviews. In asynchronous interviews, responses are not in real-time such as email interviews. Online interviews use feedbacks about topics to get insight into the participants, attitudes, experiences or ideas.
  3. Online qualitative research: This includes blogs, communities and mobile diaries. It saves cost, time and is convenient. Respondents for online qualitative research can be gotten from surveys, databases or panels.
  4. Social network analysis: This has gained acceptance. With social network analysis researchers can measure the relationship between people, groups, organization, URLs and so on.

Other methods of online research include cyber-ethnography, online content analysis, and Web-based experiments.

TYPES OF ONLINE RESEARCH

  1. Customer satisfaction research: This occurs through phone calls or emails. Customers are asked to give feedback on their experience with a product, service or an organization.
  2. New product research: This is carried out by testing a new product with a group of selected individuals and immediately collecting feedback.
  3. Brand loyalty: This research seeks to find out what attracts customers to a brand. The research is to maintain or improve a brand.
  4. Employee satisfaction research: With this research, you can know what employees think about working for your organization. The moral of your organization can contribute to its productivity.

When conducting an online research give open-ended questions and show urgency but be tolerant.

Written by Junaid Ali Qureshi he is a digital marketing specialist who has helped several businesses gain traffic, outperform the competition and generate profitable leads. His current ventures include Progostech, Magentodevelopers.online.eLabelz, Smart Leads.ae, Progos Tech and eCig.

Categorized in Online Research

Source: This article was Published business.com By Katharine Paljug - Contributed by Member: Grace Irwin

Good content marketing, which makes use of long-tail keywords, can be key to making sure your small business ranks well on Google.

As the internet continues to change consumer behavior, more marketers are turning to content marketing to reach customers. But the rules for this new form of consumer outreach are different than those of traditional ads. Rather than creating a slogan or image to catch customers' attention, content marketing requires the careful use of long-tail keywords.

What are long-tail keywords?

Trying to figure out long-tail keywords can feel overwhelming, especially if you aren't a marketing professional. For instance, a simple Google search for the phrase returns more than 77 million results. At its core, long-tail keywords refer to a phrase or several words that indicate precisely what a user has typed into Google. If you tailor your SEO properly, you will rank high in the search results for the phrase that directly corresponds to what your customers are searching for online as it related to your business. 

For example, say your Atlanta-based company makes doodads that are only meant for use within restaurants and bars. Someone looking to buy those doodads might search for "where to find doodads for restaurants in Atlanta." And if you're positioned well in search results (because you've made effective use of that long-tail keyword phrase on your website), you may show up in the first- or second-page search results. 

To use long-tail keywords, you don't need to know everything about them. You just need to understand six things about the changing world of marketing, how long-tail keywords fit in that picture and where you can find them. The answer, generally speaking, is content marketing.

Content marketing has a low cost and high ROI.

Though you can still purchase ads online, one of the most cost-effective and valuable ways to reach customers is through content marketing. That involves creating online material, such as blog posts, website pages, videos or social media posts, that do not explicitly promote your brand. Instead, the messaging stimulates interest in your business and products by appealing to the needs and interests of your target customers. 

Content marketing is a form of inbound marketing, bringing consumers to you and gaining their trust and loyalty. It generates more than three times as many leads as traditional outbound marketing while costing about 62 percent less. 

However, blogging and other forms of content marketing aren't effective unless you make effective use of keywords, particularly long-tail keywords.

Long-tail keywords are essential to content marketing.

When creating online content, you want customers to be able to find it. The most common way that customers find content online is through search engines. The average business website receives more than three-quarters of its traffic from search, but that level of traffic is impossible without using keywords. 

When you incorporate relevant keywords in your content, you optimize your website for search, making it more likely that customers searching for the keywords you have used will find your business. This search engine optimization, or SEO, increases your web traffic and exposes new audiences to your brand. 

Just using keywords isn't enough. To create effective content that makes it to the top of a search engine results page, you need to use a specific type of keyword known as long-tail keywords.

Long-tail keywords attract customers who are ready to buy.

Long-tail keywords are phrases of three or more words, but their length isn't where the name comes from. Long tail describes the portion of the search-demand curve where these keywords live. 

In statistics, the long tail is the portion of a distribution graph that tapers off gradually rather than ending sharply. This tail usually has many small values and goes on for a long time. 

When it comes to online marketing, a small number of simple keywords are searched for very frequently, while keywords that fall into the long-tail are searched for more sporadically. For example, a simple keyword that is searched for hundreds of thousands of times would be "fitness." A long-tail keyword would be "dance fitness class in Boston." Because the tail is so long and there are so many of them, these keywords account for about 70 percent of all online searches, even though the individual keywords themselves are not searched as often. 

Long-tail keywords are not searched for as frequently as simple keywords like "hotel" or "socks," because they don't apply to everyone. They're what a customer plugs into a search engine when they know exactly what they want and need an online search to help them find it. These search terms communicate a consumer's intent – especially their intent to buy – rather than their general interest. 

This means that when you use the right long-tail keywords, you appeal directly to customers who are looking for what you are selling. You want to determine what your audience might be searching and then work those phrases into your content marketing.

Look for high search volume and low competition.

Because long-tail keywords are so niche, there is much less competition for them. If your long-tail keyword is "dance fitness class in Boston," you aren't competing for search traffic with every dance class out there or even every gym in Boston. You are only competing with Boston studios that offer dance fitness classes. That is a much smaller field. 

However, you still need enough people to search for your keywords for your investment in content marketing to be worthwhile. The best long-tail keywords are low in competition but higher in search volume. High volume in this context doesn't mean thousands of searches every day. But several dozens to a couple hundred searches shows that many of your potential customers are actively searching for that keyword.

There are many tools to help you find long-tail keywords.

The best way to find low-competition, high-volume long-tail keywords is with a keyword tool. These tools allow you to plug in a seed keyword related to your business or audience, and they will return relevant long-tail keywords. 

Keyword planners, such as Answer the Public and Keywords Everywhere, are free, though the number of keywords and the information they provide about them is limited. You can also plug seed keywords into a Google search and use the auto-complete and related search term features to find new long-tail keywords. 

Paid keyword research tools, such as LongTailPro or Ahrefs Keyword Explorer, return not only thousands of relevant long-tail keywords but also statistics on the number of monthly searches and the level of competition for those keywords. They also include tools for project planning, search filters, and additional traffic stats. However, these tools can be expensive, costing several hundred dollars to use. 

The type of tool you select depends on your budget and the scope of your content marketing, and the keywords that get you the best results depend on your business and your customers.

Long-tail keywords tell you what content to create.

If you know who your target customer is, you can use their interests and concerns as seed keywords to find related long-tail keywords. For example, if you know that your customers are interested in travel, you can search for those words to find related keyword such as "which travel insurance is best" or "tax deductible travel expenses." 

Once you have a list of these high-volume, low-competition keywords, they provide you with ideas for blog posts, social media, video content, web pages and more. You can create a series of blog posts comparing kinds of travel insurance. You can make an infographic about tax-deductible travel expenses. Rather than wondering what content to create, the long-tail keywords themselves can serve as your topics. 

Creating content around these relevant keywords automatically optimizes your web platforms for search. And since your initial seed keywords were based on what you know about your target customer, you are designing content that directly appeals to the people searching for a business like yours. Using long-tail keywords effectively works with search engines to bring customers directly to your website, rather than hoping that they see an ad and decide your business is worth visiting.

Categorized in Online Research

Source: This article was published searchenginejournal.com By Matt Southern - Contributed by Member: Corey Parker

Google’s John Mueller revealed that the search engine’s algorithms do not punish keyword stuffing too harshly.

In fact, keyword stuffing may be ignored altogether if the content is found to otherwise have value to searchers.

This information was provided on Twitter in response to users inquiring about keyword stuffing. More specifically, a user was concerned about a page ranking well in search results despite obvious signs of keyword repetition.

Prefacing his statement with the suggestion to focus on one’s own content rather than someone else’s, Mueller goes on to say that there are over 200 factors used to rank pages and “the nice part is that you don’t have to get them all perfect.”

When the excessive keyword repetition was further criticized by another user, Mueller said this practice shouldn’t result in a page being removed from search results, and “boring keyword stuffing” may be ignored altogether.

Official AdWords Campaign Templates
Select your industry. Download your campaign template. Custom built with exact match keywords and converting ad copy with high clickthrough rates.

“Yeah, but if we can ignore boring keyword stuffing (this was popular in the 90’s; search engines have a lot of practice here), there’s sometimes still enough value to be found elsewhere. I don’t know the page, but IMO keyword stuffing shouldn’t result in removal from the index.”

There are several takeaways from this exchange:

  • An SEO’s time is better spent improving their own content, rather than trying to figure out why other content is ranking higher.
  • Excessive keyword stuffing will not result in a page being removed from indexing.
  • Google may overlook keyword stuffing if the content has value otherwise.
  • Use of keywords is only one of over 200 ranking factors.

Overall, it’s probably not a good idea to overuse keywords because it arguably makes the content less enjoyable to read. However, keyword repetition will not hurt a piece of content when it comes to ranking in search results.

Categorized in Search Engine

 Source: This article was published forbes.com By Jayson DeMers - Contributed by Member: William A. Woods

Some search optimizers like to complain that “Google is always changing things.” In reality, that’s only a half-truth; Google is always coming out with new updates to improve its search results, but the fundamentals of SEO have remained the same for more than 15 years. Only some of those updates have truly “changed the game,” and for the most part, those updates are positive (even though they cause some major short-term headaches for optimizers).

Today, I’ll turn my attention to semantic search, a search engine improvement that came along in 2013 in the form of the Hummingbird update. At the time, it sent the SERPs into a somewhat chaotic frenzy of changes but introduced semantic search, which transformed SEO for the better—both for users and for marketers.

What Is Semantic Search?

I’ll start with a briefer on what semantic search actually is, in case you aren’t familiar. The so-called Hummingbird update came out back in 2013 and introduced a new way for Google to consider user-submitted queries. Up until that point, the search engine was built heavily on keyword interpretation; Google would look at specific sequences of words in a user’s query, then find matches for those keyword sequences in pages on the internet.

Search optimizers built their strategies around this tendency by targeting specific keyword sequences, and using them, verbatim, on as many pages as possible (while trying to seem relevant in accordance with Panda’s content requirements).

Hummingbird changed this. Now, instead of finding exact matches for keywords, Google looks at the language used by a searcher and analyzes the searcher’s intent. It then uses that intent to find the most relevant search results for that user’s intent. It’s a subtle distinction, but one that demanded a new approach to SEO; rather than focusing on specific, exact-match keywords, you had to start creating content that addressed a user’s needs, using more semantic phrases and synonyms for your primary targets.

Voice Search and Ongoing Improvements

Of course, since then, there’s been an explosion in voice search—driven by Google’s improved ability to recognize spoken words, its improved search results, and the increased need for voice searches with mobile devices. That, in turn, has fueled even more advances in semantic search sophistication.

One of the biggest advancements, an update called RankBrain, utilizes an artificial intelligence (AI) algorithm to better understand the complex queries that everyday searchers use, and provide more helpful search results.

Why It's Better for Searchers

So why is this approach better for searchers?

  • Intuitiveness. Most of us have already taken for granted how intuitive searching is these days; if you ask a question, Google will have an answer for you—and probably an accurate one, even if your question doesn’t use the right terminology, isn’t spelled correctly, or dances around the main thing you’re trying to ask. A decade ago, effective search required you to carefully calculate which search terms to use, and even then, you might not find what you were looking for.
  • High-quality results. SERPs are now loaded with high-quality content related to your original query—and oftentimes, a direct answer to your question. Rich answers are growing in frequency, in part to meet the rising utility of semantic search, and it’s giving users faster, more relevant answers (which encourages even more search use on a daily basis).
  • Content encouragement. The nature of semantic search forces searches optimizers and webmasters to spend more time researching topics to write about and developing high-quality content that’s going to serve search users’ needs. That means there’s a bigger pool of content developers than ever before, and they’re working harder to churn out readable, practical, and in-demand content for public consumption.

Why It's Better for Optimizers

The benefits aren’t just for searchers, though—I’d argue there are just as many benefits for those of us in the SEO community (even if it was an annoying update to adjust to at first):

  • Less pressure on keywords. Keyword research has been one of the most important parts of the SEO process since search first became popular, and it’s still important to gauge the popularity of various search queries—but it isn’t as make-or-break as it used to be. You no longer have to ensure you have exact-match keywords at exactly the right ratio in exactly the right number of pages (an outdated concept known as keyword density); in many cases, merely writing about the general topic is incidentally enough to make your page relevant for your target.
  • Value Optimization. Search optimizers now get to spend more time optimizing their content for user value, rather than keyword targeting. Semantic search makes it harder to accurately predict and track how keywords are specifically searched for (and ranked for), so we can, instead, spend that effort on making things better for our core users.
  • Wiggle room. Semantic search considers synonyms and alternative wordings just as much as it considers exact match text, which means we have far more flexibility in our content. We might even end up optimizing for long-tail phrases we hadn’t considered before.

The SEO community is better off focusing on semantic search optimization, rather than keyword-specific optimization. It’s forcing content producers to produce better, more user-serving content, and relieving some of the pressure of keyword research (which at times is downright annoying).

Take this time to revisit your keyword selection and content strategies, and see if you can’t capitalize on these contextual queries even further within your content marketing strategy.

Categorized in Search Engine

 Source: This article was published searchengineland.com By R Oakes - Contributed by Member: Deborah Tannen

Ever wondered how the results of some popular keyword research tools stack up against the information Google Search Console provides? This article looks at comparing data from Google Search Console (GSC) search analytics against notable keyword research tools and what you can extract from Google.

As a bonus, you can get related searches and people also search data results from Google search results by using the code at the end of this article.

This article is not meant to be a scientific analysis, as it only includes data from seven websites. To be sure, we were gathering somewhat comprehensive data: we selected websites from the US and the UK plus different verticals.

Procedure

1. Started by defining industries with respect to various website verticals

We used SimilarWeb’s top categories to define the groupings and selected the following categories:

  • Arts and entertainment.
  • Autos and vehicles.
  • Business and industry.
  • Home and garden.
  • Recreation and hobbies.
  • Shopping.
  • Reference.

We pulled anonymized data from a sample of our websites and were able to obtain unseen data from search engine optimization specialists (SEOs) Aaron Dicks and Daniel Dzhenev. Since this initial exploratory analysis involved quantitative and qualitative components, we wanted to spend time understanding the process and nuance rather than making the concessions required in scaling up an analysis. We do think this analysis can lead to a rough methodology for in-house SEOs to make a more informed decision on which tool may better fit their respective vertical.

2. Acquired GSC data from websites in each niche

Data was acquired from Google Search Console by programming and using a Jupyter notebook.

Jupyter notebooks are an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text to extract website-level data from the Search Analytics API daily, providing much greater granularity than is currently available in Google’s web interface.

3. Gathered ranking keywords of a single internal page for each website

Since home pages tend to gather many keywords that may or may not be topically relevant to the actual content of the page, we selected an established and performing internal page so the rankings are more likely to be relevant to the content of the page. This is also more realistic since users tend to do keyword research in the context of specific content ideas.

The image above is an example of the home page ranking for a variety of queries related to the business but not directly related to the content and intent of the page.

We removed brand terms and restricted the Google Search Console queries to first-page results.

Finally, we selected ahead term for each page. The phrase “head term” is generally used to denote a popular keyword with high search volume. We chose terms with relatively high search volume, though not the absolute highest search volume. Of the queries with the most impressions, we selected the one that best represented the page.

4. Did keyword research in various keyword tools and looked for the head term

We then used the head term selected in the previous step to perform keyword research in three major tools: Ahrefs, Moz, and SEMrush.

The “search suggestions” or “related searches” options were used, and all queries returned were kept, regardless of whether or not the tool specified a metric of how related the suggestions were to the head term.

Below we listed the number of results from each tool. In addition, we extracted the “people also search for” and “related searches” from Google searches for each head term (respective to country) and added the number of results to give a baseline of what Google gives for free.

**This result returned more than 5,000 results! It was truncated to 1,001, which is the max workable and sorted by descending volume.

We compiled the average number of keywords returned per tool:

5.  Processed the data

We then processed the queries for each source and website by using some language processing techniques to transform the words into their root forms (e.g., “running” to “run”), removed common words such as  “a,” “the” and “and,” expanded contractions and then sorted the words.

For example, this process would transform “SEO agencies in Raleigh” to “agency Raleigh SEO.”  This generally keeps the important words and puts them in order so that we can compare and remove similar queries.

We then created a percentage by dividing the number of unique terms by the total number of terms returned by the tool. This should tell us how much redundancy there are in the tools.

Unfortunately, it does not account for misspellings, which can also be problematic in keyword research tools because they add extra cruft (unnecessary, unwanted queries) to the results. Many years ago, it was possible to target common misspellings of terms on website pages. Today, search engines do a really good job of understanding what you typed, even if it’s misspelled.

In the table below, SEMrush had the highest percentage of unique queries in their search suggestions.

This is important because, if 1,000 keywords are only 70 percent unique, that means 300 keywords basically have no unique value for the task you are performing.

Next, we wanted to see how well the various tools found queries used to find these performing pages. We took the previously unique, normalized query phrases and looked at the percentage of GSC queries the tools had in their results.

In the chart below, note the average GSC coverage for each tool and that Moz is higher here, most likely because it returned 1,000 results for most head terms. All tools performed better than related queries scraped from Google (Use the code at the end of the article to do the same).

Getting into the vector space

After performing the previous analysis, we decided to convert the normalized query phrases into vector space to visually explore the variations in various tools.

Assigning to vector space uses something called pre-trained word vectors that are reduced in dimensionality (x and y coordinates) using a Python library called t-distributed Stochastic Neighbor Embedding (TSNE). Don’t worry if you are unfamiliar with this; generally, word vectors are words converted into numbers in such a way that the numbers represent the inherent semantics of the keywords.

Converting the words to numbers helps us process, analyze and plot the words. When the semantic values are plotted on a coordinate plane, we get a clear understanding of how the various keywords are related. Points grouped together will be more semantically related, while points distant from one another will be less related.

Shopping

This is an example where Moz returns 1,000 results, yet the search volume and searcher keyword variations are very low.  This is likely caused by Moz semantically matching particular words instead of trying to match more to the meaning of the phrase. We asked Moz’s Russ Jones to better understand how Moz finds related phrases:

“Moz uses many different methods to find related terms. We use one algorithm that finds keywords with similar pages ranking for them, we use another ML algorithm that breaks up the phrase into constituent words and finds combinations of related words producing related phrases, etc. Each of these can be useful for different purposes, depending on whether you want very close or tangential topics. Are you looking to improve your rankings for a keyword or find sufficiently distinct keywords to write about that are still related? The results returned by Moz Explorer is our attempt to strike that balance.”

Moz does include a nice relevancy measure, as well as a filter for fine-tuning the keyword matches. For this analysis, we just used the default settings:

In the image below, the plot of the queries shows what is returned by each keyword vendor converted into the coordinate plane. The position and groupings impart some understanding of how keywords are related.

In this example, Moz (orange) produces a significant volume of various keywords, while other tools picked far fewer (Ahrefs in green) but more related to the initial topic:

Autos and vehicles

This is a fun one. You can see that Moz and Ahrefs had pretty good coverage of this high-volume term. Moz won by matching 34 percent of the actual terms from Google Search Console. Moz had double the number of results (almost by default) that Ahrefs had.

SEMrush lagged here with 35 queries for a topic with a broad amount of useful variety.

The larger gray points represent more “ground truth” queries from Google Search Console. Other colors are the various tools used. Gray points with no overlaid color are queries that various tools did not match.

Internet and telecom

This plot is interesting in that SEMrush jumped to nearly 5,000 results, from the 50-200 range in other results. You can also see (toward the bottom) that there were many terms outside of what this page tended to rank for or that were superfluous to what would be needed to understand user queries for a new page:

Most tools grouped somewhat close to the head term, while you can see that SEMrush (in purplish-pink) produced a large number of potentially more unrelated points, even though Google People Also Search were found in certain groupings.

General merchandise   

Here is an example of a keyword tool finding an interesting grouping of terms (groupings indicated by black circles) that the page currently doesn’t rank for. In reviewing the data, we found the grouping to the right makes sense for this page:

The two black circles help to visualize the ability to find groupings of related queries when plotting the text in this manner.

Analysis

Search engine optimization specialists with experience in keyword research know there is no one tool to rule them all.  Depending on the data you need, you may need to consult a few tools to get what you are after.

Below are my general impressions from each tool after reviewing, qualitatively:

  • The query data and numbers from our analysis of the uniqueness of results.
  • The likelihood of finding terms that real users use to find performing pages.

Moz     

Moz seems to have impressive numbers in terms of raw results, but we found that the overall quality and relevance of results was lacking in several cases.

Even when playing with the relevancy scores, it quickly went off on tangents, providing queries that were in no way related to my head term (see Moz suggestions for “Nacho Libre” in the image above).

With that said, Moz is very useful due to its comprehensive coverage, especially for SEOs working in smaller or newer verticals. In many cases, it is exceedingly difficult to find keywords for newer trending topics, so more keywords are definitely better here.

An average of 64 percent coverage for real user data from GSC for selected domains was very impressive  This also tells you that while Moz’s results can tend to go down rabbit holes, they tend to get a lot right as well. They have traded off a loss of fidelity for comprehensiveness.

Ahrefs

Ahrefs was my favorite in terms of quality due to their nice marriage of comprehensive results with the minimal amount of clearly unrelated queries.

It had the lowest number of average reported keyword results per vendor, but this is actually misleading due to the large outlier from SEMrush. Across the various searches, it tended to return a nice array of terms without a lot of clutter to wade through.

Most impressive to me was a specific type of niche grill that shared a name with a popular location. The results from Ahrefs stayed right on point, while SEMrush returned nothing, and Moz went off on tangents with many keywords related to the popular location.

A representative of Ahrefs clarified with me that their tool “search suggestions” uses data from Google Autosuggest.  They currently do not have a true recommendation engine the way Moz does. Using “Also ranks for” and “Having same terms” data from Ahrefs would put them more on par with the number of keywords returned by other tools.

 SEMrush   

SEMrush overall offered great quality, with 90 percent of the keywords being unique It was also on par with Ahrefs in terms of matching queries from GSC.

It was, however, the most inconsistent in terms of the number of results returned. It yielded 1,000+ keywords (actually 5,000) for Internet and Telecom > Telecommunications yet only covered 22 percent of the queries in GSC. For another result, it was the only one not to return related keywords. This is a very small dataset, so there is clearly an argument that these were anomalies.

Google: People Also Search For/Related Searches 

These results were extremely interesting because they tended to more closely match the types of searches users would make while in a particular buying state, as opposed to those specifically related to a particular phrase. 

For example, looking up “[term] shower curtains” returned “[term] toilet seats.”

These are unrelated from a semantic standpoint, but they are both relevant for someone redoing their bathroom, suggesting the similarities are based on user intent and not necessarily the keywords themselves.

Also, since data from “people also search” are tied to the individual results in Google search engine result pages (SERPs), it is hard to say whether the terms are related to the search query or operate more like site links, which are more relevant to the individual page.

Code used

When entered into the Javascript Console of Google Chrome on a Google search results page, the following will output the “People also search for” and “Related searches” data in the page, if they exist.

1    var data = {};
2    var out = [];
3    data.relatedsearches = [].map.call(document.querySelectorAll(".brs_col p"), e => ({ query: e.textContent }));
4    
5    data.peoplesearchfor = [].map.call(document.querySelectorAll(".rc > div:nth-child(3) > div > div > div:not([class])"), e => {
6    if (e && !e.className) {
7    return { query: e.textContent };
8     }
9     });
10   
11    for (d in data){
12
13    for (i in data[d]){
14    out.push(data[d][i]['query'])
15     }
16
17    }
18    console.log(out.join('\n'))

In addition, there is a Chrome add-on called Keywords Everywhere which will expose these terms in search results, as shown in several SERP screenshots throughout the article. 

Conclusion

Especially for in-house marketers, it is important to understand which tools tend to have data most aligned to your vertical. In this analysis, we showed some benefits and drawbacks of a few popular tools across a small sample of topics. We hoped to provide an approach that could form the underpinnings of your own analysis or for further improvement and to give SEOs a more practical way of choosing a research tool.

Keyword research tools are constantly evolving and adding newly found queries through the use of clickstream data and other data sources. The utility in these tools rests squarely on their ability to help us understand more succinctly how to better position our content to fit real user interest and not on the raw number of keywords returned. Don’t just use what has always been used. Test various tools and gauge their usefulness for yourself.

Categorized in Online Research

 Source: This article was published themarketingagents.com By Rich Brooks - Contributed by Member: Robert Hensonw

A keyword analysis (or keyword research) is the art and science of uncovering which keyword phrases your prospects are likely to use at Google or other search engines. 

Why is this important?

Because search engines are looking to return relevant results when someone performs a search. The closer the words on your web page, blog post or online video are to the search that was just performed, the more likely you are to rank higher for that search. 

Higher rankings mean more qualified traffic. In fact, a recent study showed that the number one result averaged a 36.4% click-through rate (CTR.) The second place result only managed 12.1% CTR, and the CTR declined with every subsequent result. 

Although using the right keywords isn’t the only reason why your page ranks well or poorly–the quality and quantity of inbound links is important, too–it’s one of the easiest variables for you to affect.

How do you perform a keyword analysis?

Keyword research is a three-step process:

  1. Brainstorm: Whether by yourself, with team members, or trusted customers and clients, you should start by brainstorming a list of your best keywords. These would be the words you think your ideal customer would use when searching for a product or service like yours, or phrases you’d like to rank well for. Anything from “Boston tax accountant” to “how do I write off a business expense?” I talk about using five perspectives to generate the best keyword phrases.
  2. Test: After you generate your keywords, you’ll want to determine if they actually will bring you enough traffic. Often, we’ve been in our industry for so long we use jargon that our prospects don’t use. Or, we are missing out on new, related phrases that could attract new customers. Using a tool like Google Adwords Keyword Tool will help you determine which words and phrases are most likely to attract the most qualified traffic. By entering your phrases into this free online tool, you can discover how much competition you would have from other sites to rank well for a phrase, as well as how many people are actually searching for that phrase. In addition, GAKT will provide a lot of related phrases that may perform better than your original list.
  3. Rewrite: Once you have your list of your best keywords, get to work putting them in strategic places on each page of your site, including the page title, any headers or subheaders, early and often in the body copy, as well as in the intrasite links from one page to another.

How do you know if it’s working?

Improved search engine visibility rarely happens overnight. Continually adding new, keyword rich content to your blog or website over time will improve your search engine ranking and attract more qualified traffic to your site. 

Two reports in Google Analytics can help determine if this is happening. The first can be found at Traffic Sources > Search Engine Optimization > Queries. This report shows your site’s average ranking for any keyword that “resulted in impressions, clicks, and click-throughs.” You can see if you’re moving up or down over time.

 The second report can be found at Traffic Sources > Search Engine Optimization > Landing Pages. This will show you how your individual pages are fairing from a search engine standpoint. 

Finally, take a look at your overall search traffic and the number of leads you’re generating from your website. If the number of leads you’re getting a month is increasing, your work is making a difference.

Categorized in Online Research
Page 1 of 4

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar - GET 70% OFF FOR MEMBERS ONLY      Register Now