fbpx
Barbara Larson

Barbara Larson

Local search marketing can be especially challenging for businesses with multiple locations. Contributor Jason Decker explains how to do it right. 

Search marketing for a local business is tough — but those of us helping businesses with hundreds, sometimes thousands, of locations know it is a completely different ballgame. It’s a challenge to get one business in one location to rank, much less hundreds of locations spread across thousands of miles.

This article is meant to serve as a guide for those performing enterprise local search. While many of the tips herein relate to any multi-location business, these tips are specifically tailored for enterprise-sized businesses and corporations with 50-1,000+ locations.

1. Create Unique Local Landing Pages

When it comes to enterprise local search, it is critical to have a well-designed local landing page forevery single location. The following best practices will help you dominate local search results.

  1. Title & Meta Tags. Be sure to use keyword optimized Title Tags that are unique to every single location. Include city, state and zip in your meta data. Have a unique meta Description for every page.
  2. NAP. Be sure to have your businesses Name, Address and Phone Number (NAP) in a very visible part of the page. Ensure that your NAP is consistent across the entire website (this includes details such as “St.” vs. “Street” and “CO” vs. “Colorado”). It is important to be totally consistent.
  3. Use structured data (Schema) to markup with the appropriate type of business.
  4. Google Local Map. Embed your business’s Google Local Map and not just a generic Google Map with your address.
  5. Photos. Use photos to differentiate your listing, increase engagement, and encourage interaction. Local landing pages with no photos generally have a much higher bounce rate. Use a storefront photo as well as interior photos if possible. For more professional businesses, show photos of the employees with a short biography. Videos are even better!
  6. Unique Local Content. A major challenge for many multi-location businesses is creating unique content for each and every local landing page. You often see identical content with only a few small changes. This is very spammy, bad for the search engines, and bad for your potential customers. You must create unique, very localized content for every single page.Optimized URL. Created optimized local landing page URLs that are easy for humans andsearch engines to understand. A good formula to follow would be www.website.com/city-state-zip. If you have multiple locations in the same ZIP code, use some kind of differentiator such as a neighborhood, landmark or even a store number. The formula could be www.website.com/city-state-zip-differentiator.
    • Be sure to list all products or services as well as hours of operation.
    • Describe your city as well as your neighborhood. Mentioning nearby landmarks, parks, festivals or anything that makes your location unique is great! Be as specific as you can about your location. Mention your cross streets.
    • Describe any public transportation nearby such as subways, bus routes, etc.
    • Mention any awards or sponsored local businesses. Use badges that build trust such as BBB approved or industry specific badges.
  7. Optimized URLCreated optimized local landing page URLs that are easy for humans andsearch engines to understand. A good formula to follow would be www.website.com/city-state-zip. If you have multiple locations in the same ZIP code, use some kind of differentiator such as a neighborhood, landmark or even a store number. The formula could be www.website.com/city-state-zip-differentiator.If you create a unique page for every location and if you follow these best practices for local page content, this will build the foundation for local search success.

2. Build Citations 

In search engine optimization terms, a citation is a mention of your business name, address and phone number (NAP) on another webpage, even if there is no link to your website.

SmartSearch-Yelp.jpg

An important factor in local search ranking algorithms is the number of citations, the accuracy and consistency of these listings, and how authoritative the website providing the citation is. Submitting your site to local business directories such as Google+ Local, Yelp, Foursquare, etc., is therefore critical to building your local search presence.

When creating your plan to build citations, I recommend that you start with data aggregators such as Acxiom, Localeze, Infogroup & Factual. Then focus on major local directories such as Google, Bing, Yahoo, Foursquare & Yelp.

Moz also offers some great tools to determine the most important local businesses listings based on your industry or city. Once you have your list ranked in order of importance, it’s time to get started.

Data aggregators can be expensive for marketers with many locations, but they are very important for local search. They are the fastest way to distribute local information for a large number of locations, as many local directory services get information directly from these sources (see the chart below).

Use as many of these services as you can, as it will save you the trouble of having to submit individually to hundreds of local directories.

Local-Search-Ecosytem.jpg

If you are new to enterprise-level local marketing, you will be saddened to learn that many local directories do not allow for bulk uploading of data for multiple locations.

Yes, that’s correct: when creating listings for your business locations on directory sites, you will often have to repeat a process hundreds of times simply because there is no other option. (And some services, like Yelp, only allow for multi-location bulk uploading when you pay to advertise with them.)

That said, the following major local directory services do support bulk uploads for businesses with many locations, at no cost. Make sure you take advantage of this!

  1. Google My Business. Google gives the ability to upload a pretty much unlimited number of locations, so this is a great place to start and a very impactful first step. (Sadly, the system is still very glitchy, so be sure to factor in some time for fixing errors.)
  2. Bing Local. Bing’s bulk upload process is much lengthier than Google’s, but it is a must, as well. Expect at least a month for the whole process to take place.
  3. Foursquare. The process is simple, and Foursquare has really stepped up its features lately. Foursquare has truly become a trusted source of local data.

When you submit your business information to local directories and data aggregators, be sure that the business name, address and phone number (NAP) for each location is consistent with the way it’s presented on the site and on any local directories you may have already submitted to.

As mentioned above, NAP consistency is incredibly important for local search visibility. It enables search engines to identify each individual listing scattered across the web as referring to a single business location, thus strengthening your citation count for any given location.

The importance of NAP consistency across the web leads me to the final step:

3. Clean Up Existing Citations

This is an area that many businesses (especially large, multi-location businesses) ignore because it’s very time consuming and quite tedious work.

For example, many local marketers review each directory listing individually to verify content, add new listings, clean up duplicates, etc. If you can afford it, there are some great tools that can help identify existing citations and clean up duplicates for you.

These services are usually pretty expensive and may be a more cost-effective option for single location businesses, as the cost really starts to add up with multiple locations. A few of these services are WhitesparkYext and a brand new tool from Moz called Moz Local.

First, you should assess your current listings situation. There are a few free tools that will scan the web for your business and return information such as Name, Address, Phone Number inaccuracies, duplicates and more for a number of important listings. Moz Local offers a free tool for this, as does Yext.

These tools are great time savers and a good place to get an idea of how much work needs to be done for each location. More than likely, you are going to find some inaccuracies that need to be fixed.

For those doing it manually, search for each business location under each local directory. Where applicable, be sure to search a few different ways: by name (with variations), by phone number and ZIP. This way, you can determine if you are listed in a directory and find any duplicate listings at the same time.

If you find duplicate listings for a single business, keep the most accurate listing and report the others as duplicates.

Be sure to claim all local listings so you are the one managing them.

Continue to build and clean up as many citations as possible for every location. While this is a very tedious and time-consuming part of the process, it cannot be ignored.

It’s important to have as many local business listings as possible, and they absolutely need to be 100% accurate. Be sure to check back with any duplicates that you have found. Sometimes you will have to request to be deleted several times before they are removed.

Dominate The Local Search Results

We’ve covered the basics of building a winning local search strategy for enterprise businesses with multiple locations. To recap:

Start by creating a unique page for each individual business; utilize bulk uploads; distribute business information via data aggregators; optimize each business page using local search best practices; and build your citations.

This proven approach will help you dominate local search results, build your brand online, drive local website traffic, and generate local business results.

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.

Source : http://searchengineland.com/

Google wants to seriously step up its Amazon Prime competitor, Express.

The speedy online delivery service plans to spread its coverage from roughly 20 states and regions to the entire country by the end of the year, general manager Brian Elliott tells Business Insider.

To make that possible, the group made the "hard decision" to kill part of its grocery business and stop selling perishables, like fresh fruit and vegetables, a pilot it started this February in parts of San Francisco and Los Angeles.

"We've been testing a lot of different things, figuring what works, and how it works," Elliott says. "And now we're really, dramatically scaling the business."

Nation-wide coverage and new partnerships

Google launched its Shopping Express delivery service back in 2013, in part to avert Amazon's encroach on its product search territory. More and more, people were beginning their searches for water bottles, flat screen TVs, banana slicers, you name it, on Amazon instead of Google, which threatened the search engine's ads business. Express was a way for Google to reinstate itself as the go-to choice for product searches and to make it easier for people to actually buy the goods they found.

In the three years since, though, it's had some struggles, including having to make tweaks to its delivery model and some executive turnover. The division lost its founding boss, Tom Fallows, to Uber in late 2014 and had several temporary execs until Elliott took over almost exactly a year ago. Now, he says, Express has finally figured out its business strategy to the point where it's ready to start rapidly scaling up.

Google ExpressHere's a Google Express order from Whole Foods that hit ~$34 (not including the olive oils in the back). Business Insider

Instead of a one-size-fits-all-regions delivery style, Express will offer a combination of same-day, overnight, and two-day delivery options. Users can either pay a $4.99 minimum delivery fee for each store order, or a yearly $95 membership fee that eliminates the per-store fee. Shoppers can also test the service through a $10-a-month plan.

As of now, Express has signed on 50 total merchants, including Target, Walgreens, Petsmart, REI and, as of Tuesday, the home-goods shop Wayfair and 99 Ranch Market, an Asian grocery store.

It's also expanding nationwide service for two partners: Whole Foods and Costco.

People who don't have Costco memberships will be able to shop there, albeit at prices that are a bit higher than what members would get.

Expanding the delivery blanket of those two stores is instrumental to Express's focus on groceries, even though shoppers won't be able to order anything that needs temperature controls.

"We made the hard decision to focus on dry good groceries and having that be national," Elliott says. "For now, we're not gonna sweat the milk, and eggs, and ice cream side of it."

ExpressSee ya later, Google Express delivery vans. Google

That separates it from Amazon's grocery service, Prime Fresh, as well as other instant delivery startups like Instacart or Postmates, which can manage full, diverse orders.

Not delivering perishables makes delivery much easier, though. Google plans to partner with different third-party delivery services like OnTrac and Dynamex, so you'll be seeing fewer of its Google-branded delivery vans. Getting rid of its own fleet will help Google cut costs as it makes money by taking a small commission of total purchase prices.

"In a year from now, what we want is for this not only to be nationally present, but something that people see as a big part of their every day lives," Elliott says. "We want to go from 50 merchants to hundreds of merchants."

Not an "everything store"

Expresssss3Google

If you look at Google Express and Amazon's shopping service, Prime, head-to-head, the appeals of the "The Everything Store" are apparent. Even if Express does hit hundreds of stores, Amazon will still have far more products to choose from and an almost-identical yearly fee ($95 for Express vs $99 for Amazon) that will get shoppers a bunch of other perks, like free movies and music.

But Elliott advocates that people who like Google's service come looking for specific goods from specific stores. You don't want just any fleece jacket — you want that one from Target. He see Express as a way to buy from a bunch of retailers easily, in one place, not as a way to buy anything you want.

"We have this range of merchants that people really love, so they don't think about this as 'online shopping,'" Elliott says. "They just think of this as shopping. We want to bring a lot of the stores that people need to shop at on a regular basis into one easy-to-use service that makes it worthwhile."

Meanwhile, retailers hesitant to sell their good on Amazon see Google as more of an ally than a competitor since Express ensures them fast delivery while preserving much of their brand experience.

"You may live 300 or 400 miles from a Whole Foods — or even further if you live in a state where we don't have a store — and this is really an opportunity to bring the store to you," Jeff Jenkins, head of digital strategy at Whole Foods, says. "It really allows us to expand into places where we don't have brick-and-mortar stores."

A future focused on what Google's good at

Since the beginning, Express was meant to make Google's product search ads more useful. But, even three years later, the connection between search ads and Express isn't really there.

Google Express

Elliott, who worked on Google Shopping and its Buy Button before coming to Express, says that now that the infrastructure is in place, that's about to change.

"Because [Express] hasn't had a national presence, it's been hard to think about how to integrate this into other Google products and services," Elliott said. "But you can imagine that we'd want to do it so that when you searched for these products on Google, you'd actually find them in [Express]."

He also wants to find ways to help retailers give a heads up on their own websites that they offer their products through Express.

If Google rolls out a free way to feature Express merchants on product search pages, more retailers will likely want to be included. Though whether people will actually use Express as it spreads its availability through the US remains to be seen. Analysts estimate that Amazon hasup to 69 million Prime subscribers. Elliott declined to comment on Express's revenue numbers, users, or its budget (the latter of which Recode reported as $500 million per year in 2014).

"It’s a very small initiative and opportunity for Google relative to what they primarily do — selling advertising," says Pivotal Research analyst Brian Wieser. "I would argue that, in general, Google would benefit from prioritizing its focus on endeavors that directly relate to advertising or otherwise reinforce its position in the space. However, it’s also entirely unlikely that will happen."

Here's what Google's delivery list looks like now:

Google

 

Source : http://www.businessinsider.com/google-express-delivery-cuts-perishables-plans-to-expand-throughout-us-2016-9

This election cycle, more users than ever before are turning to social media - and as a result, it can be difficult to separate reported fact from rampant speculation. The BBC's Charlie Northcott discovers how to spot a conspiracy theory.

Conspiracy theories, a common feature in many elections, have run amok in a new climate where scuttlebutt travels round the world on social media before the mainstream media can get its boots on.

On Twitter, rumours that Hillary Clinton uses a body double to hide her health problems run alongside mainstream media reports about her contracting pneumonia.

Donald Trump hasn't been spared either.

Some say he is a Russian spy, while others suggest he is really plotting to win the election for his rival, Hillary Clinton.

The BBC has spoken with four fact-checking and lie-detection experts. This is their guide to spotting conspiracy theories in the news.

Dick Cheney at 9/11 memorial

Secrecy

What do the claims that Hillary Clinton is scheming to ban Christian churchesand that Donald Trump is conspiring to destroy the Republican party have in common? Both involve a secret plot.

Hatching nefarious schemes behind closed doors is often a tell-tale sign of a conspiracy theory.

"If someone asserts something is going on in secret, alarm bells should be going off in your head," says Joseph Uscinski, associate professor of political science at the University of Miami and author of American Conspiracy Theories.

"People acting in secrecy is a necessary feature of a conspiracy theory, as it cannot be disproved."

Claiming a heinous plot is underway gives cover to even the most outlandish claims.

"The idea that 1% of America is secretly controlling the rest. Or that George Bush secretly plotted to blow up the Twin Towers. These ideas persist, with no evidence, because their 'secrecy' makes them impossible to verify."

The truth is, it's very difficult keep a secret plot secret, which ties into the next tell-tale sign of a conspiracy.

 

 

Sandy Hook families hold pictures of their loved ones

 

Scale

American politicians have engaged in serious cases of subterfuge and real conspiracy historically, from Bill Clinton's misrepresentation of his affair with Monica Lewinsky to Richard Nixon's deceit during the Watergate scandal.

"It is reasonable to suspect government," says Michael Shermer, publisher ofSkeptic Magazine. "Governments do lie."

But distinguishing between true conspiracies and false theoretical ones is difficult.

One way to go about it is to think about scale.

"The more people allegedly involved in a conspiracy, the less likely it is to be true," Mr Shermer says.

"People are bad at keeping secrets. People make mistakes. The chances of hundreds of people successfully controlling something, while keeping it secret are slim.

"The bigger the claim, the less likely it is to be true."

Hillary Clinton

 

Source

Billions of blogs, articles and social media comments are posted on the internet every day. As many as eight billion videos are viewed on Facebook alone.

When digesting this information, readers should ask themselves a key question: who is the source?

"The first thing we do when we hear a television advert, a political statement or read an article is track down the source of that information," says Eugene Kiely, Director of FactCheck.org.

"Is a source cited? If so, how reputable is that individual or organisation?"

In the case of a fake Hillary Clinton medical record that recently circulated online, Mr Kiely's team had to track down her doctor's real name and contact her presidential campaign team to discredit the forgery.

But in many cases, basic source analysis can be done through search engines like Google, for gathering background details, and through cross-referencing stories with credible media outlets, which aspire to avoid publishing information that has not been verified.

Readers should be aware of satirical news websites which publish intentionally funny and false news. The Onion is the most notable example, but there are several other sites, including ones that are often more subtle and harder to spot.

Readers should also be wary of fake news websites which pose as organisations like the BBC.

When in doubt, click around to see what else that site is covering, or see what turns up when it's plugged into a search engine.

George Bush and John Kerry

Patterns

Conspiracy theories often contain similar themes - like a desire for global domination, power or wealth. They also often contain similar narratives.

George Bush was accused of being fed answers during a 2004 presidential debate; in 2016 Mrs Clinton's detractors said she had received illicit information via a hidden earpiece.

Stories which touch these issues should be read with discerning eyes.

"A rumour that the government is planning to dramatically raise taxes occurs every year in America," says Angie Halan, editor in chief of PolitiFact, a US media outlet that fact checks politics.

"A lot of conspiracy theories have an alarmist tone. If it sounds too awful to be true, you should view the story with scepticism."

After all, in this election cycle, there is enough real news - strange, awful, and otherwise - to bother yourself with what's made up.

Source : http://www.bbc.com/news/world-us-canada-37344846

Thursday, 15 September 2016 19:17

Bing gains 20% share of UK search

When it comes to search, it can often seem that Google is the only show in town, so it’s interesting to note that Microsoft’s Bing search engine is now powering as many as 20% of the UK’s online searches.

Comscore reports that Bing has surpassed 20 percent market share in the UK. So what’s driving this growth? Reports suggest that strategic syndication partnerships with firms like Gumtree and new services such as Bing Shopping and Bing Ads Editor for Mac have helped too

Raven Beeston of Bing UK says: “These figures clearly demonstrate Bing’s growing presence within the search market. We’ve already seen great success, and our journey is only just getting started. Over the past year, we’ve announced syndication partnerships, and new product launches, showcasing Microsoft’s ongoing commitment to search. Bing understands the conversations that consumers are having, and empowers marketers to harness these insights to make meaningful connections.”

Do you use Bing as a searcher or an advertiser to driver your sales?

Source : http://tamebay.com/2016/09/bing-gains-20-share-of-uk-search.html

A decade ago Google was just a search engine and advertiser, but now it’s the driving force behind the largest computing platform in the world: Android. Even the slow-to-start Chrome OS has been picking up steam in recent years, dominating the budget laptop market. Both these products are based in part on Linux, but Google is working on something completely new, and you can take a peek at it on Github. It’s an operating system called Fuchsia, which could run on just about anything.

 

Google and many other companies make use of the Linux kernel for a variety of reasons. The robustness of features is certainly part of it, but it’s also freely available under the GPL license. Anyone can use the Linux kernel in a project, provided they make the open source components available to end users and developers. So, what about Fuchsia? According to the GitHub page, “Pink + Purple == Fuchsia (a new Operating System).” Google’s mysterious new Fuchsia OS is based on a completely different kernel known as Magenta. This is a microkernel, which itself is based on a different project called LittleKernel.

 

The intended use for Magenta was as part of an embedded system like you might see on routers or set-top boxes. It seems that Google wants to use it for more than that now. Magenta is designed to be lightweight, but it can scale up to be the basis for more powerful systems. Google’s Fuchsia page notes that the project is targeted at “modern phones and modern personal computers’ that have fast processors and lots of RAM.

 

 

Building something open from scratch gives Google much more freedom to make exactly what it needs. The Linux kernel has been around for about 25 years and is used in all manner of applications. Many developers have contributed code over that time, and as a result it’s a little ungainly. Many of the security exploits found in Android these days are actually faults in the Linux kernel. Google is testing Fuchsia on a variety of devices like Intel NUCs and Acer laptops. There is also support for the Raspberry Pi 3 on the way. Google is currently using a system called Flutter for the interface and Dart as the programming language.

 

But what’s Google going to do with Fuchsia? It’s possible Google management isn’t even sure. This could just become another abandoned project before it has a chance to replace anything. Still, some have speculated that Google could see Fuchsia as the next step for Android, Chrome OS, or both. Migrating to a new platform probably means breaking compatibility with existing software (or emulating it in some way), so this is not something to be done lightly. Perhaps Fuchsia is something completely new for Google — a robust full desktop OS alternative to Chrome OS. Whatever Google has planned for Fuchsia, nothing is changing at the moment.

 

Source : http://www.extremetech.com/computing/233699-google-is-working-on-a-mysterious-new-os-called-fuchsia

Microsoft announced last week that the default search engine on Microsoft Edge browser in Windows 10 will be Baidu, not Bing.

The announcement read:

Together, we will make it easy for Baidu customers to upgrade to Windows 10 and we will deliver a custom experience for customers in China, providing local browsing and search experiences. Baidu.com will become the default homepage and search for the Microsoft Edge browser in Windows 10. Baidu’s new Windows 10 distribution channel, Baidu “Windows 10 Express” will make it easy for Chinese Internet users to download an official Windows 10 experience. Additionally, Baidu will deliver Universal Windows Applications for Search, Video, Cloud and Maps for Windows 10.

We remain deeply committed to delivering Bing around the world and we’re also committed to offering locally relevant experiences - like Baidu in China - to provide great Windows 10 experiences.

This is a pretty big deal for Microsoft and honestly makes a statement.

The obvious point, as engine said in WebmasterWorld, "there's an interesting fact there that is worth highlighting - Microsoft drops Bing as default search for Baidu in China."

Source : https://www.seroundtable.com/baidu-default-edge-browser-20952.html 

My Business Insights Show How You’re Being Found

Google has rolled out enhanced insights for Google My Business pages. When logged into GMB, you’ll now be able to see the total views to your GMB page, where visitors are coming from, and how they found your page.

Google Search vs. Google Maps

Where are the visitors to your GMB page coming from? Google Search and Google Maps both send traffic to GMB pages, and now you’ll be able to see a breakdown of how many visitors are coming from which source.

Direct vs. Discovery

How are people finding your GMB page? At times people will find it by directly typing your business or brand name in the search bar. At other times, people may find it by searching for a related keyword. Now Google will show you a comparison between who found your page by searching for your name directly, and who discovered it by searching for a related keyword. Unfortunately, when it comes to the actual keywords used to find your GMB page, those are ‘not provided’.

With the addition of these new insights Google finds it no longer necessary to include Google+ statistics in the GMB dashboard, so those have been removed going forward. The company expects to introduce even more new insights to GMB pages in the near future.

Source : https://www.searchenginejournal.com/new-google-business-insights-show-youre-found/170448/

Google has multiple named parts of the algorithm that influence search rankings. Google Panda is part of the algo that is specific to the quality of content, Penguin is specific to the quality of links, and Hummingbird is Google’s part of the algo for handling conversational search queries accurately.

Google Panda

Google Panda takes the quality of a site’s content into account when ranking sites in the search results. For sites that have lower quality content, they would likely find themselves negatively impacted by Panda. As a result, this causes higher quality content to surface higher in the search results, meaning higher quality content is often rewarded with higher rankings, while low-quality content drops.

Google Panda

When Panda originally launched, many saw it as a way for Google to target content farms specifically, which were becoming a major problem in the search results with their extremely low-quality content that tended to rank due to sheer volume. These sites were publishing a fantastic amount of low-quality content very quickly on topics with very little knowledge or research, and it was very obvious to a searcher who landed on one of those pages.

Google has now evolved Panda to be part of the core algorithm. Previously, we had a known Panda update date, making it easier to identify when a site was hit or had recovered from Panda. Now it is part of a slow rolling update, lasting months per cycle. As a result, it is hard to know whether a site is negatively impacted by Panda or not, other than doing a content audit and identifying factors that sites hit by Panda tend to have.

User Generated Content

It is important to note that Panda does not target user-generated content specifically, something that many webmasters are surprised to learn. But while Panda can target user-generated content, it tends to impact those sites that are producing very low-quality content – such as spammy guest posts or forums filled with spam.

Do not remove your user-generated content, whether it is forums, blog comments or article contributions, simply because you heard it is “bad” or marketed as a “Panda proof” solution. Look at it from a quality perspective instead. There are many highly ranking sites with user-generated content, such as Stack Overflow, and many sites would lose significant traffic and rankings simply because they removed that type of content. Even comments made on a blog post can cause it to rank and even get a featured snippet.

Word Count

Word count is another aspect of Panda that is often misunderstood by SEOs. Many sites make the mistake that they refuse to publish any content unless it is above a certain word count, with 250 words and 350 words often cited. Instead, Google recommends you think about how many words the content needs to be successful for the user.

For example, there are many pages out there with very little main content, yet Google thinks the page is quality enough that it has earned the featured snippet for the query. In one case, the main content was a mere 63 words, and many would have been hard pressed to write about the topic in a non-spammy way that was 350+ words in length. So you only need enough words to answer the query.

Content Matches the Query

Ensuring your content matches the query is also important. If you see Google is sending traffic to your page for specific queries, ensure that your page is answering the question searchers are looking for when they land there. If it is not, it is often as simple as adding an extra paragraph or two to ensure that this is happening.

As a bonus, these are the types of pages – ones that answer a question or implied question – that Google is not only looking to rank well but is also awarding the featured snippet for the query to.

 

Technical SEO

Technical SEO also does not play any role in Panda. Panda looks just at the content, not things like whether you are using H1 tags or how quickly your page loads for users. That said, technical SEO can be a very important part of SEO and ranking in general, so it should not be ignored. But it does not have any direct impact on Panda specifically.

Determining Quality

If you are struggling to determine whether a particular piece of content is considered quality or not, there is one surefire way to confirm. Look in Search Analytics or your site’s analytics program such as Google Analytics and look at the individual page. If Google is ranking a page and sending it traffic, then clearly it is viewing it as quality enough to show high enough in the search results that people are landing there from those Google’s search results.

However, if a page is not getting traffic from Google, it does not automatically mean it is bad, but the content is worth looking at closer. Is it simply newer and has not received enough ranking signals to rank yet? Do you see areas of improvement you can make by adding a paragraph or two, or changing the title to match the content better? Or is it truly a garbage piece of content that could be dragging the site down the Panda hole?

Also, do not forget that there is traffic outside of Google. You may question a page because Google is not sending it traffic, but perhaps it does amazingly well in Bing, Baidu, or one of the other search engines instead. Diversity in traffic is always a good thing, and if you have pages that Google might not be sending traffic to, but is getting traffic from other search engines or other sites or through social media shares, then removing that content would be the wrong decision to make.

Panda Prevention

How to prevent Google Panda from negatively impacting your site is pretty simple. Create high-quality, unique content that answers the question searchers are asking.

Reading content out loud is a great way to tell if content is high-quality or not. When content is read aloud, suddenly things like over usage of repetitive keywords, grammatical errors, and other signals that the content is less than quality will stand out. Read it out yourself and edit as you go, or ask someone else to read it so you can flag what should be changed.

Google Penguin

Google Penguin

The second major Google algorithm is Penguin. Penguin deals solely with link quality and nothing else. Sites that have purchased links or have acquired low-quality links through places such as low-quality directories, blog spam, or link badges and infographics could find their sites no longer ranking for search terms.

Who Should Worry about Penguin?

Most sites do not need to worry about Penguin unless they have done some sketchy link building in the past or have hired an SEO who might have engaged in those tactics. Even if the site owner was not aware of what an SEO was doing, the owner is still ultimately responsible for those links. That is why site owners should always research an SEO or SEO agency before hiring.

If you have done link building in the past while tactics were accepted, but which are now against Google’s webmaster guidelines, you could be impacted by Penguin. For example, guest blogging was fine years ago, but is not a great way to build links now unless you are choosing your sites well. Likewise, asking site visitors or members to post badges that linked to your site was also fine previously, but will now definitely result in Penguin or a link manual action.

Algorithmic Penguin and Link Manual Actions

Penguin is strictly algorithmic in nature. It cannot be lifted by Google manually, regardless of the reason why those links might be pointing to a website.

Confusing the issue slightly is that there is a separate manual action for low-quality links and that one can be lifted by Google once the links have been cleaned up. This is done with a reconsideration request in Google Search Console. And sites can be impacted by both a linking manual action and Penguin at the same time.

Incoming Links Only

Penguin only deals with a site’s incoming links. Google only looks at the links pointing to the site in question and does not look at the outgoing links at all from that site. It is important to note that there is also a Google manual action related directly to a site’s outgoing links (which is different from the regular linking manual action), so the pages and sites you link to could result in a manual action and the deindexing of a site until those links are cleaned up.

Finding Your Backlinks

If you suspect your site has been negatively impacted by Penguin, you need to do a link audit and remove or disavow the low quality or spammy links. Google Search Console includes a list of backlinks for site owners, but be aware that it also includes links that are already nofollowed. If the link is nofollowed, it will not have any impact on your site, but keep in mind, the site could remove that nofollow in the future without warning.

There are also many third-party tools that will show links to your site, but because some websites block those third-party bots from crawling their site, it will not be able to show you every link pointing at your site. And while some of the sites blocking these bots are high-quality well-known sites not wanting to waste the bandwidth on those bots, it is also being used by some spammy sites to hide their low-quality links from being reported.

Assessing Link Quality

When it comes to assessing the links, this is where many have trouble. Do not assume that because a link comes from an .edu site that it is high-quality. There are plenty of students who sell links from their personal websites on those .edu domains which are extremely spammy and should be disavowed. Likewise, there are plenty of hacked sites within .edu domains that have low-quality links.

 

Do not make judgments strictly based on the type of domain. While you can’t make automatic assumptions on .edu domains, the same applies to all TLDs and ccTLDs. Google has confirmed that just being on a specific TLD it does not help or hurt the search rankings. But you do need to make individual assessments. There is a long running joke about how there’s never been a quality page on a .info domain because so many spammers were using them, but in fact, there are some great quality links coming from that TLD, which shows why individual assessment of links is so important.

Beware of Links from Presumed High-Quality Sites


Do not look at the list of links and automatically consider links from specific websites as being a great quality link, unless you know that very specific link is high quality. Just because you have a link from a major website such as Huffington Post or the BBC does not make that an automatic high-quality link in the eyes of Google – if anything, you should question it more.

Many of those sites are also selling links, albeit some disguised as advertising or done by a rogue contributor selling links within their articles. These types of links from high-quality sites actually being low-quality has been confirmed by many SEOs who have received link manual actions that include links from these sites in Google’s examples. And yes, they could likely be contributing to a Penguin issue.

As advertorial content increases, we are going to see more and more links like these get flagged as low-quality. Always investigate links, especially if you are considering not removing any of them simply based on the site the link is from.

Promotional Links

As with advertorials, you need to think about any links that sites may have pointed to you that could be considered promotional links. Paid links do not always mean money is exchanged for the links.

Examples of promotional links that are technically paid links in Google’s eyes are any links given in exchange for a free product for review or a discount on products. While these types of links were fine years ago, they now need to be nofollowed. You will still get the value of the link, but instead of it helping rankings, it would be through brand awareness and traffic. You may have links out there from a promotional campaign done years ago that are now negatively impacting a site.

For all these reasons, it is vitally important to individually assess every link. You want to remove the poor quality links because they are impacting with Penguin or could cause a future manual action. But you do not want to remove the good links, because those are the links that are helping your rankings in the search results.

Promotional links that are not nofollowed can also trigger the manual action for outgoing links on the site that placed those links.

Editor Note: Removing links and submitting a disavow request is also covered in more detail in the ‘What to Do When Things Go Wrong‘ section of our SEO Guide.

Link Removals

Once you have gone through your backlinks and determined that there are some that should be removed or disavowed, you will need to get these links removed. You should first approach site owners and ask them to remove the links pointing to your site. If removals are unsuccessful, add those URLs to a disavow file, one you will submit to Google.

There are tools that will automate the link removal requests and agencies that will handle the requests as well, but do not feel it is necessary to do this. Many webmasters find contact forms or emails and will do it themselves.

Some site owners will demand a fee to remove a link from a site, but Google recommends not paying for link removals. Just include them in your disavow file instead and move onto the next link removal. Some site owners are using link removals to generate revenue, so the practice is becoming more common.

Creating and Submitting a Disavow File

The next step in cleaning up Penguin issues is to submit a disavow file. The disavow file is a file you submit to Google that tells them to ignore all the links included in the file so that they will not have any impact on your site. The result is that the negative links will no longer cause negative ranking issues with your site, such as with Penguin, but it does also mean that if you erroneously included high-quality links in your disavow file, those links will no longer help your ranking. This is another reason why it is so crucial to check your backlinks well before deciding to remove them.

If you have previously submitted a disavow file to Google, they will replace that file with your new one, not add to it. So it is important to make sure that if you have previously disavowed links, you still include those links in your new disavow file. You can always download a copy of the current disavow file in Google Search Console.

 

Disavowing Individual Links Versus Domains

It is recommended that you choose to disavow links on a domain level instead of disavowing the individual links. There will be some cases where you will want to disavow individually specific links, such as on a major site that has a mix of quality versus paid links. But for the majority of links, you can do a domain based disavow. Then, Google only needs to crawl one page on that site for that link to be discounted on your site.

Doing domain based disavows also means that you are do not have to worry about those links being indexed as www or non-www, as the domain based disavow will take this into account.

What to Include in a Disavow File

You do not need to include any notes in your disavow file, unless they are strictly for your reference. It is fine just to include the links and nothing else. Google does not read any of the notations you have made in your disavow file, as they process it automatically without a human ever reading it. Some find it useful to add internal notations, such as the date a group of URLs was added to the disavow file or comments about their attempts to reach the webmaster about getting a link removed.

Once you have uploaded your disavow file, Google will send you a confirmation. But while Google will process it immediately, it will not immediately discount those links. So you will not instantly recover from submitting the disavow alone. Google still needs to go out and crawl those individual links you included in the disavow file, but unfortunately the disavow file itself will not prompt Google to crawl those pages specifically.

It can take six or more months for all those individual links to be crawled and disavowed. And no, there is no way to determine which links have been discounted and which ones have not been, as Google will still include both in your linking report in Google Search Console.

Speeding Up the Disavow Process

There are ways you can speed up the disavow process. The first is using domain based disavows instead of individual links. And the second is to not waste time include lengthy notations for Google’s benefit so that you can submit your disavow faster. Because reconsideration requests require you to submit more details, some misunderstand and believe the disavow needs more details, too.

 

Lastly, if you have undergone any changes in your domain, such as switching to https or switching to a new domain, you need to remember to upload that disavow file to the new domain property in Google Search Console. This is one step that many forget to do, and they can be impacted by Penguin or the linking manual action again, even though they have cleaned it up previously.

Recovery from Penguin

When you recover from Penguin, do not expect your rankings to go back to where they used to be before Penguin, nor for the return to be immediate. Far too many site owners are under the impression that they will immediately begin ranking at the top for their top search queries once Penguin is lifted.

First, some of the links that you disavowed were likely contributing to an artificially high ranking, so you cannot expect those rankings to be as high as they were before. Second, because many site owners have trouble assessing the quality of the links, some high-quality links inevitably get disavowed in the process, links that were contributing to the higher rankings.

Add to the mix the fact Google is constantly changing their ranking algorithm, so factors that benefited you previously might not have as big of an impact now, and vice versa.

Compensated Links via Badges and More

Also be aware of any link building campaigns you are doing, or legacy ones that could come back to impact your site. This would include things like badges you have given to other site owners to place on their sites or the requirement that someone includes a link to your site to get a directory listing or access something. In simple terms, if the link was placed in exchange for anything, it either needs to be nofollowed or disavowed.

When it comes to disavowing files that people are using to clean up poor quality links, there is a concern that a site could be hurt by competitors placing their URLs into a disavow file uploaded to Google. But Google has confirmed that they do not use the URLs contained within a disavow file for ranking, so even if your site appears in thousands of disavows, it will not hurt. That said, if you are concerned your site is legitimately appearing in thousands of disavows, then your site probably has a quality issue you should fix.

Negative SEO

There is also the negative SEO aspect of linking, where some site owners worry that a competitor could buy spammy links and point them to their site. And many use negative SEO as an excuse when their site gets caught by Google for low-quality links.

If you are worried about this, you can proactively disavow the links as you notice them. But Google has said they are pretty good about recognizing this when it happens, so it is not something most website owners need to worry about.

Real Time Penguin

Google is expected to release a new version of Penguin soon, which will have one very notable change. Instead of site owners needing to wait for a Penguin update or refresh, the new Penguin will be real-time. This is a huge change for those dealing with the impact of spamming links and the weights many have had to endure after cleaning up.

Hummingbird

Hummingbird

Google Hummingbird is part of the main Google search algorithm and was the first major change to their algorithm since 2001. But what is different about Hummingbird is that this one is not specifically a spam targeting algorithm, but instead an algorithm to ensure they are serving the best results for specific queries. Hummingbird is more about being able to understand search queries better, particularly with the rise of conversational search.

 

It is believed that Hummingbird is positively impacting the types of sites that are providing high-quality content that reads well to the searcher and is providing answers to the question the searcher is asking, whether it is implied or not.

Hummingbird also impacts long-tailed search queries, similarly to how Rank Brain is also helping those types of queries. Google wants to ensure that they can provide high-quality results for the longer queries. For example, instead of sending a specific question related to a company to the company’s homepage, Google will try to serve an internal page on the site about that specific topic or issue instead.

Hummingbird cannot be optimized for, outside of optimizing for the rise of conversational search. Longer search queries, such as what we see with voice search, and the types of queries that searchers tend to do on mobile are often highlighted with a conversational search. And optimizing for conversational search is easier than it sounds. Make sure your content is highly readable and can answer those longer tail queries as well as shorter tail ones.

Like Rank Brain, Hummingbird had been released for a period before it was announced, and SEOs did not particularly notice anything different regarding the rankings. It is not known how often Hummingbird is updated or changed by Google.

Source :https://www.searchenginejournal.com/seo-guide-panda-penguin-hummingbird/169167/

Google has made another small acquisition to help it continue building out its latest efforts in social apps. The search and Android giant has hired the team behind Kifi, a startup that was building extensions to collect and search links shared in social apps, as well as provide recommendations for further links — such as this tool, Kifi for Twitter. Terms of the deal are not being disclosed, but, according to Google engineering director Eddie Kessler, the app’s team will be joining the company to work on Spaces, Google’s group chat app.

Google tells me it is not commenting on the exact number of people joining.

It looks like Spaces could use the help. The app launched earlier this year and has had a very lukewarm run in the market so far, currently lingering around 577 in the U.S. iOS App Store and 284 in the U.S. Android store, according to stats from App Annie.

This is essentially an acqui-hire. In a Medium post earlier today, Kifi noted that the app is not coming to Google. It will only remain alive for another few weeks, after which point it will stick around for a few weeks more for data exports only.

While the app is not living on, it sounds like the kind of tech that Kifi’s team — co-founded by Dan Blumenfeld and Eishay Smith (although Blumenfeld left the company some time ago) — will continue. Considering Space’s current focus on group chat, it sounds like this means they could tweak Kifi’s link sharing and link recommendation technology to use them in that context, and to be able to collate them with links from other applications and platforms.

This seems to be what Kessler says will be the intention, too, in his own short Google+ post: “Delighted the Kifi team, with their great expertise in organizing shared content and conversations, is joining the Spaces team to build features that improve group sharing.”

Google has disclosed nearly 200 acquisitions to date. Among them, other recent M&A moves that point to Google building up its talent in areas like social and apps include Pie (a Slack-like app) in Singapore and Moodstocks in Paris (to improve image recognition in apps).

Kifi had raised just over $11 million in funding from Don Katz, Oren Zeev, SGVC and Wicklow Capital.

https://techcrunch.com/2016/07/12/google-acquires-deep-search-engine-kifi-to-enhance-its-spaces-group-chat-app/

 

Imagine a criminal breaks into your home but doesn't steal anything or cause any damage. Instead, they photograph your personal belongings and valuables and later that day hand-deliver a letter with those pictures and a message: "Pay me a large sum of cash now, and I will tell you how I got in."

Cybercriminals are doing the equivalent of just that: Hacking into corporations to shake down businesses for upward of $30,000 when they find vulnerabilities, a new report from IBM Security revealed.

The firm has traced more than 30 cases over the past year across all industries, and at least one company has paid up. One case involved a large retailer with an e-commerce presence, said John Kuhn, senior threat researcher at IBM Security.

 

Though some companies operate bug bounty programs — rewarding hackers for revealing vulnerabilities — in these cases, the victims had no such program.

"This activity is all being done under the disguise of pretending to be a "good guy" when in reality, it is pure extortion," said Kuhn.

Researchers have dubbed the practice "bug poaching."

Here's how it typically works. The attacker finds and exploits web vulnerabilities on an organization's website. The main method of attack — known as SQL injection — involves the hacker injecting code into the website which allows them to download the database, said Kuhn.

 

Once the attacker has obtained sensitive data or personally identifiable information, they pull it down and store it, then place it in a cloud storage service. They then send an email to the victim with links to the stolen information — proof they have it — and demand cash to disclose the vulnerability or "bug."

Though the attacker does not always make explicit threats to expose the data or attack the organization directly, there is no doubt of the threatening nature of the emails. Hackers often include statements along the lines of, "Please rest assured that the data is safe with me. It was extracted for proof only. Honestly, I do this job for living, not for fun," said the report.

"This does not negate the fact that the attacker stole the organization's data and placed it online, where others could potentially find it, or where it can be released," said Kuhn.Trusting unknown parties to secure sensitive corporate data — particularly those who breached a company's security systems without permission — is inadvisable, said Kuhn. And, of course, there are no guarantees when dealing with these criminals so even when companies pay up, there is still a chance the attacker will just release the data.

 

Organizations that fall victim to this type of attack should should gather all relevant information from emails and servers and then contact law enforcement, said Kuhn.

Here are some measures companies can take to avoid becoming a victim, according to IBM Security: 1) Run regular vulnerability scans on all websites and systems. 2) Do penetration testing to help find vulnerabilities before criminals do. 3) Use intrusion prevention systems and web application firewalls. 4) Test and audit all web application code before deploying it. 5) Use technology to monitor data and detect anomalies.

Source:  http://www.cnbc.com/2016/05/27/the-disturbing-new-way-hackers-are-shaking-down-big-business.html

 

 

 

 

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media