Google nailed e-mail with the 2004 introduction of Gmail. Now it’s the No. 1 form of electronic correspondence in the United States.

But as traditional e-mail falls out of favour with a growing sliver of the population, Google has struggled to release newer messaging tools that resonate widely.

Now Google is trying again with a new video chat application called Duo. The app works with mobile devices running Google’s Android operating system and Apple Inc.’s iOS. It runs on Wi-Fi and cellular networks, automatically switching between different types and speeds of connection and adjusting video quality.

Duo uses phone numbers, rather than a Google account, making it easier to call friends, family and other people already stored on smartphone contact lists. The company’s existing video calling and messaging app, Hangouts, requires a Google account. That limited adoption, especially in emerging markets. Facebook Inc.’s WhatsApp and Messenger, Skype – now owned by Microsoft Corp. – and Apple’s FaceTime used phone numbers to grow faster.

A confusing array of communication options has held Google back. It has two e-mail services – Gmail, which is the top e-mail service in the United States based on unique visitors, according to ComScore, and Inbox; three text offerings, Hangouts, Messenger and the upcoming Allo; and now two video chat services, Duo and Hangouts (which offers texting and video calls).

This scattershot approach, and Google’s late start, is becoming more costly for the Alphabet Inc. division as messaging evolves from a simple way to communicate quickly into one of the next big technology platforms supporting digital commerce, advertising and new services powered by artificial-intelligence.

“Google missed it because of the requirement that you needed a Google ID to communicate with others,” said Ankit Jain, a former Googler and executive at SimilarWeb Inc., which measures website and mobile app usage.

Hangouts ranked 84th among Android apps in the United States in July, based on installs and usage, according to SimilarWeb. That lagged behind Facebook Messenger, WhatsApp and Snapchat.

Nick Fox, a 13-year Google veteran, was tasked by Google Chief Executive Officer Sundar Pichai 18 months ago with fixing the sprawl. Soon after, his new team formulated a strategy and started building Duo and Allo.

“Google sees communication as this essential human need, whether that’s through text, a picture, calling someone or doing a video call.” Mr. Fox said in a recent interview.

This insight is a decade old and has guided Facebook’s strategy since its creation in 2004. Asian companies, such as Tencent Holdings Ltd.’s WeChat and Line, have grown into tech powerhouses by connecting people through communication apps and offering related services on top of their networks. Skype, founded in 2003, became a leading video chat app on a similar foundation.

So how is Mr. Fox going to catch up? Job number one is clearing up the bloated smorgasbord of Google communications services.

Hangouts will be a workplace service, offering group video conferencing mostly via desktop computers and office laptops, Fox said. It will be integrated more with Google’s work software, such as Docs, Sheets and Slides, which will be easier to share.

Duo is a mobile app and only allows one-to-one video calling, limiting it as a consumer offering. Allo, a messaging service coming out later this year, will also target consumers, Fox said. Google’s Messenger is a basic text system, part of a group of services provided to wireless carriers that work closely with Android.

The second tactic: Bringing what Mr. Fox says is better technology to the new services to catch up with rivals.

Duo constantly performs “bandwidth estimation” to understand how much video can be delivered. If Wi-Fi weakens, it switches to a phone’s cellular network. If a cellular signal drops as low as 2G, Duo will automatically cut video and maintain audio.

Allo will use Google’s expertise in AI to automatically understand texts and provide useful suggestions. Google will also let third-party developers create chatbots that will interact with Allo users through messages. That’s already being tried by other companies such as Facebook and Microsoft, but Google has been working hard on AI for about a decade, so it may be more advanced.

“First build a great product,” Mr. Fox said, repeating a common Google mantra. “Once you get people to love it, they will share it with friends and co-workers and it grows.”

Google was late in other technology and caught up, Fox noted. Gmail started in 2004, more than six years after Yahoo Mail, but Google’s offer of mountains of free storage won over hundreds of millions of users. Google’s Chrome emerged in 2008 – over a decade after Microsoft’s Internet Explorer – and is now the most popular web browser partly because of speed and frequent updates.

Better technology may not be enough to catch up, Mr. Jain said. WhatsApp and Snapchat offered something useful enough to persuade many people to switch away from their existing communication service where all their friends already were.

Duo’s promise of video calling for everyone on Android and iOS is something that Hangouts already offers, but that didn’t move the needle enough, he noted.

“It’s worth another shot, but having better tech can’t be the only thing,” Mr. Jain said.

Source : http://www.theglobeandmail.com/technology/knock-knock-google-duo-video-call-is-here/article31426625/

Categorized in Science & Tech

It’s almost impossible to see any meaningful search engine optimization (SEO) results without spending some time building and honing your inbound link profile.Of the two main deciding factors for site rankings (relevance and authority), one (authority) is largely dependent on the quantity and quality of links pointing to a given page or domain.

As most people know, Google’s undergone some major overhauls in the past decade, changing its SERP layout, offering advanced voice-search functionality and significantly revising its ranking processes. But even though its evaluation of link quality has changed, links have been the main point of authority determination for most of Google’s existence.

Why is Google so dependent on link metrics for its ranking calculations, and how much longer will links be so important?

The concept of PageRank

To understand the motivation here, we have to look back at the first iteration of PageRank, the signature algorithm of Google Search named after co-founder Larry Page. It uses the presence and quality of links pointing to a site to determine how to gauge a site’s authoritativeness.

Let’s say there are 10 sites, labeled A through J. Every site links to site A, and most sites link to site B, but the other sites don’t have any links pointing to them. In this simple model, site A would be far likelier to rank for a relevant query than any other site, with site B as a runner-up.

links-site-a-site-b

But let’s say there are two more sites that enter the fray, sites K and L. Site L is linked to from sites C, D and E, which don’t have much authority, but site K is linked to from site A, which has lots of authority. Even though site K has fewer links, the higher authority link matters more — and might propel site K to a similar position as site A or B.

link-authority-chart

The big flaw

PageRank was designed to be a natural way to gauge authority based on what neutral third parties think of various sites; over time, in a closed system, the most authoritative and trustworthy sites would rise to the top.

The big flaw is that this isn’t a closed system; as soon as webmasters learned about PageRank, they began cooking up schemes to manipulate their own site authority, such as creating link wheels and developing software that could automatically acquire links on hundreds or thousands of unsuspecting websites at the push of a button. This undermined Google’s intentions and forced them to develop a series of checks and balances.

Increasing phases of sophistication

Over the years, Google has cracked down hard on such rank manipulators, first punishing the most egregious offenders by blacklisting or penalizing anyone participating in a known link scheme. From there, they moved on to more subtle developments that simply refined the processes Google used to evaluate link-based authority in the first place.

One of the most significant developments was Google Penguin, which overhauled the quality standards Google set for links. Using more advanced judgments, Google could now determine whether a link appeared “natural” or “manipulative,” forcing link-building tactics to shift while not really overhauling the fundamental idea behind PageRank.

Other indications of authority

Of course, links aren’t the only factor responsible for determining a domain or page’s overall authority. Google also takes the quality of on-site content into consideration, thanks in part to the sophisticated Panda update that rewards sites with “high-quality” (well-researched, articulate, valuable) content.

The functionality of your site, including its mobile-friendliness and the availability of content to different devices and browsers, can also affect your rankings. But it’s all these factors together that determine your authority, and links are still a big part of the overall mix.

Modern link building and the state of the web

Today, link building must prioritize the perception of “naturalness” and value to the users encountering those links. That’s why link building largely exists in two forms: link attraction andmanual link building.

Link attraction is the process of creating and promoting valuable content in the hope that readers will naturally link to it on their own, while manual link building is the process of placing links on high-authority sources. Even though marketers are, by definition, manipulating their rankings whenever they do anything known to improve their rankings, there are still checks and balances in place that keep these tactics in line with Google’s Webmaster Guidelines.

Link attraction tactics won’t attract any links unless the content is worthy of those links, and manual link-building tactics won’t result in any links unless the content is good enough to pass a third-party editorial review.

The only sustainable, ongoing manual link-building strategy I recommend is guest blogging, the process by which marketers develop relationships with editors of external publications, pitch stories to them, and then submit those stories in the hope of having them published. Once published, these stories achieve myriad benefits for the marketer, along with (usually) a link.

Could something (such as social signals) replace links?

Link significance and PageRank have been the foundation for Google’s evaluation of authority for most of Google’s existence, so the big question is: could anything ever replace these evaluation metrics?

More user-centric factors could be a hypothetical replacement, such as traffic numbers or engagement rates, but user behavior is too variable and may be a poor indication of true authority. It also eliminates the relative authority of each action that’s currently present in link evaluation (i.e., some users wouldn’t be more authoritative than others).

Peripheral factors like content quality and site performance could also grow in their significance to overtake links as a primary indicator. The challenge here is determining algorithmically whether content is high-quality or not without using links as a factor in that calculation.

Four years agoMatt Cutts squelched that notion, stating at SMX Advanced 2012, “I wouldn’t write the epitaph for links just yet.” Years later, in a Google Webmaster Video from February 2014, a user asked if there was a version of Google that excludes backlinks as a ranking factor. Cutts responded:

We have run experiments like that internally, and the quality looks much, much worse. It turns out backlinks, even though there’s some noise and certainly a lot of spam, for the most part, are still a really, really big win in terms of quality of our search results. So we’ve played around with the idea of turning off backlink relevance, and at least for now, backlink relevance still really helps in terms of making sure that we return the best, most relevant, most topical set of search results.

The safe bet is that links aren’t going anywhere anytime soon. They’re too integrated as a part of the web and too important to Google’s current ranking algorithm to be the basis of a major overhaul. They may evolve over the next several years, but if so, it’ll certainly be gradual, so keep link building as a central component of your SEO and content marketing strategy.

Source : http://searchengineland.com/links-still-core-authority-signal-googles-algorithm-255452

Categorized in Search Engine

Later this year, Apple (NASDAQ:AAPL) will release macOS Sierra, the next version of its Mac operating system. Sierra will bring a number of notable improvements to Apple's PCs, including support for Apple Pay and deeper integration with iOS devices.

It also brings Siri, Apple's digital personal assistant. Siri made her debut back in 2011, but it's taken Apple a full five years to bring her to the Mac. Nevertheless, her introduction could dramatically improve the productivity of Mac users, and give them a new way to interact with their machines. It could also pose a threat to Alphabet's (NASDAQ:GOOG) (NASDAQ:GOOGL) Google.

Siri comes to the desktop

Once they've installed Sierra, Mac users will be able to call on Siri by speaking to their machines. It's a bit of a shift from the familiar mouse and keyboard, but impressions have been generally favorable. Wired's David Pierce described the process of using Siri on the Mac as "almost natural" and "definitely useful." Ultimately, he concluded that Siri may be better on the Mac than she is on the iPhone.

In addition to her standard functionality, Mac Siri can conduct special Mac-related tasks, like searching local files or changing settings. iPhone users often find Siri most useful when they aren't actually using their phones at all -- Apple has trumpeted Siri's hands-free features as a key selling point in recent years. But on the Mac, she offers the prospect of improved multi-tasking. A Mac owner can use Siri to change a setting, play a song, or find a file without leaving their current application.

The inclusion of Siri probably won't help Apple sell that many more Macs, as competing machines running Windows 10 have something similar. Still, it does enhance Apple's offerings and ensures that the company is keeping pace with its rivals.

Cortana has helped Bing capture share

Perhaps more interesting is what Siri could do to the search market. Beyond finding files and changing settings, Siri can be used to scour the web. Since 2013, she's relied on Microsoft's (NASDAQ:MSFT) Bing to do so. Unless you specifically request Google, simple commands such as "find me chicken recipes" will result in Siri conducting a Bing search. Last December, research firm Kantar found that Siri relied on Bing frequently. Analyzing the behavior of 3,000 different Siri users, Kantar found that Bing powered 63% of the searches Siri conducted.

Microsoft's share of the U.S. search market has been rising in recent quarters. In July, 2015, Bing captured 20.4% of the U.S. desktop search market, according to comScore. Google's share was more than three times higher, at 64%. But over the last year, the gap has closed a bit. In June, comScore reported Bing's market share at 21.8%. Meanwhile, Google stood at 63.8%.

That shift has been driven by Windows 10 -- last quarter, over 40% of Microsoft's search revenue came from Windows 10 devices. Like macOS Sierra, Windows 10 includes a digital personal assistant: Cortana, which, unsurprisingly, is also powered by Bing. Cortana is integrated directly into the Windows 10 taskbar, and on many PCs can be summoned with a quick voice command. To date, Windows 10 users have asked Cortana more than 8 billion questions, and 100 million Windows 10 users rely on her each month.

Siri's addition to the Mac could have a similar effect. There are fewer Mac users than Windows users, obviously, but Siri could shift some search queries to Bing in the months ahead.

If Bing's share of the search market continues to rise, Microsoft may have Apple to thank.

10 stocks we like better than Apple

When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has nearly tripled the market.*

David and Tom just revealed what they believe are the ten best stocks for investors to buy right now… and Apple wasn't one of them! That's right -- they think these 10 stocks are even better buys.

Source : http://www.fool.com/investing/2016/08/11/apples-mac-is-now-a-threat-to-google.aspx

Categorized in Science & Tech

 

Search-driven piracy is a red herring, Google argues.

Pirate sites are now an ‘infinitesimal’ part of Google search results, according to a new copyright-focused report created by Google. “Worldwide, more than 3.5 billion searches are made each day on Google Search, making it the most widely used search engine in the world,” the report states. “Search’s popularity has tangible benefits for rightsholders, as it helps more than a billion people worldwide find licensed copies of content. For example, between our Search and Google News services, Google sends over 10 billion clicks per month to publishers’ websites.”

“There are more than 60 trillion addresses on the Web, but only an infinitesimal portion of these have any connection to piracy.”

Exactly what percentage quantifies ‘infinitesimal’ is not specified in the document, nor are absolute amounts defined. Also unclear is to why Google created this document, though major content owners, policymakers, and key member of the media appear to be targets. The document was dated July, 2016.

Throughout the substantial piracy-focused PDF, Google attempts to dismantle the supposition that search is a major driver of piracy traffic. That includes considerable attention towards YouTube, particularly the highly-effective control mechanisms offered by Content ID. Specifically, Google claims that 99.5 percent of music infringement claims are handled by Content ID, an assertion first reported by Digital Music News and heavily debated by the music industry.

Misleading ‘Long Tail’ Piracy Queries?

In the document, Google also noted that most people aren’t searching for pirated material at all. Instead, the search giant accused its critics of artificially constructing piracy-laden search queries that typically don’t exist. “The search results for the vast majority of media-related queries show results that include only legitimate sites in the top results pages,” the document continues. “This is thanks to both our constant improvements to the algorithms that power Google Search and the efforts of rightsholders to prioritize and target their copyright removal notices.”

To illustrate the point, the document contrasts search results for mainline search terms with those carrying piracy-focused additions. That includes terms like ‘Watch,’ which may indicate interest in streaming piracy sites. “Some critics paint a misleading picture by focusing on the results for rare, ‘long tail’ queries, adding terms like ‘watch’ or ‘free’ or ‘download’ to a movie title or performer’s name,” Google asserts. “While the search results for these vanishingly rare queries can include potentially problematic links, it is important to consider how rare those queries are. Look at the relative frequency of these Google searches in 2015:

‘Katy Perry’ searched 14,812x more often than ‘Katy Perry free download’
‘Taylor Swift’ searched 4534x more often than ‘Taylor Swift download’
‘PSY Gangnam Style’ searched 104× more often than ‘PSY Gangnam Style download’
‘Star Wars The Force Awakens’ searched 402× more often than ‘Watch Star Wars The Force Awakens’
‘Pixels’ searched 240× more often than ‘Watch Pixels’
“Google Search isn’t responsible for piracy.”

That supports Google’s broader assertion that search isn’t the reason for piracy. “Google Search is not how music, movie, and TV fans intent on pirating media reach pirate sites,” the document continues. “A 2011 study found that all traffic from major search engines (Yahoo, Bing, and Google combined) accounts for less than 16% of traffic to sites like The Pirate Bay, and recent statistics from ComScore confirm these numbers. Research that Google co-sponsored with PRS for Music in the UK further confirmed that traffic from search engines is not what keeps these sites in business. These findings were confirmed in a research paper published by the Computer & Communications Industry Association.”

Source :http://www.digitalmusicnews.com/2016/08/11/piracy-infinitesimal-part-google-search/ 

 

Categorized in Search Engine

For years, Yahoo has been criticized for failing to understand what it really is.

Is it a search engine? A web portal? A news site? An advertising tech company? All of the above?

Well, based on how Yahoo describes its competition in itslatest quarterly filing, it looks like Yahoo still has no clue what it really wants to be.

Here's what it says:

"We face significant competition from online search engines, sites offering integrated internet products and services, social media and networking sites, e-commerce sites, companies providing analytics, monetization and marketing tools for mobile and desktop developers, and digital, broadcast and print media.

In a number of international markets, especially those in Asia, Europe, the Middle East and Latin America, we face substantial competition from local Internet service providers and other entities that offer search, communications, and other commercial services."

What does that mean? It means Yahoo's competing in all of these areas one way or another:

- Online search

- Internet services, like email

- Social media

- E-commerce

- Data analytics

- Marketing and advertising technology

- Messaging

- Media

For a company that generates about $5 billion a year, that's a lot of different areas to be in. Yahoo's scattershot approach is also pretty interesting when you compare the language to how other companies describe their competition.

Here's what Google says:

"We have many competitors in different industries, including general purpose search engines and information services, vertical search engines and e-commerce websites, social networks, providers of online products and services, other forms of advertising and online advertising platforms and networks, other operating systems, and wireless mobile device companies...Our competitors are constantly developing innovations in search, online advertising, wireless mobile devices, operating systems, and many other web-based products and services."

Here's Facebook:

"We face significant competition in every aspect of our business, including from companies that provide tools to facilitate communication and the sharing of information, companies that enable marketers to display advertising and companies that provide development platforms for applications developers."

Here's Twitter:

"Although we have developed a global platform for public self-expression and conversation in real time, we face strong competition in our business. We compete against many companies to attract and engage users, including companies which have greater financial resources and substantially larger user bases, such as Facebook (including Instagram and WhatsApp), Google, LinkedIn, Microsoft and Yahoo, which offer a variety of Internet and mobile device-based products, services and content."

And Amazon:

"Our businesses are rapidly evolving and intensely competitive, and we have many competitors in different industries, including retail, e-commerce services, digital content and electronic devices, and web and infrastructure computing services."

At least Yahoo is now under a restructuring plan that will narrow its focus to three platforms (search, email, Tumblr) and four content verticals (news, finance, sports, and lifestyle), as well as its Gemini and Brightroll ad offerings. And with its sale to Verizon, it's likely Yahoo will be a much more focused company. Still, it's an interesting reminder that spreading a company's resources too thinly across many different areas often don't work.

Source : http://www.businessinsider.com/yahoo-still-has-no-idea-what-it-is-2016-8

Categorized in Search Engine

Say goodbye to Flash and hello to a pure HTML5 experience when Google rolls out the next update to its Chrome browser. In September Google will release Chrome 53, which will be designed to block Flash.

Most Flash on the web today is never even seen, as Google points out 90% of it is loaded behind the scenes. However, this type of Flash can slow down your web browsing experience, which is the reason behind Google’s decision to block it.

In Google’s words:

HTML5 is much lighter and faster, and publishers are switching over to speed up page loading and save you more battery life. You’ll see an improvement in responsiveness and efficiency for many sites.

Google began to de-emphasize Flash last September, when Flash content started being served on a click-to-play basis, rather than autoplaying in the browser. Following the positive effect of that change, Google appears to be all but removing Flash from its browser completely.

The company even announced a future update coming this December where HTML5 will become the default experience on Chrome. Sites that only support Flash will still be accessible in Chrome, but users will first be prompted to enable it.

What was once synonymous with in-browser gaming, entertainment, and cutting edge web design, Adobe Flash has been outshone by competing technologies which are lighter, faster, and easier on your device’s battery life.

The two companies will continue to work together, with Adobe said to be assisting Google with its efforts to transition the web to HTML5.

Source : https://www.searchenginejournal.com/googles-chrome-browser-block-flash-starting-september/170548/ 

Categorized in Search Engine

Google has multiple named parts of the algorithm that influence search rankings. Google Panda is part of the algo that is specific to the quality of content, Penguin is specific to the quality of links, and Hummingbird is Google’s part of the algo for handling conversational search queries accurately.

Google Panda

Google Panda takes the quality of a site’s content into account when ranking sites in the search results. For sites that have lower quality content, they would likely find themselves negatively impacted by Panda. As a result, this causes higher quality content to surface higher in the search results, meaning higher quality content is often rewarded with higher rankings, while low-quality content drops.

Google Panda

When Panda originally launched, many saw it as a way for Google to target content farms specifically, which were becoming a major problem in the search results with their extremely low-quality content that tended to rank due to sheer volume. These sites were publishing a fantastic amount of low-quality content very quickly on topics with very little knowledge or research, and it was very obvious to a searcher who landed on one of those pages.

Google has now evolved Panda to be part of the core algorithm. Previously, we had a known Panda update date, making it easier to identify when a site was hit or had recovered from Panda. Now it is part of a slow rolling update, lasting months per cycle. As a result, it is hard to know whether a site is negatively impacted by Panda or not, other than doing a content audit and identifying factors that sites hit by Panda tend to have.

User Generated Content

It is important to note that Panda does not target user-generated content specifically, something that many webmasters are surprised to learn. But while Panda can target user-generated content, it tends to impact those sites that are producing very low-quality content – such as spammy guest posts or forums filled with spam.

Do not remove your user-generated content, whether it is forums, blog comments or article contributions, simply because you heard it is “bad” or marketed as a “Panda proof” solution. Look at it from a quality perspective instead. There are many highly ranking sites with user-generated content, such as Stack Overflow, and many sites would lose significant traffic and rankings simply because they removed that type of content. Even comments made on a blog post can cause it to rank and even get a featured snippet.

Word Count

Word count is another aspect of Panda that is often misunderstood by SEOs. Many sites make the mistake that they refuse to publish any content unless it is above a certain word count, with 250 words and 350 words often cited. Instead, Google recommends you think about how many words the content needs to be successful for the user.

For example, there are many pages out there with very little main content, yet Google thinks the page is quality enough that it has earned the featured snippet for the query. In one case, the main content was a mere 63 words, and many would have been hard pressed to write about the topic in a non-spammy way that was 350+ words in length. So you only need enough words to answer the query.

Content Matches the Query

Ensuring your content matches the query is also important. If you see Google is sending traffic to your page for specific queries, ensure that your page is answering the question searchers are looking for when they land there. If it is not, it is often as simple as adding an extra paragraph or two to ensure that this is happening.

As a bonus, these are the types of pages – ones that answer a question or implied question – that Google is not only looking to rank well but is also awarding the featured snippet for the query to.

 

Technical SEO

Technical SEO also does not play any role in Panda. Panda looks just at the content, not things like whether you are using H1 tags or how quickly your page loads for users. That said, technical SEO can be a very important part of SEO and ranking in general, so it should not be ignored. But it does not have any direct impact on Panda specifically.

Determining Quality

If you are struggling to determine whether a particular piece of content is considered quality or not, there is one surefire way to confirm. Look in Search Analytics or your site’s analytics program such as Google Analytics and look at the individual page. If Google is ranking a page and sending it traffic, then clearly it is viewing it as quality enough to show high enough in the search results that people are landing there from those Google’s search results.

However, if a page is not getting traffic from Google, it does not automatically mean it is bad, but the content is worth looking at closer. Is it simply newer and has not received enough ranking signals to rank yet? Do you see areas of improvement you can make by adding a paragraph or two, or changing the title to match the content better? Or is it truly a garbage piece of content that could be dragging the site down the Panda hole?

Also, do not forget that there is traffic outside of Google. You may question a page because Google is not sending it traffic, but perhaps it does amazingly well in Bing, Baidu, or one of the other search engines instead. Diversity in traffic is always a good thing, and if you have pages that Google might not be sending traffic to, but is getting traffic from other search engines or other sites or through social media shares, then removing that content would be the wrong decision to make.

Panda Prevention

How to prevent Google Panda from negatively impacting your site is pretty simple. Create high-quality, unique content that answers the question searchers are asking.

Reading content out loud is a great way to tell if content is high-quality or not. When content is read aloud, suddenly things like over usage of repetitive keywords, grammatical errors, and other signals that the content is less than quality will stand out. Read it out yourself and edit as you go, or ask someone else to read it so you can flag what should be changed.

Google Penguin

Google Penguin

The second major Google algorithm is Penguin. Penguin deals solely with link quality and nothing else. Sites that have purchased links or have acquired low-quality links through places such as low-quality directories, blog spam, or link badges and infographics could find their sites no longer ranking for search terms.

Who Should Worry about Penguin?

Most sites do not need to worry about Penguin unless they have done some sketchy link building in the past or have hired an SEO who might have engaged in those tactics. Even if the site owner was not aware of what an SEO was doing, the owner is still ultimately responsible for those links. That is why site owners should always research an SEO or SEO agency before hiring.

If you have done link building in the past while tactics were accepted, but which are now against Google’s webmaster guidelines, you could be impacted by Penguin. For example, guest blogging was fine years ago, but is not a great way to build links now unless you are choosing your sites well. Likewise, asking site visitors or members to post badges that linked to your site was also fine previously, but will now definitely result in Penguin or a link manual action.

Algorithmic Penguin and Link Manual Actions

Penguin is strictly algorithmic in nature. It cannot be lifted by Google manually, regardless of the reason why those links might be pointing to a website.

Confusing the issue slightly is that there is a separate manual action for low-quality links and that one can be lifted by Google once the links have been cleaned up. This is done with a reconsideration request in Google Search Console. And sites can be impacted by both a linking manual action and Penguin at the same time.

Incoming Links Only

Penguin only deals with a site’s incoming links. Google only looks at the links pointing to the site in question and does not look at the outgoing links at all from that site. It is important to note that there is also a Google manual action related directly to a site’s outgoing links (which is different from the regular linking manual action), so the pages and sites you link to could result in a manual action and the deindexing of a site until those links are cleaned up.

Finding Your Backlinks

If you suspect your site has been negatively impacted by Penguin, you need to do a link audit and remove or disavow the low quality or spammy links. Google Search Console includes a list of backlinks for site owners, but be aware that it also includes links that are already nofollowed. If the link is nofollowed, it will not have any impact on your site, but keep in mind, the site could remove that nofollow in the future without warning.

There are also many third-party tools that will show links to your site, but because some websites block those third-party bots from crawling their site, it will not be able to show you every link pointing at your site. And while some of the sites blocking these bots are high-quality well-known sites not wanting to waste the bandwidth on those bots, it is also being used by some spammy sites to hide their low-quality links from being reported.

Assessing Link Quality

When it comes to assessing the links, this is where many have trouble. Do not assume that because a link comes from an .edu site that it is high-quality. There are plenty of students who sell links from their personal websites on those .edu domains which are extremely spammy and should be disavowed. Likewise, there are plenty of hacked sites within .edu domains that have low-quality links.

 

Do not make judgments strictly based on the type of domain. While you can’t make automatic assumptions on .edu domains, the same applies to all TLDs and ccTLDs. Google has confirmed that just being on a specific TLD it does not help or hurt the search rankings. But you do need to make individual assessments. There is a long running joke about how there’s never been a quality page on a .info domain because so many spammers were using them, but in fact, there are some great quality links coming from that TLD, which shows why individual assessment of links is so important.

Beware of Links from Presumed High-Quality Sites


Do not look at the list of links and automatically consider links from specific websites as being a great quality link, unless you know that very specific link is high quality. Just because you have a link from a major website such as Huffington Post or the BBC does not make that an automatic high-quality link in the eyes of Google – if anything, you should question it more.

Many of those sites are also selling links, albeit some disguised as advertising or done by a rogue contributor selling links within their articles. These types of links from high-quality sites actually being low-quality has been confirmed by many SEOs who have received link manual actions that include links from these sites in Google’s examples. And yes, they could likely be contributing to a Penguin issue.

As advertorial content increases, we are going to see more and more links like these get flagged as low-quality. Always investigate links, especially if you are considering not removing any of them simply based on the site the link is from.

Promotional Links

As with advertorials, you need to think about any links that sites may have pointed to you that could be considered promotional links. Paid links do not always mean money is exchanged for the links.

Examples of promotional links that are technically paid links in Google’s eyes are any links given in exchange for a free product for review or a discount on products. While these types of links were fine years ago, they now need to be nofollowed. You will still get the value of the link, but instead of it helping rankings, it would be through brand awareness and traffic. You may have links out there from a promotional campaign done years ago that are now negatively impacting a site.

For all these reasons, it is vitally important to individually assess every link. You want to remove the poor quality links because they are impacting with Penguin or could cause a future manual action. But you do not want to remove the good links, because those are the links that are helping your rankings in the search results.

Promotional links that are not nofollowed can also trigger the manual action for outgoing links on the site that placed those links.

Editor Note: Removing links and submitting a disavow request is also covered in more detail in the ‘What to Do When Things Go Wrong‘ section of our SEO Guide.

Link Removals

Once you have gone through your backlinks and determined that there are some that should be removed or disavowed, you will need to get these links removed. You should first approach site owners and ask them to remove the links pointing to your site. If removals are unsuccessful, add those URLs to a disavow file, one you will submit to Google.

There are tools that will automate the link removal requests and agencies that will handle the requests as well, but do not feel it is necessary to do this. Many webmasters find contact forms or emails and will do it themselves.

Some site owners will demand a fee to remove a link from a site, but Google recommends not paying for link removals. Just include them in your disavow file instead and move onto the next link removal. Some site owners are using link removals to generate revenue, so the practice is becoming more common.

Creating and Submitting a Disavow File

The next step in cleaning up Penguin issues is to submit a disavow file. The disavow file is a file you submit to Google that tells them to ignore all the links included in the file so that they will not have any impact on your site. The result is that the negative links will no longer cause negative ranking issues with your site, such as with Penguin, but it does also mean that if you erroneously included high-quality links in your disavow file, those links will no longer help your ranking. This is another reason why it is so crucial to check your backlinks well before deciding to remove them.

If you have previously submitted a disavow file to Google, they will replace that file with your new one, not add to it. So it is important to make sure that if you have previously disavowed links, you still include those links in your new disavow file. You can always download a copy of the current disavow file in Google Search Console.

 

Disavowing Individual Links Versus Domains

It is recommended that you choose to disavow links on a domain level instead of disavowing the individual links. There will be some cases where you will want to disavow individually specific links, such as on a major site that has a mix of quality versus paid links. But for the majority of links, you can do a domain based disavow. Then, Google only needs to crawl one page on that site for that link to be discounted on your site.

Doing domain based disavows also means that you are do not have to worry about those links being indexed as www or non-www, as the domain based disavow will take this into account.

What to Include in a Disavow File

You do not need to include any notes in your disavow file, unless they are strictly for your reference. It is fine just to include the links and nothing else. Google does not read any of the notations you have made in your disavow file, as they process it automatically without a human ever reading it. Some find it useful to add internal notations, such as the date a group of URLs was added to the disavow file or comments about their attempts to reach the webmaster about getting a link removed.

Once you have uploaded your disavow file, Google will send you a confirmation. But while Google will process it immediately, it will not immediately discount those links. So you will not instantly recover from submitting the disavow alone. Google still needs to go out and crawl those individual links you included in the disavow file, but unfortunately the disavow file itself will not prompt Google to crawl those pages specifically.

It can take six or more months for all those individual links to be crawled and disavowed. And no, there is no way to determine which links have been discounted and which ones have not been, as Google will still include both in your linking report in Google Search Console.

Speeding Up the Disavow Process

There are ways you can speed up the disavow process. The first is using domain based disavows instead of individual links. And the second is to not waste time include lengthy notations for Google’s benefit so that you can submit your disavow faster. Because reconsideration requests require you to submit more details, some misunderstand and believe the disavow needs more details, too.

 

Lastly, if you have undergone any changes in your domain, such as switching to https or switching to a new domain, you need to remember to upload that disavow file to the new domain property in Google Search Console. This is one step that many forget to do, and they can be impacted by Penguin or the linking manual action again, even though they have cleaned it up previously.

Recovery from Penguin

When you recover from Penguin, do not expect your rankings to go back to where they used to be before Penguin, nor for the return to be immediate. Far too many site owners are under the impression that they will immediately begin ranking at the top for their top search queries once Penguin is lifted.

First, some of the links that you disavowed were likely contributing to an artificially high ranking, so you cannot expect those rankings to be as high as they were before. Second, because many site owners have trouble assessing the quality of the links, some high-quality links inevitably get disavowed in the process, links that were contributing to the higher rankings.

Add to the mix the fact Google is constantly changing their ranking algorithm, so factors that benefited you previously might not have as big of an impact now, and vice versa.

Compensated Links via Badges and More

Also be aware of any link building campaigns you are doing, or legacy ones that could come back to impact your site. This would include things like badges you have given to other site owners to place on their sites or the requirement that someone includes a link to your site to get a directory listing or access something. In simple terms, if the link was placed in exchange for anything, it either needs to be nofollowed or disavowed.

When it comes to disavowing files that people are using to clean up poor quality links, there is a concern that a site could be hurt by competitors placing their URLs into a disavow file uploaded to Google. But Google has confirmed that they do not use the URLs contained within a disavow file for ranking, so even if your site appears in thousands of disavows, it will not hurt. That said, if you are concerned your site is legitimately appearing in thousands of disavows, then your site probably has a quality issue you should fix.

Negative SEO

There is also the negative SEO aspect of linking, where some site owners worry that a competitor could buy spammy links and point them to their site. And many use negative SEO as an excuse when their site gets caught by Google for low-quality links.

If you are worried about this, you can proactively disavow the links as you notice them. But Google has said they are pretty good about recognizing this when it happens, so it is not something most website owners need to worry about.

Real Time Penguin

Google is expected to release a new version of Penguin soon, which will have one very notable change. Instead of site owners needing to wait for a Penguin update or refresh, the new Penguin will be real-time. This is a huge change for those dealing with the impact of spamming links and the weights many have had to endure after cleaning up.

Hummingbird

Hummingbird

Google Hummingbird is part of the main Google search algorithm and was the first major change to their algorithm since 2001. But what is different about Hummingbird is that this one is not specifically a spam targeting algorithm, but instead an algorithm to ensure they are serving the best results for specific queries. Hummingbird is more about being able to understand search queries better, particularly with the rise of conversational search.

 

It is believed that Hummingbird is positively impacting the types of sites that are providing high-quality content that reads well to the searcher and is providing answers to the question the searcher is asking, whether it is implied or not.

Hummingbird also impacts long-tailed search queries, similarly to how Rank Brain is also helping those types of queries. Google wants to ensure that they can provide high-quality results for the longer queries. For example, instead of sending a specific question related to a company to the company’s homepage, Google will try to serve an internal page on the site about that specific topic or issue instead.

Hummingbird cannot be optimized for, outside of optimizing for the rise of conversational search. Longer search queries, such as what we see with voice search, and the types of queries that searchers tend to do on mobile are often highlighted with a conversational search. And optimizing for conversational search is easier than it sounds. Make sure your content is highly readable and can answer those longer tail queries as well as shorter tail ones.

Like Rank Brain, Hummingbird had been released for a period before it was announced, and SEOs did not particularly notice anything different regarding the rankings. It is not known how often Hummingbird is updated or changed by Google.

Source :https://www.searchenginejournal.com/seo-guide-panda-penguin-hummingbird/169167/

Categorized in Search Engine

Google has produced a car that drives itself and an Android operating system that has remarkably good speech recognition. Yes, Google has begun to master machine intelligence. So it should be no surprise that Google has finally started to figure out how to stop bad actors from gaming its crown jewel – the Google search engine. We say finally because it’s something Google has always talked about, but, until recently, has never actually been able to do.

With the improved search engine, SEO experts will have to learn a new playbook if they want to stay in the game.

SEO Wars

In January 2011, there was a groundswell of user complaints kicked off by Vivek Wadwa about Google’s search results being subpar and gamed by black hat SEO experts, people who use questionable techniques to improve search-engine results. By exploiting weaknesses in Google’s search algorithms, these characters made search less helpful for all of us.

We have been tracking the issue for a while. Back in 2007, we wrote about Americans experiencing “search engine fatigue,” as advertisers found ways to “game the system” so that their content appeared first in search results (read more here). And in 2009, we wrote about Google’s shift to providing “answers,” such as maps results and weather above search results.

Even the shift to answers was not enough to end Google’s ongoing war with SEO experts. As we describe in this CNET article from early 2012, it turns out that answers were even easier to monetize than ads. This was one of the reasons Google has increasingly turned to socially curated links.

In the past couple of years, Google has deployed a wave of algorithm updates, including Panda and Panda 2, Penguin, as well as updates to existing mechanisms such as Quality Deserved Freshness. In addition, Google made it harder to figure out what keywords people are using when they search.

The onslaught of algorithm updates has effectively made it increasingly more difficult for a host of black hat SEO techniques — such as duplicative content, link farming and keyword stuffing — to work. This doesn’t mean those techniques won’t work. One look into a query like “payday loans” or ‘‘viagra” proves they still do. But these techniques are now more query-dependent, meaning that Google has essentially given a pass for certain verticals that are naturally more overwhelmed with spam. But for the most part, using “SEO magic” to build a content site is no longer a viable long-term strategy.

The New Rules Of SEO

So is SEO over? Far from it. SEO is as important as ever. Understanding Google’s policies and not running afoul of them is critical to maintaining placement on Google search results.

With these latest changes, SEO experts will now need to have a deep understanding of the various reasons a site can inadvertently be punished by Google and how best to create solutions needed to fix the issues, or avoid them altogether.

Here’s what SEO experts need to focus on now:

Clean, well-structured site architecture. Sites should be easy to use and navigate, employ clean URL structures that make hierarchical sense, properly link internally, and have all pages, sections and categories properly labeled and tagged.

Usable Pages. Pages should be simple, clear, provide unique value, and meet the average user’s reason for coming to the page. Google wants to serve up results that will satisfy a user’s search intent. It does not want to serve up results that users will visit, click the back button, and select the next result.

Interesting content. Pages need to have more than straight facts that Google can answer above the search results, so a page needs to show more than the weather or a sports score.

No hidden content. Google sometimes thinks that hidden content is meant to game the system. So be very careful about handling hidden items that users can toggle on and off or creative pagination.

Good mobile experience. Google now penalizes sites that do not have a clean, speedy and presentable mobile experience. Sites need to stop delivering desktop web pages to mobile devices.

Duplicate content. When you think of duplicate content you probably think of content copied from one page or site to another, but that’s not the only form. Things like a URL resolving using various parameters, printable pages, and canonical issues can often create duplicate content issues that harm a site.

Markup. Rich snippets and structured data markup will help Google better understand content, as well as help users understand what’s on a page and why it’s relevant to their query, which can result in higher click-through rates.

Google chasing down and excluding content from bad actors is a huge opportunity for web content creators. Creating great content and working with SEO professionals from inception through maintenance can produce amazing results. Some of our sites have even doubled in Google traffic over the past 12 months.

So don’t think of Google’s changes as another offensive in the ongoing SEO battles. If played correctly, everyone will be better off now.

https://techcrunch.com/2013/08/03/won-the-seo-wars-google-has/

Categorized in Search Engine

 

CatholicGoogle, a new site based on Google’s custom search, is “striving to provide an easy to use resource to anyone wanting to learn more about Catholicism and provide a safer way for good Catholics to surf the web.” The site uses a permanently-on Google SafeSearch to filter out profanity and pornography, along with a filter for specific topics that floats Catholic-related sites to the top. For example, a search for “birth control” serves up pages on why birth control is viewed as a sin in the Catholic Church as its first results.

The search engine might appeal to some devout Catholics if it actually worked. However, it seems that when it comes to filtering topics beyond the standard “offensive” categories (swear words and sex) , CatholicGoogle only serves to make queries potentially more offensive. A search for “drunk” yields a video of “Drunk Catholic Kids”. Perhaps even more bizarre: a search for “sex” offers an article bashing the Church’s stance on sexuality (they may have included this in the results for a balanced alternative perspective, but I doubt it). It’s as if the site just appends the word “Catholic” to whatever you’re searching for and crosses its fingers.

If this is your sort of thing, you might also be interested in GodTube, the YouTube for Christians or Gospelr (you guessed it – the Twitter for Christians).

https://techcrunch.com/2009/01/02/catholicgoogle-your-search-engine-for-all-things-catholic/

 

Categorized in Search Engine

Google dedicated its 31 July doodle to mark the 136th birth anniversary of acclaimed novelist Munshi Premchand.

"Today's homepage celebrates a man who filled many pages (of a different kind) with words that would forever change India's literary landscape," said the search engine.

"Although much of it was fiction, Premchand's writing often incorporated realistic settings and events, a style he pioneered within Hindi literature," Google said.

"His last and most famous novel, Godaan (1936), inspired the doodle, which shows Premchand (sometimes referred to as "Upanyas Samrat," or, "emperor among novelists") bringing his signature working-class characters to life. On what would have been his 136th birthday, the illustration pays tribute to the multitude of important stories he told," the description to the Google doodle reads.

Here are eight interesting facts about the 'Upanyas Samrat':

  1. Born Dhanpat Rai in a small village near Varanasi in 1880, the renowned author started writing at the age of 13.
  2. Many of his early works are in Urdu. His began writing in Hindi in 1914, with his first short story, Saut, being unveiled in 1915.
  3. He began writing under the pen name of Nawab Rai before switching to Premchand. He produced more than a dozen novels, 250 short stories, and a number of essays, many under the pen name Premchand.
  4. Munshi Premchand worked as a teacher in a government district school in Baharaich. He quit his teaching job to join the non-cooperation movement in 1920.
  5. His works are believed to have been largely influenced by Mahatma Gandhi and the non-cooperation movement.
  6. His 1924 short story Shatranj Ke Khiladi (The Chess Players) was made into a film. The 1977 Satyajit Ray-directorial, starring Sanjeev Kumar, Shabana Azmi, Farida Jalal, Farooq Shaikh, among others, was narrated by Amitabh Bachchan. It won the National Film Award for Best Feature Film in Hindi that year.
  7. Not many know that Premchand also wrote the script for the film Majdoor.
  8. His work Godaan is considered one of the greatest Hindustani novels of modern Indian literature. The book was later translated into English and also made into a Hindi film in 1963.

http://www.catchnews.com/tech-news/google-doodle-marks-munshi-premchand-s-136th-birth-anniversary-8-lesser-known-facts-about-the-emperor-among-novelists-1469948223.html/fullview

Categorized in Search Engine

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now