[This article is originally published in searchengineland.com written by Greg Sterling - Uploaded by AIRS Member: Eric Beaudoin]

Facebook users turned to Google search or went directly to publishers.

An interesting thing happened on August 3. Facebook was down for nearly an hour in Europe and North America. During that time, many users who were shut out of their Facebook News Feeds went directly to news sites or searched for news.

Direct traffic spikes during a Facebook outage. According to data presented by Chartbeat at the recent Online News Association conference, direct traffic to news publisher sites increased 11 percent (in large part from app-driven traffic), and search traffic (to news sites) increased 8 percent during the outage that occurred a little after 4:00 p.m., as shown in the chart above.

According to late 2017 data from the Pew Research Center:

Just under half (45 percent) of U.S. adults use Facebook for news. Half of Facebook’s news users get news from that social media site alone, with just one-in-five relying on three or more sites for news.

Algorithm change sent people back to search. From that perspective, it makes sense that when Facebook is unavailable, people will turn to direct sources to get news. Earlier this year, however, Facebook began to “fix” the News Feed by minimizing third-party “commercial content.” This impacted multiple entities, but most news publishers saw their referral traffic from Facebook decline, a pattern that predated the algorithm change.

Starting in 2017, there’s evidence that as Facebook referrals have declined, more people have turned to Google to obtain their news fix. Users no longer able to get news as easily from Facebook are going to Google or directly to news sources to get it.

Why it matters to marketers. The trends shown in this chart underscore opportunities for content creators to capitalize on well-optimized pages (and possibly ads) to reach news-seeking audiences in search. It also highlights programmatic and direct-buying ad opportunities for marketers to reach these audiences on publisher sites.

Categorized in News & Politics

According to a new study published today from the American Civil Liberties Union, major social networks including Twitter, Facebook and Instagram have recently provided user data access to Geofeedia, the location-based, social media surveillance system used by government offices, private security firms, marketers and others.

As TechCrunch previously reported, Geofeedia is one of a bevy of technologies used, secretly, by police to monitor activists and the contents of their discussions online.

The ACLU said in a blog post that both Twitter and Facebook (which owns Instagram) made some immediate changes in response to their study’s findings.

“Instagram cut off Geofeedia’s access to public user posts, and Facebook cut its access to a topic-based feed of public user posts,” the ACLU said.

The ACLU also noted in their post:

“Neither Facebook nor Instagram has a public policy specifically prohibiting developers from exploiting user data for surveillance purposes. Twitter does have a ‘longstanding rule’ prohibiting the sale of user data for surveillance as well as a Developer Policy that bans the use of Twitter data ‘to investigate, track or surveil Twitter users.’”

On Tuesday, following the publication of the ACLU findings, Twitter announced that it would “immediately suspend Geofeedia’s commercial access to Twitter data"

A Facebook spokesperson tells TechCrunch:

Advertisment

become-an-internet-research-specialist

“[Geofeedia] only had access to data that people chose to make public. Its access was subject to the limitations in our Platform Policy, which outlines what we expect from developers that receive data using the Facebook Platform. If a developer uses our APIs in a way that has not been authorized, we will take swift action to stop them and we will end our relationship altogether if necessary.”

It’s worth noting that Facebook’s platform policy generically limits developers.

For example, it says developers are not permitted to “sell, license, or purchase any data obtained” from Facebook or its services. And they can’t transfer data they get from Facebook, including “anonymous, aggregate, or derived data,” to any data brokers. Finally, developers are not permitted to put Facebook data into any search engines or directories without the social network’s explicit permission.

We have reached out to Geofeedia for comment but executives were not immediately available for an interview.

A public relations consultant for Geofeedia sent a lengthy statement, attributed to Geofeedia CEO Phil Harris, defending the company’s practices in general. An excerpt follows:

“Geofeedia is committed to the principles of personal privacy, transparency and both the letter and the spirit of the law when it comes to individual rights. Our platform provides some clients, including law enforcement officials across the country, with a critical tool in helping to ensure public safety…

Geofeedia has in place clear policies and guidelines to prevent the inappropriate use of our software; these include protections related to free speech and ensuring that end-users do not seek to inappropriately identify individuals based on race, ethnicity, religious, sexual orientation or political beliefs, among other factors.

That said, we understand, given the ever-changing nature of digital technology, that we must continue to work to build on these critical protections of civil rights.”

Update: A company statement from Geofeedia was added to this post after it was originally published. 

Source : https://techcrunch.com

Categorized in Search Engine

Facebook wants you to lean back and watch its News Feed videos on your television with a new feature that lets you stream clips via Apple TV, AirPlay devices, Google Chromecast, and Google Cast devices. The move could help Facebook generate more video ad revenue, and increase usage time by giving people the richest possible viewing experience while at home.

The feature is now available on iOS and will come to Android soon. To use it, just find a video in the feed on your phone or desktop, tap the TV button in the top right, and then select the device you want to stream through.

Advertisment

become-an-internet-research-specialist

You can keep scrolling through the feed and using Facebook while the video continues to stream. That allows Facebook to become both the first and second screen, a strategy Periscope is pursuing differently by allowing professional content broadcasts to be piped into Periscope and Twitter via its new Producer feature.

 

Facebook started testing streaming to televisions from Android back in May and iOS in August. The company actually added a way to cast via AirPlay from its iPad app back in 2011, so it’s strange that it’s taken this long to come to web and mobile.

 

Competitors like YouTube and Periscope already have ways to stream onto televisions, and today’s launch could make sure Facebook doesn’t fall behind. YouTube lets you create a queue of videos on the fly to watch sequentially, which seems like a sensible next feature for Facebook to add.

The goal for Facebook is always ubiquity, so it’s embracing as many viewing platforms as possible. While its most popular surface and core money maker might remain mobile for a long time, VR and TVs could give Facebook an even bigger presence in our lives.

Source : techcrunch.com

Categorized in Search Engine

There are so many information portals on the web for health information, it can be tough to decipher which one is the best resource to answer a medical question. NetBase Solutions has launched healthBase, a powerful semantic search engine that aggregates medical content from millions of authoritative health sites including WebMD, Wikipedia, PubMed, and the Mayo Clinic’s health site.

HealthBase uses NetBase’s proprietary search intelligence technology to read sentences inside documents and linguistically understand the meaning of the content. Thus, healthBase’s search engine can automatically find treatments for any health condition or disease; the pros and cons of any treatment, medication and food, and more.

The search engine’s results are impressive. When you type in a search for the available treatments for diabetes, you are given results that are broken down by 63 drugs and medications used to treat the disease, 70 common treatments for diabetes, and 20 appropriate food and plants for the treatment of diabetes. You can also see the pros and cons of certain treatments. Search results appear disarmingly fast and will take you to the appropriate site where the content and information is hosted.

There’s no doubt that this is a useful site to tap into the vast variety of health information there is on the web, but I find the site to be slightly impersonal. Medical information, which can be daunting and sterile, is sometimes best served with a human touch on the web, especially when it comes to consumer knowledge. Medpedia is a good example of a site that contains a large amount of content that also has a social element.

But healthBase serves a valid purpose as an aggregator of medical content and will surely help those looking for a comprehensive research tool. Parent company NetBase won’t serve advertising on the site but monetizes its technology by powering internal search engines for companies that have large databases of content. Healthbase is a public demonstration of its technology.

https://techcrunch.com/2009/09/02/healthbase-is-the-ultimate-medical-content-search-engine/

Categorized in Search Engine

 

As of late June, 32.5% of page one Google results now use the HTTPS protocol, according to a new study from Moz.The esteemed Dr Pete published a blog post this week on the data they’ve been tracking in the two year period since Google announced HTTPS was to be a light ranking signal in August 2014.

The results are definitely enough to give SEOs pause for thought when it comes to considering whether to switch their sites to a secure protocol.

What is HTTPS?

In case you need a refresher, here is Jim Yu’s explanation of the difference between http and HTTPS:

HTTP is the standard form used when accessing websites. HTTPS adds an additional layer of security by encrypting in SSL and sharing a key with the destination server that is difficult to hack.

And here is Google’s 2014 announcement:

“We’re starting to use HTTPS as a ranking signal. For now, it’s only a very lightweight signal, affecting fewer than 1% of global queries, and carrying less weight than other signals, such as high-quality content.”
But over time, the promise that Google would strengthen the signal “to keep everyone safe on the Web” seems to be coming true…

HTTPS as a ranking signal in 2014

Searchmetrics found little difference between HTTP and HTTPS rankings in the months after the initial Google announcement. Hardly surprising as it did only affect 1% of results.Moz also saw very little initial difference. Prior to August 2014, 7% of page one Google results used HTTPS protocol. A week after the update announcement, that number increased to 8%.

So we all went about our business, some of us implemented, some of us didn’t. No big whoop. It’s not like it’s AMP or anything! Amirite?

SMASH CUT TO:

HTTPS as a ranking signal in 2016

Moz has found that one-third of page one Google results now use HTTPS.

moz https results

 

As Dr Pete points out, due to the gradual progression of the graph, this probably isn’t due to specific algorithm changes as you would normally see sharp jumps and plateaus. Instead it may mean that Google’s pro-HTTPS campaign has been working.

“They’ve successfully led search marketers and site owners to believe that HTTPS will be rewarded, and this has drastically sped up the shift.”
Projecting forward it’s likely that in 16–17 months time, HTTPS results may hit 50% and Dr Pete predicts an algorithm change to further bolster HTTPS in about a year.

Source:  https://searchenginewatch.com/2016/07/07/https-websites-account-for-30-of-all-google-search-results/

 

 

Categorized in Search Engine

I am not sure where these myths come from, but someone asked Google's John Mueller in Friday's Google Hangout on Google+ at the 11:15 minute mark if it is something Google may penalize for if the site doesn't link out to other sites. The person said they heard Google issues penalties when site's don't link to other external sites.

John Muller from Google quickly said there is no such penalty.

The question asked was:

I heard that there is a penalty if I don’t link out from my domain to different domain from any of my pages. Is that truth? Is not linking out from any of my page harmful?

John responded:

No that's not correct.

So there is no penalty for not linking out, that's definitely not the case.
Obviously for users sometimes it makes sense to provide references and other websites that they can visit to to get more information on certain topics. So I think from a user experience point of view it's probably a good idea to have links on your pages.

But surely from a web spam point of view, from a Google indexing point of view you don't need to to put links on your pages.

John also told us there is no SEO benefit to link out, which was refuted by an SEO study later. There is definitely a fear of linking out which is sad.

John added his personal thoughts saying:

So I guess for me from a personal point of view, I really like to see links on other pages because it really kind of helps to keep the web a vibrant and that people go off and visit other things from time to time and they want to see different view points for the same type of information so that's something I certainly wouldn't suppress.

Which echoed what Gary Illyes said in a more "PC" way, where Gary said it's stupid not to link out and it makes him angry.

Source:  https://www.seroundtable.com/google-link-externally-penalty-22362.html

Categorized in Search Engine

Now, I do not know if authorship was ever a ranking signal but I assume Google tested it to see if it should be. And we now know Google said it is safe to remove authorship markup from your pages and they also said they don't know who authored something on your site.

But Google has said over the years, I remember Matt Cutts calling out certain authors as awesome and can help your site rank for stuff if you get them to write for you. But I guess that was tested and it didn't play into Google's definition of what is quality content.

John Mueller addressed the question again on Friday's Google Hangout on Google+. He said that Google doesn't know who wrote the article on your site and even if you do have a great writer, write something on your site, it might not be something great that he wrote. So each article needs to "stand on their own," John said.

He said this at the 36:41 minute mark.
Question:

Previously, you said you didn't know really who wrote an article. Does it mean it's not a ranking factor who created content? Danny Sullivan is a great author. If he guest-posted on my blog, wouldn't you think the article would be great because it's made by him?

Answer:

Probably we wouldn't know that. I mean maybe the article is great and it would rank essentially on its own or based on kind of the feedback that we see from users with regards to recommendations like links. But just because a well-known author publishes on someone else's blog doesn't automatically make that blog post really relevant. So it might be that Danny Sullivan post something on some totally random blog and we don't realize that and other users don't realize that, then that's something that might my kind of get lost like that.

So that's also something we're just because one person person wrote it doesn't necessarily mean that the quality will always be really high. So we shouldn't like assume that just because it has maybe Danny Sullivan's author markup on that page that this article is suddenly really valuable and should be raking very high. So from from that point of view at these pages these articles that are written by people they really have to be able to stand on their own.

So I guess, maybe, Google tested to see if who writes something is a good ranking signal? Or maybe, SEOs faked authorship and killed it and Google couldn't use it?

Source:  https://www.seroundtable.com/google-dropped-authorship-as-a-ranking-signal-why-22364.html

Categorized in Search Engine

Google's John Mueller covered lots and lots of myths this past Friday in the Google Hangout on Google+. He said at the 34:37 minute mark that having short articles won't give you a Google penalty. He also said that even some long articles can be confusing for users. He said that short articles can be great and long articles can be great - it is about your users, not search engines.

The question posed was:

My SEO agency told me that the longer the article I write, the more engaged the user should be or the Google will penalize me for this. I fear writing longer articles with lots of rich media inside because of this, is my SEO agency correct or not?

Back in 2012, Google said short articles can rank well and then again in 2014 said short articles are not low quality. John said in 2016:

So I really wouldn't focus so much on the length of your article but rather making sure that you're actually providing something useful and compelling for the user. And sometimes that means a short article is fine, sometimes that means a long article with lots of information is fine.

So that's something that you essentially need to work out between you and your users.
From our point of view we don't have an algorithm that council words on your page and says, oh everything until a hundred words is bad everything between 200 and 500 is fine and over 500 needs to have five pictures. We don't look at it like that.

We try to look at the pages overall and make sure that this is really a compelling and relevant search results to users. And if that's the case then that's perfectly fine. If that's long or short or lots of images or not, that's essentially up to you.

Sometimes I think long articles can be a bit long winding and my might lose people along the way. But sometimes it's really important to have a long article with all of the detailed information there. That's really something that maybe it's worth double checking with your user is doing some a/b testing with them. Maybe getting their feedback in other ways are like sometimes you can put like the stars on the page do you have a review that or use maybe Google consumer surveys to get a quick kind of a sample of how your users are reacting to that content. But that's really something between you and your users and not between you and and Google search engine from that point of view.

I specifically did the Google Consumer Surveys approach when I was hit by the Panda 4.1 update, which I recovered from on Panda 4.2. I even published my results for all to see over here and it showed, people, my readers, like my short content.

So it really isn't about how short, tall, long or detailed you are. As long as the content satisfies the user, Google should be satisfied too.

Sources:  https://www.seroundtable.com/google-short-articles-penalty-22363.html

Categorized in Search Engine

Google is increasing the number of queries that receive a Google Quick Answer box. The number of results that had an answer box went from just over 20% in December 2014 to more than 30% in May 2016.*Brands that wish to maintain a strong digital presence need to make sure their website is well represented within these rich answers.

Answer boxes provide users with scannable, easy-to-digest answers at the top of the search results so that users can find the information they seek without having to click off to another website.These answer boxes are pulled from high-ranking websites that Google trusts to provide users with the correct response. They appear most frequently in response to question queries, such as those beginning with ‘what is’ or ‘how to’.

As they become increasingly significant on SERPs, companies who are not optimized to receive Quick Answers have a good chance of falling behind and losing ground to others in their industry.

How do Google Quick Answers impact brands?

When Quick Answers first appeared, many site owners became nervous about the potential implications for site traffic. With the answer to many queries appearing right at the top of the page, users would theoretically lose their motivation to click through to the websites.

Some sites found this to be true. Wikipedia, for example, saw a drop in traffic that many attributed to the growth of Quick Answers. This is likely because the domain specializes in providing people with the type of rapid response that many can now receive right on the SERP.

However many business websites started to see tremendously positive results.

It is important to remember that Quick Answers are not just taken from results in position 1 on the SERP. The can come from any result on the page, although the majority come from the top 5 results.This means however that sites ranked in position 3 or 4 can receive an answer box and suddenly be front and centre on the page, without even earning the top ranking spot. This draws the user’s attention to this result and can have a very positive impact on site success.

Adobe, for example, benefited from a 17% incremental lift on topics on which it has secured the Quick Answer box. The results contributed to millions of additional visitors to Adobe.com.Kirill Kronrod at Adobe reported that within the sub-set of 2,000 How-To phrases, 60% produced Quick Answers, contributing to 84% share of voice with Quick Answer boxes for the main site and 98% including supporting sites.

Quick Answers help Google improve the user experience, and your brand needs to optimize to remain relevant.

The ABCs of succeeding with Google Quick Answers

quick answers

A) Understand the four key factors that matter for Quick Answers

Although there is no concrete formula that brands have to meet before they will receive a Quick Answer, there are a few commonalities that sites which earn the answer box tend to have.

Sites have over 1,000 referring domains
Pages rank in the top 5
Pages are less than 2,000 words
Pages have strong user engagement
All of these factors demonstrate to Google that you have a site appreciated by users and that offers value to readers. These factors show that you offer an authoritative resource, making you appealing to Google.

B) Find the best opportunities to explore

It is important to find opportunities where you have a reasonable chance of gaining an answer box.Since only one site can have it at a time, you need to have the domain authority and response needed to make your page stand out. SEO software can be an enormous asset in this quest.

You can research which keywords have high traffic and which ones already have Quick Answers. If the keyword already has a Quick Answer, you will need to investigate the page to see if you can outperform it.

If it does not, then you can see if an answer box would be the optimal display for the user. Make sure that the pages you select to optimize for the Quick Answers will lend themselves easily to you fulfilling the four key factors.

C) Optimize your site for the answer box

On-page optimization: you will need to follow on-page optimization best practices to improve the ranking of your site. These will include using your target keyword in titles and headings, linking to other pages in your site, and making your page more engaging with images and other rich media.

Remember that Google wants to be able to pull the answer quickly from your text, so include the answer to the target question in the first paragraph and use lists and bullets – which are appealing both for users and search engines – where possible.

Off-page optimization: you want to focus on cultivating backlinks, so look for opportunities to write guest posts to bring links to your site. It is also important to develop a thorough content distribution system that will attract attention to your content.

When people are exposed to your content and it provides them with value, they become more likely to share it with others and link back to it themselves. For off-page optimization, you also want to submit pages to Google Search Console to maximize visibility.

Technical optimization: use schema markup to increase visibility for your site. Schema was developed as means of providing search engines with an optimal look at your site. It will help the search engines quickly interpret your material, which will aid Google in its quest to quickly pull answers from websites.

Of course, your pages should also be optimized for mobile, since not being mobile-friendly can hurt sites in the SERPs and hinder the user experience. Also include your page in your XML sitemap to ensure that Google can easily find and interpret the material.

Google Quick Answers offer users an improved user experience, making them popular with the search engine. To remain relevant for customers, you need to follow these ABCs and ensure your site is optimized to provide the answer box for the key terms that are important for your business. For more information, check out our Quick Answers pdf guide.

Source:  https://searchenginewatch.com/2016/07/05/the-abc-of-google-quick-answers/

Categorized in Search Engine

The world has changed since Biblical times, but only so much. (For starters, we’re back to using tablets.)

Before, there was one Goliath and one David. Now, there are few a “Googles” and a few “Dropboxes” who are taking them on. But if not a sling and a handful of smooth stones – what sets the successful challengers apart from the rest?

It’s tempting to point to the founders themselves, and it would be accurate to do so. But even if an entrepreneur brings the right levels of passion, precision, talent, and temerity, these characteristics count for little unless they’re put to work executing the right business model.

Though the iterative process of developing a business model is standard practice these days, undertaking the effort does not guarantee success. Far from it. We often see evidence of this when a well-funded startup meets its demise or a former Goliath like Yahoo goes into decline – just as we did in the wreckage littered across the tech landscape after the dot-com bubble burst.

We might not be in the midst of a bursting bubble at the moment, but a shift in the business cycle is clearly underway.

techbubble

One recent survey of venture-backed entrepreneurs found that 80 percent of them got what they wanted – or more – during their most recent round of funding. Yet going forward, more than 97 percent of them expected that this year, it will either get harder to raise capital (66 percent) or remain the same (31 percent).

These entrepreneurs are onto something, because the National Venture Capital Association reported that VCs raised 11 percent less in the first quarter of 2016 compared with Q1 2015 than they did the year before.

So in addition to all the usual pressures founders find themselves under, they’re now entering an environment where capital is harder to come by. The wisest among them will be closely examining how they plan to succeed in it – especially when the “Googles” of the world will continue to have the upper hand in both human and financial resources.

A possible answer is to change the business model. Another will be to prove that the current model can expand profit margins and sales volumes, without relying on costly marketing plans to expand the customer base.

Screen Shot 2016-06-21 at 10.29.10 AM

Neither option presents an easy task, and both require a shift in mindset – away from doing what is necessary to get to the next funding round and toward doing what is necessary to build an enduring business.

Setting aside the specifics of the product or service, or the industry, two of the most important steps in the process of developing and executing a business model are selecting customers and doing what is required to keep them – especially when disaster strikes. (It always does.)

For companies serving enterprise customers, there are two ways to become a big fish: Go after other big fish or go after lots of small fish.

Many entrepreneurs overlook the latter, when they should have learned a lesson from blue whales. At a hundred feet long, earth’s largest animal survives on a daily diet of tens of millions of krill that measure in less than an inch. It’s less like B-to-B and more like B-to-billions.

Even though there aren’t billions of bite-sized businesses to go after, in the United States alone the small business market is massive. They number 28 million, employ more than 56 million, and generate 46 percent of nonfarm GDP. Meanwhile, they have more room to grow and less bureaucracy to deal with.

What makes small businesses so compelling – especially in the face of headwinds like a deflating bubble – is that by virtue of their numbers alone, they present less risk and potentially more reward.

I’ve seen too many ambitious emerging companies successfully recruit a colossal enterprise customer, only to be crushed by customization requests, service demands, and SOPs. But I’ve also been fortunate to work with a few entrepreneurs who got it right.

One of our portfolio companies, AdRoll, found that building their ad retargeting platform with small business customers in mind gave them the insight and resilience to withstand their toughest challenge yet: Google entering the market with a competing product.

CEO Aaron Bell put it this way in Foundation Capital’s latest Start-Up Story: “I had always heard stories of how Google just makes a little move and wipes out a hundred companies. So I had this preconceived fear of Google knocking us out.”

As intimidating as it can be for a new company to make inroads into a new market only to see a huge player follow on, Aaron identified that Google had done them a favor by validating their platform. He focused his team on creating value for customers in a way that Google couldn’t – iterating quickly to meet the needs of new customers that Google had brought to the market, while creating tailored solutions for customers they were keen to keep.

Staying calm and staying the course in that tough moment certainly has paid off in the long run. Today, AdRoll serves more than 20,000 brands in 150 countries – seizing the mantle of “the world’s most widely used retargeting platform.”

Just as Dropbox continued to grow in the years after the launch of Google Drive, AdRoll is succeeding despite competition from Google in the same way that David beat Goliath: Not through brute strength, but with the best strategy. It goes to show that when it comes to the size of your customers’ businesses, it’s possible to think small on your way to growing big.

Source:  https://techcrunch.com/2016/06/21/to-compete-with-google-think-small/

Categorized in Search Engine
Page 1 of 3

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now