As always, when Google releases a new update to its search algorithm, it’s an exciting (and potentially scary) time for SEO. Google’s latest update, BERT, represents the biggest alteration to its search algorithm in the last five years.

So, what does BERT do?

Google says the BERT update means its search algorithm will have an easier time comprehending conversational nuances in a user’s query.

The best example of this is statements where prepositional words such as ‘to’ and ‘for’ inform the intent of the query.

BERT stands for Bidirectional Encoder Representations from Transformers, which is a language processing technique based on neural networking principles.

Google estimates the update will impact about 10% of United States-based queries and has revealed BERT can already be seen in action on featured snippets around the world.

How does Google BERT affect on-page SEO?

SEO practitioners can breathe a collective sigh of relief, because the Google BERT update is not designed to penalise websites, rather, only improve the way the search engine understands and interprets search queries.

However, because the search algorithm is better at understanding nuances in language, it means websites with higher-quality written content are going to be more discoverable.

Websites that have a lot of detailed ‘how-to’ guides and other in-depth content designed to benefit users are going to get the most from Google BERT. This means businesses who aren’t implementing a thorough content strategy are likely to fall behind the curve.

Basically, the BERT update follows Google’s long-running trend of trying to improve the ability of its search algorithm to accurately serve conversational search queries.

The ultimate result of this trend is users being able to perform detailed search queries with the Google voice assistant as if they were speaking to a real person.

Previous algorithm updates

While BERT may be the first major change to Google search in five years, it’s not the biggest shakeup in their history.

The prior Google PANDA and Google PENGUIN updates were both significant and caused a large number of websites to become penalised due to the use of SEO strategies that were considered ‘spammy’ or unfriendly to users.

PANDA

Google PANDA was developed in response to user complaints about ‘content farms’.

Basically, Google’s algorithm was rewarding quantity over quality, meaning there was a business incentive for websites to pump out lots of cheaply acquired content for the purposes of serving ads next to or even within them.

The PANDA update most noticeably affected link building or ‘article marketing’ strategies where low-quality content was published to content farms with a link to a business’ website attached to a keyword repeated throughout the article.

It meant that there was a significant push towards more ethical content marketing strategies, such as guest posting.

PENGUIN

Google PENGUIN is commonly seen as a follow up to the work started by PANDA, targeting spammy link-building practices and ‘black-hat’ SEO techniques.

This update was focused primarily on the way the algorithm evaluates the authority of links as well as the sincerity of their implementation in website content. Spammy or manipulative links now carried less weight. 

However, this meant that if another website posted a link to yours in a spammy or manipulative way, it would negatively affect your search rankings.

This meant that webmasters and SEO-focused businesses needed to make use of the disavow tool to inform Google what inbound links they approve of and which they don’t.

[Source: This article was published in smartcompany.com.au By LUCAS BIKOWSKI - Uploaded by the Association Member: Bridget Miller]

Categorized in Search Engine

Don't try to optimize for BERT, try to optimize your content for humans.

Google introduced the BERT update to its Search ranking system last week. The addition of this new algorithm, designed to better understand what’s important in natural language queries, is a significant change. Google said it impacts 1 in 10 queries. Yet, many SEOs and many of the tracking tools did not notice massive changes in the Google search results while this algorithm rolled out in Search over the last week.

The question is, Why?

The short answer. This BERT update really was around understanding “longer, more conversational queries,” Google wrote in its blog post. The tracking tools, such as Mozcast and others, primarily track shorter queries. That means BERT’s impact is less likely to be visible to these tools.

And for site owners, when you look at your rankings, you likely not tracking a lot of long-tail queries. You track queries that send higher volumes of traffic to your web site, and those tend to be short-tail queries.

Moz on BERT. Pete Meyers of Moz said the MozCast tool tracks shorter head terms and not the types of phrases that are likely to require the natural language processing (NLP) of BERT.

dr.pete

RankRanger on BERT. The folks at RankRanger, another toolset provider told me something similar. “Overall, we have not seen a real ‘impact’ — just a few days of slightly increased rank fluctuations,” the company said. Again, this is likely due to the dataset these companies track — short-tail keywords over long -tail keywords.

Overall tracking tools on BERT. If you look at the tracking tools, virtually all of them showed a smaller level of fluctuation on the days BERT was rolling out compared to what they have shown for past Google algorithm updates such as core search algorithm updates, or the Panda and Penguin updates.

Here are screenshots of the tools over the past week. Again, you would see significant spikes in changes, but these tools do not show that:

mozcast 800x348

serpmetrics 800x308

algoroo 800x269

advancedwebranking 800x186

accuranker 800x245

rankranger 800x265

semrush 800x358

SEO community on BERT. When it comes to individuals picking up on changes to their rankings in Google search, that also was not as large as a Google core update. We did notice chatter throughout the week, but that chatter within the SEO community was not as loud as is typical with other Google updates.

Why we care. We are seeing a lot of folks asking about how they can improve their sites now that BERT is out in the wild. That’s not the way to think about BERT. Google has already stated there is no real way to optimize for it. Its function is to help Google better understand searchers’ intent when they search in natural language. The upside for SEOs and content creators is they can be less concerned about “writing for the machines.” Focus on writing great content — for real people.

Danny Sullivan from Google said again, you cannot really optimize for BERT:

johan

Continue with your strategy to write the best content for your users. Don’t do anything special for BERT, but rather, be special for your users. If you are writing for people, you are already “optimizing” for Google’s BERT algorithm.

[Source: This article was published in searchengineland.com By Barry Schwartz - Uploaded by the Association Member: Joshua Simon]

Categorized in Search Engine

Google said it is making the biggest change to its search algorithm in the past five years that, if successful, users might not be able to detect.

The search giant on Friday announced a tweak to the software underlying its vaunted search engine that is meant to better interpret queries when written in sentence form. Whereas prior versions of the search engine may have overlooked words such as “can” and “to,” the new software is able to help evaluate whether those change the intent of a search, Google has said. Put a bit more simply, it is a way of understanding search terms in relation to each other and it looks at them as an entire phrase, rather than as just a bucket of words, the company said. Google is calling the new software BERT, after a research paper published last year by Google executives describing a form of language processing known as Bidirectional Encoder Representations from Transformers.

While Google is constantly tweaking its algorithm, BERT could affect as many as 10 percent of English language searches, said Pandu Nayak, vice president of search, at a media event. Understanding queries correctly so Google returns the best result on the first try is essential to Google’s transformation from a list of links to determining the right answer without having to even click through to another site. The challenge will increase as queries increasingly move from text to voice-controlled technology.

But even big changes aren’t likely to register with the masses, he conceded.

“Most ranking changes the average person does not notice, other than the sucking feeling that their searches were better,” said Nayak.

“You don’t have the comparison of what didn’t work yesterday and what does work today,” said Ben Gomes, senior vice president of search.

BERT, said Nayak, may be able to determine that a phrase such as “math practice books for adults” likely means the user wants to find math books that adults can use, because of the importance of the word “for.” A prior version of the search engine displayed a book result targeted for “young adults,” according to a demonstration he gave.

Google is rolling out the new algorithm to U.S. users in the coming weeks, the company said. It will later offer it to other countries, though it didn’t offer specifics on timing.

The changes suggest that even after 20 years of data collection and Google’s dominance of search — with about 90 percent market share — Web searches may best be thought of as equal parts art and science. Nayak pointed to examples like searches for how to park a car on a hill with no curb or whether a Brazilian needs a visa to travel to the United States as yielding less than satisfactory results without the aide of the BERT software.

To test BERT, Google turned to its thousands of contract workers known as “raters,” Nayak said, who compared results from search queries with and without the software. Over time, the software learns when it needs to read entire phrases versus just keywords. About 15 percent of the billions of searches conducted each day are new, Google said.

Google said it also considers other input, such as whether a user tries rephrasing a search term rather than initially clicking on one of the first couple of links.

Nayak and Gomes said they didn’t know whether BERT would be used to improve advertising sales that are related to search terms. Advertising accounts for the vast majority of Google’s revenue.

[Source: This article was published inunionleader.com By Greg Bensinger - Uploaded by the Association Member: Jeremy Frink]

Categorized in Search Engine

A Boolean search, in the context of a search engine, is a type of search where you can use special words or symbols to limit, widen, or define your search.

This is possible through Boolean operators such as ANDORNOT, and NEAR, as well as the symbols + (add) and - (subtract).

When you include an operator in a Boolean search, you're either introducing flexibility to get a wider range of results, or you're defining limitations to reduce the number of unrelated results.

Most popular search engines support Boolean operators, but the simple search tool you'll find on a website probably doesn't.

Boolean Meaning

George Boole, an English mathematician from the 19th century, developed an algebraic method that he first described in his 1847 book, The Mathematical Analysis of Logic and expounded upon in his An Investigation of the Laws of Thought (1854).

Boolean algebra is fundamental to modern computing, and all major programming languages include it. It also figures heavily in statistical methods and set theory.

Today's database searches are largely based on Boolean logic, which allows us to specify parameters in detail—for example, combining terms to include while excluding others. Given that the internet is akin to a vast collection of information databases, Boolean concepts apply here as well.

Boolean Search Operators

For the purposes of a Boolean web search, these are the terms and symbols you need to know:

Boolean Operator Symbol Explanation Example
AND + All words must be present in the results football AND nfl
OR Results can include any of the words paleo OR primal
NOT - Results include everything but the term that follows the operator  diet NOT vegan
NEAR The search terms must appear within a certain number of words of each other swedish NEAR minister

Note: Most search engines default to using the OR Boolean operator, meaning that you can type a bunch of words and it will search for any of them, but not necessarily all of them.

Tips: Not all search engines support these Boolean operators. For example, Google understands - but doesn't support NOT. Learn more about Boolean searches on Google for help.

Why Boolean Searches Are Helpful

When you perform a regular search, such as dog if you're looking for pictures of dogs, you'll get a massive number of results. A Boolean search would be beneficial here if you're looking for a specific dog breed or if you're not interested in seeing pictures for a specific type of dog.

Instead of just sifting through all the dog pictures, you could use the NOT operator to exclude pictures of poodles or boxers.

A Boolean search is particularly helpful after running an initial search. For instance, if you run a search that returns lots of results that pertain to the words you entered but don't actually reflect what you were looking for, you can start introducing Boolean operators to remove some of those results and explicitly add specific words.

To return to the dog example, consider this: you see lots of random dog pictures, so you add +park to see dogs in parks. But then you want to remove the results that have water, so you add -water. Immediately, you've cut down likely millions of results.

More Boolean Search Examples

Below are some more examples of Boolean operators. Remember that you can combine them and utilize other advanced search options such as quotes to define phrases.

AND

free AND games

Helps find free games by including both words.

"video chat app" iOS AND Windows

Searches for video chat apps that can run on both Windows and iOS devices.

OR

"open houses" saturday OR sunday

Locate open houses that are open either day.

"best web browser" macOS OR Mac

If you're not sure how the article might be worded, you can try a search like this to cover both words.

NOT

2019 movies -horror

Finds movies mentioning 2019, but excludes all pages that have the word horror.

"paleo recipes" -sugar

Locates web pages about paleo recipes but ensures that none of them include the word sugar.

Note: Boolean operators need to be in all uppercase letters for the search engine to understand them as an operator and not a regular word.

[Source: This article was published in lifewire.com By Tim Fisher - Uploaded by the Association Member: Jason bourne] 

Categorized in Research Methods

[Source: This article was published in nakedsecurity.sophos.com By Mark Stockley - Uploaded by the Association Member: Deborah Tannen]

The history of computing features a succession of organisations that looked, for a while at least, as if they were so deeply embedded in our lives that we’d never do without them.

IBM looked like that, and Microsoft did too. More recently it’s been Google and Facebook.

Sometimes they look unassailable because, in the narrow territory they occupy, they are.

When they do fall it isn’t because somebody storms that territory, they fall because the ground beneath them shifts.

For years and years Linux enthusiasts proclaimed “this will be the year that Linux finally competes with Windows on the desktop!”, and every year it wasn’t.

But Linux, under the brand name Android, eventually smoked Microsoft when ‘Desktop’ gave way to ‘Mobile’.

Google has been the 800-pound gorilla of web search since the late 1990s and all attempts to out-Google it has failed. Its market share is rock solid and it’s seen off all challengers from lumbering tech leviathans to nimble and disruptive startups.

Google will not cede its territory to a Google clone but it might one day find that its territory is not what it was.

The web is getting deeper and darker and Google, Bing and Yahoo don’t actually search most of it.

They don’t search the sites on anonymous, encrypted networks like Tor and I2P (the so-called Dark Web) and they don’t search the sites that have either asked to be ignored or that can’t be found by following links from other websites (the vast, virtual wasteland known as the Deep Web).

The big search engines don’t ignore the Deep Web because there’s some impenetrable technical barrier that prevents them from indexing it – they do it because they’re commercial entities and the costs and benefits of searching beyond their current horizons don’t stack up.

That’s fine for most of us, most of the time, but it means that there are a lot of sites that go un-indexed and lots of searches that the current crop of engines are very bad at.

That’s why the US’s Defence Advanced Research Projects Agency (DARPA) invented a search engine for the deep web called Memex.

Memex is designed to go beyond the one-size-fits-all approach of Google and deliver the domain-specific searches that are the very best solution for narrow interests.

In its first year it’s been tackling the problems of human trafficking and slavery – things that, according to DARPA, have a significant presence beyond the gaze of commercial search engines.

When we first reported on Memex in February, we knew that it would have potential far beyond that. What we didn’t know was that parts of it would become available more widely, to the likes of you and me.

A lot of the project is still somewhat murky and most of the 17 technology partners involved are still unnamed, but the plan seems to be to lift the veil, at least partially, over the next two years, starting this Friday.

That’s when an initial tranche of Memex components, including software from a team called Hyperion Gray, will be listed on DARPA’s Open Catalog.

The Hyperion Gray team described their work to Forbes as:

Advanced web crawling and scraping technologies, with a dose of Artificial Intelligence and machine learning, with the goal of being able to retrieve virtually any content on the internet in an automated way.

Eventually our system will be like an army of robot interns that can find stuff for you on the web, while you do important things like watch cat videos.

More components will follow in December and, by the time the project wraps, a “general purpose technology” will be available.

Memex and Google don’t overlap much, they solve different problems, they serve different needs and they’re funded in very different ways.

But so were Linux and Microsoft.

The tools that DARPA releases at the end of the project probably won’t be a direct competitor to Google but I expect they will be mature and better suited to certain government and business applications than Google is.

That might not matter to Google but there are three reasons why Memex might catch its eye.

The first is not news but it’s true none the less – the web is changing and so is internet use.

When Google started there was no Snapchat, Bitcoin or Facebook. Nobody cared about the Deep Web because it was hard enough to find the things you actually wanted and nobody cared about the Dark Web (remember FreeNet?) because nobody knew what it was for.

The second is this statement made by Christopher White, the man heading up the Memex team at DARPA, who’s clearly thinking big:

The problem we're trying to address is that currently access to web content is mediated by a few very large commercial search engines - Google, Microsoft Bing, Yahoo - and essentially it's a one-size fits all interface...

We've started with one domain, the human trafficking domain ... In the end we want it to be useful for any domain of interest.

That's our ambitious goal: to enable a new kind of search engine, a new way to access public web content

And the third is what we’ve just discovered – Memex isn’t just for spooks and G-Men, it’s for the rest of us to use and, more importantly, to play with.

It’s one thing to use software and quite another to be able to change it. The beauty of open-source software is that people are free to take it in new directions – just like Google did when it picked up Linux and turned it into Android.

Categorized in Search Engine

[Source: This article was published in seroundtable.com By Barry Schwartz - Uploaded by the Association Member: Bridget Miller]

Google's John Mueller said it again, do not worry about words or keywords in the URLs. John responded to a recent question on Twitter saying "I wouldn't worry about keywords or words in a URL. In many cases, URLs aren't seen by users anyway."

oliver

It references that video from Matt Cutts back in 2009 where it says keywords play a small role in rankings, but really small.

In 2017, John Mueller said keywords in URLs are overrated and that it is a small ranking factor back in 2016.

Forum discussion at Twitter.

Categorized in Search Engine

[Source: This article was published in searchengineland.com By Awario - Uploaded by the Association Member: Robert Hensonw]

Boom! Someone just posted a tweet praising your product. On the other side of the world, an article featuring your company among the most promising startups of 2019 was published. Elsewhere, a Reddit user started a thread complaining about your customer care. A thousand miles away, a competitor posted an announcement about a new product they are building. 

What if you (and everyone on your team, from Social Media to PR to Product to Marketing) could have access to that data in real time?

That’s exactly where social listening steps in.

What is social media listening?

Social listening is the process of tracking mentions of certain words, phrases, or even complex queries across social media and the web, followed by an analysis of the data.

A typical word to track would be a brand name, but the possibilities of social media monitoring go way beyond that: you can monitor mentions of your competitors, industry, campaign hashtags, and even search for people who’re looking for office space in Seattle if that’s what you’re after.

Despite its name, social listening isn’t just about social media: many listening tools also monitor news websites, blogs, forums, and the rest of the web.

But that’s not the only reason why the concept can be confusing. Social listening goes by many different names: buzz analysis, social media measurement, brand monitoring, social media intelligence… and, last but not least, social media monitoring. And while these terms don’t exactly mean the same thing, you’ll often see them used interchangeably today.

The benefits of social listening

The exciting thing about social media listening is that it gives you access to invaluable insights on your customers, market, and competition: think of it as getting answers to questions that matter to your business, but without having to ask the actual questions.

There’s an infinite number of ways you can use this social media data; here’re just a few obvious ones.

1. Reputation management.

A sentiment graph showcasing a reputation crisis. Screenshot from Awario.

This is one of the most common reasons companies use social listening. Businesses monitor mentions of their brand and products to track brand health and react to changes in the volume of mentions and sentiment early to prevent reputation crises.

2. Competitor analysis.

Social media share of voice for the airlines. Screenshot from the Aviation Industry 2019 report.

Social media monitoring tools empower you with an ability to track what’s being said about your competition on social networks, in the media, on forums and discussion boards, etc. 

This kind of intelligence is useful at every step of competitor analysis: from measuring Share of Voice and brand health metrics to benchmark them against your own, to learning what your rivals’ customers love and hate about their products (so you can improve yours), to discovering the influencers and publishers they partner with… The list goes on. For more ways to use social media monitoring for competitive intelligence, this thorough guide to competitor analysis comes heavily recommended.

3. Product feedback.

The topic cloud for Slack after its logo redesign. Screenshot from Awario.

By tracking what your clients are saying about your product online and monitoring key topics and sentiment, you can learn how they react to product changes, what they love about your product, and what they believe is missing from it. 

As a side perk, this kind of consumer intelligence will also let you learn more about your audience. By understanding their needs better and learning to speak their language, you’ll be able to improve your ad and website copy and enhance your messaging so that it resonates with your customers.

4. Customer service.

Recent tweets mentioning British Airways. Screenshot from Awario.

Let’s talk numbers.

Fewer than 30% of social media mentions of brands include their handle — that means that by not using a social listening tool you’re ignoring 70% of the conversations about your business. Given that 60% of consumers expect brands to respond within an hour and 68% of customers leave a company because of its unhelpful (or non-existent) customer service, not reacting to those conversations can cost your business actual money.

5. Lead generation.

Social media leads for smartwatch manufacturers. Screenshot from Awario.

While lead generation isn’t the primary use case for most social listening apps, some offer social selling add-ons that let you find potential customers on social media. For the nerdy, Boolean search is an extremely flexible way to search for prospects: it’s an advanced way to search for mentions that uses Boolean logic to let you create complex queries for any use case. Say, if you’re a NYC-based insurance company, you may want to set up Boolean alerts to look for people who’re about to move to New York so that you can reach out before they’re actually thinking about insurance. Neat, huh? 

6. PR.

Most influential news articles about KLM. Screenshot from Awario.

Social listening can help PR teams in more than one way. First, it lets you monitor when press releases and articles mentioning your company get published. Second, PR professionals can track mentions of competitors and industry keywords across the online media to find new platforms to get coverage on and journalists to partner with.

7. Influencer marketing.

Top influencers for Mixpanel. Screenshot from Awario.

Most social media monitoring tools will show you the impact, or reach, of your brand mentions. From there, you can find who your most influential brand advocates are. If you’re looking to find new influencers to partner with, all you need to do is create a social listening alert for your industry and see who the most influential people in your niche are. Lastly, make sure to take note of your competitors’ influencers — they will likely turn out to be a good fit for your brand as well.

8. Research.

Analytics for mentions of Brexit over the last month. Screenshot from Awario.

Social listening isn’t just for brands — it also lets you monitor what people are saying about any phenomenon online. Whether you’re a journalist writing an article on Brexit, a charity looking to evaluate the volume of conversations around a social cause, or an entrepreneur looking to start a business and doing market research, social listening software can help.

 

3 best social media listening tools

Now that we’re clear on the benefits of social media monitoring, let’s see what the best apps for social listening are. Here are our top 3 picks for every budget and company size.

1. Awario

Awario is a powerful social listening and analytics tool. With real-time search, a Boolean search mode, and extensive analytics, it’s one of the most popular choices for companies of any size.

Awario offers the best value for your buck. With it, you’ll get over 1,000 mentions for $1 — an amazing offer compared to similar tools. 

Key features: Boolean search, Sentiment Analysis, Topic clouds, real-time search.

Supported platforms: Facebook, Instagram, Twitter, YouTube, Reddit, news and blogs, the web.

Free trial: Try Awario free for 7 days by signing up here.

Pricing: Pricing starts at $29/mo for the Starter plan with 3 topics to monitor and 30,000 mentions/mo. The Pro plan ($89/mo) includes 15 topics and 150,000 mentions. Enterprise is $299/mo and comes with 50 topics and 500,000 mentions. If you choose to go with an annual option, you’ll get 2 months for free. 

2. Tweetdeck

TweetDeck is a handy (and free) tool to manage your brand’s presence on Twitter. It lets you schedule tweets, manage several Twitter accounts, reply to DMs, and monitor mentions of anything across the platform — all in a very user-friendly, customizable dashboard. 

For social media monitoring, TweetDeck offers several powerful ways to search for mentions on Twitter with a variety of filters for you to use. You can then engage with the tweets without leaving the app. 

TweetDeck is mostly used for immediate engagement — the tool doesn’t offer any kind of analytics.

Key features: User-friendly layout, ability to schedule tweets, powerful search filters.

Supported platforms: Twitter.

Free trial: N/A

Pricing: Free.

3. Brandwatch

Brandwatch is an extremely robust social media intelligence tool. It doesn’t just let you monitor brand mentions on social: the tool comes with image recognition, API access, and customizable dashboards that cover just about any social listening metric you can think of. 

Brandwatch’s other product, Vizia, offers a way to visualize your social listening data and even combine it with insights from a number of other sources, including Google Analytics.

Key features: Powerful analytics, exportable visualizations, image recognition.

Supported platforms: Facebook, Twitter, Instagram, YouTube, Pinterest, Sina Weibo, VK, QQ, news and blogs, the web.

Free trial: No.

Pricing: Brandwatch is an Enterprise-level tool. Their most affordable Pro plan is offered at $800/month with 10,000 monthly mentions. Custom plans are available upon request.

Before you go

Social media is an invaluable source of insights and trends in consumer behavior but remember: social listening doesn’t end with the insights. It’s a continuous learning process — the end goal of which should be serving the customer better.

Categorized in Social

[Source: This article was published in searchenginejournal.com By Pratik Dholakiya - Uploaded by the Association Member: Barbara larson] 

Important changes are happening at Google and, in a world where marketing and algorithms intersect, those changes are largely happening under the radar.

The future of search looks like it will have considerably less search in it, and this isn’t just about the end of the 10 blue links, but about much more fundamental changes.

Let’s talk about some of those changes now, and what they mean for SEO.

Google Discover

Google Discover is a content recommendation engine that suggests content across the web-based on a user’s search history and behavior.

Discover isn’t completely new (it was introduced in December of 2016 as Google Feed). But Google made an important change in late October (announced in September) when they added it to the Google homepage.

The revamp and rebranding to Discover added features like:

  • Topic headers to categorize feed results.
  • More images and videos.
  • Evergreen content, as opposed to just fresh content.
  • A toggle to tell Google if you want more or less content similar to a recommendation.
  • Google claims the recommendations are personalized to your level of expertise with a topic.

Google Discover hardly feels revolutionary at first. In fact, it feels overdue.

Our social media feeds are already dominated by content recommendation engines, and the YouTube content recommendation engine is responsible for 70% of the time spent on the site.

But Discover could have massive implications for the future of how users interact with the content of the web.

While it’s unlikely Discover will ever reach the 70% level of YouTube’s content recommendation engine, if it swallows even a relatively small portion of Google search, say 10%, no SEO strategy will be complete without a tactic for earning that kind of traffic, especially since it will allow businesses to reach potential customers who aren’t even searching for the relevant terms yet.

Google Assistant

For most users, Google Assistant is a quiet and largely invisible revolution.

Its introduction to Android devices in February 2017 likely left most users feeling like it was little more than an upgraded Google Now, and in a sense that’s exactly what it is.

But as Google Assistant grows, it will increasingly influence how users interact with the web and decrease reliance on search.

Like its predecessor, Assistant can:

  • Search the web.
  • Schedule events and alarms.
  • Show Google account info.
  • Adjust device settings.

But the crucial difference is its ability to engage in two-way conversations, allowing users to get answers from the system without ever even looking at a search result.

An incredibly important change for the future of business and the web is the introduction of Google Express, the capability to add products to a shopping cart and order them entirely through Assistant.

But this feature is limited to businesses that are explicitly partnered with Google Express, an incredibly dramatic change from the Google search engine and its crawling of the open web.

Assistant can also identify what some images are. Google Duplex, an upcoming feature, will also allow Assistant to call businesses to schedule appointments and other similar actions on the user’s behalf.

The more users rely on Assistant, the less they will rely on Google search results, and the more businesses who hope to adapt will need to think of other ways to:

  • Leverage Assistant’s algorithms and other emerging technologies to fill in the gaps.
  • Adjust their SEO strategies to target the kind of behavior that is exclusive to search and search alone.

Google’s Declaration of a New Direction

Circa Google’s 20th anniversary, Google announced that its search product was closing an old chapter and opening a new one, with important new driving principles added.

They started by clarifying that these old principles wouldn’t be going away:

  • Focusing on serving the user’s information needs.
  • Providing the most relevant, high-quality information as quickly as possible.
  • Using an algorithmic approach.
  • Rigorously testing every change, including using quality rating guidelines to define search goals.

This means you should continue:

  • Putting the user first.
  • Being accurate and relevant.
  • Having some knowledge of algorithms.
  • Meeting Google’s quality rating guidelines.

But the following principles represent a dramatically new direction for Google Search:

Shifting from Answers to Journeys

Google is adding new features that will allow users to “pick up where they left off,” shifting the focus away from short-term answers to bigger, ongoing projects.

This currently already includes activity cards featuring previous pages visited and queries searched, the ability to add content to collections, and tabs that suggest what to learn about next, personalized to the user’s search history.

A new Topic layer has also been added to the Knowledge Graph, allowing Google to surface evergreen content suggestions for users interested in a particular topic.

Perhaps the most important change to watch carefully, Google is looking for ways to help users who don’t even make a search query.

Google Discover is central to this effort and the inclusion of evergreen content, not just fresh content, represents an important change in how Google is thinking about the feed. This means more and more traditional search content will become feed content instead.

Shifting from Text to Visual Representation

Google is making important changes in the way information is presented by adding new visual capabilities.

They are introducing algorithmically generated AMP Stories, video compilations with relevant caption text like age and notable events in a person’s life.

New featured videos have been added to the search, designed to offer an overview on topics you are interested in.

Image search has also been updated so that images featured on pages with relevant content take priority and pages where the image is central to the content rank better. Captions and suggested searches have been added as well.

Finally, Google Lens allows you to perform a visual search based on objects that Google’s AI can detect in the image.

These changes to search are slipping under the radar somewhat for now, since user behavior rarely changes overnight.

But the likelihood that these features and Google’s new direction will have a dramatic impact on how search works is very high.

SEOs who ignore these changes and continue operating with a 2009 mindset will find themselves losing ground to competitors.

SEO After Search

While queries will always be an important part of the way we find information online, we’re now entering a new era of search.

An era that demands we start changing the way we think about SEO soon, while we can capitalize on the changing landscape.

The situation is not unlike when Google first came on the scene in 1998 when new opportunities were on the horizon that most at the time were unaware of and ill-prepared for.

As the technological landscape changes, we will need to alter our strategies and start thinking about questions and ideas like these in our vision for the future of our brands:

  • Less focus on queries and more focus on context appears inevitable. Where does our content fit into a user’s journey? What would they have learned before consuming it, and what will they need to know next? Note that this is much more vital than simply a shift from keywords to topics, which has been happening for a very long time already. Discovery without queries is much more fundamental and impacts our strategies in a much more profound way.
  • How much can we incorporate our lead generation funnel into that journey as it already exists, and how much can we influence that journey to push it in a different direction?
  • How can we create content and resources that users will want to bookmark and add to collections?
  • Why would Google recommend our content as a useful evergreen resource in Discover, and for what type of user?
  • Can we partner with Google on emerging products? How do we adapt when we can’t?
  • How should we incorporate AMP stories and similar visual content into our content strategy?
  • What type of content will always be exclusive to query-based search, and should we focus more or less on this type of content?
  • What types of content will Google’s AI capacities ultimately be able to replace entirely, and on what timeline? What will Google Assistant and it’s successors never be able to do that only content can?
  • To what extent is it possible for SEOs to adopt a “post-content” strategy?

With the future of search having Google itself doing more of the “searching” on the user’s behalf, we will need to get more creative in our thinking.

We must recognize that surfacing content has never been Google’s priority. It has always been focused on providing information.

Bigger Than Google

The changes on the horizon also signal that the SEO industry ought to start thinking bigger than Google.

What does that mean?

It means expanding the scope of SEO from search to the broader world where algorithms and marketing intersect.

It’s time to start thinking more about how our skills apply to:

  • Content recommendation engines
  • Social media algorithms
  • Ecommerce product recommendation engines
  • Amazon’s search algorithms
  • Smart devices, smart homes, and the internet of things
  • Mobile apps
  • Augmented reality

As doors on search close, new doors open everywhere users are interacting with algorithms that connect to the web and the broader digital world.

SEO professionals should not see the decline of traditional search as a death knell for the industry.

Instead, we should look at the inexorably increasing role algorithms play in peoples’ lives as a fertile ground full of emerging possibilities.

Categorized in Search Engine

 [Source: This article was published in fastcompany.com By DOUG AAMOTH - Uploaded by the Association Member: Jay Harris]

Whether you’re privacy-conscious or just on the hunt for the perfect grilled-cheese sandwich, the best search engine for you might not be Google.

You’d be hard-pressed to find the cross-section of living people who have searched for something on the web and who haven’t ever—never, ever, ever, not even once—used Google. But even if you are among the billions who do, it’s nice to know you have alternatives. Maybe you’re concerned about your privacy. Maybe you’re looking for something pretty specific. Maybe you’re just ready to try something new.

Well, the good news is that there are plenty of search engines to try—some very Google-like, and some going out of their way to act very un-Googley. Here are a few to check out the next time you need something.

DuckDuckGo

1. DON’T QUIT COLD TURKEY

Newsflash: Google is dominant. Startpage—which bills itself as the world’s most private search engine—knows this and doesn’t try to out-Google the almighty Google. Instead, it leverages Google’s search results but strips out all the tracking, data mining, and personalized results. Your IP address isn’t recorded, none of your personal data is collected, and there’s a single cookie served up that stores your preferences (but it expires if you don’t come back for 90 days).

For private, truly non-Google search, try the venerable DuckDuckGo—which leverages hundreds of sources, including Bing and its own web crawler—or Searx, which can be customized to toggle search results on and off from more than 20 engines (including Google).

Ecosia

2. DO SOME GOOD

The internet has it all . . . including a search engine that plants trees. Environmentally-minded Ecosia uses servers that run on renewable energy, doesn’t track users or sell data to third parties, and uses profits from text link ads and commissions from its online store to plant trees around the world. Ecosia says that it takes about 45 searches to finance a new tree, so the more curious among us may someday be responsible for entire forests. Actual search results are powered by Microsoft’s Bing technology, and there’s a cool little personal counter that lets you know how many Ecosia searches you’ve made.
SearchTeam

3. SEARCH WITH FRIENDS

A self-described “collaborative search engine,” SearchTeam works as its name implies. Someone in your group creates a “SearchSpace” based on a specific topic and then invites others in the group to scour the web for sites and media that further the cause. Saving happens in real-time, and there are organization and commenting features that make it easy to keep everyone in the loop. And if SearchTeam’s results aren’t quite extensive enough, you can add links manually, upload documents, and create custom posts to organize additional knowledge.

Yummly

4. LET YOUR TUMMY BE YOUR GUIDE

Hungry? Picky? Yummly has you covered. This food-finding search engine catalogs more than two million recipes and lets you get very specific about what you’d like to eat, peppering you with questions and qualifiers that you can answer or skip in order to narrow down the results. My search for the perfect grilled-cheese sandwich—no veggies, five or fewer ingredients, 15 minutes or less, cheddar cheese, grilled (not pressed), and easy enough for a culinary Luddite to create—started at 7,137 recipes and ended up at a very-manageable 20 to choose from. Now I need to figure out how to work my stove.

Listen Notes

5. LOTS AND LOTS OF LISTENABLES

Podcasts are everywhere—both figuratively in popularity and literally in that they’re scattered around all corners of the web. Confidently billing itself as “the best podcast search engine,” Listen Notes does an admirable job at corralling content, boasting more than 50 million episodes to be found across almost three-quarters of a million podcasts. You can create your own listen-later playlists for individual episodes without having to subscribe to entire podcasts, which can then be slung to your player of choice via RSS (kids, look that up—it was the bee’s knees back in the day). You can even add contributors so that you and your friends can work on the same lists.

 

Categorized in Search Engine

[Source: This article was published in observer.com By Harmon Leon - Uploaded by the Association Member: Paul L.]

On HBO’s Silicon Valley, the Pied Piper crew’s mission is to create a decentralized internet that cuts out intermediaries like FacebookGoogle, and their fictional rival, Hooli. Surely a move that would make Hooli’s megalomaniac founder Gavin Belson (also fictional) furious.

In theory, no one owns the internet. No. Not Mark Zuckerberg, not Banksy, not annoying YouTube sensation Jake Paul either. No—none of these people own the internet because no one actually owns the internet.

But in practice, a small number of companies really control how we use the internet. Sure, you can pretty much publish whatever you want and slap up a website almost instantaneously, but without Google, good luck getting folks to find your site. More than 90 percent of general web searches are handled by the singular humongous search engine—Google.

If things go sour with you and Google, the search giant could make your life very difficult, almost making it appear like you’ve been washed off the entire internet planet. Google has positioned itself as pretty much the only game in town.

Colin Pape had that problem. He’s the founder of Presearch, a decentralized search engine powered by a community with roughly 1.1 million users. Presearch uses cryptocurrency tokens as an incentive to decentralize search. The origin story: Before starting Presearch, Google tried to squash Pape’s business, well not exactly squash, but simply erase it from searches.

Let’s backtrack.

In 2008, Pape founded a company called ShopCity.com. The premise was to support communities and get their local businesses online, then spread that concept to other communities in a franchise-like model. In 2011, Pape’s company launched a local version in Google’s backyard of Mountain View, California.

End of story, right? No.

“We woke up one morning in July to find out that Google had demoted almost all of our sites onto page eight of the search results,” Pape explained. Pape and his crew thought it was some sort of mistake; still, the demotion of their sites was seriously hurting the businesses they represented, as well as their company. But something seemed fishy.

Pape had read stories of businesses that had essentially been shut down by Google—or suffered serious consequences such as layoffs and bankruptcy—due to the jockeying of the search engine.

“Picture yourself as a startup that launches a pilot project in Google’s hometown,” said Pape, “and 12 months later, they launch a ‘Get Your City Online’ campaign with chambers of commerce, and then they block your sites. What would you think?”

It was hard for Pape not to assume his company had been targeted because it was easy enough for Google to simply take down sites from search results.

“We realized just how much market power Google had,” Pape recalled. “And how their lack of transparency and responsiveness was absolutely dangerous to everyone who relies on the internet to connect with their customers and community.”

google

Google’s current search engine model makes us passive consumers who are fed search results from a black box system into which none of us have any insight. Chris Jackson/Getty Images

 

Fortunately, Pape’s company connected with a lawyer leading a Federal Trade Commission (FTC) investigation into Google’s monopolistic practices. Through the press, they put pressure on Google to resolve its search issues.

This was the genesis for Presearch, ‘the Switzerland of Search,’ a resource dedicated to the more open internet on a level playing field.

“The vision for Presearch is to build a framework that enables many different groups to build their own search engine with curated information and be rewarded for driving usage and improving the platform,” Pape told Observer.

But why is this so important?

“Because search is how we access the amazing resources on the web,” Pape continued. “It’s how we find things that we don’t already know about. It’s an incredibly powerful position for a single entity [Google] to occupy, as it has the power to shape perceptions, shift spending and literally make or break entire economies and political campaigns, and to determine what and how we think about the world.”

You have to realize that nothing is truly free.

Sure, we use Google for everything from looking for a local pet groomer to finding Tom Arnold’s IMDB page. (There are a few other things in between.) Google isn’t allowing us to search out of the goodness of its heart. When we use Google, we’re essentially partaking in a big market research project, in which our information is being tracked, analyzed and commoditized. Basically, our profiles and search results are sold to the highest bidders. We are the product—built upon our usage. Have you taken the time to read Google’s lengthy terms of service agreement? I doubt it.

How else is Sergey Brin going to pay for his new heliport or pet llama?

Stupid free Google.

Google’s current model makes us passive consumers who are fed search results from a black box system into which none of us have any insight. Plus, all of those searches are stored, so good luck with any future political career if a hacker happens to get a hold of that information.

Presearch’s idea is to allow the community to look under the hood and actively participate in this system with the power of cryptocurrency to align participant incentives within the ecosystem to create a ground-up, community-driven alternative to Google’s monopoly.

“Every time you search, you receive a fraction of a PRE token, which is our cryptocurrency,” explained Pape. “Active community members can also receive bonuses for helping to improve the platform, and everyone who refers a new user can earn up to 25 bonus PRE.”

Tokens can be swapped for other cryptocurrencies, such Bitcoin, used to buy advertising, sold to other advertisers or spent on merchandise via Presearch’s online platform.

Presearch’s ethos is to personalize the search engine rather than allowing analytics to be gamed against us, so users are shown what they want to see. Users can specify their preferences to access the information they want, rather than enveloping them in filter bubbles that reinforce their prejudices and bad behaviors, simply to makes them click on more ads.

“We want to empower people rather than control them,” Pape said. “The way to do that is to give them choices and make it easy for them to ‘change the channel,’ so to speak if the program they’re being served isn’t resonating with them.”

Another thing to fear about Google, aside from the search engine being turned on its head and being used as a surveillance tool in a not-so-distant dystopian future, is an idea that’s mentioned in Jon Ronson book, So You’ve Been Publicly Shamed. People’s lives have been ruined due to Google search results that live on forever after false scandalous accusations.

How will Presearch safeguard us against this?

“We are looking at a potential model where people could stake their tokens to upvote or downvote results, and then enable community members to vote on those votes,” said Pape. “This would enable mechanisms to identify false information and provide penalties for those who promote it. This is definitely a tricky subject that we will involve the community in developing policies for.”

Pape’s vision is very much aligned with Pied Piper’s on HBO’s Silicon Valley.

“It is definitely pretty accurate… a little uncanny, actually,” Pape said after his staff made him watch the latest season. “It was easy to see where the show drew its inspiration from.”

But truth is stranger than fiction. “The problems a decentralized internet are solving are real, and will become more and more apparent as the Big Tech companies continue to clamp down on the original free and open internet in favor of walled gardens and proprietary protocols,” he explained. “Hopefully the real decentralized web will be the liberating success that so many of us envision.”

Obviously an alternative to Google’s search monopoly is a good thing. And Pape feels that breaking up Google might help in the short term, but “introducing government control is just that—introducing more control,” Pape said. “We would rather offer a free market solution that enables people to make their own choices, which provides alignment of incentives and communities to create true alternatives to the current dominant forces.”

Presearch may or may not be the ultimate solution, but it’s a step in the right direction

Categorized in Search Engine
Page 1 of 10

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now