[Source: This article was published in enca.com - Uploaded by the Association Member: Rene Meyer]

In this file illustration picture taken on July 10, 2019, the Google logo is seen on a computer in Washington, DC. 

SAN FRANCISCO - Original reporting will be highlighted in Google’s search results, the company said as it announced changes to its algorithm.

The world’s largest search engine has come under increasing criticism from media outlets, mainly because of its algorithms - a set of instructions followed by computers - that newspapers have often blamed for plumenting online traffic and the industry’s decline.

Explaining some of the changes in a blog post, Google's vice president of news Richard Gingras said stories that were critically important and labor intensive -- requiring experienced investigative skills, for example -- would be promoted.

Articles that demonstrated “original, in-depth and investigative reporting,” would be given the highest possible rating by reviewers, he wrote on Thursday.

These reviewers - roughly 10,000 people whose feedback contributes to Google’s algorithm - will also determine the publisher’s overall reputation for original reporting, promoting outlets that have been awarded Pulitzer Prizes, for example.

It remains to be seen how such changes will affect news outlets, especially smaller online sites and local newspapers, who have borne the brunt of the changing media landscape.

And as noted by the technology website TechCrunch, it is hard to define exactly what original reporting is: many online outlets build on ‘scoops’ or exclusives with their own original information, a complexity an algorithm may have a hard time picking through.

The Verge - another technology publication - wrote the emphasis on originality could exacerbate an already frenetic online news cycle by making it lucrative to get breaking news online even faster and without proper verification.

The change comes as Google continues to face criticism for its impact on the news media.

Many publishers say the tech giant’s algorithms - which remain a source of mysterious frustration for anyone outside Google -- reward clickbait and allow investigative and original stories to disappear online.

Categorized in Search Engine

 [Source: This article was Published in searchenginejournal.com By Barry Schwartz - Uploaded by the Association Member: Martin Grossner]

Google says the June 3 update is not a major one, but keep an eye out for how your results will be impacted.

Google has just announced that tomorrow it will be releasing a new broad core search algorithm update. These core updates impact how search results are ranked and listed in the Google search results.

Here is Google’s tweet:

searchliaison

Previous updates. Google has done previous core updates. In fact, it does one every couple months or so. The last core update was released in March 2019. You can see our coverage of the previous updates over here.

Why pre-announce this one? Google said the community has been asking Google to be more proactive when it comes to these changes. Danny Sullivan, Google search liason, said there is nothing specifically “big” about this update compared to previous updates. Google is being proactive about notifying site owners and SEOs, Sullivan said, so people aren’t left “scratching their heads after-the-fact.”

casey markee

When is it going live? Monday, June 3, Google will make this new core update live. The exact timing is not known yet, but Google will also tweet tomorrow when it does go live.

eric mitz

Google’s previous advice. Google has previously shared this advice around broad core algorithm updates:

“Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year.

As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded.

There’s no ‘fix’ for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.”

 

Categorized in Search Engine

[This article is originally published in makeuseof.com  written by Dan Price  - Uploaded by AIRS Member: Carol R. Venuti]

Of course, most social networks have their own search engines built in, but they’re fundamentally limited by the fact they can only search their own database. And how you are supposed to know whether Aunt Mary is on Facebook, Google Plus, or one of the other myriad options?

The solution? Use a network-agnostic social search engine. They can search all the most common networks, as well as lots of the niche, smaller ones.

If you need a social search engine, you’ve come to the right place. Here are six options for you to consider.

1. Pipl

Pipl offers a vast database of online accounts – almost three billion are accessible through its search algorithms.

The search engine doesn’t only scan social media networks. It also scans a list of both personal and work emails, deep web archives such as court records, news reports, and publicly available government lists.

6 Most Powerful Search Engines for Social Networks pipl 670x449

To use the tool, enter the person’s name, email address, or social media username into the search box. If you wish, you can also enter a location. Click on the magnifying glass icon to start the search.

How to Check for Open Usernames on Dozens of Social Media Sites at Once How to Check for Open Usernames on Dozens of Social Media Sites at OnceIf you want to create a new presence across social media sites, this tool will help you find a username that you can use on all of them!READ MORE

The results page will show you hits from across the site’s various databases. You can use the filters on the left-hand side of the screen to narrow the results by location and age.

Twitter itself also allows you to search for tweets by location.

2. Social Mention

Social Mention is both a social search engine and a way to aggregate user-generated content across a number of networks into a single feed. It helps you search for phrases, events, and mentions, but it won’t let you find individual people.

The site supports more than 80 social networks, including Twitter, Facebook, YouTube, Digg, Google Plus, and Instagram. It can also scan blogs, bookmarks, and even comments.

6 Most Powerful Search Engines for Social Networks socialmention 670x476

In the left-hand panel of the results page, you’ll see an abundance of data about the phrases you entered. You can find out how frequently the page is mentioned, a list of associated keywords and hashtags, top users, and more.

On the right-hand side of the screen you’ll find links for exporting data into a CSV file, and along the top of the screen are various filter options.

3. snitch.name

The snitch.name site is one of the easiest on this list to use.

The site has several advantages over a regular search query on Google. For example, many social networks are either not indexed by Google, or only have very limited indexing. It also prioritizes “people pages,” whereas a regular Google search will also return results for results for posts mentioning the person, associated hashtags, and other content.

6 Most Powerful Search Engines for Social Networks snitch name 670x480

Obviously, even after running a search, some profiles theoretically remain restricted depending on the said user’s privacy settings. However, as long as you can access the account through your own social media account, you will be able to access the listing on snitch.name.

To use the site, fire up the homepage, enter your search terms, and mark the checkboxes next to the networks you want to scan. When you’re ready, click Search.

4. Social-Searcher

Social-Searcher is another web app that works across a broad array of social networks and other platforms.

You can use the site without making an account. Non-registered users can search Twitter, Google Plus, Facebook, YouTube, Instagram, Tumblr, Reddit, Flickr, Dailymotion, and Vimeo. You can also save your searches and set up email alerts.

6 Most Powerful Search Engines for Social Networks social searcher 670x316

If you need a more powerful solution, you should consider signing up for one of the paid plans. For $3.50 per month, you get 200 searches per day, three email alerts, three keyword monitors, and space for up to 3,000 saved posts. The top-level plan, which costs $20 per month, increases the limits even further.

5. Social-Searcher: Google Social Search

The same team who is responsible for the previously-mentioned Social-Searcher has also developed a Google Social Search tool.

It works with six networks. They are Facebook, Twitter, Google Plus, Instagram, LinkedIn, and Pinterest. You can mark the checkboxes next to the networks’ logos to limit your search to particular sites.

6 Most Powerful Search Engines for Social Networks google social search 670x353

The usual Google search tricks apply. For example, putting quotation marks around a set of words will force Google to only return results with an exact match, adding a minus sign will exclude specific words from the results, and typing OR between words will let you roll several terms into one search result.

Results are sorted by networks, and you can click on Web or Images to toggle between the different media.

6. Buzzsumo

Buzzsumo takes a slightly different approach to the tools we have mentioned so far. It specializes in searching for trends and keyword performance.

That makes it an ideal tool for businesses; they can find out what content is going to have the biggest impact when they share it, as well as gaining an insight into the words and phrases their competitors are using.

On the results page, you can use the panel on the left-hand side of the screen to create filters. Date, content type, language, country, and even word counts are searchable parameters.

6 Most Powerful Search Engines for Social Networks buzzsumo 670x300

On the right-hand side of the page, you can see how successful each post was. Analytics for Facebook, LinkedIn, Twitter, and Pinterest are shown, as are the total number of shares.

Free users can only see the top 10 results; you will need a Pro account for $79 per month to unlock more. It’s probably too much money for individual users, but for businesses the cost is negligible.

Which Social Media Search Engines Do You Use?

In this article, we have introduced you to six of the best social media search engines. Each of them focuses on a different type of user and presents its results in a different way. If you use them all, you should be able to quickly find the topic, person, trend, or keyword you’re looking for.

Categorized in Search Engine

Source: This article was Published searchengineland.com By Barry Schwartz - Contributed by Member: Clara Johnson

Some SEOs are seeing more fluctuations with the Google rankings now, but Google has confirmed the August 1 update has been fully rolled out.

Google has just confirmed that the core search algorithm update that began rolling out a week ago has now finished fully rolling out. Google search liaison Danny Sullivan said on Twitter, “It’s done” when I asked him if the rollout was complete.

Danny did add that if we are seeing other changes, “We always have changes that happen, both broad and more specific.” This is because some of the tracking tools are seeing more fluctuations today, and if they are unrelated to this update, the question is what they can be attributed to.

Here is Danny’s tweet:

@dannysullivan is the rollout of the core update complete? Seeing fluctuations today.

It's done. That said, we always have changes that happen, both broad and more specific.

Based on our research, the August 1 update was one of the more significant updates we have seen from Google on the organic search side in some time. It continued to roll out over the weekend and has now completed.

Google’s current advice on this update is that webmasters do not need to make any technical changes to their websites. In fact, the company said, “no fix” is required and that it is aimed at promoting sites that were once undervalued. Google has said that you should continue to look at ways of making your overall website better and provide even better-quality content and experiences to your website visitors.

Now that the rollout is complete, you can check to see if your site was impacted. But as Danny Sullivan said above, there are always changes happening in search.

 

Categorized in Search Engine

 Source: This article was published forbes.com By Jayson DeMers - Contributed by Member: William A. Woods

Some search optimizers like to complain that “Google is always changing things.” In reality, that’s only a half-truth; Google is always coming out with new updates to improve its search results, but the fundamentals of SEO have remained the same for more than 15 years. Only some of those updates have truly “changed the game,” and for the most part, those updates are positive (even though they cause some major short-term headaches for optimizers).

Today, I’ll turn my attention to semantic search, a search engine improvement that came along in 2013 in the form of the Hummingbird update. At the time, it sent the SERPs into a somewhat chaotic frenzy of changes but introduced semantic search, which transformed SEO for the better—both for users and for marketers.

What Is Semantic Search?

I’ll start with a briefer on what semantic search actually is, in case you aren’t familiar. The so-called Hummingbird update came out back in 2013 and introduced a new way for Google to consider user-submitted queries. Up until that point, the search engine was built heavily on keyword interpretation; Google would look at specific sequences of words in a user’s query, then find matches for those keyword sequences in pages on the internet.

Search optimizers built their strategies around this tendency by targeting specific keyword sequences, and using them, verbatim, on as many pages as possible (while trying to seem relevant in accordance with Panda’s content requirements).

Hummingbird changed this. Now, instead of finding exact matches for keywords, Google looks at the language used by a searcher and analyzes the searcher’s intent. It then uses that intent to find the most relevant search results for that user’s intent. It’s a subtle distinction, but one that demanded a new approach to SEO; rather than focusing on specific, exact-match keywords, you had to start creating content that addressed a user’s needs, using more semantic phrases and synonyms for your primary targets.

Voice Search and Ongoing Improvements

Of course, since then, there’s been an explosion in voice search—driven by Google’s improved ability to recognize spoken words, its improved search results, and the increased need for voice searches with mobile devices. That, in turn, has fueled even more advances in semantic search sophistication.

One of the biggest advancements, an update called RankBrain, utilizes an artificial intelligence (AI) algorithm to better understand the complex queries that everyday searchers use, and provide more helpful search results.

Why It's Better for Searchers

So why is this approach better for searchers?

  • Intuitiveness. Most of us have already taken for granted how intuitive searching is these days; if you ask a question, Google will have an answer for you—and probably an accurate one, even if your question doesn’t use the right terminology, isn’t spelled correctly, or dances around the main thing you’re trying to ask. A decade ago, effective search required you to carefully calculate which search terms to use, and even then, you might not find what you were looking for.
  • High-quality results. SERPs are now loaded with high-quality content related to your original query—and oftentimes, a direct answer to your question. Rich answers are growing in frequency, in part to meet the rising utility of semantic search, and it’s giving users faster, more relevant answers (which encourages even more search use on a daily basis).
  • Content encouragement. The nature of semantic search forces searches optimizers and webmasters to spend more time researching topics to write about and developing high-quality content that’s going to serve search users’ needs. That means there’s a bigger pool of content developers than ever before, and they’re working harder to churn out readable, practical, and in-demand content for public consumption.

Why It's Better for Optimizers

The benefits aren’t just for searchers, though—I’d argue there are just as many benefits for those of us in the SEO community (even if it was an annoying update to adjust to at first):

  • Less pressure on keywords. Keyword research has been one of the most important parts of the SEO process since search first became popular, and it’s still important to gauge the popularity of various search queries—but it isn’t as make-or-break as it used to be. You no longer have to ensure you have exact-match keywords at exactly the right ratio in exactly the right number of pages (an outdated concept known as keyword density); in many cases, merely writing about the general topic is incidentally enough to make your page relevant for your target.
  • Value Optimization. Search optimizers now get to spend more time optimizing their content for user value, rather than keyword targeting. Semantic search makes it harder to accurately predict and track how keywords are specifically searched for (and ranked for), so we can, instead, spend that effort on making things better for our core users.
  • Wiggle room. Semantic search considers synonyms and alternative wordings just as much as it considers exact match text, which means we have far more flexibility in our content. We might even end up optimizing for long-tail phrases we hadn’t considered before.

The SEO community is better off focusing on semantic search optimization, rather than keyword-specific optimization. It’s forcing content producers to produce better, more user-serving content, and relieving some of the pressure of keyword research (which at times is downright annoying).

Take this time to revisit your keyword selection and content strategies, and see if you can’t capitalize on these contextual queries even further within your content marketing strategy.

Categorized in Search Engine

Source: This article was published bizjournals.com By Sheila Kloefkorn - Contributed by Member:Anthony Frank

You may have heard about Google’s mobile-first indexing. Since nearly 60 percent of all searches are mobile, it makes sense that Google would give preference to mobile-optimized content in its search results pages.

Are your website and online content ready? If not, you stand to lose search-engine rankings and your website may not rank in the future.

Here is how to determine if you need help with Google’s mobile-first algorithm update:

What is mobile-first indexing?

Google creates an index of website pages and content to facilitate each search query. Mobile-first indexing means the mobile version of your website will weigh heavier in importance for Google’s indexing algorithm. Mobile responsive, fast-loading content is given preference in first-page SERP website rankings.

Mobile first doesn’t mean Google only indexes mobile sites. If your company does not have a mobile-friendly version, you will still get indexed, but your content will be ranked below mobile-friendly content. Websites with a great mobile experience will receive better search-engine rankings than a desktop-only version. Think about how many times you scroll to the second page of search results. Likely, not very often. That is why having mobile optimized content is so important.

How to determine if you need help

If you want to make sure you position your company to take advantage of mobile indexing as it rolls out, consider whether you can manage the following tasks on your own or if you need help:

  • Check your site: Take advantage of Google’s test site to see if your site needs help.
  • Mobile page speed: Make sure you enhance mobile page speed and load times. Mobile optimized content should load in 2 seconds or less. You want images and other elements optimized to render well on mobile devices.
  • Content: You want high-quality, relevant and informative mobile-optimized content on your site. Include text, videos, images and more that are crawlable and indexable.
  • Structured data: Use the same structured data on both desktop and mobile pages. Use mobile version of URLs in your structured data on mobile pages.
  • Metadata: Make sure your metadata such as titles and meta descriptions for all pages is updated.
  • XML and media sitemaps: Make sure your mobile version can access any links to sitemaps. Include robots.txt and meta-robots tags and include trust signals like links to your company’s privacy policy.
  • App index: Verify the mobile version of your desktop site relates to your app association files and others if you use app indexation for your website.
  • Server capacity: Make sure your hosting servers have the needed capacity to handle crawl mobile and desktop crawls.
  • Google Search Console: If you use Google Search Console, make sure you add and verify your mobile site as well.

What if you do not have a mobile site or mobile-optimized content?
If you have in-house resources to upgrade your website for mobile, the sooner you can implement the updates, the better.

If not, reach out to a full-service digital marketing agency like ours, which can help you update your website so that it can continue to compete. Without a mobile-optimized website, your content will not rank as well as websites with mobile-friendly content.

Categorized in Search Engine

Source: This article was published helpnetsecurity.com - Contributed by Member: Corey Parker

Ben-Gurion University of the Negev and University of Washington researchers have developed a new generic method to detect fake accounts on most types of social networks, including Facebook and Twitter.

According to their new study in Social Network Analysis and Mining, the new method is based on the assumption that fake accounts tend to establish improbable links to other users in the networks.

“With recent disturbing news about failures to safeguard user privacy, and targeted use of social media by Russia to influence elections, rooting out fake users has never been of greater importance,” explains Dima Kagan, lead researcher and a researcher in the BGU Department of Software and Information Systems Engineering.

“We tested our algorithm on simulated and real-world datasets on 10 different social networks and it performed well on both.”

The algorithm consists of two main iterations based on machine-learning algorithms. The first constructs a link prediction classifier that can estimate, with high accuracy, the probability of a link existing between two users.

The second iteration generates a new set of meta-features based on the features created by the link prediction classifier. Lastly, the researchers used these meta-features and constructed a generic classifier that can detect fake profiles in a variety of online social networks.

Here’s a helpful video explanation of how it all works:

“Overall, the results demonstrated that in a real-life friendship scenario we can detect people who have the strongest friendship ties as well as malicious users, even on Twitter,” the researchers say. “Our method outperforms other anomaly detection methods and we believe that it has considerable potential for a wide range of applications particularly in the cyber-security arena.”

Other researchers who contributed are Dr. Michael Fire of the University of Washington (former Ben-Gurion U. doctoral student) and Prof. Yuval Elovici, director of [email protected] and a member of the BGU Department of Software and Information Systems Engineering.

The Ben-Gurion University researchers previously developed the Social Privacy Protector (SPP) Facebook app to help users evaluate their friend's list in seconds to identify which have few or no mutual links and might be “fake” profiles.

Categorized in Social

A new book shows how Google’s search algorithms quietly reinforce racist stereotypes.

Are search engines making us more racist?

According to Safiya Umoja Noble, a professor of communication at the University of Southern California, the answer is almost certainly yes.

Noble’s new book, Algorithms of Oppression: How Search Engines Reinforce Racism, challenges the idea that search engines like Google provide a level playing field for all ideas, values, and identities. She says they’re inherently discriminatory and favor the groups that designed them, as well as the companies that fund them.

This isn’t a trivial topic, especially in a world where people get more information from search engines than they do from teachers or libraries. For Noble, Google is not just telling people what they want to know but also determining what’s worth knowing in the first place.

I reached out to Noble last week to find out what she had learned about the unseen factors driving these algorithms, and what the consequences of ignoring them might be.

A lightly edited transcript of our conversation follows.

Sean Illing

What are you arguing in this book?

Safiya Umoja Noble

I’m arguing that large, multinational advertising platforms online are not trusted, credible public information portals. Most people think of Google and search engines in particular as a public library, or as a trusted place where they can get accurate information about the world. I provide a lot of examples to show the incredible influence of advertising dollars on the kinds of things we find, and I show how certain people and communities get misrepresented in service of making money on these platforms.

Sean Illing

Who gets misrepresented and how?

Safiya Umoja Noble

I started the book several years ago by doing collective searches on keywords around different community identities. I did searches on “black girls,” “Asian girls,” and “Latina girls” online and found that pornography was the primary way they were represented on the first page of search results. That doesn’t seem to be a very fair or credible representation of women of color in the United States. It reduces them to sexualized objects.

So that begs the question: What’s going on in these search engines? What are the well-funded, well-capitalized industries behind them who are purchasing keywords and using their influence to represent people and ideas in this way? The book was my attempt to answer these questions.

Sean Illing

Okay, so at the time you did this research, if someone went to Google and searched for “black women,” they would get a bunch of pornography. What happens if they type in “white girls” or “white women”? Or if they search for what should be a universal category, like “beautiful people”?

Safiya Umoja Noble

Now, fortunately, Google has responded to this. They suppressed a lot of porn, in part because we’ve been speaking out about this for six or seven years. But if you go to Google today and search for “Asian girls” or “Latina girls,” you’ll still find the hypersexualized content.

For a long time, if you did an image search on the word “beautiful,” you would get scantily clad images of almost exclusively white women in bikinis or lingerie. The representations were overwhelmingly white women.

People often ask what happens when you search “white girls.” White women don’t typically identify as white; they just think of themselves as girls or women or individuals. I think what you see there is the gaze of people of color looking at white women and girls and naming whiteness as an identity, which is something that you don’t typically see white women doing themselves.

Sean Illing

These search algorithms aren’t merely selecting what information we’re exposed to; they’re cementing assumptions about what information is worth knowing in the first place. That might be the most insidious part of this.

Safiya Umoja Noble

There is a dominant male, a Western-centric point of view that gets encoded into the organization of information. You have to remember that an algorithm is just an automated decision tree. If these keywords are present, then a variety of assumptions have to be made about what to point to in all the trillions of pages that exist on the web.

And those decisions always correlate to the relationship of advertisers to the platform. Google has a huge empire called AdWords, and people bid in a real-time auction to optimize their content.

That model — of information going to the highest bidder — will always privilege people who have the most resources. And that means that people who don’t have a lot of resources, like children, will never be able to fully control the ways in which they’re represented, given the logic and mechanisms of how search engines work.

Sean Illing

In the book, you talk about how racist websites gamed search engines to control the narrative around Martin Luther King Jr. so that if you searched for MLK, you’d find links to white supremacist propaganda. You also talk about the stakes involved here and point to Dylann Roof as an example.

Safiya Umoja Noble

In his manifesto, Dylann Roof has a diatribe against people of color, and he says that the first event that truly awakened him was the Trayvon Martin story. He says he went to Google and did a search on “black-on-white crime.” Now, most of us know that black-on-white crime is not an American epidemic — that, in fact, most crime happens within a community. But that’s a separate discussion.

So Roof goes to Google and puts in a white nationalist red herring (“black-on-white crime.”) And of course, it immediately takes him to white supremacist websites, which in turn take him down a racist rabbit hole of conspiracy and misinformation. Often, these racist websites are designed to appear credible and benign, in part because that helps them game the algorithms, but also because it convinces a lot of people that the information is truthful.

This is how Roof gets radicalized. He says he learns about the “true history of America,” and about the “race problem” and the “Jewish problem.” He learns that everything he’s ever been taught in school is a lie. And then he says, in his own words, that this makes him research more and more, which we can only imagine is online, and this leads to his “racial awareness.”

And now we know that shortly thereafter, he steps into the “Mother” Emanuel AME Church in Charleston, South Carolina, and murders nine African-American worshippers in cold blood, in order to start a race war.

So the ideas that people are encountering online really matter. It matters that Dylann Roof didn’t see the FBI statistics that tell the truth about how crime works in America. It matters that he didn’t get any counterpoints. It matters that people like him are pushed in these directions without resistance or context.

Sean Illing

My guess is that these algorithms weren’t designed to produce this effect, but I honestly don’t know. What is driving the decision-making process? Is this purely about commercial interests?

Safiya Umoja Noble

It’s difficult to know exactly what Google’s priorities are, because Google’s search algorithm is proprietary, so no one can really make sense of the algorithm except by looking at the output. All of us who study this do it by looking at the end results, and then we try to reverse-engineer it as best we can.

But yes, it’s pretty clear that what’s ultimately driving tech companies like Google is profit. I don’t imagine that a bunch of racists is sitting around a table at Google thinking of ways to create a racist product, but what happens, however, is that engineers simply don’t think about the social consequences of their work. They’re designing technologies for society, and they know nothing about society.

In its own marketing materials, Google says there are over 200 different factors that go into deciding what type of content they surface. I’m sure they have their own measures of relevance for what they think people want. Of course, they’re also using predictive technologies, like autosuggestion, where they fill in the blank. They’re doing that based on what other people have looked at or clicked on in the past.

Sean Illing

But the autosuggestion tool guarantees that majority perspectives will be consistently privileged over others, right?

Safiya Umoja Noble

Right. People who are a numerical minority in society will never be able to use this kind of “majority rules” logic to their benefit. The majority will always be able to control the notions of what’s important, or what’s important to click on, and that’s not how the information landscape ought to work.

Sean Illing

I’m sure some people will counter and say that these are essentially neutral platforms, and if they’re biased, they’re biased because of the human users that make them up. In other words, the problem isn’t the platform; it’s the people.

Safiya Umoja Noble

The platform exists because it’s made by people. It didn’t come down from an alien spacecraft. It’s made by human beings, and the people who make it are biased, and they code their biases into search. How can these things not inform their judgment?

So it’s disingenuous to suggest that the platform just exists unto itself and that the only people who can manipulate it or influence it are the people who use it when actually, the makers of the platform are the primary source of responsibility. I would say that there are makers, as well as users, of a platform. They have to take responsibility for their creations.

Source: This article was published vox.com By Sean Illing

Categorized in Search Engine

Google has confirmed rumors that a search algorithm update took place on Monday. Some sites may have seen their rankings improve, while others may have seen negative or zero change.

Google has posted on Twitter that it released a “broad core algorithm update” this past Monday. Google said it “routinely” does updates “throughout the year” and referenced the communication from the previous core update.

Google explained that core search updates happen “several times per year” and that while “some sites may note drops or gains,” there is nothing specific a site can do to tweak its rankings around these updates. In general, Google says to continue to improve your overall site quality, and the next time Google runs these updates, hopefully, your website will be rewarded.

Google explained that “pages that were previously under-rewarded” would see a benefit from these core updates.

Here is the statement Google previously made about this type of update:

Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year.

As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded.

There’s no “fix” for pages that may perform less well, other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.

Here is Google’s confirmation from today about the update on Monday:

Screenshot 4

 

Source: This article was published searchengineland.com By Barry Schwartz

Categorized in Search Engine

Google is the dominating force in the world of search engines, and there’s an entire industry dedicated to maximizing visibility within its search engine results: search engine optimization (SEO).

People like me have built their careers on finding ways to benefit from the central ranking algorithm at Google’s core. But here’s the interesting thing: Google doesn’t explicitly publish how its search algorithm works and often uses vague language when describing its updates.

So how much do we really know about Google’s ranking algorithm? And why is Google so secretive about it?

Why Google Keeps Things Secret

Google has come under fire lately, most recently by German Chancellor Angela Merkel, because it keeps its algorithm secret. Her main argument is that transparency is vitally important to maintaining a balanced society; after all, our daily searches shape our behavior in subtle and blatant ways, and not knowing the mechanisms that influence that behavior can leave us in the dark.

But Google isn’t withholding its algorithm so that it can manipulate people with reckless abandon. There are two good reasons why the company would want to keep the information a closely-guarded secret.

First, Google’s algorithm is proprietary, and it has become the dominant search competitor because of its sheer sophistication. If other competitors have free and open access to the inner workings of that algorithm, they could easily introduce a competing platform with comparable power, and Google’s search share could unfairly plummet.

Second, there are already millions of people who make a living by improving their positions within Google, and many of them are willing to use ethically questionable tactics or spam people in an effort to get more search visibility. If Google fully publishes its search algorithm, they could easily find bigger loopholes, and ruin the relatively fair search engine results pages (SERPs) we’ve come to expect from the giant.

How We Learn

So if Google withholds all the information on its algorithm, how can search optimizers know how to improve the search rankings of web pages?

  • Google revelations. Google doesn’t leave webmasters totally in the dark. While it refuses to disclose specifics about how the algorithm functions, it’s pretty open about the general intentions of the algorithm, and what webmasters can take away from it. For example, Google has published and regularly updates a guidelines manual on search quality ratings; 160 pages long, and last updated July of last year, it’s a fairly comprehensive guidebook that explains general concepts of how Google judges the quality of a given page. Google has also been known to explain its updates as they roll out—especially the larger ones—with a short summary and a list of action items for webmasters. These are all incredibly helpful sources of information.
  • Direct research. Google doesn’t give us everything, however. If you scroll through Moz’s fairly comprehensive guide on the history of Google’s algorithm changes, you’ll notice dozens of small updates that Google didn’t formally announce, and in many cases, refuses to acknowledge. How does the search community know that these algorithm changes unfolded? We have volatility indicators like MozCast, which measure how much the SERPs are changing within a given period of time; a period of high volatility is usually the signature of some kind of algorithm change. We can also conduct experiments, such as using two different tactics on two different pages and seeing which one ranks higher at the end of the experiment period. And because the SEO community is pretty open about sharing this information, one experiment is all it takes to give the whole community more experience and knowledge.
  • Experience and intuition. Finally, after several years of making changes and tracking growth patterns, you can rely a bit on your own experience and intuition. When search traffic plummets, you can usually identify a handful of potential red flags and come up with ideas for tweaks to take you back to your baseline.

What Do We Know?

So what do we really know about Google’s search algorithm?

  • The basics. We know the basic concept behind the search platform: to give users the best possible results for their queries. Google does this by presenting results that offer a combination of relevance (how appropriate the topic is) and authority (how trustworthy the source is).
  • Core ranking factors. We also know the core ranking factors that will influence your rank. Some of these come directly from Google’s webmaster guidelines, and some of them come from the results of major experiments. In any case, we have a good idea what changes are necessary to earn a high rank, and what factors could stand in your way. I covered 101 of them here.
  • Algorithm extensions and updates.We also know when there’s a new Google update, thanks to the volatility indicator, and we can almost always form a reasonable conclusion on the update’s purpose—even when Google doesn’t tell us directly.

While we still don’t know the specifics of how Google’s algorithm works—and unless the EU’s transparency campaign kicks into high gear soon, we probably won’t for the foreseeable future—we do know enough about it to make meaningful changes to our sites, and maximize our ranking potential.

Moreover, the general philosophy behind the algorithm and the basic strategies needed to take advantage of it aren’t hard to learn. If you’re willing to read Google’s documentation and learn from the experiments of others, you can get up to speed in a matter of weeks.

 Source: This article was published forbes.com By Jayson DeMers,

Categorized in Search Engine
Page 1 of 6

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.
Please wait

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Newsletter Subscription

Receive Great tips via email, enter your email to Subscribe.
Please wait

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now