If you Google “Was the Holocaust real?” right now, seven out of the top 10 results will be Holocaust denial sites. If you Google “Was Hitler bad?,” one of the top results is an article titled, “10 Reasons Why Hitler Was One Of The Good Guys.”
In December, responding to weeks of criticism, Google said that it tweaked it algorithm to push down Holocaust denial and anti-Semitic sites. But now, just a month later, their fix clearly hasn’t worked.
In addition to hateful search results, Google has had a similar problem with its “autocompletes” — when Google anticipates the rest of a query from its first word or two. Google autocompletes have often embodied racist and sexist stereotypes. Google image search has also generated biased results, absurdly tagging some photos of black people as “gorillas.”
The result of these horrific search results can be deadly. Google search results reportedly helped shape the racism of Dylann Roof, who murdered nine people in a historically black South Carolina church in 2015. Roof said that when he Googled “black on white crime, the first website I came to was the Council of Conservative Citizens,” which is a white supremacist organization. “I have never been the same since that day,” he said. And of course, in December, a Facebook-fueled fake news story about Hillary Clinton prompted a man to shoot up a pizza parlor in Washington D.C. The fake story reportedly originated in a white supremacist’s tweet.
These terrifying acts of violence and hate are likely to continue if action isn’t taken. Without a transparent curation process, the public has a hard time judging the legitimacy of online sources. In response, a growing movement of academics, journalists and technologists is calling for more algorithmic accountability from Silicon Valley giants. As algorithms take on more importance in all walks of life, they are increasingly a concern of lawmakers. Here are some steps Silicon Valley companies and legislators should take to move toward more transparency and accountability:

1. Obscure content that’s damaging and not of public interest.

When it comes to search results about an individual person’s name, many countries have aggressively forced Google to be more careful in how it provides information. Thanks to the Court of Justice of the European Union, Europeans can now request the removal of certain search results revealing information that is “inadequate, irrelevant, no longer relevant or excessive,” unless there is a greater public interest in being able to find the information via a search on the name of the data subject.
Such removals are a middle ground between information anarchy and censorship. They neither disappear information from the internet (it can be found at the original source) nor allow it to dominate the impression of the aggrieved individual. They are a kind of obscurity that lets ordinary individuals avoid having a single incident indefinitely dominate search results on his or her name. For example, a woman in Spain whose husband was murdered 20 years ago successfully forced Google Spain to take news of the murder off search results on her name.

Such removals are a middle ground between information anarchy and censorship.

2. Label, monitor and explain hate-driven search results.

In 2004, anti-Semites boosted a Holocaust-denial site called “Jewwatch” into the top 10 results for the query “Jew.” Ironically, some of those horrified by the site may have helped by linking to it in order to criticize it. The more a site is linked to, the more prominence Google’s algorithm gives it in search results.
Google responded to complaints by adding a headline at the top of the page entitled “An explanation of our search results.” A web page linked to the headline explained why the offensive site appeared so high in the relevant rankings, thereby distancing Google from the results. The label, however, no longer appears. In Europe and many other countries, lawmakers should consider requiring such labeling in the case of obvious hate speech. To avoid mainstreaming extremism, labels may link to accounts of the history and purpose of groups with innocuous names like “Council of Conservative Citizens.”
In the U.S., this type of regulation may be considered a form of “compelled speech,” barred by the First Amendment. Nevertheless, better labeling practices for food and drugs have escaped First Amendment scrutiny in the U.S., and why should information itself be different? As law professor Mark Patterson has demonstrated, many of our most important sites of commerce are markets for information: search engines are not offering products and services themselves but information about products and services, which may well be decisive in determining which firms and groups fail and which succeed. If they go unregulated, easily manipulated by whoever can afford the best search engine optimization, people may be left at the mercy of unreliable and biased sources.

Better labeling practices for food and drugs have escaped First Amendment scrutiny in the U.S. Why should information itself be different?

3. Audit logs of the data fed into algorithmic systems.

We also need to get to the bottom of how some racist or anti-Semitic groups and individuals are manipulating search. We should require immutable audit logs of the data fed into algorithmic systems. Machine-learning, predictive analytics or algorithms may be too complex for a person to understand, but the data records are not.
A relatively simple set of reforms could vastly increase the ability of entities outside Google and Facebook to determine whether and how the firms’ results and news feeds are being manipulated. There is rarely adequate profit motive for firms themselves to do this — but motivated non-governmental organizations can help them be better guardians of the public sphere.

4. Possibly ban certain content.

In cases where computational reasoning behind search results really is too complex to be understood in conventional narratives or equations intelligible to humans, there is another regulatory approach available: to limit the types of information that can be provided.
Though such an approach would raise constitutional objections in the U.S., nations like France and Germany have outright banned certain Nazi sites and memorabilia. Policymakers should also closely study laws regarding incitement to genocide” to develop guidelines for censoring hate speech with a clear and present danger of causing systematic slaughter or violence against vulnerable groups.

It’s a small price to pay for a public sphere less warped by hatred.

5. Permit limited outside annotations to defamatory posts and hire more humans to judge complaints.

In the U.S. and elsewhere, limited annotations ― rights of reply” ― could be permitted in certain instances of defamation of individuals or groups. Google continues to maintain that it doesn’t want human judgment blurring the autonomy of its algorithms. But even spelling suggestions depend on human judgment, and in fact, Google developed that feature not only by means of algorithms but also through a painstaking, iterative interplay between computer science experts and human beta testers who report on their satisfaction with various results configurations.
It’s true that the policy for alternative spellings can be applied generally and automatically once the testing is over, while racist and anti-Semitic sites might require fresh and independent judgment after each complaint. But that is a small price to pay for a public sphere less warped by hatred.
We should commit to educating users about the nature of search and other automated content curation and creation. Search engine users need media literacy to understand just how unreliable Google can be. But we also need vigilant regulators to protect the vulnerable and police the worst abuses. Truly accountable algorithms will only result from a team effort by theorists and practitioners, lawyers, social scientists, journalists and others. This is an urgent, global cause with committed and mobilized experts ready to help. Let’s hope that both digital behemoths and their regulators are listening.
 
EDITOR’S NOTE: The WorldPost reached out to Google for comment and received the following from a Google spokesperson.
Search ranking:
Google was built on providing people with high-quality and authoritative results for their search queries. We strive to give users a breadth of content from a variety of sources, and we’re committed to the principle of a free and open web. Understanding which pages on the web best answer a query is a challenging problem, and we don’t always get it right.When non-authoritative information ranks too high in our search results, we develop scalable, automated approaches to fix the problems, rather than manually removing these one-by-one. We are working on improvements to our algorithm that will help surface more high quality, credible content on the web, and we’ll continue to improve our algorithms over time in order to tackle these challenges.
Autocomplete:
We’ve received a lot of questions about Autocomplete, and we want to help people understand how it works: Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for a wide range of material on the web ― 15 percent of searches we see every day are new. Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we don’t always get it right. Autocomplete isn’t an exact science, and we’re always working to improve our algorithms.
Image search:
Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online. This means that sometimes unpleasant portrayals of subject matter online can affect what image search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs.
Author : Frank Pasquale
Categorized in Search Engine

Did Yandex's new algorithm Palekh just go head to head with Google's RankBrain?

Yandex announced on their Russian blog that they have launched a new algorithm aimed at improving how they handle long-tail queries. The new algorithm is named Palekh, which is the name of a world-famous Russian city that has a firebird on its coat of arms.

The firebird has a long tail, and Yandex, the largest Russian search engine, used that as code name for long-tail queries. Long-tail queries are several words entered into the search box, more often seen in voice queries these days. Yandex says about 100 million queries per day fall under the “long-tail” classification within their search engine.

The Palekh algorithm allows Yandex to understand the meaning behind every query, and not just look for similar words. Which reminds me of Google RankBrain. I asked Yandex if it is similar to Google’s RankBrain, and they said they “don’t know exactly what’s the technology behind Google’s RankBrain, although these technologies do look quite similar.”

Yandex’s Palekh algorithm has started to use neural networks as one of 1,500 factors of ranking. A Yandex spokesperson told us they have “managed to teach our neural networks to see the connections between a query and a document even if they don’t contain common words.” They did this by “converting the words from billions of search queries into numbers (with groups of 300 each) and putting them in 300-dimensional space — now every document has its own vector in that space,” they told us. “If the numbers of a query and numbers of a document are near each other in that space, then the result is relevant,” they added.

When I asked if they are using machine learning, Yandex said they do use machine learning and explained that they teach their “neural network based on these queries will lead to some advancements in answering conversational based queries in the future.” Adding that they “also have many targets (long click prediction, CTR, “click or not click” models and so on) that are teaching our neural network — our research has showed that using more targets is more effective.”

Author : Barry Schwartz

Source : http://searchengineland.com/yandex-launches-new-algorithm-named-palekh-improve-search-results-long-tail-queries-262334

Categorized in Search Engine

Search giant was under fire for search results dominated by neo-Nazi white supremacist sites
Online search giant Google has tweaked an algorithm to ensure that Holocaust denial sites are not the top results when someone asks whether the Holocaust occurred, Digital Trends reported on Sunday.

“We recently made improvements to our algorithm that will help surface more high quality, credible content on the web,” a Google spokesperson told Digital Trends when asked about the topic. “We’ll continue to change our algorithms over time in order to tackle these challenges.”

Google recently faced criticism after journalists noted that the top results for queries about the Holocaust's legitimacy were all links to anti-Semitic, neo-Nazi sites.

Experts have recently said that far-right groups have been using methods to manipulate search algorithms and push their propaganda higher up Google's search rankings.

While Holocaust denial and neo-Nazi sites have not been banned from Google, they are now far less prominent in search results. A link from the neo-Nazi site Stormfront was previously consistently the top result, but now appears far down the first page of results or on the second page – if browsing in "incognito mode", which does not take a user's search history into account when weighing results.

Anti-hate groups have warned of a rise in online incitement this year, with the Anti-Defamation League (ADL) telling the Israeli parliament this year that there has been an "explosion of hate online."

"Online hate is particularly disturbing because of the ubiquity of social media and its deep penetration into our daily lives, plus the anonymity offered by certain platforms which facilitates this phenomenon," ADL CEO Jonathan A. Greenblatt said.

Earlier this year, Google removed an extension on its Chrome browser which allowed users to identify and track suspected Jewish members of the media and entertainment industries.

While it was active, Coincidence Detector identified suspected or confirmed Jews by adding triple parentheses to their names wherever they were referenced online. According to Mic, the extension had been downloaded more than 2,700 times and had a database of 8,700 people at the time it was removed and had a five-star rating.

Source : http://www.i24news.tv/en/news/technology/133654-161227-google-reportedly-tweaks-algorithm-over-holocaust-denying-search-results

Categorized in Search Engine

Between the long-awaited rollout of Penguin 4.0, a strengthening of Google’s mobile-friendly ranking signal and the ‘Possum’ algorithm update impacting local search, 2016 was an interesting year for Google algorithm changes.

And with an upcoming move to a mobile-first search index already on the cards, as well as a penalty for intrusive mobile interstitials coming into effect on the 10th, 2017 promises to be just as eventful.

Looking back at 2016, which algorithm changes were the most impactful for marketers in the industry? And how can brands best prepare themselves for what might be around the corner? I spoke to Sastry Rachakonda and Ajay Rama of digital marketing agency iQuanti, along with Search Engine Watch’s regular mobile columnist Andy Favell, to get their thoughts on what’s to come in the search industry.

The most impactful algorithm updates of 2016

“Mobile-first indexing is probably the most significant change that happened this year,” said Rachakonda, who is the CEO of iQuanti, a data-driven digital marketing agency, “since companies were creating unique mobile content that was not the same as their desktop content. They did that for user experience. There were smaller snippets that were design friendly, but weren’t relevant and optimal for the search query.”

But while Google’s shift to emphasise mobile search even more heavily – which has included a much fuller rollout of Accelerated Mobile Pages into organic search results – was probably its most noteworthy update overall, Rachakonda believes that a different update was actually more impactful from a brand perspective: Possum.

‘Possum’ is the name given to a major update to local search on Google which came into effect on 1st September 2016, and which is thought to be the most significant algorithm update to local search since Pigeon in 2014. The name was coined by Phil Rozek of Local Visibility System, who thought it was fitting as after the update, many business owners thought that their Google My Business listings were gone, when in fact they were only filtered – hence, ‘playing possum’.

A possum lies with its mouth open on the ground, appearing to be dead.

The apparently ‘dead’ Google My Business listings gave the Possum algorithm update its name. Image via Wikimedia Commons

The update seemed mostly aimed at improving the quality of the local search results and removing spammy listings, which meant that some businesses who had engaged in less-than-kosher practices in order to rank found themselves demoted.

“Possum has been the most impactful update for brands by far,” said Rachakonda. “One of our Fortune 500 clients in the insurance industry saw a 7% drop in keyword rankings, which resulted in a 13% loss of month-on-month traffic. We believe this was due to some outdated tactics their previous agency used to get them ranked, which clearly Google wasn’t fond of.”

While some businesses saw their traffic drop off as a result of Possum, others were seeing a remarkable recovery thanks to Penguin 4.0, which deployed after much anticipation in late September. The original Penguin update in 2012 targeted and devalued inorganic links, such as links which had been bought or placed solely to improve rankings, which led to significant losses in traffic for businesses who had engaged in those practices.

As Chuck Price explained in an article for Search Engine Watch in December 2015, “After Penguin, bad links became ‘toxic’, requiring a link audit and removal or disavow of spammy links. Even then, a Penguin refresh was usually required before one could see any signs of recovery.”

But thanks to the Penguin 4.0 update in 2016, these refreshes now take place in real-time, leading to significant recovery for brands who had already taken action to remove and disavow the bad links. Marcela De Vivo took a look at how this recovery works in practice, and what site owners can do to improve their situation if they haven’t already done so. 

What’s on the cards for 2017?

As I mentioned in my introduction, at least two updates in 2017 are already certain, both of them relevant to mobile search. One, Google’s penalty for mobile sites with annoying interstitials, is due to go live tomorrow, and our search news roundup last Friday featured some new clarifications from Google about what kind of interstitials will be affected by the penalty.

The other is Google’s move to a mobile-first search index, a major shift which reflects the fact that the majority of Google search queries are now coming from mobile devices. While we don’t yet have a date for this change, Google confirmed in October that the change would take place within the next few months, which means that Google’s primary index could switch to mobile any day now, and brands would do well to prepare themselves. I asked Andy Favell, Search Engine Watch’s resident mobile specialist, what advice he would give to brands who want to be prepared.

“Google has done an excellent job of focusing companies’ minds on the importance of having a mobile-friendly website. The stick approach – the fear of harming the search ranking – has worked wonders for driving adoption of mobile or mobile-friendly sites.“However, companies should have been focusing on the carrot – building websites that would appeal to some of the billions of mobile web users out there. The beauty of mobile first is that a mobile-friendly site is often a much better desktop site. That is still true today.“Rather than worrying about trying to make Google happy, brands should concentrate on the mobile users, consider who they are, their context, and what they want, and provide that the best possible way – i.e. intuitive, fast-loading, good UX and usability. Businesses that do this will get more traffic, more happy users and more conversions.“That’s not just good for business, it’s good for your search ranking also. Because Google wants what’s best for the search user.”

A white iPhone held in a person's hand against a blurry backdrop. The lock screen displays the time as 12:23 on Monday, July 29.

Brands will need to prepare themselves for mobile search becoming Google’s primary index some time soon in 2017.

Those are the changes we know about so far. But what do those in the industry think is coming for search in 2017? Ajay Rama, Senior Vice President of Product at iQuanti, believes that the mobile-first index will take up most of SEO mindshare over the coming year, but he also has a number of predictions for how voice search – which has become a huge part of the search landscape since 2015 – may evolve and change things.

“As voice search starts becoming mainstream, we might see the beginning of a SERPless search – search without a SERP page,” predicts Rama. “We could see early tests in this space where we will see Google Assistant and search being seamlessly integrated into an interactive search experience. Assistant interacts with the user to ask the right questions and take him to the target page or a desired action, instead of showing a SERP page with various options. In this new experience, ads would have to be reinvented all over.”

Given that Google’s innovations of the past few years, from semantic search to Quick Answers, have increasingly been geared towards understanding users’ exact intentions with the aim of finding a single, ideal result or piece of information to satisfy their query, it’s not hard to imagine this happening. Rama also foresees a much more extensive rollout of Google’s voice-controlled Assistant to go along with this.

“Google Assistant will become part of Android, and will be available on all Android devices. Talking to the device in local languages becomes mainstream, and Google Assistant will lead this space. Their machines will learn all accents and all languages, and will soon become a leader in the voice devices, especially in non-English speaking nations.”

A Google search results page for 'voice search'.

Can you imagine Google search without the SERP? With the expansion of voice search, it could become a reality.

While it’s hard to imagine that all of these developments will take place in 2017 alone, there’s definitely a possibility that we’ll see them begin. Google Assistant is already reported to be learning Hindi as a second language, and more languages could well follow if the uptake of Hindi is a success. However, Google Assistant is fairly late to the game compared to established voice assistants like Siri and Cortana who have been around much longer, and have had more time to refine their technology. So is it still possible for Google to pull ahead in this race?

With this change in the way we search comes a change in the way we market, as well; and if the search results page is to disappear one day, advertising will have no choice but to colonise whatever takes its place. We’re already seeing a shift towards an ‘always-on’, always-connected culture with devices like the Amazon Echo constantly listening out for voice commands. Rama believes that Internet of Things-connected devices could easily start to ‘spy’ on their owners, collecting data for the purposes of marketing – “Advertisers would love to get into living room and dinner table discussions.”

This might seem like sobering food for thought, or a whole new world of possibilities, depending on your perspective. Either way, it will be extremely interesting to see whether search continues to develop along the path it seems to be taking now – or whether it veers off in other, even more unexpected directions.

Author : Rebecca Sentance

Source : https://searchenginewatch.com/2017/01/09/which-google-algorithm-changes-impacted-marketers-most-in-2016-and-what-can-we-expect-from-2017/

Categorized in Search Engine

Google says it is “thinking deeply” about improving its search results after learning that Holocaust deniers and others were successful in making their links rise to the top.

The company, a subsidiary of Alphabet, told the BBC that it is thinking about ways to improve its search results.

The focus on skewing Google’s algorithm comes as there’s more pressure on internet firms to do more to combat fake news and conspiracy theory sites. Given that many people get news and information from Google, the company’s algorithm is of particular focus. Last week, Facebook announced it was partnering with third-party fact checkers, including news organizations, to fight fake news.

In the U.S. and the United Kingdom, those searching for “Did the Holocaust happen?” received a top result linking to a website with the headline, “Top Ten Reasons why the Holocaust didn’t happen.” The site is run by Stormfront, a neo-Nazi white supremacist group.

Google changed the ranking for U.S. users. Now the link to the Holocaust denier is the second result, after an ad and three news stories about Google’s struggles with the issue.

But in the United Kingdom, the Holocaust denier’s site is still the top result, the BBC said.

The internet giant has struggled with its position that its algorithm is surfacing the best content on the web and those who are able to skew the algorithm:

This is a really challenging problem, and something we’re thinking deeply about in terms of how we can do a better job….Search is a reflection of the content that exists on the web….The fact that hate sites may appear in search results in no way means that Google endorses these views.

Danny Sullivan of Search Engine Land wrote that it’s likely that Holocaust deniers and others are figuring out how to game Google’s system to bring their own results to the top of the search results.

The challenge for Google, he said, is to find a systemic fix, rather than just respond to one-off discoveries of misinformation:

It’s very easy to take a search here and there and demand Google change something and then the next day you find a different search and say, ‘why didn’t you fix that?’

The BBC also found that so-called “snippets,” short summaries of information that appear at the top of search results, also can be gamed.

Above: Google search page.

Author : 

Source : http://www.siliconbeat.com/2016/12/20/google-moves-past-denial-comes-skew-algorithm/

Categorized in Search Engine

Google's search algorithm has been changed over the last year to increasingly reward search results based on how likely you are to click on them, multiple sources tell Business Insider.

As a result, fake news now often outranks accurate reports on higher quality websites.

The problem is so acute that Google's autocomplete suggestions now actually predict that you are searching for fake news even when you might not be, as Business Insider noted on December 5.

There is a common misconceptionthat the proliferation of fake news is all Facebook's fault. Although Facebook does have a fake news problem, Google's ranking algorithm does not take cues from social shares, likes, or comments when it is determining which result is the most relevant, search experts tell Business Insider. The changes at Google took place separately, experts say, to the fake news problem occurring on Facebook.

The changes to the algorithm now move links up Google's search results page if Google detects that more people are clicking on them, search experts tell Business Insider.

Joost De Valk, founder of Yoast, a search consultancy that has worked for The Guardian, told Business Insider: "All SEOs [search engine optimisation experts] agree that they include relative click-through rate (CTR) from the search results in their ranking patterns. For a given 10 results page, they would expect a certain CTR for position five, for instance. If you get more clicks than they’d expect, thus a higher CTR, they’ll usually give you a higher ranking and see if you still outperform the other ones," 

Search marketing consultant Rishi Lakhani said: "Though Google doesn't like to admit it, it does use CTR (click through rate) as a factor. Various tests I and my contemporaries have run indicate that. The hotter the subject line the better the clicks, right?"

It is well known that Google includes user-behaviour signals to evaluate its ranking algorithms. Google has an obvious interest in whether users like its search results. Its ranking engineers look at live traffic frequently to experiment with different algorithms. User behavior signals have the added advantage of being difficult to model, or reproduce, by unscrupulous web publishers who want to game the algorithm. 

The unfortunate side effect is that user-behaviour signals also reward fake news. Previously, Google's ranking relied more heavily on how authoritative a page is, and how authoritative the incoming links to that page are. If a page at Oxford University links to an article published by Harvard, Google would rank that information highly in its display of search results.

Now, the ranking of a page can also be boosted by how often it's clicked on even if it does not have incoming links from authoritative sources, according to Larry Kim, founder and chief technology officer of WordStream, a search marketing consultancy.

The result of all this is that "user engagement" has become a valuable currency in the world of fake news. And people who believe in conspiracy theories — the kind of person who spends hours searching for "proof" that Hillary Clinton is a child abuser, for instance — are likely to be highly engaged with the fake content they are clicking on.

Thus even months after a popular fake news story has been proven to be fake, it will still rank higher than the most relevant result showing that it's false, if a large enough volume of people are clicking on it and continuing to send engagement signals back to Google's algorithm.

Here some examples.

President Obama has never signed an order banning the US national anthem, and yet ...

Obama_signs_a_nationwide_order_ _Google_Search

And Hillary Clinton has never sold weapons to ISIS, but ...

Hillary_Clinton_sold_weapons_to_ISIS_ _Google_Search

De Valk says: "I think the reason fake news ranks is the same reason why it shows up in Google’s autocomplete: they’ve been taking more and more user signals into their algorithm. If something resonates with users, it’s more likely to get clicks. If you’re the number three result for a query, but you’re getting the most clicks, Google is quick to promote you to #1 ... Nothing in their algorithm checks facts."

Google never explains in full how its algorithm works but a spokesperson for the company said:

"When someone conducts a search, they want an answer, not trillions of webpages to scroll through. Algorithms are computer programs that look for clues to give you back exactly what you want. Depending on the query, there are thousands, if not millions, of web pages with helpful information. These algorithms are the computer processes and formulas that take your questions and turn them into answers. Today Google’s algorithms rely on more than 200 unique signals or 'clues' that make it possible to show what you might really be looking for and surface the best answers the web can offer."

Larry Kim, founder and chief technology officer of WordStream, a search marketing consultancy, tracks the changes to October 2015, when Google added a machine-learning program called "RankBrain" to the list of variables that control its central search algorithm. It deployed the system to all search results in June of this year.

Is this machine learning's fault? "I’m certain of this," Kim said. "This is the only thing that changed in the algorithm over the last year."

Rankbrain is now the third most-important variable in Google's search algorithm, according to Greg Corrado, a senior research scientist at Google.

The change was intended to help Google make more intelligent guesses about the 15% of new daily search queries that Google has never encountered before. RankBrain considers previous human behaviour, such as the historical popularity of similar searches, in its attempt to get the right answer.

Kim told us: "The reason why they did this was not to create fake news. They created this because links can be vulnerable, links can be gamed. In fact it is so valuable to have these number one listings for commercial listings that there’s a lot of link fraud. User-engagement [looking at how popular a search result is] sees through attempts to inflate the value of content." 

Kim's opinion is disputed by his peers. Lakhani doubts that RankBrain is the sole cause of the proliferation of fake news. "It's been tested on and off for a while," he says.

De Valk is not so sure either. "I'm not sure it's related to that. It might be, but I'm not sure. Google does hundreds of updates every year," he told Business Insider.

Naturally, the type of content that is more likely to get clicked on is also more likely to get shared, commented on, and liked on Facebook. And Facebook and Google both reward engagement (or popularity, which gives off similar signals). That pushes an item higher on the newsfeed in Facebook's case, and on the search results page in Google's case. The performance of fake news on Google is correlated to its performance on Facebook because they both deal in the same currency — user engagement. So what does well on Facebook often does well on Google. 

This is correlation, not causation.

But the fact that the two of them are occurring at the same time exacerbates the high-level presence of fake news generally. Google and Facebook dominate the amount of time people spend online and inside apps.

The changes at Google help explain why fake news has suddenly gone from circulating in a tiny corner of the internet to something that outperforms real newsinfluences Google's predictive search, and has real world consequences — such as the Comet Ping Pong shooting, done by a man who was convinced from his internet searches that Clinton was using the restaurant as a front for a child abuse ring. Nearly 50% of Trump voters believe the same thing, according to research by Public Policy Polling.

Author:  Hannah Roberts

Source:  http://www.businessinsider.com/

Categorized in Search Engine

It’s a zoo out there. For marketers and advertisers alike, it’s hard to control your organic search rankings with all the Google updates; Pandas, Penguins, Pigeons and Hummingbirds to keep track of. We know it’s important, but why?

In a 2015 study done by Eli Swartz, Google dominated other search engines like Yahoo, Bing and Duck Duck Go by a landslide. Seventy-five percent of responders to a survey stated that Google was their primary search engine, and in 2016 the percentage is only rising.

So what does that mean for local and organic search, or search in general? For one, it means that SEO has to continue to abide by the rules each algorithm puts forth during their updates. It also means search engine marketers have to take into account Google’s new local algorithm update, Possum. Here’s why.

About Possum

You may not have seen a drastic change in your local organic listings in September, but a recent study shows that Google’s Possum Algorithm changed sixty-four percent of local SERPs. The debut of Possum in September impacted website rankings in the local 3-pack and Local Finder. The biggest impact Possum currently has on search results is filtering your business out if it has a duplicate, similar or second listing. This update runs separately from organic SERPs, and affects the following types of businesses:

  1. Businesses outside of city limits.
  2. Separate businesses location at the same address as a similar business.
  3. Two or more businesses owned by the same company.

Playing Possum Outside of City Limits

One of the biggest and most beneficial changes seen for the new Possum update are rankings for businesses outside of their own city limits. With the algorithm in place, businesses that are attempting to rank in the local 3-pack or Local Finder for the town over are having an easier time doing so, and may have already seen rankings in those areas increase drastically. With Possum in place, there is a need now more than ever for all search engine marketers to do more to go local with their SEO campaigns. Going local with your SEO will help to increase traffic and revenue to your site, as well as improve local rankings and authority in the SERPs.

Separate Business Locations at the same Address as a Similar Business

This is where Possum comes into full-effect. Businesses of similar industries located in the same building will begin to be filtered out and ultimately won’t show up for the same search. Keyword variation plays a more important role as this algorithm continues to make its way across search. That means that attorneys, lawyers, dentists, and chiropractors located in the same building will rank locally for different keywords than a similar or competitive businesses at the same location.

One Algorithm, One Parent Company and Two Businesses

Although not as apparent as the other two listings, Possum’s update has also affected separate businesses owned by one company. One business remains filtered out in all searches for certain keyword terms while the other business continues to show up in the search results. While there isn’t a way around the filter (yet), we’re trusting Google to test, tweak and update their newest algorithm to differentiate the two businesses as separate rooftops, even if they have the same parent company.

Streams Kick Start Step: If you haven’t already invested in link building, you should start. Google still views quality links that point to your website as a vote of confidence. The more local, quality and authoritative links you have pointing back to your site, the more likely you are to rank in the local SERPs.

Taking Action

While Google seems to still be working out and testing their newest algorithm, it’s always a good idea to stay ahead of the next update. In order to take action, one thing that is extremely beneficial for your local SEO strategy is to start incorporating local content and putting a heavy focus on off-page SEO if you haven’t already. When incorporating local and off-page SEO, you have a recipe for success to maintain and improve your local rankings.

Take SEO a step further. Download this FREE SEO Checklist to learn 17 ways to improve your organic search results, where to start with a link building strategy and steps to integrating content into your SEO campaign.

Source : http://www.business2community.com/

Auhtor : Keisha James

Categorized in Search Engine

n 25 October, the German chancellor, Angela Merkel, wandered into unfamiliar territory – at least for a major politician. Addressing a media conference in Munich, she called on major internet companies to divulge the secrets of their algorithms on the grounds that their lack of transparency endangered public discourse. Her prime target appeared to be search engines such as Google and Bing, whose algorithms determine what you see when you type a search query into them. Given that, an internet user should have a right to know the logic behind the results presented to him or her.

“I’m of the opinion,” declared the chancellor, “that algorithms must be made more transparent, so that one can inform oneself as an interested citizen about questions like, ‘What influences my behaviour on the internet and that of others?’ Algorithms, when they are not transparent, can lead to a distortion of our perception; they can shrink our expanse of information.”

All of which is unarguably true. We know that search results – and social media news feeds – are assembled by algorithms that determine the websites or news items likely to be most “relevant” for each user. The criteria used for determining relevance are many and varied, but some are calibrated by what your digital trail reveals about your interests and social network and, in that sense, the search results or news items that appear in your feed are personalised for you. But these powerful algorithms, which can indeed shape how you see the world, are proprietary and secret, which is wrong. So, Merkel argues, they should be less opaque.

QED? Sadly, no. I hold no brief for Google or Facebook, but simply making their algorithms transparent would do more harm than good. The reason is that search results or news feeds could then be “gamed” by external operators whose objectives might be even more questionable – and would certainly be more opaque – than those of Google and Facebook.

That doesn’t mean that the companies are squeaky clean, by the way. In fact, at the moment, the European commission is trying to decide if Google is abusing its monopoly of search to favour its own commercial interests. But at least we know what its motives are. On the other hand, if hackers of the Russian security service, say, were able secretly to manipulate your search results then you might conclude that transparency was overrated.

As a slogan, “transparency” sounds good. Like the saying “sunlight is the best disinfectant”, it gives one a warm feeling, even if it’s baloney. But in the digital arena, at least, transparency is not necessarily the best way to achieve the accountability that Merkel rightly craves.

Just imagine for a moment that she were able to compel Google to publish its PageRank algorithm, the one that decides which pages are most relevant to your search query. Once upon a time, when Google was conceived by Sergey Brin and Larry Page, PageRank was probably a fairly compact program. Now, it’s an amalgam of hundreds, perhaps thousands, of individual modules expressed as many thousands, perhaps millions, of lines of computer code. Publishing it might indeed enable hackers armed with the right tools to find exploitable weaknesses in the code, but it wouldn’t do much to help the average citizen to figure out if search results were being skewed in some sinister or unscrupulous way.

So just publishing secret stuff doesn’t do the trick. In a way, this is the hard lesson that WikiLeaks learned. At the beginning, its animating philosophy was that if you published information that powerful and secretive organisations would prefer to keep private, then good things would happen. So WikiLeaks did indeed publish such material. But except in a few cases, nothing much happened, which is why, in the end, Julian Assange decided that the way forward was first of all to create editorial material such as the “collateral murder” video and, later, to team up with established journalistic outfits such as the Guardian, New York Times, Le Monde, El Pais and Der Spiegel to release a huge trove of US diplomatic cables.

But the German chancellor has put her finger on an important problem. Decision-making algorithms are already shaping our culture, our commerce and maybe even our politics. For the most part, they are opaque and those who deploy them are therefore unaccountable. In the long run, this is intolerable – for liberty, privacy, equity and maybe even for democracy. The good news is that the problem is not insoluble. But there’s no single solution, no magic bullet. Calling for transparency won’t do it. Sometimes, publication will be the answer; at other times, it will be a muscular inspection regime, or need legislative changes to enforce legal liability. But before we can get started on solutions, we need first to acknowledge that we have a problem. So two cheers for Angela Merkel!

Source:  theguardian.com

Categorized in News & Politics

One of the biggest buzzwords around Google and the overall technology market is machine learning. Google uses it with RankBrain for search and in other ways. We asked Gary Illyes from Google in part two of our interview how Google uses machine learning with search.

Illyes said that Google uses it mostly for “coming up with new signals and signal aggregations.” So they may look at two or more different existing non-machine-learning signals and see if adding machine learning to the aggregation of them can help improve search rankings and quality.

He also said, “RankBrain, where … which re-ranks based on based on historical signals,” is another way they use machine learning, and later explained how RankBrain works and that Penguin doesn’t really use machine learning.

Danny Sullivan: These days it seems like it’s really cool for people to just say machine learning is being used in everything.

Gary Illyes: And then people freak out.

Danny Sullivan: Yeah. What is it, what are you doing with machine learning? Like, so when you say it’s not being used in the core algorithm. So no one’s getting fired. The machines haven’t taken over the algorithm, you guys are still using an algorithm. You still have people trying to figure out the best way to process signals, and then what do you do with the machine learning; is [it] part of that?

Gary Illyes: They are typically used for coming up with new signals and signal aggregations. So basically, let’s say that this is a random example and not know if this is real, but let’s say that I would want to see if combining PageRank with Panda and whatever else, I don’t know, token frequency.

If combining those three in some way would result in better ranking, and for that for example, we could easily use machine learning. And then create the new composite signal. That would be one example.

The other example would be RankBrain, where… which re-ranks based on based on historical signals.

But that also is, if you, if you think about it, it’s also a composite signal.

It’s using several signals to come up with a new multiplier for the results that are already ranked by the core algorithm.

What else?

Barry Schwartz: Didn’t you first use it as a query refinement? Right? That’s the main thing?

Gary Illyes: I don’t know that … ?

Barry Schwartz: Wasn’t RankBrain all about some type of query understanding and…

Gary Illyes: Well, making sure that for the query we are the best possible result, basically, it is re-ranking in a way.

Barry Schwartz: Danny, did you understand RankBrain to mean, maybe it was just me, to mean, alright someone searched for X, but RankBrain really makes [it] into Xish? And then the queries would be the results.

Danny Sullivan: When it first came out, my understanding was [that] RankBrain was being used for long-tail queries to correspond them to short short answers. So somebody comes along and says, Why is the tide super-high sometimes, when I don’t understand — the moon seemed to be very big, and that’s a very unusual query, right? And Google might be going, OK, there’s a lot going on here. How do unpack this and to where, and then getting the confidence and using typical things where you’d be like, OK, we’ll see if we have all these words you have a link to whatever. Meanwhile, really what the person is saying is why is the tide high when the moon is full. And that is a more common query. And Google probably has much more confidence in what it’s ranking when it deals with that, and my understanding [is that] RankBrain helped Google better understand that these longer queries coresponded basically to the shorter queries where it had a lot of confidence about the answers.

That was then, that was like what, a year ago or so? At this point, Gary, when you start talking that re-ranking, is that the kind of the re-ranking you’re talking about?

Gary Illyes: Yeah.

Danny Sullivan: OK.

Barry Schwartz: All right. So we shouldn’t be classifying all these things as RankBrain, or should we? Like it could be other machine learning.

Gary Illyes: RankBrain is one component in our ranking system. There are over 200, as we said in the beginning, signals that we use and what each of them might become like machine learning-based.

But when you or I don’t expect that any time soon or in the foreseeable future all of them would become machine learning based. Or that’s what we call the core algorithm would become machine learning-based. The main reason for that is that debugging machine learning decisions or AI decisions, if you want, if you like, is incredibly hard, especially when you have … multiple layers of neural networks. It becomes close to impossible to debug a decision. And that’s very bad for us. And for that we try to develop new ways to to track back decisions. But if it can easily obfuscate issues, and that would limit our with our ability to improve search in general.

Barry Schwartz: So when people say Penguin is now an old machine learning-based…

Gary Illyes: Penguin is not ML.

Barry Schwartz: OK, there’s a lot of people saying that Penguin [is] machine learning-based.

Gary Illyes: Of course they do. I mean if you think about it, it’s a very sexy word. Right. And if you publish it…

Danny Sullivan: People use it in bars and online all the time. Like hey, machine learning. Oh yeah.

Gary Illyes: But basically, if you publish an article with a title like machine learning is now in Penguin or Penguin generated by machine learning it’s like…. But if you publish an article with that title it’s much more likely that people could click on that title, and well, probably come up with the idea that you are insane or something like that. But it’s much more likely they would visit your site than if you publish something with a title Penguin has launched.

Source : searchengineland

Categorized in Search Engine

First there was Panda and Penguin. Now, Google will release a Google mobile update on April 21. This update promises to be even wider-reaching than both of the “bird-inspired” updates that valued high-quality content.

Writing For Google's Biggest Algorithm Update Yet | SEJ

Understanding the Scope of Google’s “Mobilegeddon” Update

Google’s new update promises to be a game changer. The algorithm will rank mobile-friendly sites higher than non-mobile-friendly ones. Many webmasters from around the world are (rightfully) anxious about its release since it could significantly impact traffic.

From a writer’s perspective, the update gives us something to think about as well. Does this mean we need to learn a whole new way to create web content?

There is no getting around the fact that your website must be mobile.

Before Panda and Penguin made their debuts, it was fairly easy to rank a website at the top of the search result by indiscriminately stuffing a particular keyword. These updates crippled a number of websites because they depended on that tactic to gain traffic.

The Mobilegeddon promises to do the same for webmasters who have neglected optimization for mobile browsers. This could be potentially devastating to some reaches of the Internet. Google has already stated that there will be no middle ground. Your site will either be mobile friendly or not. This could mean an entire reworking of site architecture and the content contained therein. This is of utmost importance to us as webmasters, writers, and marketers.

Content Production for the Mobilegeddon

Get ahead of this potentially game-changing update. Although it isn’t in effect yet, estimate how writing for a mobile site differs from writing for PCs. There is going to be a series of changes that content producers should aim to heed if they intend to keep producing high-quality, compelling content after the update has rolled out. Read this Search Engine Land post that offers three actions to prepare your website for the impending update.

From what we know about the update, it’s likely that we will have to make changes to our content production habits. Here are a few tactics that will help:

1. Curtail Headline Length

User experience on a mobile device is different than a desktop browser. One of the most obvious differences is the change in screen size (and the amount of usable real estate). Currently, a headline can stretch across the full banner-length of a browser, but mobile screens change the game when it comes to headline width.

 

What this Means for Us: Create shorter headlines. For Twitter users, it just means that you can practice your 140-character limit more often. For those of us who don’t use this particular social media network, now is a good time to start. We need to learn how to condense page-width headlines into more bite-sized chunks, without sacrificing the impact potential of our headline.

2. Make Shorter Paragraphs

“Snackable content” is something that content producers are all too aware of, but is especially important for mobile optimization. Create content that the user can consume in one sitting. However, the format in which we present this content is likely to be as bite-sized as the content itself. Because of short attention spans and aversions to “walls of text” it’s likely that mobile users would feel put upon when it comes to dealing with paragraphs that fill their entire screen.

What this Means for Us: Learn to summarize your ideas. Keep to the point and make your copy more targeted in nature. In some cases, such as home pages, reduce the amount of copy there altogether. Increased copy gives the user a hard time and makes for difficult reading, especially on a tiny display. Get your message across in short bursts.

3. Less Words, More Action

In Orwell’s 1984, he invented a form of the English language called “newspeak” where words were combined, removing unnecessary and frivolous ones and replacing the others that didn’t serve a purpose. This mobile update is likely to make content producers do the same, paring content down to be less wordy while at the same time interspersing calls to action. Condensing content will require us to consider what we write and distill the message in as few words as possible.

What this Means for Us: Rethink the methodology for creating content. In addition to making content compelling and benefit focused, we must also now take a look at the amount of words we use and how often we call to action. It could possibly mean a change in the basic tenets of web writing.

The exclusion is blog content– they will always rank and read better in long form – but for your home and main pages, less content means a better mobile experience, and happier readers.

 

Preparing for the Mobilegeddon Now

Luckily, this change does not require us to find a fallout shelter to survive. Writing habits just need to be carefully considered.

You may need to review web writing and revamp some marketing approaches accordingly to align to with what is expected from mobile friendly sites.

Source : searchenginejournal

Categorized in Search Engine
Page 4 of 6

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.
Please wait

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Newsletter Subscription

Receive Great tips via email, enter your email to Subscribe.
Please wait

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now