Articles
Pages
Products
Research Papers
Blogs
Search Engines
Events
Webinar, Seminar, Live Classes
William Smith

William Smith

Attend our free webinar July 12 and discover SEO, social media and content tactics to boost your brand’s online visibility.

When you hear the words "penguin" or "panda," you likely think of the adorable animals you saw during your last trip to the zoo. If you are familiar with Google’s constant algorithm updates, however, these terms mean far more than adorable black-and-white animals.

Google algorithm updates could have had a detrimental impact on your website. If you ever notice that your website’s search engine ranking position has suddenly plummeted, Penguin or Panda could be the culprit. And chances are, you probably haven't even received a warning or notification from Google.

I was brought onboard to help a $1 billion dollar company recover from a major Google penalty -- and I have helped many companies since. Once their site was dinged by Google, the company suddenly saw their rankings plummet, which directly impacted their bottom line.

My goal in this article is to teach you, much like I taught that company, how to identify whether or not your website has been negatively impacted from a Google penalty and what steps you can take to recover. I'll talk about best practices for proactive measures during this tough time so you don’t dwell on the negatives -- but actually go on to grow your business.

What exactly are Penguin and Panda?

Jennifer Slegg of thesempost.com wrote a great article on “Understanding Google Panda.” Slegg explains Google Panda as "one of the search engine's ranking filters that seeks to downrank pages that are considered low quality. Sites with higher quality and valuable content rise in the search results. But it is easily one of the most misunderstood algos. Search Metrics provided insight on the different ways you could have been hit by a Panda algorithm update and how to prevent this.

Penguin, according to Search Engine Land, launched in April 2012 to "better catch sites deemed to be spamming its search results, in particular those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings."

Barry Schwartz reported reported in January of 2016 that Panda is now baked in as one of Google's core ranking algorithm. There will be a similar tone for Penguin, according to Schwartz. “We know the next Penguin algorithm should be the real time version and with real time algorithms, they don't get pushed out on occasion, instead, they run all the time. So new Penguin penalties and recoveries will happen all of the time.”

My issue with Google’s algorithm updates

Google has to make sure webmasters don’t manipulate its core search algorithm. Otherwise, undeserving websites would populate towards the top of the search engines.

If there were no penalties for insider trading, would there be people who would reap the benefits from secretive information about publicly traded companies? Absolutely. This isn’t to say that the SEC catches everyone, but traders who try and manipulate the system know there is a penalty and risk if they get caught.

The same is true for Google regarding how they monitor their search results. If they see a website trying to take advantage of their algorithm, they need to take action.

My big issue is the lack of transparency. There have been many instances when a small business owner has approached me about his inability to rank anywhere on Google for keywords related to his or her local business.

After doing research, I would discover the reason their business was unable to rank was due to terrible links built to their site or duplicative content that was created. The worst part is that the business owner would have no idea this even took place. That’s right, Google sometimes informs webmasters via its search console whether their site has had a “manual action.” There are many instances when there will be no warning about a penalty and it takes someone with SEO knowledge and expertise to discover this.

Going back to the insider trading scenario, imagine if your financial advisor picked a stock and had inside information on the trade. His buddy worked for a publicly traded company and he knew that they were going to crush it on their upcoming earnings. He made this trade without your knowledge. Now imagine if the SEC came knocking on your door with a subpoena for insider trading. Would this be fair? I don’t think so!

Google must become more transparent and inform everyone whether or not their site is being held back by a previous action that has been done incorrectly. Business owners might have signed up for a backlink package from an offshore account for $50 when they heard the pitch of “guaranteed first page results.” If you are on a shoestring budget and don’t have the time or expertise with SEO, this might sound like a good option, right?

*Note: Don’t ever sign up for an SEO package that offers “guaranteed first page results.” This is an unrealistic promise that will have more of a negative impact on your site.

These business owners often times don’t have the slightest clue that these actions will negatively impact their search rankings. An attorney in Phoenix might have thought it was a good idea to create numerous local pages throughout the entire state of Arizona to get more exposure in other cities, even though he didn’t have an office in those other locations. Little did he know he would get dinged by Panda since the content was duplicative.

If Google would just inform business owners and webmasters of an action which they deem to be detrimental, webmasters would learn their lesson and fix the issue. On the contrary, with the lack of transparency, business owners have no clue how to fix the issue and this can literally kill their business.

How to recover when Google has dinged your site

Below are some best practices to see whether or not your site has been hit by a Google penalty. Each website and each scenario is different, so I always advise consulting with an expert.

1. Check search console. 

Within your Google search console dashboard, you will see a “Search Traffic” tab. Within this tab, select “Manual Actions.” This will give you insight on whether or not Google has directly informed if your site was hit with a web spam action.

2. Check your backlink profile (Penguin related).

If your website is connected with Google Search Console (formerly Webmasters) you can “Download your latest link report.” This will show you all of the backlinks that have been built to your site. If you notice a lot of low quality links on spammy websites, this is a clear cut sign that you could be suffering from a link-based algorithm update.

My recommendation is to go through each and every one of the links and add any link you deem as low quality into an Excel file. Run this list you accumulated by an SEO expert. If you confirm that your links need to be disavowed, you’ll need to create a .txt file and upload your links through this process on Google’s search console. Marie Haynes, who writes for Moz, provided a great guide to using Google’s disavow tool.

3. Analyze the content on your website (Panda related).

Is there duplicative content on your website? Could Google be seeing some of the pages you created as low quality and manipulative? It can be very tricky to detect whether or not your site has been dinged with a content based algorithm, often times referred to as Panda.

I would recommend sharing your site with someone knowledgeable in search engine optimization and digital marketing so they can provide an analysis on whether or not the content may seem manipulative or duplicative. If you have a ton of pages on your website, you can use a software such as Screaming Frog to run a scan on your site and easily organize the content structure on each one of your website pages.

4. Look at the change history.

Each year, Google changes its search algorithm between 500 and 600 times. While most of these changes are minor, Google occasionally rolls out a major algorithmic update -- such as Google Panda and Google Penguin -- that affects search results in significant ways.

Moz listed the major algorithmic changes that have had the biggest impact on search. If you noticed your site drop in ranking right around one of the particular dates outlined in Moz’s change history updates, it will be easier to identify the penalty that took place.

Focus on growing your business.

Too often, I will see business owners obsessing over a Google penalty. And they are right to do so -- it could be dramatically impacting to their bottom line. Make sure you take the appropriate actions to clean up your website so the next time Google runs a major algorithm change -- or if the update is in more real time -- your site has a strong likelihood to recover.

I also recommend trying to strengthen your business during this difficult time. Attend more networking events if your organic leads have dropped. Get more involved in public speaking and seminars related to your industry. Write more informative articles where you share your industry expertise to drive high quality referral traffic back to your site. Invest in social media marketing and Google Adwords to try and discover what message will generate you more leads at a profitable price point.

By taking all of the aforementioned proactive steps, by the time your site recovers from one of Google’s core algorithm updates, your business will have strengthened. At the end of the day, that is what Google wants to see. Natural traffic coming to and linking to your site based off of the different marketing and business initiatives taking place.

Source:  https://www.entrepreneur.com/article/277467

The world’s highest-capacity undersea Internet cable, a 9,000-kilmetre link between the US and Japan backed by Google has been activated.

The fibre cable, which can transport data at 60 terabits per second (60 million Mbps) is expected to be a significant boost to trans-pacific Internet speeds.

Google is one of six companies behind the project, alongside Asian telecoms groups. Google’s Urs Holzle said it had “more [capacity] than any active subsea cable.”

Stephen Lam / Getty Images

Demand for faster Internet speeds and extra capacity is increasing as more devices go online and amid the growth of cloud Internet services, which Google is a significant player in.

Almost all international Internet traffic runs via undersea cables, but in the Internet’s earlier days much of the traffic was via satellites.

The $390 million “Faster” cable system will connect to hubs in Los Angeles, San Francisco, Portland and Seattle, and two points in Japan.

Google is also backing a project to build a cable between Florida and Brazil due to be finished by the end of the year, while Microsoft and Facebook recently announced a trans-atlantic cable between Virginia Beach and Bilbao in Spain.

“Faster is one of just a few hundred submarine cables connecting various parts of the world, which collectively form an important backbone that helps run the Internet,” Holzle said.

Laying the cable under the sea requires specially-designed ships which can lay up to 125 miles per day. Although the optical fibres that transport data are extremely thin, the cables have to be reinforced with layers of tubing, steel wires and plastic to prevent damage.

The cables have become a target for shark attacks, with sharks possibly drawn to the electromagnetic signals running through them. Shark attacks forced Google to reinforce parts of its cable infrastructure with a special Kevlar coating in 2014.

Source:  http://news.nationalpost.com/news/world/google-launches-9000-km-long-undersea-internet-cable-between-japan-and-the-us

Nearly a year ago, Google expanded their search engine to begin instantly answering questions, such as the death of a celebrity or a math problem. The result was a reaction to the true nature of search; nobody was writing something into Google without actively seeking an answer.

That answer may be finding a particular piece or a few different pieces of content, or simply a particular website, but you are asking the question; "where is this thing I want?" The instant responsiveness of Google and its ability to query an entire database of the Internet has made other sites take notice.

That's why sites like Periscope, Medium, Vevo and Hacker News have adopted Algolia's hosted cloud search platform, an API that brings Google's instant to near-instant search capabilities to their sites. The result is that their content is immediately searchable and relevant, so that if a user makes a complex query and/or a typo, they will still receive results that make sense for what they're looking for. "By leveraging the trove of internal data that websites or mobile apps have, we are helping them to deliver an experience that is even deeper and more personalized than what Google does for web searches," said Nicolas Dessaigne, CEO of Algolia. "Our goal is to make search seamless, nearly invisible.

Today we can deliver relevant results at the first keystroke. In the future, all results will be personalized and delivered before the question is even completely formulated." This is an important approach for businesses large and small to take, and closes in on AI; Algolia's technology works to not just index and search your data, but also make sure that it produces the right answer to a query.

This is an interesting comparison to the ever-growing world of the Internet of Things, led by Amazon's Echo. Users, despite their accents, stuttering or other things that make a question "imperfect" are still able to get an answer. Algolia, their competitor Elastic and Google all recognize this, with Algolia in particular even advertising directly on their website that you should try a test search with a typo, to show how the platform can answer the question regardless. Google will even go as far as to suggest what you may be trying to type, if not bringing you the exact answer despite your mistake.

As Quartz's Leo Mirani said, there are over 10 trillion web pages to index, including but not limited to the masses of social media services providing terabytes if not petabytes of information into said stream. This is the same problem that many startups and companies will begin to find, both from the angle of big data overload and the expectations of the user.

The instantaneous nature of search may make users unlikely to even browse the same way, as we move away from the original web's exploratory nature to people visiting each website with a purpose. In the same Quartz article, Mirani speaks to author Stefan Weitz, who wrote the book Search: How The Data Explosion Makes Us Faster, where Weitz argues that search must mature to mirror human nature, and be ready to answer a query at speed.

 "We must think of search as the omniscient watcher in the sky, aware of everything this happening on the ground below," said Weitz. "For this to happen, search itself needs to be deconstructed into its component tasks: indexing and understanding the world and everything in it; reading senses, so search systems can see and hear (and eventually smell and touch!) and interact with us in more natural ways; and communicating with us humans in contextually appropriate ways, whether that's in text, in speech, or simply by talking to other machines on our behalf to make things happen in the real world."

To Algolia's Dessaigne, this approach is a natural course. "Personalization of results is also going to be an important trend for websites and apps, particularly among big retailers and media websites. Along this progression, voice interfaces are going to gain traction. We are still far from truly conversational interfaces, but we'll eventually get there."

While we all dream of a day when we can have an answer as we speak, or even think of the question, we are far away from it. Nevertheless, startups are clearly ready to make the jump for us. We're in a world that's far from the days when having a search bar was a quirky feature; users have a question and to succeed in business, you'll need to have an answer.

Source:  http://www.inc.com/amy-cuddy/3-body-language-books-that-all-leaders-should-read-this-summer.html

It is no secret that trade secrets are essential to the success of many businesses. Nearly all of us are aware of examples of highly treasured trade secrets, such as the recipe for Coca-Cola, the formula for WD-40 and the methodology for creating the nooks and crannies in Thomas’ English Muffins. Less-known is the growing importance of trade secrets and the role they play in our economy, including their impact on the patent system.

Trade secrets are by definition valued by their owners. Secrecy preserves a source of competitive advantage that generally cannot be recovered once secrecy is lost. Although secrecy may be lost inadvertently through carelessness, disclosure is often the result of wrong-doing and sometimes even the actions of sophisticated international industrial espionage.

Trade-secret laws allow for the recovery of monetary damages from, or even criminal penalties against, those that steal trade secrets. Annual losses to the U.S. economy from international theft of trade secrets has been estimated to exceed $300 billion. An effective system protecting trade secrets thus enhances the economy and promotes national security.

Our government recently enacted the Defend Trade Secrets Act (DTSA), a milestone for trade-secret protection. The DTSA creates a federal civil cause of action so that companies will no longer need to navigate a maze of state laws to enforce their rights when their trade secrets are stolen. In addition, the DTSA enhances remedies available to victims — in particular by providing for ex parte seizure orders under appropriate circumstances to limit further disclosure of the trade secrets. The DTSA is a great achievement that should both deter potential wrongdoers and ease the burden of enforcement on those that have been wronged.

The DTSA is also a “wake-up call” to companies that value and protect their intellectual property as trade secrets. Companies now have increased incentive to identify, capture, inventory and protect their potential trade secret information. Failure to do so now could forego the ability to leverage the DTSA downstream.

Technology and markets also are trending toward increasing importance of trade secrets. The commoditization of computer hardware drives innovation into computer-implemented software. The implications extend well beyond the computer and information technology industry because, in addition to being directly incorporated in devices such as medical diagnostics and automobiles, computer software technology is used in the development and design of virtually everything today — from computational biology to 3D printing to computer-aided design.

More and more innovation is moving behind the “firewall,” meaning that it is invisible to the public (e.g. digital 3D designs now most often reside out of reach in the cloud instead of on a local disk drive [and subject to reverse engineering]).

Ironically, while this trend toward reliance on secrecy is rational behavior, without changes to our patent system, it may have the unintended consequence of slowing the overall pace of innovation. Trade secrets and patents are different forms of intellectual property, both of which play an important role in a comprehensive intellectual property strategy. Secrecy protects innovation by reducing the likelihood that others possess it. On the other hand, patent protection requires the publication of a complete written description of a patented invention, thereby enhancing public knowledge and reducing the need to “reinvent the wheel.”

Unfortunately, recent court decisions have dramatically narrowed the eligibility of computer-implemented inventions for patenting. As a result, companies developing software-centric solutions are likely to rely more heavily on trade secrets to protect product innovations that can no longer be patented.

Companies may even choose to market their innovations in a way that favors trade-secret protection, i.e. by delivering innovations as services rather than products to avoid bringing the innovations out from behind their firewall (e.g. Google search services that do not reveal search algorithms or software delivered as a service).

The combination of technology and market trends and recent court decisions on patent eligibility has altered the trade secret and patent equilibrium. Enhancement of trade-secret protection via the DTSA helps offset patent system contraction for innovators, and is likely to lead to increased focus on protecting innovation through trade secrets and a reduction in patent applications. The consequences for our innovation economy as a whole are significant.

The relative shift will reduce innovation sharing and is likely to lead to reduced investment in technologies currently deemed ineligible for patent protection and which cannot be maintained in secrecy when commercialized. Notwithstanding the rational behavior of our government to enhance trade-secret protection and our businesses to use this protection, the contraction of the patent system undermines our innovation economy, especially for less-flexible industries.

Trade secrets and patents are not mutually exclusive — each can be of value. The DTSA improves trade-secret protection. Now that the DTSA is in place, we should turn our attention toward achieving a better innovation ecosystem by reversing or legislating away recent harmful court patent decisions so we restore the proper balance between trade secrets and patents. To best promote innovation, we need both strong trade-secret protection and strong patent protection going forward.

Source:  https://techcrunch.com/2016/06/20/the-changing-trade-secret-and-patent-equilibrium/

Thursday, 19 May 2016 02:37

How to Do Internet Research

The internet has made researching a topic easier than ever before. Instead of making a trip to the library, people with internet access can simply pull up a search engine, type, and click away. But, in addition to making it easier to access information, the web has also made it easier to access misinformation.However, by following some simple rules, you can avoid being fooled or misinformed by a phony, inaccurate, or biased web source.

1-Knowing Where to Begin

Decide where to start your search. If your employer, college, or university provides you with a search engine or directory, begin there. If you have access to a library database of research articles, such as EBSCOhost, start there.[4] Library databases provide you with access to peer-reviewed research, which is the gold standard for academic study. “Peer-reviewed” means that top experts in the field have reviewed the research to make sure it is accurate, trustworthy, and informed before publishing it. Even if you’re just trying to learn something for your own personal benefit, academic research will provide you with the most up-to-date, reliable information.

You can usually access these databases through your home library’s website. Some academic and universities libraries may require a password if you are accessing them remotely (from somewhere other than in the library itself).
If you don’t have access to a library, try using Google Scholar for your searches. You can find academic research through this search engine, and Google Scholar will show you where you can find free copies of the articles online.

Image titled Teach Guitar Step 9

2- Look for subject-specific databases.

Depending on the area of your research, you have several options for online databases specific to your field. For example, if you are looking for research on education, the ERIC (Education Resources Information Center) is sponsored by the United States Department of Education and provides peer-reviewed research and informational materials on education topics.[5] If you’re looking for medical or scientific research, PubMed, sponsored by the United States National Library of Medicine, is a great place to start.

Image titled Research Step 12

3-Ask a librarian.

If you have access to a library, make an appointment to speak with your reference librarian. These people are specially trained in helping you access the best research and knowledge available.[7] They can help you find sources and also help you determine whether sources are credible.

Image titled Become an Accomplished Young Author Step 19

4-Use regular search engines with caution.

Search engines crawl the web indexing pages by reading the words and phrases that appear on those pages. From there, the process is automated. Each search engine has an algorithm that’s used to rank results for specific searches. This means that no human is vetting the accuracy of the results. The “top” result is simply the result of an algorithm. It’s not an endorsement of the content or quality of the result.

Most search engines can be “gamed” by savvy websites in order to ensure their content comes up first. Moreover, each search engine has its own algorithm, and some tailor their results based on your browsing history. So the “top” result on Google will not necessarily be the “top” result on Yahoo, even with the exact same search phrasing.
Be aware that simply because you find information online doesn’t make it credible or authoritative. Anyone can make a webpage, and the amount of poor, unverified, and just plain wrong information often outweighs the good stuff online.[10] To help you sift through the useless stuff, talk to your teacher or librarian, and use library or academic search engines when possible.

Image titled Do Qualitative Research Step 5

Choose your keywords carefully. For any given inquiry, there are an almost limitless number of potential word and phrase choices you could enter into a search engine. Therefore, it’s important to think carefully about what you hope your search will find, as well as try multiple different search combinations.
If you’re using an academic search engine, such as your library’s search feature, try using a combination of keywords and Boolean Operators, or words you can use to narrow down your search: AND, OR, and NOT.

For example, if you are doing research on feminism in China, you might run a search for “feminism AND China.” This will return results that include both of those topic keywords.
You can use OR to run searches for related keywords. For example, you could search for “feminism OR feminist OR social justice.” This would return results that contain one or more of those terms.

You can use NOT to exclude keywords from your search. For example, you could search for “feminism AND China NOT Japan.” You would not get any results that included Japan.
You can use quote marks to search for full phrases. For example, if you want to search for academic performance, you would search for the whole phrase inside quotation marks: “academic performance.” Be aware, though, that using quotation marks will kick out any result that isn’t an exact match. For example, you would not get results about “school performance” or “academic functioning” because they are not worded exactly the way you searched.

Use specific keyword phrases to locate the most relevant information. For example, if you are looking for information social welfare expenditures in the U.S., you’re more likely to get the results you want by searching for “total yearly amount spent on welfare programs in U.S.” than searching for “welfare,” which would bring up definitions of welfare, types of welfare in other countries, and thousands more results you don’t want. Be aware, though, that you can’t always find information like this -- the more words you enter, the fewer results you’re likely to get.

Use alternate words or keyword phrases to locate additional research sources. For example, if you are researching “welfare,” consider using “safety net” or “social programs” or “public assistance” in place of “welfare” to find different results. In many cases, your word choice might unintentionally bias your results, since terms like “welfare” are often politically loaded. Using a wider variety of terms ensures that you’ll be exposed to a broader — and therefore potentially less biased — set of sources.

Source:  http://www.wikihow.com/Do-Internet-Research

Twitter plans to let people fire off links or pictures without eating into the 140-character limit set for posts at the one-to-many messaging service, Bloomberg has reported.

The change could take place by the end of this month, according to a Blomberg report that yesterday cited someone familiar with the matter.

Twitter declined to comment, but the move would come as the San Francisco-based company strives to ramp up the number of users along with how much people communicate at the service.

 

 

Analysts have maintained that relaxing a 140-character limit set due to mobile phone text messaging constraints in place when Twitter launched in 2006 would encourage use.

And, as people increasingly communicate by sharing pictures, videos and links, making Twitter more conducive to that content would play into the trend.

 

 

Twitter last month released a quarterly update that showed little change in its user base.

In a key metric in the fast-evolving social media world, Twitter's trend was flat. The number of monthly active users was 310 million, up three percent from a year ago and only slightly higher than the 305 million in the previous quarter.

 

 

Twitter changed the way it measures active users, no longer counting those who follow via SMS text messaging. The number reported in the fourth quarter including that group was 320 million.

Twitter announced separately it has added BET Networks chief executive Debra Lee to its board.

Lee has fired off thousands of tweets and has 67,400 followers at the service, according to her account @iamdebralee.

"Twitter has been and continues to be a transformative service for the media landscape and the world," Lee said in a statement.

In a tweet, Lee said that she is "thrilled" to be on the Twitter board.

Source:  http://www.dnaindia.com/scitech/report-twitter-to-soon-exclude-photos-and-links-in-140-character-limit-2213142

 

 

 

 

 

Alec Green is vice president of marketing for The Search Agency, an online marketing agency specializing in SEM, SEO, Social Media, Display, CPO, & Landing Page Optimization.

Search engines adapt to compete in the ever-changing search marketplace. But because they derive revenue from ads, search engines have an incentive to convince users to click on them, which may come at the cost of user experience as the lines between paid ads and organic search results blur. They seem to try to cut corners, without ever really earning their success.

This blatant disregard for online advertising rules prompted the Federal Trade Commission to revise its original 2002 guidelines for ad disclosure.

 

 

Clarity and prominence are the cornerstones of the no-longer new guidelines – search engines must clearly and significantly distinguish ads from organic search results. Visually, the letter requires that advertisements use evident shading with a clear outline, a prominent border or both. For text labels, the guidelines call for a uniform label that unambiguously identifies the result as advertising, is clearly visible, and is located close enough to relevant search results.

So how do the big three search engines stack up? All of them have room to improve. Let’s start with Google, which received 86.9 percent of the ad-click share according to a recent report.

 

 

 

 

Google features two ad displays. The ads at the top of the organic results have a blatant tan background and include a text label on the top left corner that clearly identifies the results as ads – all good qualities. However, while the ads to the right also have a text label, they lack a shaded background or noticeable border.

How would competitor Bing stack up on the same search query?

 

 

 

 

 

 

Bing’s page resembles Google’s in its use of two sections of ads, one in a shaded box above the organic results and one in a panel to the side. Bing’s shading, however, consists of a translucent-green box that nearly disappears when viewed on different monitors. It does get bonus points for adding a shaded edge to the right, giving the box a slightly 3D look that, if exaggerated, could really improve the ads’ distinction from the rest of the results. The text label, “Ads,” is clear enough but appears as tiny text in the top right, against FTC recommendations for the left. The ads on the side panel to the right have the same text label but like Google, lack background shading or a prominent border.

Finally, let’s look at Yahoo’s performance:

 

 

 

 

 

Yahoo is the worst offender by far. One glance at the search results page reveals a weak distinction between ads and organic results. However, Yahoo does include decent text labels, even placing them in the top left corner. However, the shading is “existent” but nowhere close to the FTC’s guideline of being “prominent.”

 

These examples show that all the major search engines have room for improvement. The fight for ad revenue is becoming increasingly fierce, and toeing the line on ad disclosure could be one means of garnering more paid clicks. But the FTC is cracking down, and it won’t be long before the search engines are forced to respond.

 

Source:  http://digiday.com/agencies/search-engines-not-complying-with-ftc-guidelines/

 

 

 

 

 

China has launched an investigation into search giant Baidu after the death of a student who tried an experimental cancer therapy he found online.

Wei Zexi, who died last month from a rare form of cancer, had sought the treatment from a hospital that came top of the list on his Baidu web search.
Baidu has come under fire for allegedly selling listings to bidders without adequately checking their claims. In a statement Baidu said it was investigating the matter. The company told the BBC: "We deeply regret the death of Wei Zexi and our condolences go out to his family.

 

"Baidu strives to provide a safe and trustworthy search experience for our users, and has launched an immediate investigation of the matter."

 

Baidu owns search engine and social media services, and is often compared to Google. On Baidu, listings that have paid for a prominent placement are marked at the bottom with a small sign saying "promote", but many say this does not identify them as paid-for listings sufficiently clearly.

 

Baidu's value fell by more than $5bn on Tuesday, after its shares slumped in the US on news of an investigation by China's internet regulator. Baidu's Nasdaq-listed shares fell 7.92%.
Baidu is China's largest search engine with 70% market share and more than 660 million people using its mobile search every month.

 

Stephen McDonell, BBC News, China Correspondent


China's massive search engine Baidu is facing a colossal credibility problem.
It's one thing for the company to sell top search slots to the highest bidders, but another to manage the fallout when a boy dies of cancer after trusting that a highly placed search result was what appeared to be the most trusted hospital treatment available to him.
All over the world, private companies are administering what on one level are public services, and it is becoming distorted by commercial deals - and this country is no exception.


Baidu says it is looking into this matter but, for the Chinese authorities, that is not enough.
The Chinese government says it will carry out an official inquiry into the role Baidu's "search results for sale" business model had on the death of Wei Zexi.
What's more it says the findings of the investigation will be made public.
Chinese people were already annoyed that Baidu sold off search positions, but when it became a life and death matter it has enraged them.


According to state news agency Xinhua, Wei was diagnosed with synovial sarcoma in 2014.
He and his family said he found out about a controversial treatment at the Second Hospital of the Beijing Armed Police Corps through Baidu. The hospital was listed at the top of his search results.


But the treatment was unsuccessful and the 21-year-old student died on 12 April.
Before his death, Wei publicly accused the hospital of misleading him and his family of the treatment's effectiveness, and criticised Baidu for selling search listings for medical information to the highest bidder.


Baidu has denied ranking hospitals in promoted search results solely based on how much they paid, and says the hospital had been approved by the Beijing municipal government.


Investigations have now been launched against the hospital.


The hospital has yet to comment and efforts to contact hospital officials have been unsuccessful.
In addition to the Cyberspace Administration of China, several other government agencies including the State Administration of Industry and Commerce, and the National Health and Family Planning Commission are looking into the matter.
The outcry over the case follows a similar scandal in January involving ethical practices regarding healthcare advertising.

 

Source:  http://www.bbc.com

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.
Please wait
online research banner

airs logo

AIRS is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Subscribe to AIRS Newsletter

Receive Great tips via email, enter your email to Subscribe.
Please wait

Follow Us on Social Media